May 22 2015
May 22

Last week, many of us were in sunny Los Angeles for DrupalCon 2015. Though many were seasoned veterans, it was my first time at a Con. It was a whirlwind of team building, a magical Prenote, great one-on-one coversations and plenty of Drupal talk. Needless to say, I'm still recovering! But one thing is certain, our team had a wonderful time. Here are some of their takeaways:

Mario Hernandez

Mario Hernandez - Front-end DeveloperHaving DrupalCon in my hometown of Los Angeles was great because I could enjoy the conference, meet with my friends and go home every night to my wife and kids.

I attended several sessions mostly front-end related but the biggest event for me was being able to speak at DrupalCon for the first time.  The topic was Advanced Layouts with Flexbox.

I have to admit I was concerned because my talk was scheduled as one of the last sessions of the conference right along with my colleagues Matt Davis and Jason Smith who would be talking about the front-end framework they put together for The Weather Channel.  Not to mention it was also at the same time as Dries’ Q&A session.  I was certain there will be no audience in my session.  
The time came for me to speak and I was shocked when I realized the room was almost full.  I estimated a minimum of 150 attendants in my session which completely surprised me.  I thought the presentation went well and the feedback I received was positive.  I walked out the room singing in the back of my mind the words of Ice Cube “Today was a good day!”.

Mark Casias

I had a great time in LA with all my favorite, and newly favorite Drupalists. I decided to take the twelve hour drive from Albuquerque to LA (and back) and that wasn’t as horrible an idea as I thought it would be. This was also my first ‘con with an actual supporting company, so it was fun to see the other side. Talking to people at the booth, and playing corn hole at the Mediacurrent booth was a blast. Additionally it was a lot of fun hosting the karaoke machine at the Mediacurrent after party. I would be a jerk if I didn’t thank the fine Canadians at Digital Echinda, where I won the nifty Drupal Hockey Jersey, and at Opin who is also from Canada, and gave me a TV! Of course the reason that we go there is the learning. My most memorable session was given by David Diers at Four Kitchens. His session was API Design: The Musical, which was a great primer on how to create a good program API along with his guitar and some pretty catchy, original songs. I truly hope he gets that album to drop

Bob Kepford

This is the second DrupalCon I've attended in my home state. As usually my favorite part of DrupalCon was meeting and reconnecting with people. My favorite session this year was 
Decoupled Drupal: When, Why, And How with Amitai Burstein and Josh Koenig. These two teamed up to talk about Decoupled Drupal. They offered reasons why a decoupled approach is a good idea but also why it is not a panacea. When you take Drupal's head off you lose a lot that Drupal does well. Josh mentioned several times that the Drupal has the opportunity to become one of the best backends for the hot frontend frameworks like Angular and Ember. They raised a rallying cry to Drupal developers to solve the problems that decoupled Drupal creates. Mark, Jason, Mario, and I also recorded a live episode of our podcast which was a lot of fun.


Damien McKenna

Having missed the past two Drupalcons, it was great to be back. My week focused around all things Panels, with a dash of Metatag and usability for good measure. On Tuesday I lead a panel session providing a State Of The Union update for all things Panels-related. My fellow Panels-ists and I demonstrated the latest 'n greatest in the Panels world, concluding with a brief preview of things to come in Drupal 8. Later that day we had a BOF where we had a round-table discussion of how to use Panels modules and some of the pain-points ("pane-points"?) site builders experienced. On Wednesday there was a BOF on Panopoly, a customized distribution based upon many Panels modules. We rounded off the week with a code sprint focusing on stability improvements. The best part of this was getting so many maintainers and active users together to discuss our goals, pain points and how to collaborate further. It was also great to catch up with friends I'd not seen in several years and finally meeting many others I'd gotten to know via and Twitter.

Nathan James

I have heard that it almost never rains in LA. Well, during my first DrupalCon, it absolutely rained!  Besides the water falling from the sky, it rained t-shirts, nerf guns, inspiring and informative sessions, great interactions with coworkers and the Drupal community, musical numbers, after parties, and valuable knowledge and ideas.  Being newer than most to the community, I found it empowering in the Driesnote to get a history of Drupal and a better vision of the impact this technology currently has and where it is going.  Seeing Mediacurrent give some sessions made me proud to be a part of this company.  I also appreciated the honest evaluation of Decoupled Drupal.  Going in expecting more hype talk, it was good to see Josh koenig and Amitai Burstein give a reality check, highlighting that there are many things in Drupal that we take for granted and would need to be rebuilt when using the Drupal backend with a different frontend.  Lastly, it would be simply wrong not to mention how Mediacurrent gave the most amazing after party.  Standing room only, in the loud roar of conversation and fun, you could just feel community growing and memories being made as good beer flowed and karaoke … happened.


Plugging into the Drupal community is a wonderful thing. Though we've wrapped up in Los Angeles, we're already gearing up for DrupalCon New Orleans! Will you be there? Laissez le bon temps rouler!

May 22 2015
May 22

Commercial Progression presents Hooked on Drupal, “Episode 9: DrupalCon LA 2015 Highlights with Steve Burge from OSTraining".  In this special DrupalCon edition of Hooked on Drupal we conferenced in Steve Burge of OSTraining for an on the ground report from Los Angeles.  Held on May 11-15, 2015 DrupalCon LA was the premiere event for the Drupal community.  Steve brings us the inside scoop of highlights and takeaways as the conference wraps up.  Additionally, Alex Fisher (also a DrupalCon veteran) shares his memories and insights from past DrupalCons.  Commercial Progression has recently sponsored OSTraining with a $5000 kickstarter backing to bring Drupal 8 upgrade training to the masses.  This new collection of video resources will be released in September 2015.  With Dries call to support Drupal as public utility from DrupalCon, this announcement seems especially timely.

Hooked on Drupal is available for RSS syndication here at the Commercial Progression site. Additionally, each episode is available to watch online via our YouTube channel, within the iTunes store, on SoundCloud, and now via Stitcher.

If you would like to participate as a guest or contributor, please email us at

[email protected]


Content Links and Related Information

OSTraining logo


Hooked on Drupal Content Team

ALEX FISHER - Founder of Commercial Progression

STEVE BURGE - Founder of OSTraining

Alex Fisher Steve

Left, Alex Fisher, founder and owner of Commercial Progression in Northville, Mich.
Right, Steve Burge of Sarasota, Fla., founder and CEO of OSTraining

Podcast Subscription

Hooked on Drupal Episode 9 - DrupalCon 2015 Review with Steve Burge of OSTraining Podcast

May 22 2015
May 22
 * Implements hook_action_info().

function mymodule_action_info() {
  return array(
    'mymodule_update_products' => array(
      'type' => 'entity',
      'label' => t('Update products by 2%'),
      'configurable' => FALSE, 
      'triggers' => array('any'),
      'pass rows' => TRUE,
function mymodule_update_products(&$entity, $context) { 
  $product_id = $entity->product_id; 
  $price = $entity->commerce_price[LANGUAGE_NONE][0]['amount'];
  $updated_price = 1.02 * $price;
  $affected_rows = db_update('field_data_commerce_price')
    ->fields(array('commerce_price_amount' => $updated_price))
    ->condition('entity_id', $product_id)
function mymodule_round_up_line_item_price($line_item_id) {
  $line_item = commerce_line_item_load($line_item_id);
  return round($line_item->commerce_unit_price[LANGUAGE_NONE][0]['amount'],-2);
May 22 2015
May 22

The Drupal 8 multilingual team is really great in spreading know-how on the new things in the upcoming version, so we had our session (1h) and workshop (2h) recordings published and widely available. While we of course love our baby and can talk all day about it, who has hours when they just want to explore what is coming up? We just addressed that this week with the following.

1. New 2m22s introduction video with the key benefits

[embedded content]

2. A quick summary of key benefits and an easy to skim features list lists the top 12 benefits and provides the more detailed information in an easy to skim text form. And yeah, that 1h session video if you have the time.

3. Easy to launch demo to try features out

Thanks to our work on the multilingual workshops for DrupalCons, BADCamp and DrupalCamps, we have a demo with sample content in 4 languages that you can try out in your browser for 30 minutes without any registration or local software install required thanks to

4. Check out who voted with their feet already

Drupal 8 is not yet released, yet there are numerous live multilingual Drupal 8 sites helping with nature preservation, finding health professionals or concert tickets among other good uses. Now there is a handy list to review at

If you like what you see, we still have guided workshops (those that last 2h). The next one is coming up right this Sunday at DrupalCamp Spain. We also believe that the multilingual team is one of the best to get involved with if you want to know Drupal 8 better and give back some to improve the new version as well. We have weekly meetings and a huge sprint coming up at DrupalCon Barcelona. Maybe we'll have some opportunity to celebrate as well. See you there!

May 22 2015
May 22

Years ago now, the Drupal community adopted Git as a version control system to replace CVS. That move has helped development since the distributed nature of Git allows better tracking of work privately before uploading a patch to

Sandbox repositories allow contributors to clone an existing project to work on independently (therefore not needing permissions for the canonical repository), but there is currently no way that I know of to request that those changes are pulled back, facilitate a review of changes and then merge the changes in (a pull request).

Hopefully that functionality is on the way!

But as a community the challenge is not just the development on, collaboration with GitHub, or whatever form the technical change takes. Alongside those changes, we need the workflows that will help us better manage multiple versions, allow fast bug fixes whilst features are being tested, and provide for reviews without alienating developers. And the technical element goes hand in hand with the workflow.

As an example, for the Drupal PM module, we recently debated how to set up Git branches to allow more flexibility than the traditional "single line of code" inheritted from CVS.

There were a few criteria that the new solution had to have:

  • Flexibility that allowed bug fixes to be more quickly applied to a release: Under the "single line of code" approach, Releasing bug fixes only would require adhoc branches and tags.
  • Fit with infrasturcture: In particular, we'd like users to be able to test a development version without cloning from Git. So the development release on needed to correspond to an appropriate codeset for people to test.
  • Alignment to industry standard approaches where possible: Looking into what is used elsewhere in the software world, the Gitflow model has been received well.

Putting all of this together and discussing on Skype and a issue, we came up with a branching model that seems to fit these criteria.

For each major version of the module (i.e., 7.x-1.x, 7.x-2.x, 8.x-1.x), we will have the following branches:

  • Release branches: There will be one release branch for each major version, named after the version (for example: "7.x-1.x"). The codebase in here will always be the release candidate for the next point release, and those point releases will always be tagged from this release branch.
  • Development branches: There will be one development branch for each major version, named "develop-[version]" (for example: "7.x-1.x"). This will effectively be a staging branch for the next release but one. Features will be merged into here, and then this development branch will be merged into the release branch when the next release candidate is required.
  • Feature branches: There will be one feature branch for each feature ( issue), named "feature-[issue]-[title]" (for example, "feature-12345-add-feature"). These will be worked on until the given feature is finished. Once completed, the feature branch is merged into the development branch.
  • Hotfix branches: There will be one hotfix branch for each bug fix ( issue), named "hotfix-[issue]-[title]" (for example, "hotfix-12345-fix-bug"). These will be worked on until the bug is confirmed fixed. Once completed, the hotfix branch is merged into both the development and release branches.

We're just beginning to use this system in entirety, and I hope that it works out.

One caveat is that the system only works for developers with permissions on the project repository. I would love for any contributor to be able to fit into this model and to have the pull request system available for the final merge... perhaps soon...

May 22 2015
May 22

If you're building a Drupal website with a lot of content for a community of users, chances are you'll need to set up some editorial controls. Starting with the Workbench and Workbench Moderation modules, you can create editorial workflows for content types. Nodes pass through different 'States', like Draft, Needs Review, and Published. Different User Roles control the flow of nodes through these different states. For example, you could create a Contributor role, who has permission to create new nodes and promote them to the Needs Review state. Content in the Needs Review state isn't published, so you could also create an Editor role, who can then promote nodes in the Needs Review state to Published, or reject nodes in the Needs Review state by setting them back to Draft. Drupal has fairly extensive documentation on working with Workbench and how to set up Moderation. 

Workbench Moderation works very well for high-level editorial controls, but if you need more granular control of your content, you should check out the Workbench Access module. For this example, we are going to enforce content moderation where only certain users can edit or moderate changes to Membership pages while other users can edit or moderate changes to Career pages. I like working with examples and screenshots (videos and screenshares make my eyes glaze over) so here is the basic setup.

We have three Content Types that we want to moderate with Workbench:

  1. Webinars: custom content type with date, time, and location.
  2. Internships: custom content type with dates, location, and description.
  3. Pages: Drupal's page content type out-of-the-box. This website has hundreds of pages, divided into many sections. For simplicity's sake, we'll just sat there are three sections: Membership, Careers, and Professional Interests.

We have three levels of access on this site:

  1. Administrators: can pretty much do anything and everything, and can publish any new content or edits to existing content without asking anyone’s permission.
  2. Content Editors: are in charge of moderating content. They are the ones who have the final say in terms of approving new content for publication, as well as approving new edits to existing content for publication. In Workbench-ese, editors can promote nodes in the 'Needs Review' state to 'Published' or reject nodes in the 'Needs Review' state by setting them back to 'Draft'.
  3. Content Contributors: can edit content, but need their edits to be approved by editors before they are published. In Workbench-ese, contributors can only create new 'Drafts' and then submit them for review by editors by promoting nodes to the 'Needs Review' state. 

These three levels of access work well for specialized content types (for example, Webinars and Internships) where we can set permissions and editorial controls globally on a content-type basis. All contributors can create new Webinars and edit all Webinars, and all editors can approve Webinars for publication. This system breaks down for the page nodes. We need a way to differentiate the editorial controls of the pages in the Membership section from the pages in the Careers section and the Professional Interests section. This isn’t something Workbench Moderation does out of the box, so we will need to add Workbench Access for this granular control of the different sections.

I'm going to skip the Workbench Moderation setup, as this was pretty straightforward and is covered already pretty well in the Drupal documentation. 

The setup assumes the following:

  • You have downloaded and enabled Workbench, Workbench Moderation, and Workbench Access.
  • You have already configured workbench (admin/config/workbench/moderation) in terms of the states (draft, needs review, published) you need.
  • You have set up permissions correctly for Workbench Moderation so the correct roles can edit and approve different content types for publication.
  • You have made sure that any content type you want to moderate with Workbench has been set up correctly (admin/structure/types/manage/[content-type). Make sure to select the Enforce Workbench Access control option.

workbench access

OK, now on to getting Workbench Access configured. 

Set permissions

Contributors need to be able to be assigned to sections and view access information. Editors need to be able to administer all settings for Workbench Access while Contributors need to be able to access Workbench Access sections. 

workbench access permissions

Also, confirm contributors and editors have necessary page node permissions - we want Editors and Contributors to be able to edit any page node. Workbench Access will then add the most granular control of which pages (Membership vs Careers) they can edit. 

workbench access node permissions

Editorial Section Taxonomy

Set up the Taxonomy you will be using for enforcing editorial controls. Workbench Access can use either Menus or Taxonomies for editorial sections, but to me a taxonomy seems more straightforward. This taxonomy isn’t used for tagging content, so doesn’t need to be assigned to any content type. You will be associating the taxonomy with the desired content types in Workbench Access. For this example, I have created an 'Editorial Sections' taxonomy with 2 terms, Membership and Careers.

Configure Workbench Access settings.

This is the first thing we'll do at  admin/config/workbench/access/settings. In this case, we’ll be using our newly created Editorial Sections Taxonomy on Page nodes.

workbench access settings

Add roles and users to the Editorial Sections

When adding roles or users to different Editorial sections, it’s important to realize that we are just assigning users to these sections - what the users can do in these sections (edit, approve edits for publication etc) is still controlled by Workbench Moderation and permissions. We’re adding a layer on top of the content moderation provided by Workbench Moderation. Add the administrator roles to all sections at /admin/config/workbench/access/roles - adding it to the whole taxonomy will then add it to all terms within this taxonomy. Click on the '0 roles' and then add all administrators and save. Back on the Roles tab you can now see one role has been assigned to all Editorial Sections.

Now we want to individually assign users to be able to edit or moderate pages in different editorial sections of the website. For demonstration purposes, I’ve created 4 users - Member Editor, Member Contributor, Career Editor, and Career Contributor. Here is how we're going to use Workbench Access:

  • Member Editor and Career Editors are both Content Editors, so can promote nodes in 'Needs Review' to 'Published.  
  • Member Contributor and Career Contributor are both Content Contributors, so can create new nodes and submit the for review. 
  • Member Editor and Member Contributor will only have rights to edit/moderate pages in the Members section
  • Career Editor and Career Contributor will only have rights to edit/moderate pages in the Careers section.

Click on the Editors tab, and then the 0 roles link next to the Membership section. Add the users in the auto-complete field and then hit save. Don't get confused by Workbench Access' terminology - when it says 'Editors' that is just the individual users who will be able to edit or moderate a section. It has no relationship with the Editors and role we have set up previously. 

workbench access add editors

Do the same for Editors of the Careers section, and you'll see now that 2 editors are assigned to each of these sections. 

workbench access editors

Add pages to sections

Now that we’ve got our Editorial sections set up and users and roles assigned to these sections, we need to add pages to these sections. Remember, we only set up the Taxonomy that will be used to divide the pages into different Editorial sections. We haven’t assigned pages to these sections yet. We can see this if we go to any of the pages on the site - they still say Editorial Section: Unassigned

There are two ways to assign pages to Editorial sections. You can edit the individual node, and select your Editorial Section in the new field that Workbench Access has added, or you can do a batch update through the Content Overview page at admin/content. After you've done that, now if you view one of the node you have updated, it should have the Workbench Access section assigned:

Confirm it's working

To make sure the controls are working the way you want, try logging in as the different editor and contributor users. You will see that the Career Editor/Contributors can only edit and moderate any nodes you have assigned to the Career section, and only the Membership Editor/Contributor users can only edit and moderate node assigned to the Membership section. 

Important note: until pages have been assigned to an Editorial Section, Workbench Access isn’t being enforced. So, if we view a page node as the Member Contributor and it says ‘Editorial Section: Unassigned’ we will be able to edit that page

Another important note: If you want to take this another step and integrate Workbench Access with rules, you will need to apply this patch (see issue This patch will then give you the ability to set conditions using Workbench Access. 

May 22 2015
May 22

Earlier this week Matt Mullenweg, founder and CEO of Automattic, parent company of, announced the acquisition of WooCommerce. This is a very interesting move that I think cements the SMB/enterprise positioning between WordPress and Drupal.

As Matt points out a huge percentage of the digital experiences on the web are now powered by open source solutions: WordPress, Joomla and Drupal. Yet one question the acquisition may evoke is: "How will open source platforms drive ecommerce innovation in the future?".

Larger retailers with complex requirements usually rely on bespoke commerce engines or built their online stores on solutions such as Demandware, Hybris and Magento. Small businesses access essential functions such as secure transaction processing, product information management, shipping and tax calculations, and PCI compliance from third-party solutions such as Shopify, Amazon's merchant services and increasingly, solutions from Squarespace and Wix.

I believe the WooCommerce acquisition by Automattic puts WordPress in a better position to compete against the slickly marketed offerings from Squarespace and Wix, and defend WordPress's popular position among small businesses. WooCommerce brings to WordPress a commerce toolkit with essential functions such as payments processing, inventory management, cart checkout and tax calculations.

Drupal has a rich library of commerce solutions ranging from Drupal Commerce -- a library of modules offered by Commerce Guys -- to connectors offered by Acquia for Demandware and other ecommerce engines. Brands such as LUSH Cosmetics handle all of their ecommerce operations with Drupal, others, such as Puma, use a Drupal-Demandware integration to combine the best elements of content and commerce to deliver stunning shopping experiences that break down the old division between brand marketing experiences and the shopping process. Companies such as Tesla Motors have created their own custom commerce engine and rely on Drupal to deliver the front-end customer experience across multiple digital channels from traditional websites to mobile devices, in-store kiosks and more.

To me, this further accentuates the division of the CMS market with WordPress dominating the small business segment and Drupal further solidifying its position with larger organizations with more complex requirements. I'm looking forward to seeing what the next few years will bring for the open source commerce world, and I'd love to hear your opinion in the comments.

May 22 2015
May 22

If you’re anything like me, right now you’re thinking: Finally! It’s a very exciting moment for those in our field who have craved ways to collaborate, learn from experiences and refine our craft. The Drupalcon team has heard our request loud and clear, and we can now enjoy the very first Project Management Track!

What makes this awesome news

We can finally dedicate these sessions to Project Management in its own right, instead of treating it like it’s some sort of sales or business solution that needs to be sold. Those of us in the field know this universal truth: it’s about delivery, not just sales!

Who should submit and who should attend

We want seasoned professionals to submit sessions, but anyone is welcome to attend. Sessions will assume some PM knowledge is already acquired by the audience. We only get 7 sessions, so we want to make them count!

Timeline reminder

- Submit by 8 June midnight CEST

- Selection from 8-20 June

- Sessions announced 28 June 

Do this to get your session picked

Don’t submit general project management sessions. Let’s get into the nitty-gritty. Keep the emphasis off business (eg: how to sell your estimate), and more on practice (eg: how to make your estimate). Workshops would be great for this kind of content! Do propose sessions that solve real PM problems, explore new concepts, and challenge our preconceived notions. We want the good stuff; the advice and tips that come from years of experience, and many projects under the belt. Propose a variety of content formats into the selected sessions, from workshops that focus on teaching, to case studies to panels and your more “classic” presentations.

Let’s make it happen people

Share this post! We need to get the word out!
Submit a session! You have until Midnight CEST (that’s 6pm Eastern) on 8 June 
Share your ideas! What do you want to learn from sessions? Tell us in comments
RALLY! Let’s get so much great content we will have a dickens of a time choosing!


Bonus: Content Ideas for you!

Back in the day, I proposed a lot of session ideas and it helped generate content, so I’m repeating the experiment! I highly encourage you to steal them, tweak them, and reinvent them -  whatever you want! I hope to inspire you to share your hard-earned wisdom, check out the list at the end of this post.

Here are a few of my session ideas up for grabs:

  • Project estimation techniques
  • Project planning tips
  • Scope and Change management and handling tough conversations
  • Risk management tools, techniques and handling frequent project risks
  • Your experiences with different PM methodologies: what worked or didn’t?
  • Fixed bids pros, cons, good, bad, ugly, how to avoid them, how to use them to your advantage
  • Managing Portfolios: how to keep track and report on what matters
  • Leadership and coordination approaches
  • Pre-sales estimation: Benefits/Risks, Tips & techniques
  • PM Tools comparison: what’s the best one? why?
  • The shift: changing methodologies, issues and advice on implementing new processes
  • Handling difficult and demanding customers who don’t know the meaning of “out of scope”.
  • Priority management and backlog grooming tips/advice/tactics
  • Defining Done: techniques to make things clear
  • Lessons Learned, Post-Mortems and Retrospectives: learning from past mistakes & successes
  • Team cohesion: staying united & motivated during tough projects
  • Authority: how to wield it without being a jerk
  • The long project: how to manage big, long, projects and how they differ
  • The big nasty: worst project you ever had, and what you learned from it
  • Difficult team members: bringing up the quality of work on projects, and dealing with
  • Drupal Risk Management: Common Drupal project risks and how to mitigate them
  • Watergile for Drupal: Hybrid PM Tips and Techniques for Agile + Waterfall Projects
  • Drupal Resources: How to PM Volunteer Projects
  • Drupal Iterations: An agile process case study
  • Drupal Estimation Panel: Tips and Techniques for Estimating S/M/L projects
  • Drupal Estimation: Did you forget something? (things we should estimate, but often don't)
  • Before You Win: Project Estimation and Assessment
  • Drupal Assumptions: Common Assumptions that Kill Projects (and how to educate your clients)
  • Drupal Project Reporting: The Good and Bad News about Reporting Progress
  • When you’re wrong: handling project estimations & assumptions that went south bigtime.
  • Turning it around: bringing projects back on track when they are off the rails
  • Drupal Projects: Good, Bad & Ugly -- what do we love and hate about Drupal projects
  • Testing: assumptions, estimations and pitfalls of drupal testing
  • Managing different kinds of projects, do some techniques work better than others?
  • How to make estimators out of developers
  • OTOBOS: umm, does it really exist across an entire project? Has anyone ever delivered every single iteration on time on budget on scope?? If not, why is that?
  • Chaos report: top contributing factors to project failure and techniques to mitigate those risks on your drupal projects
  • Do you hate agile or some other methodology? Why? What would make it “better”?
  • Best project you ever ran case study: why was it good, what worked, what did you learn?
  • Bad PM: worst mistakes you can make.

So, let’s get cracking! Submit your sessions, spread the word and come join us in Barcelona!


Shannon Vettes
Project Management Track Chair
DrupalCon Barcelona

May 22 2015
May 22

DrupalCon LA

So I did not make it along to DrupalCon Los Angeles, but I did spend some time reading twitter, and watching the sessions online. Here are some of the sessions I found entertaining and insightful and would recommend to others.

Driesnote Keynote

Dries, as always, sets the lay of the land with Drupal. He also goes into the early days of Drupal, and how some key people he was involved with and have now gone on to form organisations that centre around Drupal.

Best quote:

Obstacles don’t block the path, they are the path

[embedded content]


Larry Garfield gives an interesting talk on why sometimes it is best to say NO in order to give focus to the things that actually matter.

Best quote:

Case and point, the new Macbook Airs, they say NO TO EVERYTHING.

[embedded content]

PHP Containers at Scale: 5K Containers per Server

David Strauss explains the history of web hosting, and how this is now far more complex. David is CTO of Pantheon, and they now run 100,000+ websites, all with dev + test + production environments. Pantheon run 150+ containers on a 30GB box (205MB each on average). Really interesting talk on how to run large amounts of sites efficiently.

[embedded content]

Decoupled Drupal: When, Why, and How

Amitai Burstein and Josh Koenig give a really entertaining presentation on monolithical architectures and some developer frustrations. And then introduce REST web services in Drupal 8, and how this can be used to provide better consumer interfaces for other frameworks.

[embedded content]

Features for Drupal 8

Mike Potter goes through what role features played in Drupal 7, and how features will adapt in Drupal 8 now that CMI is in. Features in Drupal 8 will be going back to it’s roots and provide ‘bundles’ of configuration for re-use.

[embedded content]

Meet Commerce 2.x

Ryan and Bojan go through 1.x on Drupal 7, and how they have chosen to develop Commerce 2.x on Drupal 8. This is a complete rewrite. The hierarchical product model is really exciting.

[embedded content]

How, When and Why to Patch a Module

Joshua Turton goes over what a patch is, when you should patch contributed modules, and how to keep track of these with Drush make.

[embedded content]

My colleague Josh also wrote a blog post on how to use Drush make.

CI for CSS: Creating a Visual Regression Testing Workflow

I topic that I am passionate about is visual regressions, here Kate Kligman goes through some tools that can help you test your site for visual changes. Tools covered include PhantomJS, SlimerJS, Selenium, Wraith.

[embedded content]

Speeding up Drupal 8 development using Drupal Console

Eduardo and Jesus give us an introduction to your new best friend in Drupal 8. Drupal console is a Symfony CLI application to help you write boilerplate code, e.g. to create a new module. Personally, I am excited for the form API generator, and the ability to create a new entity with a single command.

[embedded content]

For more information see

Q&A with Dries

As Drupal heads down from 130 critical issues down to 22 currently, what are some key concerns by people. The questions are answered by dries, xjm, webchick and alexpott.

[embedded content]

Where can I find more videos

Don’t worry there are plenty more videos on the Drupal Association Youtube page.

If you have any awesome sessions that I have missed let me know in the comments.

May 21 2015
May 21

There’s many dirty little secrets in Drupal 7 core’s API when it comes to inconsistencies and oversights. It’s a big part of why so much care is being placed in D8 and its taking so long, because people realize this is a platform that’s used for the long haul and core decisions today will have lasting impacts a decade from now.

That said, I discovered one a year or so ago and kept putting it off, hoping it would go away on its own. Well, it hasn’t and here comes a potential scenario that I detail in an ELMSLN issue queue thread I like to call Role-mageddon. While this doesn’t just affect distributions and install profiles and features, it is a lot more likely you could run into a problem there with them; and so here we go.

Example Scenario

Site 1 (Profile A)

  • Developer Adds a Feature X that adds 2 roles
  • Then creates Views, Rules, and blocks and associates roles to access / visibility
  • Then they create Feature Y with 1 role and do the same as before

Site 2 (Profile A + the additions above)

  • Developer Enables Feature Y
  • Developer Enables Feature X
  • All access / visibility criteria of Roles / functionality supplied in Y is flipped with X
  • Oh Sh….

So What happened?

Roles in drupal are stored as id, name, weight. id is generated based on the database being incremented, so anonymous is always user rid 1 and authenticated rid is always 2. After that, it’s the wild west of whoever comes first gets the next id.

Well, if Roles 1 and 2 are created then Role 3, they’ll get ids of 3,4,5.

If Role 3 is created then Roles 1 and 2, they’ll get ids of 3,4,5 but all views, rules, blocks, anything associated to the rid identifier is now associated with the wrong role!

Without this knowledge you could have oh, i don’t know, made all your admin blocks visible the ‘bosswhopays’ role on production and not understood why ). This would also happen if your in dev and have a role that doesn’t move up to production that was created prior to the others that are about to. You move the features up, and none of the settings are kept.

So how do we avoid Role-mageddon?

Role Export adds a column called machine_name to the role table, and then uses the machine_name value to generate a md5 hash value which is used to create the rid. Then, so long as machine_names are unique, it effectively gaurentees that rid’s are unique and won’t collide with other roles that you import / migrate.

The import / export order no longer matters because they’ll always map to

Great for the future, but what about my existing site?

Role Export had support for automatically remapping the updated rid so your users roles don’t get lost, as well as the admin role variable and the permissions associated to the role. That’s great, without those this would have been basically worthless for existing sites.

What my patch of infinite lack of sleep provides, is the same exact thing but for Views, Rules, Blocks, Masquerade settings (since that has security implications and is popular) as well as a hook that can be invoked to fix your other variables like IMCE, Piwik, and LTI.

May 21 2015
May 21

Mediacurrent Dropcast: Episode 5

Our first foray into public during Drupalcon Los Angeles. Bob, Jason and Mark are live interviewing anyone who showed up to our BOF (Birds of a Feather) and gave away fancy Weekly Drop T-shirts. We also talked about our favorite sessions in this years North American ‘Con. Special thanks to Benztown Radio for the use of their equipment.

Episode 3 Audio Download Link


Bob, Jason, Mario, Mark

Show Updates:

Show Links:

May 21 2015
May 21

Javascript code

Business is keenly aware of the importance of page-load time, and its impact on conversion and search engine optimization. It’s now a priority at companies like Wal-Mart, Amazon, and Mozilla.

At Acquia, we hear about it from virtually every customer. They all want to know how our platform and services can improve the performance of their websites. How much can we speed up the responsiveness of the digital experience they are offering their users and customers.

Performance is often considered to be primarily a back-end problem, but frankly what we find after we dig through back-end code: often poor front-end optimization is the culprit and not Drupal itself.

While internet users don't have a page-load value in mind — they’re not counting seconds — they do want their content now. A content owner’s fear is that with a finger hovering over the back button, a user's brain is doing an automatic cost-benefit analysis on whether the loading content is worth the wait. If the site is too slow, they are impatiently wondering if they can get what they’re looking for somewhere else, somewhere quicker.

Its important for business to understand the impact of design and feature-level decisions on performance, and the importance of balancing a sophisticated and elegant user experience with nimble performance. As Engagement Managers, Architects, and Developers, it’s up to us to inform stakeholders of the impacts of their choices, offer compromises where we can, and to implement in smart and responsible ways. Regardless of the heroic efforts we are asked to make at the code level, we should all be able to agree on this:

Faster Page Loads = Happier Users

This article kicks off a series about optimizing the requests made by a Drupal site after the DOM loads. The goal of the series is to give site and product owners a new set of tools to evaluate their internal performance and to provide architects and developers specific recommendations. Today we’ll tackle image handling. Subsequent posts will cover JavaScript and CSS optimization, Content Delivery Networks (CDN), semantic HTML and better content selection. We’ll start with image handling because it’s low-hanging fruit and a front end swing-and-miss we often see.

Our first post is divided in two: Theme Images, the images comprised in your design, and Content Images, the images chosen and uploaded by authors, editors, and producers.

In Theme Images we cover sprites: why you should use them, how we employ them at Acquia, and some resources to get you going. In Content Images we explore how to deliver high quality images, optimized using compression and size adjustments, and how we accomplish this at Acquia. Finally, we’ll link to some additional resources.


Your images need to be optimized. Full stop. Apply some lossy compression to that 50 image gallery. Dump all your theme images into one sprite file. Don’t serve a retina-quality image to an outdated smartphone. All of these impact page-load times, and we’ll touch on each one here.

Theme Images

We have the most control over theme images because the end users who create content on a site rarely need to manipulate them. Theme images don’t change much after the designer has created them. That makes them ideal for combining into CSS sprite files. A sprite works by combining all theme images into one file and using the x and y positioning values of the “background” CSS property to control which portion of the image is visible.

Sprites hold the advantage of existing in a singular file that is almost always smaller than the sum of its would-be piecemeal parts, plus it can be downloaded with a single HTTP request and cached for reuse. While nothing new, if you’re unfamiliar or need a refresher on sprites, CSS Tricks has a great introduction.

There are a lot of ways to create sprites, including manually in Photoshop. Various Ruby gems and Grunt/Gulp plugins make the process easier. Here at Acquia, we tend to rely on Compass to do the heavy lifting for our Professional Services builds. When creating sprites with Compass, you can use directories to group images that will form separate sprites. So, instead of creating one enormous sprite for all of my styles, I'll break them up into logically grouped images based on their use. These almost always end up being PNGs. When employing icons, I try to use a font-icon or an SVG icon if possible. And if you’re considering SVGs because they look great at different resolutions and screen sizes, you can sprite those too.

Content Images

Content images differ from theme images in that we as designers don’t have full control. We’re shackled to the whims of a writer or a content producer with a burning desire for that full-window 50-image slideshow. Nevertheless, we need to make sure those 50 images hit a sweet spot for size and compression. That means we’re applying an acceptable amount of lossy compression on our JPGs and sizing them to correspond with viewport size and device resolution.

We see a lot of designers and developers getting around responsive challenges by simply loading a larger image then necessary, not declaring dimensions on the image, and scaling the image using styles.

Instead, we should use our current best option, Drupal’s Picture Module. The picture module uses the (soon to be accepted) HTML5 picture element and is a backport of Drupal 8's Responsive Image module which is a part of core Drupal 8. For many, the current preferred solution is to use an image tag with “srcset” and, yes, I am aware of the ongoing conversation around Drupal 8 image handling. Presently, however, the picture element and a polyfill is Acquia’s go-to solution for responsive images. It uses the Breakpoints Module to load the correct image according to viewport size and pixel density, and adopts our defined image styles to create derivatives for different viewports.

This solution takes care of both image size and compression, doing the math to find that optimized sweet spot so you don’t have to.


Drupal can be a speedy back-end workhorse, but sloppy front-end implementations can quickly undo all your hard work. Employing the strategies I’ve outlined here can decrease your page-load times by a significant amount. Using sprites for theme images reduces the number of HTTP requests, and enables caching for future use. Drupal’s Picture Module takes the guesswork out of image delivery, optimizing with appropriate compression and size manipulation.

And this is just a start towards your faster Drupal website. In the next post in this series, I’ll show you how to optimize your javascript and cascading style sheets -- two more ways you can improve your site’s front end to create faster page loads, and happier customers.

May 21 2015
May 21

Drupal is an awesome tool for building sites! You imagine, you create and finally you publish your work online.

But, if you are asking yourself “What now? Is all the work finished?” , then this track is exactly what you were looking for.

Every site needs to be deployed, hosted, monitored, upgraded, scaled, security patched and maintained. DrupalCon Barcelona DevOps track can help you to achieve those goals and ensure the success of your site.

DevOps bridges the gap between the world of developers and operations.

Drupal development is well understood by the community - it is all about code. Writing code, implementing code, testing code, re-writing code… relax, have a drink, and do some more code.

Operations is all about looking after the systems that runs that code. Figuring out how much CPU, RAM and disk you will need to run your Drupal site, ensuring patch security, automated testing, scalability, etc - basically how to keep the site running and growing flawlessly.

In this track we want to expand these core concepts of DevOps for the Drupal community, and help both developers and operations achieve the long-lasting success of Drupal sites. If you want to share your experience submit your now.

Submit a Session


Ricardo Amaro
DevOps Track Chair
DrupalCon Barcelona

May 21 2015
May 21

In this article, we will create a basic view to display a Drupal calendar with events.

By the end of this article, you will be able to configure a basic Drupal event calendar for your website that looks like this:

Drupal Event Calendar

In order to get started, I want you to download and unzip the following modules to your Modules folder:

1.Drupal Calendar Module (

Drupal Calendar Module

2. Drupal Date Module (

Drupal Date Module

Enable the modules after you are done with the downloads.

Click “Modules” on the top-level menu:

Search and enable the following modules:

  • Calendar
  • Date
  • Date API
  • Date Popup
  • Date View

We assume you already have Views module downloaded and enabled.

Let’s now create a new content type to feature the events. I will be creating two events for June – World Environment Day and Father’s Day.

Click “Structure”:

Click “Content types”:

Click “Add content type”:

I am adding the new content type as “Events”. I have given “Title of Event” as the label. Click “Save and add fields”. View the screenshot below for more info:

Under Add new field, I have provided the value “Date of Event”. Under FIELD TYPE, I chose "Date" from the dropdrown. The WIDGET field gets automatically updated with the value “Pop-up calendar”. Click “Save”:

The next screen that comes up is the FIELD SETTINGS screen. Here are the important fields you will come across:

  • Date attributes to collect: It lets you choose the attributes you want highlighted in the Date field. I have kept the values as default.
  • Collect an end date: It allows you to provide an end date for an event. I ticked the box.
  • Time zone handling: Provide your time zone. I have gone with the default “Site’s time zone”.

Click “Save field settings”:

You won’t have to do anything in the new screen. Enter content for Help text if you wish to:

Scroll further below and click “Save settings”:

Notice the newly created “Title of Event” and “Date of Event”. Click “Save”:

Let’s now create the Events! Click "Content" on the top-level menu:

Click “Add content”:

Click “Events”:

In the new page, provide the title and details about your event. I have pasted from Wikipedia for this page:

Scroll below and you will be asked to fill in the timings for the event by using a pop-up calendar:

This is how the timings for the event looks like after details have been filled:

Click “Save” at the bottom of the page:

Your event for World Environment Day has been created:

Let’s create one more event:

Click “Save” and the page for Father’s Day shows up:

Let’s now configure the calendar to show these events. We can do this by creating a View that highlights the calendar.

Click “Structure”:

Click “Views”:

Click “Add view from template” to completely personalize your view:

You will be allowed to select a template based on a pre-configured value. In this case, Date.

In the new screen search for the newly created field created above – Date of event. Click “Add on the extreme right. Refer the screenshot below:

Click “Continue” to go to the next screen:

You will be taken to the following screen. It features basic configuration for your calendar. On the top-left, you will find several buttons – Month (default), Week, Day, Year, Block and Upcoming. Clicking them will take you to similar screens. I want you to click “Save” on the top-right:

Click “View Month” as shown below:

You will now be able to see the calendar with the events featured. Clicking on any event will take you to its respective page:

We have now completed the basic configuration to enable a Calendar view with events.

May 21 2015
May 21

The Drupalcon song - with actions!

I am never missing the #DrupalCon #prenote again. So brilliant.

— Kelley Curry (@BrightBold) May 12, 2015

DrupalCon always leaves me full of energy, and Amsterdam 2014 was no exception. The three of us – Adam Juran, me, and my wife Bryn – sat together on the short train ride back home to Cologne. Some chit chat and reminiscing quickly led to anticipation of the next DrupalCon, in LA. We were excited about the possibilities of this world-class host city. The home of Hollywood, Venice Beach, and Disneyland sounded like a great destination, but after three years of co-writing the DrupalCon “opening ceremony” with Jam and Robert, we were more excited about the possibilities for the Prenote. We knew we had to up the ante, make something new and different from previous years, and LA seemed like a gold mine of possibilities.

Every DrupalCon, before the keynote from Dries, this small group has staged a “pre-note.” The goal of the prenote is to break the ice, to remind everyone present that Drupal is a friendly, fun, and above all, inclusive community. It’s often themed after the host city: in Munich, Jam and Robert taught everyone how to pour a good Bavarian beer, and brought in a yodeling instructor for a singalong (yodel-along?) at the end. In Portland we held a “weirdest talent” competition, featuring prominent community members juggling and beat boxing. Every year it gets more fun, more engaging, and more entertaining for the audience.

Learning how to pour beer at the Drupalcon Munich prenote, 2012

Learning how to pour beer at the Drupalcon Munich prenote, 2012

On that train ride home, we threw around a lot of possibilities. Maybe the prenote could be set on a muscle beach, with Dries as the aspiring “98 pound weakling.” Or the whole thing could be a joke on a hollywood party. We briefly considered a reality-TV style “Real coders of Drupalcon” theme, but nobody wanted to sink that low. That’s when the idea struck: we could do it as a Disney musical!

Part of Your World

The Prenote was Jam and Robert’s baby, though. We knew that we would have to have some absolutely knock-down material to convince them of our concept. With beer in hand, the three of us started work on Part of your world from the Little Mermaid, as the client who is excited for the worst website idea ever.

“I’ve got sliders and icons a-plenty,
I’ve got OG with breadcrumbs galore.
You want five-level dropdowns?
I’ve got twenty!
But who cares? No big deal.
I want more!”

We quickly moved on to the song for the coder who would save the day, You ain’t never had a friend like me from Aladdin. We got halfway through this fun number before we realized that the song titles alone could do a lot of the convincing. Another beer, and we had a list of potential songs. There was so much material just in the song titles, we knew that the music would take center stage.

Some of our favorite titles from this first list were ultimately cut. Maybe someday we’ll flesh them into full songs for a Drupal party, but in the meantime you can let your imagination run wild. Hakuna Matata from The Lion King was to become We’ll Build it in Drupal! The Frozen parody, Do You Wanna Build a Website was a big hit, and so was Aladdin’s A Whole New Theme.

We showed our idea to Jam and Robert the first chance we got. They took one look at our list of songs and said the three words we wanted to hear: “run with it.”

You Ain’t Never had a Friend Like Me

Forum One's Adam Juran and Campbell Vertesi as

Forum One’s Adam Juran and Campbell Vertesi as “Themer” and “Coder” at the Drupalcon Austin prenote, 2014

We divided up responsibility for  the remainder of the songs and started to experiment with the script. What kind of story could we wrap around these crazy songs? How much time did we really have, and could we do all this music? We were all absorbed in our normal work, but every chance we got, the group of us would get together to throw ideas around. I don’t think I’ve ever laughed as much as while we wrote some of these songs.

Writing parody lyrics is entertaining on your own, but as a duo it’s a laugh riot.  More than once we checked the Drupal song lyrics project for inspiration. We riffed on ideas and tried different rhyme schemes until things seemed to just “fit.”

Heigh Ho, Heigh Ho

In the last few weeks leading up to DrupalCon, Adam and I met two and three times a week for long sessions, brainstorming new lyrics. We powered through writing the script around the whole thing, and started to address the logistical problems of backtracks, props, and costumes as well.

via Mendel at Drupalcon LA. Ronai Brumett as the perfect hipster Ariel

via Mendel at Drupalcon LA. Ronai Brumett as the perfect hipster Ariel

Finally we set about casting the different songs. Adam and I had always wanted to sing the Agony duet from Into the Woods, so that one was easy. We had a tentative list of who we wanted in the other songs, but we had no idea who would be willing. All of a sudden the whole endeavor looked tenuous again. Why did we think Dries would be OK to make a joke about Drupal 8 crashing all the time? Would Jeremy Thorson (maintainer of the test infrastructure on even be interested to get up on stage and sing about testing? We realized that we’d never heard these people sing karaoke, much less in front of thousands of people!

One by one we reached out to the performers and got their approval. Some of them were more enthusiastic than others. Dries replied with “OK, I trust you guys,” while Larry Garfield and Jeremy Thorson insisted on rewriting some of their lyrics and even adding verses! The day before the show, Larry was disappointed that we couldn’t find giant foam lobster claws for his version of Under the Sea from the Little Mermaid. Aaron Porter bought a genie costume and offered to douse himself in blue facepaint for his role, and Ronai Brumett spent a weekend building the perfect “hipster Ariel” costume.

When You Wish Upon a Star

On DrupalCon – Monday the day before the show – the cast assembled for the first time for their only rehearsal together. I arrived a few minutes late, direct from a costume shop on Hollywood Boulevard. Jam had built karaoke tracks on his laptop, and Robert had put together a prompter for the script, so the group huddled around the two laptops and tried to work through the whole show.

Via <a href=

Via Mendel at Drupalcon LA. The prenote cast rehearses. From left to right, Larry Garfield, Aaron Porter, Adam Juran, Jeffrey McGuire, Campbell Vertesi.

The rehearsal showed us what a hit we had created. The performers had embraced the motto: “if you can’t sing it, perform it” and they started to feed off each other’s energy. We all laughed at Ronai’s dramatic rendition of Part of My Site, and the Agony Duet raised the energy even further. It turned out that Dries had never heard When You Wish Upon a Star from Pinocchio before, but he was willing to learn as long as he could have someone to sing along with him!

via Mendel at Drupalcon LA. Aaron Porter codes with his butt - on Dries Buytaert's laptop!

via Mendel at Drupalcon LA. Aaron Porter codes with his butt – on Dries Buytaert’s laptop!

The rehearsal really started to hit it’s stride when Aaron delivered You Ain’t Never had a Dev Like Me. Aaron had never sung in public before, and we could tell he was nervous. Then the backtrack started playing with its blaring horns, and he came alive. It’s a difficult piece, with lots of fast moving text and a rhythm that can be hard to catch. Aaron launched into it with gusto. He had us in stitches when he shouted “can your friends do this!” and grabbed Dries’ laptop to start typing with his butt. When he nailed the high note at the end with a huge grin on his face, it was a deciding moment for the group.

From that moment on we were on a ride, and we knew it. Simpletest (to the tune of Be Our Guest from Beauty and the Beast) turned out to be a laugh riot, and Jeremy led us naturally into a kick line for the grand finale. We cheered Larry’s choreography skills during the dance break of RTBC, and Ben Finklea was a natural (as ever) at leading us all in Commit, to the tune of Heigh Ho from Snow White.

Forum One UX lead Kristina Bjoran, had protested the most of everyone about having to sing, but the moment she started with our version of Let it Go from Frozen, we were caught up in the feeling of it. I don’t think anyone expected the goosebumps that happened when we sang that chorus together, but we all appreciated what it meant.

Let it Go

The morning of the show saw the whole cast up bright and early. Though we joked about doing a round of shots before going on stage, no one seemed nervous. In fact we spent most of the setup time laughing at one another. Larry discovered that he has great legs for red tights. Aaron got blue face paint everywhere. We cheered at Jam and Robert’s Mickey and Minnie costumes, and laughed at Ronai’s perfect Hipster Ariel.

Some of us had last minute changes to make: Jeremy spent his time crafting oversized cuffs for his costume. I had forgotten the belt to my ninja outfit, so we made one out of duct tape. Kristina discovered that her Elsa costume limited her movement too much for the choreography she had planned. Dries was the only one who seemed nervous to me – this guy who has spoken in public countless times was afraid of a little Disney! We sang through the song together one last time, and it was time to go on.

via Mendel at Drupalcon LA. Jeremy Thorson leads the

via Mendel at Drupalcon LA. Jeremy Thorson leads the “Simpletest” song. Behind him, from left: Campbell Vertesi, Ronai Brumett, Adam Juran, Aaron Porter, Dries Buytaert

Everyone knows the rest – or at least, you can see it on youtube. What you probably don’t know is how hard we all laughed as we watched the show backstage. Even knowing every word, the energy from the audience was infectious. In the end, there’s nothing quite like standing in front of three thousand people and shouting together: “we come for code, but we stay for community!”

Previous Post

Evolving the Nation’s Report Card – A Study of Designing New Reports

May 21 2015
May 21

Broken links suck. It's incredibly frustrating to read a great article that links to an external resource that covers a subtopic in detail only to find the link is broken. In that moment I curse whatever developer or webmaster of the external site didn't think that creating a 301 redirect was worth the effort. I end up going to the root domain to hunt for the article by topic, hopefully the link text or URL slug gives enough topic or keyword clues to find it. Sometimes the external resource is gone completely and then I'm off to to try to find a cached copy.

You know what's worse than a broken external link for your Drupal site? A broken internal link.

A broken internal link is a slap in the face for user experience. You didn't create a 301 redirect from the old URL to a relevant new URL. You didn't update the link you have control over. Do you care about your reader's experience at all? I want to help you prevent broken internal links in your Drupal sites and show you a process to automate link updates. Don't slap your users, follow this process!

What are internal links

Internal links are simply links which are to URLs on the same domain or website. For example, this is an internal link to our Drupal CMS guides. The link is to a page on the domain from a page on the domain. We have full control over both the link href (an anchor attribute which creates a hyperlink, the link destination URL) and linking page. The opposite, external links, are simply links to a different domain. You don't have any control over the link destination.

Why internal links are important in Drupal

Inbound links (external links on other sites to your site) are important to build domain authority for SEO, improving search rankings. Similarly, internal links help create content relationships within your site and give context to content through their link text.

Sidenote: Search engines account for internal link anchor text. Ideally, you don't want anchor text to simply say "click here", "more", or some other non-contextual, non-descriptive anchor text.

How to create internal links in Drupal body text

The primary way you'll create Drupal internal links is through body text on pages. More than likely, your WYSIWYG offers a link button that'll pop-up a dialog to input a link URL. This gives you a few methods to create internal links.

Absolute internal links

If you create a link with a fully qualified URL (the complete http(s)://subdomain.domain.extension/page-name), that's an absolute link. No part of the link is "variable", its destination will forever be the exact URL you've input. If any part of the destination URL were to change in the future, your absolute links will become broken. You'll need to manually update them.

Relative internal links

You can create links that are relative to the linking page. There's a few methods of creating these relative internal links.

If you were to simply add the href of "page-name-2" to a link from a page at this URL:


The href would be relative to the current location and so the link will actually go to:


The link destination will retain the depth, and target a sibling page ("page-name-2" within /topic/).

You could also link to a page on the current domain from the top level by using a leading forward slash. If your link href was set to "/page-name-2" in the above example, with a leading forward slash, your destination would become: 


The forward slash causes the hierarchy to collapse, leading from the top level domain: "/topic" is dropped.

There's some additional advanced methods of creating internal links in HTML documents, but this should give you an idea of how they work. In essence, they create variables so that your links will work when certain conditions change. If your domain name were to change but your content hierarchy remained the same, your relative links would still have destinations that match your hierarchy and become relative to the new domain name. This gives you more flexibility than absolute internal links, but it's still not the best method.

Canonical internal links

Drupal's internal URL scheme creates canonical locations for content. These are the "non-pretty" URLs. Drupal's node system is reflected in these internal URLs:<nid>

In this scheme, the <nid> represents the internal node ID (a unique identifying integer) for a particular piece of content. This works similarly for Drupal's taxonomy and term system:<tid>

The <tid> is the term ID, identifying a taxonomy term. Using our example above, this is the canonical URL to our Drupal CMS guides:

The "Drupal CMS guides" term ID is 23. If you inspect or hover the actual link above, you'll see that the URL is pretty, human readable:

We've setup Drupal to have an alias for "/guide/drupal" to display our "Drupal CMS guides" term. Drupal creates these pretty aliases by default ("clean URLs"). 

However, this is the literal HTML code that Drupal stores for that body text internal link:

<a href="">Drupal CMS guides</a>

This is a relative internal link to a canonical URL. Our WYSIWYG is configured to translate these internal canonical URLs to their pretty aliases on the fly. This takes care of multiple issues:

  1. If we were to decide to change our domain or store this blog post on a subdomain, the relative link would automatically reflect the domain change
  2. If we were to alter the alias from "/guide/drupal" for this particular term to "/guides/cms/drupal", the canonical URL would point to the appropriate pretty URL

This works the same way for nodes: if we link to their canonical URL and their alias is updated, Drupal will automatically translate the canonical URL to the new pretty URL on the fly. Tracking down the canonical URL manually for each internal link can be a pain, though. We're getting to automating the internal link creation process, but first let's make sure we understand why it's important.

Why creating Drupal canonical internal links is important

As your Drupal website develops, you'll undoubtedly create more content: more nodes, more terms. You'll also come up with new URL strategies and ways to classify your content through the pieces of your URLs. Perhaps you'll decide to add a portion of the date within the URL, maybe a category the content resides in, or you just want to manually alter a piece of content's URL to make it shorter or better reflect the contents. This will happen over time. When it does, you don't want to either have to go back and update all your internal links to reflect the change, or even worse, cause broken links and leave them broken. Broken internal links will degrade your search engine performance two ways:

  1. You'll lose the internal link anchor text context and relationship between pages
  2. Search engines will devalue your content due to having broken links

Don't cause broken internal links. You have full control over fixing them and implementing a strategy to avoid them in Drupal. Let's automate the process.

Drupal Internal Link modules

What we want to do to automate internal link updates is simply make sure that all of our internal link destinations (hrefs) are to canonical URLs wherever possible. You can do this by tracking down the internal path and ID of every term or node you want to link to. That requires a lot of extra effort. There's a better way. Thankfully, Drupal's community of contributed modules comes to the rescue.

Drupal WYSIWYG Module and editor plugin (TinyMCE, etc)

If you're using the WYSIWYG module plus one of the editor plugins it supports, your best bet is the LinkIt module. LinkIt provides a button in the WYSIWYG and a dialog to search for content on your site to link to. Once you click the LinkIt button in your WYSIWYG, it's just a matter of following the prompts to create internal links:

Creating Internal Links in Drupal with LinkIt

LinkIt will automatically create your link with the relative canonical URL. You don't have to hunt down the content ID or fuss with the HTML of the anchor.

Drupal CKEditor Module and CKEditor Link Module

All of our Drupal projects now utilize the CKEditor module for a WYSIWYG. This is the direction Drupal 8 has gone and so we try to make sure the eventual progression will be seamless for our Drupal services clients.

The CKEditor module has an extension or submodule that performs similar work to LinkIt, the CKEditor Link module. This allows you to use the lovely CKEditor WYSIWYG to create internal paths quickly. We love it.

After following the module instructions to setup and configure CKEditor Link to work with CKEditor, you'll have a simple process to creating internal, canonical links. This is our internal link workflow using CKEditor and CKEditor Link modules:

CKEditor Link Workflow Internal Path Process How To

  1. Select the anchor text you wish to use in the WYSIWYG
  2. Click the CKEditor Link button
  3. Make sure the "Link type" in the dialog is "Internal path"
  4. Start typing the title of our content in the autocomplete text input, select the right suggestion

This process creates a link to the internal canonical Drupal path to the content.

CKEditor Link creates internal Drupal paths

If you ever change your domain or alter the destination page's alias pattern, this internal link will still work. Combining CKEditor and CKEditor Link modules is our favorite way to handle internal path and link creation within Drupal.

Checking for broken internal links

Now that you have a process to create proper internal links (if you haven't, make changes now), you may need to check for any broken internal links. Drupal offers an excellent module for broken link checking, but first let's look at Google's Webmaster Tools. The free tool offers excellent insight for broken links.

Using Google Webmaster Tools to check for broken internal links

You do have a Google Webmaster Tools account, right? If you don't, sign-up. If you're a manager or webmaster, ask your developer to follow these steps to check for broken links. It's quick, easy, and free. This is a great way to fix search issues, raise your ranking, and properly distribute your internal traffic. Don't slap your users with broken internal links.

After logging into the Google Webmaster Tools (GWT) and selecting your domain, find your broken internal links this way:

Use Google Webmaster Tools to find broken links

  1. Expand the "Crawl" menu
  2. Select "Crawl Errors"
  3. Select the "Not found" tab

This will provide you with a list of URLs with problems. You're interested in those that aren't found (404 errors). The URLs listed tell you which URLs Google has stumbled across and received a 404 error. These are great candidates for creating 301 redirects to the right, or at least relevant, content on your site. What you really want to check for though is where these broken URLs are linked from. Are they your own internal pages? GWT to the rescue again! Click any of the URLs listed:

Find the source of broken internal links with Google Webmaster Tools

The pop-up dialog will indicate the broken URL and where it is linked from. Go to those pages and fix your broken internal links.

Automate broken link checking in Drupal with Link Checker module

If you've got a large Drupal site, manually checking GWT for broken links may not be the most effective way to find them. Drupal's Link Checker module will automate this process for you. Setup the module and it will scan your new and existing content for links, follow them, and identify any problems with them through their HTTP status code. Fix any 403s or 404s! The Drupal Link Checker module works with both internal and external links.

Why are internal broken links bad for your Drupal site?

Search engine spiders or crawlers are very busy workers. There's an entire web worth of pages to crawl and re-crawl. When they run into a broken link, they may stop where they are and move onto the next page. Broken links are a negative signal to search crawlers. Don't give them a reason to devalue your page.

I'm sure you've run into a broken link or two in your life while reading a page on some website. I bet it may have tempted you to stop where you are on the page and try to find a more authoritative, updated page about what you were reading. If the page owner hasn't checked their content over time to verify the links work, what's to say the content is still accurate? Not only do broken links discourage search engines, but they discourage users. It's especially embarrassing where the links are internal because the webmaster has control over both the destination page and the linking page.

Broken internal links can easily be avoided. Create a workflow to utilize canonical, relative internal links. Automate your internal linking process as much as possible by following the steps above. Identify any existing broken links with GWT or Drupal's Link Checker module and setup a process to routinely check for broken internal links. Make sure to create 301 redirects if you update existing content paths to prevent external sites from linking to now nonexistent paths. Don't slap your users with broken links, they're quick to fix!

This post was inspired by reading Mike Gifford's spring cleaning tips for Drupal sites, be sure to show your Drupal site some love with routine maintenance! Did you like this article or know someone that needs some link workflow help? Share it! If you're still having trouble getting your links in order, you can hire our Drupal team!

May 21 2015
May 21

This article was submitted by our Premium Hosting Supporter Linode.

We’ve all experienced these before: slooow server hardware; unlimited disk space that is capped once you begin to actually fill it; local directory software installs because you’re not allowed to alter the root system. Managed hosting emerged to help solve these problems. And it did - but sacrificed the true power of a host’s infrastructure. Fortunately, an alternative exists that overcomes the deficiencies of both shared and managed hosting. I call it a “Freedom Host.”

What is a Freedom Host?

A Freedom Host respects your needs and creativity. It gives you full root access to the server leaving you with the most powerful processors and lightning-fast, solid-state storage.
Why choose a Freedom Host?

“Getting off the Island.”

This counters a long-standing community practice of exclusively using Drupal. We now see large opportunities in combining Drupal with other powerful auxiliary software. Managed providers have long offered users click-to-deploy for Drupal; but where’s the Node.js button? HA Proxy button? Split-DNS? Magento? These options don’t exist on a managed host.

A Freedom Host allows you to run what you want when you want.

Security is a priority when running your Drupal website, right? You verify file permissions, sanitize all site forms and enforce strict password rules to protect against risky Internet traffic. But what about protection from other websites on the same server? What about local containers running on the same private subnet as your own? A Freedom Host, whether dedicated or VPS, offers you greater security than what’s provided through today’s shared-hosting or containers.

How do I get Managed comfort with Freedom’s power?

Deployment - Deploying on a Freedom Host doesn’t have to be difficult. Between pre-made images, StackScripts, cPanel, or Bitnami, plenty of Drupal deployment options exist.

Drush – You can install Drush in seconds with full functionality on any Freedom Host.

Control Panels - While many Freedom Hosts provide you with a remote terminal to get started, you can install and run the GUI you want, not just what you’re limited to.

Backups & Monitoring - Any reputable Freedom Host provides a backup solution but additional options are limitless. Save your Drupal site as a tarball, dump your MariaDB/MySQL database or mirror to an external slave server. You can even image the entire server to backup or test locally in VirtualBox. System metric software, including Longview, New Relic or Piwik, measure, graph and store server traffic.

So, what can I do with all this Freedom?

While impossible to compile a full list, some interesting Drupal projects I’ve seen include:

  • swapping out “Zen” PHP for Facebook’s HHVM for speed improvements in Drupal 8
  • testing Drupal 8 using PHP7
  • compiling Nginx to include custom features for Drupal
  • custom compiling a kernel for improved performance.

A Freedom Host provides options when choosing what and how you run your Drupal website. Options aside, a Freedom Host is more powerful and less expensive than most managed providers. You can’t lose with Freedom.

This article was written by Ricardo N Feliciano. He is currently a Developer Evangelist for Linode, and is an Information Systems Technician in the U.S. Navy.

May 21 2015
May 21

DrupalCamp 2015 St. Louis - SLU LAW

DrupalCamp St. Louis is scheduled for June 20-21, 2015, and will be held at SLU LAW in downtown St. Louis, MO. Less than a month away, there are a few important bits of news:

DrupalCamp STL.15 Keynote Speaker: Alina Mackenzie (alimac)

Alina Mackenzie is a developer and system administrator based in Chicago. In the Drupal community she is a camp organizer, speaker and communications lead for DrupalCon mentored sprints. She is passionate about learning organizations, automation, and making open source friendly for beginners.

Alina's keynote will focus on "Finding the entrance: Why and how to get involved with the Drupal community".

Alina's profile is

Session Submission Deadline: May 29

Please submit your session proposals by Friday, May 29—just over a week from today! We'll notify speakers on June 5th whether a session was accepted or not.

We hope to see you at DrupalCamp St. Louis 2015! Registration will open next Monday, and sessions will be announced on June 5th.

May 21 2015
May 21

 Gorilla It is one thing to be brought into a project as a team player, where the project is managed or you are delivering a predefined piece of it. However, that is typically not the way things happen when doing work for a small business or an initial project which will result in a business launch.

The more challenging and demanding opportunities for a freelancer are those one-man top-to-tail projects: creating the whole megillah[1]. Here is a brief look at the steps that could give the poor shlub[2] a fighting chance.

  1. Initial Meeting: The client transfers his vision to you. Determine what specifically makes or breaks its success.

    Output: Management Summary: Mirror the client’s vision back to him in your own words, for validation.

  2. Reference site(s): Ask the client to point to sites exemplifying functionality that works and functionality that doesn’t.

    Output: Create a spreadsheet that will contain a row for each function, and identify that function as either a launch requirement, nice to have for launch, post-launch, or unneeded.

  3. Conceptual design: Using the approved spreadsheet, decide what the site will look like.

    Output: Some conceptual prototype, such as wireframes, storyboard, etc.

  4. Functional design: The details behind the elements of the conceptual design; how front-end elements should work, as well as the back-end functionality, business rules, and the seemingly small details (being able to print receipts with a receipt printer, accepting input from card-swipers, syncing with third-party applications, etc).

    Output: Functional design document.

  5. Architectural design: With all the functionality designed, it’s time to determine the architectural stack that will support it. What is the caching strategy? What services will be required (wget, cron, curl, configuration outside the Drupal docroot, specific versions for applications or services, etc)?

    Output: Architectural design document.

  6. Database design Of course, the Drupal schema has already been done for you, but there could well be additional tables needed, indexes defined, perhaps even the use of alternative databases, like MariaDB, NoSql, etc.

    Output: Schema(s) supplemental to the core Drupal schema.

    At this point, you have everything you need to determine effort and pricing. But as the client reacts to sticker shock by removing features or altering priorities, additional iterations may be needed.

  7. Artistic design It’s time to develop the design from which the page layouts, templates, CSS, and themes, will emerge. It may not be your responsibility to develop the artistic design, but merely to ensure that the design embraces the functional and conceptual designs.

    Output: Artist comps, Photoshop files, etc.

  8. Hosting selection: Armed with the architectural design, it’s time to consider what hosting service will be used. What pieces are needed beyond the normal LAMP stack and Drupal? What access will be required (ssh? cPanel?) and what granularity of control is needed?

    Output: Requirements list for the client to verify with the hosting service.

  9. Development: Naturally, this is where most of your time will be spent. I normally do my development locally, with a privately accessible test site for the client to use for review. I take the site offline unless I have something I want the client to review, otherwise he can get antsy not seeing any change.
  10. Customer testing: As page, widgets, and features progress, it’s good to have the client review them as soon as possible. There is no easy answer as to whether one should point at the agreement when the client rejects something that was done as designed. That said, certainly successful element/feature/page tests should be a foundation for an ultimate system test.
  11. Production environment readiness: It’s best to install the test environment on the production environment for readiness testing in advance. That could be difficult if the domain name cannot be used yet; not having it available could cause some things to fail. But for now, this is more a test of whether all the pieces on the architecture, and all the configuration settings for the LAMP stack (or whichever is being used), are correct and viable.

This is a greatly condensed, summarized, and simplified approach; nothing in Drupal development is ever one-size-fits-all. I hope it sparks your imagination and gives you a helpful framework with which to succeed.

Image: "Eastern Lowland Gorilla Infant in Kahuzi Biega National Park" by jpmckenna is licensed under CC BY 2.0

[1] From the Yiddish megile, the scroll from which the story of Purim is read. Used to mean a long embroiled story. And yes, the source of the name for The Magilla Gorilla Show (in one episode he says, “Such a megillah over a gorilla.”)

[2] Yiddish, meaning a pitiful person.

Other articles from this issue:


Michael J. Ross

Drupal 8 may not be released until Winter, 2015. Meanwhile, there is REST for the weary; a simple, effective technique for leveraging the RESTful module in Drupal 7.

Coming Soon


Amber Himes Matz

Drupal 8 may not be ready... Oh right, we said that. Okay, so meanwhile, you want speed, you want beauty, you want passion that leaps off the screen. Here are two great methods for exposing your views components as JSON. A walk in the park, a kiss in the dark...

Coming Soon

Larry Garfield

Drupal 8 will have built-in RESTful web services. Sit tight. Or follow Palantir’s steps in setting up a major media client’s smo?rga?sbord of movies, TV shows, and other video as an API for a web app, iPhone or Android app, or even a set-top box.

Coming Soon

May 21 2015
May 21
Tags: access, d8, Drupal

Drupal 7

In Drupal 7, a hook_node_access implementation could return NODE_ACCESS_IGNORE, NODE_ACCESS_ALLOW and NODE_ACCESS_DENY. If any of them returned NODE_ACCESS_DENY then access was denied. If neither did but one returned NODE_ACCESS_ALLOW then access was allowed. If neither of these values were returned by any implementation then the decision was made based on other rules but at the end of the day some code needed to grant access explicitly or access was denied. Other entities didn’t have access control.

Also, blocks had some sort of access control in a very strange way: hook_block_list_alter is used -- even by core -- to remove the non-visible blocks.

Drupal 8

Drupal 8 brings a unified entity API -- even blocks become entities. It also uses many of the same objects and concepts for routing access. Instead of constants, we now use objects implementing the AccessResultInterface. You can get an instance by calling the rather self descriptive AccessResult::allowed(), AccessResult::forbidden(), AccessResult::neutral() methods. If you are handed an AccessResultInterface object you can figure out which one it is by calling the isAllowed, isForbidden, isNeutral methods on it. Only one of them can return TRUE. Access results can be cached and so have relevant cache contexts and tags -- this is why we bother with Neutral. This caching metadata is properly merged when doing various operations on access results.

These objects are returned by hook_entity_access implementations for entity access check and by the check method of services tagged with access_check used for routing access

Let’s first consider a node access case. The entity access checker fires hook_entity_access and hook_ENTITY_TYPE_access (in this case hook_node_access) and tries to put together the results in a sane way. If we do not want to change the behavior from D7, then any D7-deny (now called Forbidden) should result in a Forbidden end result. If there are no Forbiddens then any Allowed should result in a final Allowed return value. Finally, if there were neither Forbiddens nor Allows then only Neutral were present (if anything at all) so return a Neutral. This is called the orIf operation, if you have two result objects then run $result1->orIf($result2) to get the end result. The other doesn’t matter, $result2->orIf($result1) is the same.

Let’s take a look at the AccessResult::allowedIfHasPermission method. What results do we want? Obviously, if the permission is present we want an Allowed to be returned. But if the permission is not there? It can not result in Forbidden because then any hook_node_access simply returning the result of this method would immediately deny access to the node. It really can only return Neutral: the lack of a permission means we this method can’t form an opinion about access. (Using this method is strongly preferred to if ($user->hasPermission()) { return AccessResult::allowed();} because it adds the right caching metadata).

Let’s say we want to determine whether a field is editable! This requires the entity to be editable and the field to be editable. For the sake of simplicity, let’s presume both are controlled by a permission. So let’s say the user has “edit entity” permission but doesn’t have the “edit field” permission. So according to the previous paragraph AccessResult::allowedIfHasPermission($account, ‘edit entity’) is Allowed while AccessResult::allowedIfHasPermission($account, ‘edit field) is Neutral. We can not return Allowed in this case! So we can’t use orIf -- we need another operation: this is called andIf. Much like orIf we want any Forbidden input to result in a Forbidden output and again the same as orIf we want two Allowed to result in Allowed. The only difference is in the case detailed above: when one is Allowed, the other Neutral, here orIf results in Allowed but andIf results in Neutral.

If you want to use your knowledge and familiarity with AND/OR then consider first the iron rule of “any Forbidden input results in Forbidden” and only if that rule didn’t apply then consider Allowed as TRUE and Neutral as FALSE and apply the normal AND/OR to these values. This logic is called Kleene's weak three-valued logic where the “third value” is Forbidden. Most misunderstanding and confusion results from trying to treat Forbidden as FALSE instead of being the contagious “third value” it is. The name Neutral might make you think “oh, three values are not a problem, I will just erase any N I see and the resulting two values can be evaluated like normal AND/OR” this is absolutely incorrect! In fact, if you have two variables and both are Allowed / Forbidden then the results for $x->orIf($y) will be the exact same as $x->andIf($y)! The outcome will only differ if either $x or $y is Neutral (and the other is Allowed).

Routing vs Entity Access

We have clarified the necessity for two operators and we have two subsystems, each using one of them: routing uses andIf, entity uses orIf. The difference is subtle -- as we have seen the only difference is the end of result of two access checks where one is Neutral, the other is Allowed. This becomes a Neutral result in routing and an Allowed result in entity access.

Making a Decision

All this three value logic is nice but at the end of the day, this is all internal. The user wants to see a page, can we show it to them or do we show a 403 page? Can we show this entity? The answer cannot be “I do not know”, it must be yes or no. So the system looks at the result and says yes if it is explicitly allowed, otherwise says no. For routing (remember, andIf) this is very simple: if every access checker answered Allowed then and only then we can answer yes. For entity access (remember, orIf) there should be no Forbidden answers and at least one Allowed to say yes.

May 21 2015
May 21

28 pages of unmarred perfection. This book is pure unadulterated genius

- Chris Arlidge

Never Be Shocked Again! - Budgeting your Web Project

Are you having trouble figuring out an appropriate budget for your next web project? Our whitepaper can help!

Download your FREE COPY, to learn the different types of website projects, understand the factors that play a role in the budgeting process, and determine where your web plans will fit when it comes to costs!

Don’t ever be shocked by web costs again! A clear guide to help you plan the budget for your next web project.

Imagine this scenario. You add three different blocks in your footer region and the mockups call for them to be side by side. Here are 3 ways to accomplish this:


Using regions, we could add and create more footer regions and call them Footer first column, Footer second column, third and forth just like in bartik theme. I personally don’t like this solution because what if we decide in the future to only have 3 columns instead of four?  Then we will need to remove regions and change the css, clear the cache etc.. not ideal.


Another way to do this would be to go and do all the work in css and is the way I’ve been doing it for years.

.region-footer .block {
 position: relative;
 float: left;
 width: 33%;
 padding: 0 20px;

Or if you’re using sass or less you could extend bootstrap’s grid system default classes like this:

.region-footer .block {
 @extend .col-md-4;

This works perfectly fine and is easy to edit in the future if we want to change from 4 columns to 3.


The final solution that I will show you today is to use drupal’s preprocessors to change the html markup to take advantage of the existing styles that bootstrap provides.

In your template.php file create or add to the existing block preprocess hook.

* Implements hook_preprocess_block().
function themename_preprocess_block(&$variables) {
 if($variables['block']->region == 'footer') {
   array_unshift($variables['classes_array'], 'col-md-4');

This tells all the blocks within the region “footer” $variables['block']->region == 'footer' to append bootstrap’s col-md-4 class to the classes_array

This same method can be use also to add grid classes to nodes and regions with their respective hook.  hook_preprocess_node() hook_preprocess_region().

May 21 2015
May 21

Normally, I would do this video style, but I'm a wildcard people and today we write! Thanks to one of our Code Karate supporters, Pieter, I am going to walk you through how to use Drupal views, date and content types to automatically hide/show nodes based on date field. Let's get started.

Make sure that you have Drupal Views installed. If you're new to Drupal, Views is like oxygen to humans, you REALLY need it. Also, you can learn more about Views by watching this video. Also, I installed the Drupal Date module. I will use this module to set the date for the node (explained more below). Besides Views and Date, you shouldn't need any other modules besides what already comes with Drupal core.

For my example, I am going to display a list of training times people could sign up for and then once the training session has passed the view will no longer show that training. Make sense? Let's build!

A quick note before we get ahead of ourselves. When you enable the Date module, make sure that you enable Date, Date API, Date Popup, and Date Views. These additional modules will make your life a little easier. You're welcome.

I think we're ready. To start, I am going to create a training content type. For the fields, I am just going to keep the defaults Title and Body and add additional field training date. Naturally, you can add other fields that fit your unique situation, but for the purpose of this example we won't need any other fields.

With that content type, I am going to create five different nodes. Each of these nodes will have a different date. For the sake of showing how to hide content based on date, I am going to make one of the nodes have a date in past. This node will be used to prove that my view is only showing training dates in the future.

With the nodes created, we now need to create a view. Before we start excluding dates from showing in the view, let's make sure we get a view that displays all the content we want. To do this, make sure you add all the fields you want for the training content type. For me, that means I have added the Title, Body, and Training Date fields. Also, under the Filter Criteria section I have limited it to only showing training content types. Doing this will show that I can display all the training content.

Next, we need to add a filter that will only show the training dates that are in the future and exclude those that have already happened. Remember, in my example I have added a date in past for my 1st training node. If the view is built correctly it shouldn't exclude the 1st training node, but still show the remaining four training nodes.

The first step to do this is to add a filter to your view. For the field, you will want to select the Training Date field (or whatever field you are using for the date). For the settings, keep everything default except for the granularity. I change this to be minute. The reason for this is because I want the training to be available to be seen right up until they start. If I kept the granularity at a day, it would be hidden at the beginning of the day.

The filter criteria are where we do the filtering for the view. In other words, this is what is telling the view to show or not show. Again, the goal is to hide old training once the date and time have passed. To do this, select the Is greater than operator and change the drop down to Enter a relative date. Within the relative date text field, enter "now". Why now? Now is a PHP date format that specifies the current date and time. So "now" is an always changing value, which is exactly what we want. There are tons of other PHP date strings that you can use if "now" isn't exactly what you are looking for. To learn more about those formats just go here.

That is it! Once you save your filter go ahead and take a look at the view. Again, if done correctly, we shouldn't see the 1st training node as that contains a date field less than "now".

Hopefully, you were able to follow along and get your view and content displaying correctly. Remember, you can use other PHP date formats to show/hide content based on your unique situation. There are an almost unlimited amount of possibilities.

This was a very simple way to filter content. Do you use a different way? If so, share it in the comments below.

Until next time, Happy Coding!

May 20 2015
May 20

At Drupal Camp London 2015, I spoke with Piyush Poddar, Director of Drupal Practice at Axelerant. We talked about Piyush's history in Drupal, Drupal as a business-ready solution, India's coming of age in open source culture, and how that is driving business value.


"Drupal really has become a business-ready solution. It allows those of us running businesses or selling solutions to clients to really not think about anything more, just go ahead and use it for production sites–large, huge production sites."

On working in proprietary v open source software: "I've done both. We sued to buy expensive books when we worked on [a proprietary technology], just to know how to do things the right way. Then we realized that it doesn't tell you how to do things the right way, just how to use the software. So best practices, how other folk are doing it, lessons learned, there was nothing there. Nothing to be shared. No platforms. No camps, no events. Now [using Drupal], it's out there. It just depends on you what you wanna grab."

On the Drupal community and the Drupal project: "Solid. Rock solid. It's an awesome project. Without community, I don't know if it would have been so useful. It's an awesome community, but without a project, what would we have been doing? Together, we are doing a wonderful job."

From consumption to contribution in India

Piyush says that it was pretty quiet on the Drupal front until Dries Buytaert, the Drupal Project Lead, visited India in 2011. "A lot of excitement happened. A lot of traction came into the ecosystem. There were a lot of camps in the country."

"The way Drupal started, it was seen as a job-based technology. We were consuming Drupal then. Most folks were looking at Drupal as a job, a 9 to 5 job ... Go to the office, work on a Drupal project, come back, forget about it. But now, companies and individuals have realized that it's not just using Drupal, not just consuming Drupal, but investing in the Drupal ecosystem locally, nationally (and perhaps internationally as well) is where the real value lies. And that's where you're getting good Karma and besides that, it's also about establishing your reputation, your stand, your maturity up in the marketplace. So organizations understood that over the years and a lot of them started that.

I would say the journey is still on. We still need to get to a stage where we can say we are all there. But a lot of companies are participating in domestic [Drupal] Camps and meet-ups in different cities in India, with 300-700 people in each of these meet-ups. I've seen a lot of companies starting to push their developers towards contribution, at times even during their day job. There are companies offering jobs with Drupal contribution as KPIs; Axelerant is one of them. We encourage a lot of contribution in-house," during both busy and less busy times, "And from a profit and loss perspective, we are absolutely fine about that." Axelerant, in fact employs the top Indian contributor to Drupal 8, Hussain Abbas.

Contribution generates business value

I asked Piyush what Axelerant gets back out of encouraging and paying for so much contribution to Drupal.

"Being an IT company, acquiring good talent and retaining them is probably more important than sales itself. There's a lot of business, a lot of leads, a lot of opportunity out there. You can just go grab them, but you have to deliver them and you have to do that constantly. And for that you need a team, you need people who are excited about this thing, who know their stuff, people who are experts. Hiring is a big problem. The two most important things we've don to solve this problem are getting onto Drupal–people love working in Drupal and people love working with companies that are so active in the community. They feel they'll learn more and how to become like Hussain and other [role models]."

"Lately, a lot of clients are asking how active we are in terms of contribution. How mature are we from that perspective. Our Marketplace profile has really helped up a lot for that. We track our references and ask our clients how they came to us. A lot of them tell us that our profile is very strong, enough for them to build the first level of trust and give us the first few projects. I cannot imagine this sort of advantage in a proprietary technology."

"My personal conclusion on that front is that in India we believe a lot in Karma. And what we are seeing is that it's the karma we do is coming back to us already in business, in happiness, in these awesome people that we work with. It's happening."

Guest author dossier

Drupal Camp London 2015 CxO Day presentation

From Consumption to Contribution - Lessons from India

YouTube video

Interview video

[embedded content]

May 20 2015
May 20

Did you have a great time at DrupalCon Los Angeles but want something to show for it?  

We are happy to issue a certificate of attendance in PDF format for anyone who picked up their conference badge or signed in at a training.

Simply submit your request via our contact page with the subject "Request a Certificate of Attendance", and be sure to include the associated order number.

May 20 2015
May 20

How would you like to present at one of the largest PHP conferences in Europe? DrupalCon Barcelona is coming, and we are actively looking for sessions for our new PHP track.

Unlike the Coding and Development track, the PHP track is all about the larger PHP community. We're not looking for Drupal-specific talks but for sessions about PHP itself (PHP 7 anyone?), about related PHP tools like Guzzle, general PHP leading practices, software architecture, and so on.

Want to speak to the community that has reinvented itself in recent years as a modern PHP powerhouse? We want to hear about all the power of PHP that we haven't adopted yet, and how we can get even better at our favorite server-side language.

Have a look at our suggested topics, and send us your sessions!


Larry Garfield

PHP Track Team | DrupalCon Barcelona

May 20 2015
May 20

Yesterday (May 19), the Louisiana Legislature’s House Civil Law and Procedure Committee voted 10-2 to return HB707 to the calendar, effectively voting it down, at least for the current session. The bill would allow businesses to refuse, in accordance with religious beliefs, to provide goods and services on the basis of a patron’s sexuality.

Described as the protection of “the free exercise of religious beliefs and moral convictions”, were the bill to pass it would preclude the state from taking “any adverse action against a person, wholly or partially, on the basis that such person acts in accordance with a religious belief or moral conviction about the institution of marriage.”

However, hours after the committee’s vote, Louisiana Governor Bobby Jindal issued an executive order in an attempt to accomplish much of what HB707 is intended to achieve. We’re aware that at least some of the bill’s opponents doubt the executive order may create substantive law. We’re also aware that the U.S. Supreme Court may issue a ruling (before its current term ends in late June) that preempts any contradictory Louisiana law.

Why We’re Talking About Louisiana

Earlier this year, we chose New Orleans as the site for DrupalCon North America 2016. Section 86-33 of New Orleans’ municipal code explicitly forbids discrimination by public businesses and stores. In much the same spirit as New Orleans’ code, we want to take this opportunity to unequivocally state that no one at any DrupalCon should be denied service, assistance, or support because of who they are or whom they love.

Community. Collaboration. Openness. These are our ethos. At our core, we’re as committed to these values being principles for how we treat each other as we are for how we do our work.

The very nature of open source means contributions can come from anyone. That means muting voices is inconsistent with our values. That means we believe inclusivity is progress. And that means it’s important we speak when our community asks questions about the risk of discrimination.

Along with logistics—such as available event space, and costs—our DrupalCon site selection process has always considered whether we’d be able to truly celebrate the diversity of the Drupal community and the spirit of the Drupal Code of Conduct. We believe, despite the bill and executive order, that we can still create a safe, diverse, celebratory space for our community in New Orleans next year. We’re happy to bring the diversity of DrupalCon to New Orleans, and we’re confident it’ll be a fantastic event.

Talk To Us

We want to hear about your experiences at DrupalCon New Orleans—any and all of them. Tell us your opinions, voice your perspectives, and share what you see. In the meantime, comment on this post, or email us, with your questions and insights.

May 20 2015
May 20

Promet Source Support Developer Kabenla Armah shares his gameplan for DrupalCon LA in 2015. He came, he saw, he deftly avoided dairy products.

Normally I’m sitting at home in Iowa, rocking the boxer shorts on the couch. The Xbox controller is in arm’s reach. Or maybe I’m tearing up a single track course on my mountain bike. But this week, none of that. This week is different.

It’s DrupalCon, so you know I have to put away the games and pack up the power brick so I can join in the fun. I ran through this quick Q&A about my plan for DrupalCon with Promet's marketing team to help understand my goals for this year's big show.

Q: What's one new area of Drupal you want to explore?

A: I’ve done lots of php development, but I’ve never done any theming. 

Q: What are your goals for this year’s Con? 

A: I'd like to pick up one new technology and then go experiment and be conversant in it. I think the winner is going to be using Twig to theme in Drupal 8. 

Q: Who do you want to see in person? Why him or her?

A: MortenDK. Because he’s been a main handler of theming in Drupal 8. And his personality is great. 

Q: Which game station has the hottest action? Ping pong? Air hockey? 

A: It's definitely ping pong. 

Q: How many members of your team made the journey with you? 

A: I'm actually the sole rep of my current project team to make the journey. No pressure, right? 

Q: Any foods you'd like to sample while on the west coast?

A: As long as it doesn't have any cheese in it, not even a single molecule, then I'm good. I'll try anything.

Want to share your highlights from DrupalCon LA? Fill out the form below and let us know about your experience!

May 20 2015
May 20

Mapping in drupal with open layers and views is hella awesome.  Add in the heatmaps and you've got some pretty sweet visualizations.  open layer heatmap exposed view filter using the content type with our geolocation field from our first exercise we first create a view - since we're using points we'll need a vector overlay to start with vector data overlay in views.  For those looking to test on their own site I've added an updated feature with the mapping in it.  The settings for the overlay are also pretty straightforward.  I use EPSG 4326 for most projections - especially since I'm using google's geocoder that spits it out by default (if we had to use ogr2ogr to transform the data we could , but for this project EPSG 4326 is just fine.  I also use WKT (well known text) for most of my mapping... again, this is handy with google, and generally a pretty usable format open layers vector overlay settings.  Once your view is built hit save - it's time to head over to the open layers side for a bit....

In the open layers interface /admin/structure/openlayers/maps/list/ add a new map - you'll need to center the map where you expect your data to be... in my case the US and then set your projections - EPSG 4326 (aka WGS84) is what I used here again - consistency is what matters....epsg 436 for data then scroll down a bit and you'll see your layer that you created from your view select the view with open layer data. After you're done here go on to the Behaviors layers behaviors part 1 there are a huge number of settings in here, but basically the defaults are going to be your friend....heatmap and tooltip settings Hit save now and go back to the view you just created... it's time to render this sucker.  Create a page view in your existing view - you could create a new view, but why bother... it's nice to keep it all in one place in my experience... that said don't forget to use the "override for this view" settings so that when you select openlayer map for your format you won't override any other views....  openlayer map page in viewUnder settings select your newly minted map and you're done!final map in drupal openlayers

We're almost done with the report - next we're going to look at how to preserve the data by passing it along to an international service aka open spending... again - thanks to all who have helped thus far from the charlottesville gop, @Buddy Weber, Barbara Null, Richard Statman, John Pfaltz et al.  As noted in our discussions what we find here is that Charlottesville is not really a big spender when it travels, and that in a given month we don't travel all that much.

May 20 2015
May 20

Mapping in drupal with open layers and views is hella awesome.  Add in the heatmaps and you've got some pretty sweet visualizations.  open layer heatmap exposed view filter using the content type with our geolocation field from our first exercise we first create a view - since we're using points we'll need a vector overlay to start with vector data overlay in views.  For those looking to test on their own site I've added an updated feature with the mapping in it.  The settings for the overlay are also pretty straightforward.  I use EPSG 4326 for most projections - especially since I'm using google's geocoder that spits it out by default (if we had to use ogr2ogr to transform the data we could , but for this project EPSG 4326 is just fine.  I also use WKT (well known text) for most of my mapping... again, this is handy with google, and generally a pretty usable format open layers vector overlay settings.  Once your view is built hit save - it's time to head over to the open layers side for a bit....

In the open layers interface /admin/structure/openlayers/maps/list/ add a new map - you'll need to center the map where you expect your data to be... in my case the US and then set your projections - EPSG 4326 (aka WGS84) is what I used here again - consistency is what matters....epsg 436 for data then scroll down a bit and you'll see your layer that you created from your view select the view with open layer data. After you're done here go on to the Behaviors layers behaviors part 1 there are a huge number of settings in here, but basically the defaults are going to be your friend....heatmap and tooltip settings Hit save now and go back to the view you just created... it's time to render this sucker.  Create a page view in your existing view - you could create a new view, but why bother... it's nice to keep it all in one place in my experience... that said don't forget to use the "override for this view" settings so that when you select openlayer map for your format you won't override any other views....  openlayer map page in viewUnder settings select your newly minted map and you're done!final map in drupal openlayers

We're almost done with the report - next we're going to look at how to preserve the data by passing it along to an international service aka open spending... again - thanks to all who have helped thus far from the charlottesville gop, @Buddy Weber, Barbara Null, Richard Statman, John Pfaltz et al.  As noted in our discussions what we find here is that Charlottesville is not really a big spender when it travels, and that in a given month we don't travel all that much.

May 20 2015
May 20

While developing a system to automate Drupal updates and using that technology to fulfill our Drupal support contracts, we ran into many issues and questions about the workflows that integrate the update process into our overall development and deployment cycles. In this blog post, I’ll outline the best practices for handling different update types with different deployment processes – as well as the results thereof.

The general deployment workflow

Most professional Drupal developers work in a dev-stage-live environment. Using feature branches has become a valuable best-practice for deploying new features and hotfixes separately from the other features developed in the dev branch. Feature branches foster continuous delivery, although it does require additional infrastructure to test feature branches in separate instances. Let me sum up the development activity of the different branches.


This is where the development of new features happens and where the development team commits their code (or in a derived feature branch). When using feature branches, the dev branch is considered stable; features can be deployed forward separately. Nevertheless, the dev branch is there to test the integration of your locally developed changes with the code contributions of other developers, even if the current code of the dev branch hasn’t passed quality assurance. Before going live, the dev branch will be merged into the stage branch to be ready for quality assurance.


The stage branch is where code that’s about to be release (merged to the master branch and deployed to the live site) is thoroughly tested; it’s where the quality assurance happens. If the stage branch is bug-free, it will be merged into the master branch, which is the code base for the live site. The stage branch is the branch where customer acceptance happens.


The master branch contains the code base that serves the live site. No active changes happen here except hotfixes.

Hotfix branches

Hotfixes are changes applied to different environments without passing through the whole dev-stage-live development cycle. Hotfixes are handled in the same way as feature branches but with one difference: whereas feature branches start from the HEAD of the dev branch, a hotfix branch starts from the branch of the environment that requires the hotfix. In terms of security, a highly critical security update simply comes too late if it needs to go through the complete development cycle from dev to live. The same applies if there’s a bug on the live server that needs to be fixed immediately. Hotfix branches need to be merged back to the branches from which they were derived and all previous branches (e.g. if the hotfix branch was created from the master branch, it needs to be merged back to the master to bring all commits to the live site, and then it needs to be merged back to the stage and dev branch as well, so that all code changes are available for the development team)

Where to commit Drupal updates in the development workflow?

To answer this question we need to consider different types of updates. Security updates (including their criticality) and non-security updates (bug fixes and new features).

If we group them by priority we can derive the branches to which they need to be committed and also the duration of a deployment cycle. If you work in an continuous delivery environment, where you ship code continuously,the best way is to use feature branches derived from the dev branch.

Low (<=1 month):
- Bug fix updates - Feature updates

These updates should be committed by the development team and analysed for side effects. It’s still important to process these low-prio updates, as high-prio updates assume all previous code changes from earlier updates. You might miss some important quality assurance during high-prio updates to a module that hasn’t been updated for a long time.

Medium (<5 days):
- Security updates that are no critical and not highly critical

These updates should be applied in due time, as they’re related to the site's security. Since they’re not highly critical, we might decide to commit them on the stage branch and send a notification to the project lead, the quality assurance team or directly to you customer (depending on your SLA). Then, as soon as they’ve confirmed that the site works correctly, these updates will be merged to the master branch and back to stage and dev.

High (<4 hours):
- Critical and highly critical security updates

For critical and highly critical security updates we follow a "security first" strategy, ensuring that all critical security updates are applied immediately and as quickly as possible to keep the site secure. If there are bugs, we’ll fix them later! This strategy instructs us to apply updates directly to the master branch. Once the live site has been updated with the code from the master branch, we merge the updates back to the stage and dev branch. This is how we protected all our sites from Drupalgeddon in less than two hours!

Requirements for automation

If you want to automate your Drupal security updates with the Drop Guard service, all you need is the following:

  • Code deployment with GIT
  • Trigger the update of an instance by URL using e.g., Jenkins CI, DeployHQ or other services to manage your deployment or alternatively execute SSH commands from the Drop Guard server.

Also to keep in mind:

  • Know what patches you’ve applied and don't forget to re-apply them during the update process (Drop Guard helps with its automated patch detection feature)
  • Automated tests reduce the time you spend on quality assurance


Where to commit an update depends on its priority and on the speed with which it needs to be deployed to the live site. Update continuously to ensure the ongoing quality and security of your project and to keep it future-proof. Feature and bug fix updates are less critical but also important to apply in due time.

For those of you interested in Drop Guard to automate the process as described in this blog post, please sign up for the free trial period so you can test all its features – for free – and get a personal on-boarding.

May 20 2015
May 20

There are days that I work on half a dozen different websites.  I'm sure some of you are in the same boat.  We make client edits and change requests with rapid effieciency.  We work locally, push to staging, test and review, then push to the live server and repeat.  I would be remiss in saying that I never made a change on the live or staging site accidentally.

The Drupal Environment Indicator module allows you to name, color, and configure a multitude of visual queues for each of your different servers, or other variables, like Git branch or path.  It is very easy to install, and can integrate with Toolbar, Admin Menu, and Mobile Friendly Navigation Toolbar for no additional screen space. 

Once installed, set the permissions of the roles you want to give permission to see the indicator.  You can adjust the general settings at /admin/config/development/environment-indicator/settings

Environment Indicator Settings

While you can create different indicators inside the admin UI, I prefer to set these in the settings.php files on the various servers so they are not overidden when we move databases back from Production back to Staging and Dev.

To do this, add the following lines into the settings.php files on each of your servers. Adjust the names and colors as you see fit:

// Local/Development Server
$conf['environment_indicator_overwrite'] = TRUE;
$conf['environment_indicator_overwritten_name'] = 'Local';
$conf['environment_indicator_overwritten_color'] = '#bb0000';

// Staging Server
$conf['environment_indicator_overwrite'] = TRUE;
$conf['environment_indicator_overwritten_name'] = 'Stage';
$conf['environment_indicator_overwritten_color'] = '#00bb00';

// Production Server
$conf['environment_indicator_overwrite'] = TRUE;
$conf['environment_indicator_overwritten_name'] = 'Production';
$conf['environment_indicator_overwritten_color'] = '#0000bb';

Environment Indicator on Admin Menu

In this day and age, we have a lot of clients that have their own staging servers to play around with layout, copy, and style.  Environment Indicator gives us, and them a sense of security that they are in the right place at the right time doing what we need to do.

May 20 2015
May 20
135 Writing the Book Drupal 8 Configuration Management with Anja Schirwinski and Stefan Borchert - Modules Unraveled Podcast | Modules Unraveled

Skip to main content

Writing a Book for D8

  • What’s it like writing a book for a piece of software that isn’t even officially released yet?
  • How long did the writing process take?
    • Packt publishing sent us a proposal to write this book in December of 2013. We got started quickly, sending them an outline of the chapters and an estimated page count in the same month. The original estimated page count was 150, it turned out to be around 120. We received a pretty strict time line, having to finish a chapter every two weeks, starting in December of 2013.
    • We managed to finish most chapters in two weeks, but some of the longer ones took a little longer since we also started one of our biggest projects we had had until then, also in January. That was pretty tough because that project took up way more than a regular full time job, so we ended up having to write all of the chapters late at night and on the weekends. In May, all of our chapters then went to the editors and we didn’t hear back from the publisher for a really long time.
    • We also told them that we will have to rewrite a lot of the chapters since there was so much work in progress with the Configuration Management Initiative and they were changing a lot about how it worked, like going from the file based default to the database default. I think it was in January of 2015 when chapters came back with some feedback and we started rewriting every chapter, which was pretty painful at the time. We were able to update some of the documentation at with the changes we found. It felt good to contribution at least a small part, when with our project and the book we had no time left to contribute code to Drupal 8 like we usually do.
    • We spent around 40 days on the book between the two of us.
    • In December, Packt asked the first publisher to review the book. We had recommended them one of our team members at undpaul, Tom, who has a similar amount of Drupal knowledge as Stefan. We really wanted to have someone from CMI to review the book, like Greg Dunlap. They had turned down reviewing the book after the first chapters were written, because too much would still change. Then after the changes went in we kept recommending Greg but I never heard anything back, maybe he was busy or they didn’t bother to ask. At the beginning of this year they told us the book was planned to be published by March. We recommended waiting because we didn’t expect a release candidate before the European Drupalcon and we would have rather had someone like Greg take the time to review, but Packt had another opinion :) Since most of CMI changes were finished, we didn’t feel too uncomfortable about the time of publishing, and it was also kind of nice to finally be done with this thing :) So it took a little over a year from start to finish. It was published on March 24th.
  • Do you expect to need to rewrite anything between now and when 8.0 is released?

The Book: Drupal 8 Configuration Management

  • What do you cover in the book?
    • We start of with a basic introduction to what Configuration Management in Drupal means, because it is a thing in Software development in general, that doesn’t usually refer to what it is in Drupal, where it basically just means that configuration is saved in files which makes deployment easier. In the first chapters, we make sure the reader understands what Configuration Management means and why version control is so important. We mention some best practices and then show how to use it for non-coders as well, since there’s a nice backend non-technical folks can use, even if you don’t use version control (which of course we don’t recommend). We also have a part that describes how managing configuration works in Drupal 7 (Features!) and then dive into code examples, explaining schema files, showing how to add configuration to a custom module, how to upgrade Drupal 7 variables to the new system and cover configuration management for multilingual sites.
  • Who is the target audience of the book?
  • Why did you decide to write about Configuration Management?
    • We have used Features to deploy configuration changes for a very long time, I don’t recall not using it since we started the company 5 years ago. We have talked about it at several DrupalCamps and Drupal User Groups and always tried to convince everyone to use it. We were really excited about the Configuration Management Initiative and thought it was a very good fit for us.
  • Before we started recording, you mentioned that there is a companion website to the book. Can you talk about what content we’ll find there, and what purpose that serves?
  • Are you building any sites in D8 at Undpaul?
May 19 2015
May 19

As we dive deeper into visual regression testing in our development workflow we realize a sad truth: on average, we break our own CSS every week and a half.

Don't feel bad for us, as in fact I'd argue that it's pretty common across all web projects - they just don't know it. It seems we all need a system that will tell us when we break our CSS.

While we don't know of a single (good) system that does this, we were able to connect together a few (good) systems to get just that, with the help of: Travis-CI, webdriverCSS,, BrowserStack/Sauce Labs, and ngrok. Oh my!

Don't be alarmed by the long list. Each one of these does one thing very well, and combining them together was proven to be not too complicated, nor too costly.

You can jump right into the .travis file of the Gizra repo to see its configuration, or check the webdriverCSS test. Here's the high level overview of what we're doing: is built on Jekyll but visual regression could be executed on every site, regardless of the underlying technology. Travis is there to help us build a local installation. Travis also allows adding encrypted keys, so even though the repo is public, we were able to add our and ngrok access tokens in a secure way.

We want to use services such as BrowserStack or Sauce-Labs to test our local installation on different browsers (e.g. latest chrome and IE11). For that we need to have an external URL accessible by the outside world, which is where ngrok comes in: ngrok http -log=stdout -subdomain=$TRAVIS_COMMIT 9000 from the .travis.yml file exposes our Jekyll site inside the Travis box to a unique temporary URL based on the Git commit (e.g.

WebdriverCSS tests are responsible for capturing the screenshots, and comparing them against the baseline images. If a regression is found, it will be automatically pushed to Shoov, and a link to the regression would be provided in the Travis log. This means that if a test was broken, we can immediately see where's the regression and figure out if it is indeed a bug - or, if not, replace the baseline image with the "regression" image.

Visual regression found and uploaded to


Some gotchas to be aware of:

Even though visual regression testing with BrowserStack or Sauce Labs takes more time than running it on PhantomJS, it's recommended to use such tools, since they test your site against real browsers.

Those tools cost money, but we find that it's well worth it. We are currently using BrowserStack (99$/month), though we're running into some issues with it not having an internal queue system - so if you reached your limit on virtual hosts concurrency, your tests will simply fail. For that reason we might switch to Sauce Labs (149$/month) which also provides more concurrent VMs.

Blog post page tested on IE11, Windows 7

Travis is limited to 50 minutes' execution time. Capturing each image might take about 30 - 90 sec, so when you reach lots of tests, you should probably split them.

The free plan of ngrok allows only a single concurrent tunnel to be opened. Even though BroswerStack and Sauce Labs provide their own tunneling solution, we decided to go with ngrok, in order to provide a more generic solution. We happily upgraded to the $25/month business account following our excellent experience with the free account.

May 19 2015
May 19

The fourth month of the year brought reminders that Winter can show up at unexpected times, with snow flurries during the early parts of the month. It also that we can only juggle so much. With many of us involved in organizing regional events and preparing for Drupalcon, our code contributions waned for a second month, down to a rather low 20 hours.

Drupal 8

Once again we've been fortunate to be able to put a good amount of effort into porting modules to Drupal 8, primarily for a client the will heavily benefit from some of D8's newest features.

  • Michelle Cox continued work on Metatag and Webform, the former of which will shortly be available as an alpha release.
  • Paul McKibben ported the er_viewmode module and wrote er_view_formatter, both of which will be made available shortly; Paul also joined as a co-maintainer of the XMLSitemap module and committed a number of RTBC patches, pushing the module closer to a stable state for D8.
  • After much help from Michelle I've been able to eek out some tentative entity handling for the D8 port of Metatag and will be making an alpha release soon. There's a lot to do before it reaches 1.0, but it's a step in the right direction.

Other work

Additionally, work has been progressing on some personal projects.


There were several successful events during April:

  • The month kicked off with the sixth annual Florida DrupalCamp. Once again the weekend was a huge success, the organizers showing again that they put on one of the country's best camps. Mediacurrent was a gold sponsor this year and a few staff members shared their knowledge & experience in presentations.
  • Matt Goodwin and I hosted the first of our “NHDevDays” code sprints in Keene, NH. Although we had a small turnout, it went well and the day ended with several new people successfully climbing the Drupal Ladder.

May-be More

With Drupalcon last week many of us spent much of May finalizing our presentations and attendance plans. At Mediacurrent we had a very busy week, and were glad so many of our readers and followers joined us at our booth, our evening party on Tuesday, and our many sessions. Now that we’re back home I expect our code contributions to pick up speed again.

Additional Resources

Introducing Mediacurrent's Contrib Committee | Mediacurrent Blog Post
Contrib Committee Status Report, March 2015 | Mediacurrent Blog Post

May 19 2015
May 19

Here’s a tangent:

Let’s say you need to randomly generate a series of practice exam questions. You have a bunch of homework assignments, lab questions and midterms, all of which are numbered in a standard way so that you can sample from them.

Here’s a simple R script to run those samples and generate a practice exam that consists of references to the assignments and their original numbers.

## exam prep script

## build hw data
j <- 1

hw <- data.frame(hw_set = NA, problm = seq(1:17))

for (i in seq(1:12)) {
        hw[j,1] <- paste0("hw",j)
        j <- j+1


hw <- expand(hw)

names(hw) <- c("problm_set", "problm")

## build exam data

j <- 1

exam <- data.frame(exam_num = NA, problm = seq(1:22))

for (i in seq(1:8)) {
        exam[j,1] <- paste0("exam",j)
        j <- j+1


exam <- expand(exam)

names(exam) <- c("problm_set", "problm")

## create practice exam

prctce <- rbind(exam,hw)

prctce_test <- prctce[sample(1:nrow(prctce), size=22),]

row.names(prctce_test) <- 1:nrow(prctce_test)


As the last line indicates, the final step of the script is to output a prctce_test … that will be randomly generated each time the script is run, but may include duplicates over time.

output from r script

Sure. Fine. Whatever.

Probably a way to do this with Drupal … or with Excel … or with a pencil and paper … why use R?

Two reasons: 1) using R to learn R and 2) scripting this simulation let’s you automate things a little bit easier.

In particular, you can use something like BASH to execute the script n number of times.

for n in {1..10}; do Rscript examprep.R > "YOUR_PATH_HERE/practice${n}.txt"; done

That will give you 10 practice test txt files that are all named with a tokenized number, with just one command. And of course that could be written into a shell script that’s automated or processed on a scheduler.

automatically generated practice tests with bash and r script

Sure. Fine. Whatever.

OK. While this is indeed a fairly underwhelming example, the potential here is kind of interesting. Our next step is to investigate using Drupal Rules to initiate a BASH script that in turn executes an algorithm written in R. The plan is to also use Drupal as the UI for entering the data to be processed in the R script.

Will document that here if/when that project comes together.

May 19 2015
May 19

DrupalCon Los Angeles took place from May 11 to 15 and three of our team were there.

Three of our team joined over 3100 people who crowded into downtown L.A.

We'll recap our favorite sessions in another blog post, but here are some thoughts on DrupalCon L.A. and where DrupalCons are headed.

Thoughts on Los Angeles as a venue

The L.A. Convention Center was a great choice. Some recent DrupalCons have been inside huge venues where 3000 people can seem like a small group.

L.A. had the perfect size of venue for 3000 people. The auditoriums, sessions halls, bird of a feather and exhibitor rooms were right next to each other so getting around was easy.

In fact, the whole location was great. The convention center is right next to the Staples Center where the LA Lakers and LA Clippers play. There was NBA playoff basketball going on for 2 nights we were there which added to the excitement around the venue. The local area was full of hotels, restaurants and things to do.

Because it was L.A., there were parties in wonderfully ornate and interesting hotels. Pantheon hosted an enormous party right that took up a whole block in the heart of L.A.:


(thanks to Kim Pepper for the photo)

Thoughts on DrupalCon as an event

If you've only been to PHP, WordPress, Joomla or other open source events, you really should make the effort to attend a DrupalCon. Everything is larger and more professional than you're accustomed to. The Drupal Association really knows how to do a conference well. The whole event flowed smoothly. Even the wi-fi ran without a hitch all week.

However, there was a definite drop in the size of the event compared to Austin. Here's an overview of attendance in recent years:

  • Los Angeles 2015: 3,186
  • Austin 2014: 3,500
  • Portland 2013: 3,000
  • Denver 2012: 3,127
  • Chicago 2011: 2,881
  • San Francisco: 3,000

So attendance was about average for the last 6 years, but a drop of around 10% compared to Austin. There was also a decline in the number of sponsor booths. Why was there a drop?

  • Perhaps it was cost. DrupalCon tickets have risen from $200 in 2010 to over $500 this year.
  • Perhaps L.A. was not exciting enough as a destination. Certainly the choice of New Orleans next year should fix that.
  • Perhaps the decision to offer many different sponsorship levels this year meant that not everyone needed a booth.

DrupalCon does seem to have hit a growth plateau growth since 2010, but nevertheless, 3000 attendees is enough for a great event:


(thanks to Amazee Labs for the photo)

Thoughts on the professionalism of DrupalCon

We had one member of our team in L.A. who was new to DrupalCon and in fact to Drupal as a whole. He was really impressed by professionalism of the event and particularly by the Acquia employees and Acquia Partners. The fact that these companies are competing successfully and building enterprise-level sites for Fortune 100 companies was something he hadn't expected would happen with an open source CMS.

The biggest sponsor splash at the event was made by the new, which formed after the merger of Blink Reaction and Pro People. With over 400 employees they claim to be the biggest Drupal agency in the world. They had an enormous booth and FFW people were everywhere at the event.


(Thanks to FFW for the photo)

Overall thoughts

DrupalCon seemed like an interregnum conference. Drupal 7 is still the king, but Drupal 8 is tantalizingly close yet not ready for use. DrupalCon itself is still a high-class event, but it's not growing in North America.

It will be fascinating to see what happens next year in New Orleans. Will Drupal 8 kickstart another era of expansion or was Austin 2014 the high watermark for DrupalCon North America?

Maybe DrupalCon's expansion will be overseas. The conference closed by handing over to DrupalCon Barcelona (the European events continue to grow) and to DrupalCon Asia in Mumbai where growth has only just started.

May 19 2015
May 19

Blocks Drupal 8 offers unprecedented support for creating RESTful APIs. This was one of the major goals of the Web Services and Context Core Initiative (WSCCI), and one we've delivered on pretty well. However, as with most things that are worth doing, just because Drupal core “supports” it doesn't mean you'll see good results without an understanding of what's going on. In this article, we'll explore some of these principles, so that when it comes time to design with those systems, you'll know how to think about the problem.


Drupal 8 ships with support for encoding and representing its entities (and other objects) via the Hypermedia Application Language (HAL) specification. HAL can currently be expressed in JSON or XML, and is a specification for describing resources. As the specification says, HAL is “a bit like HTML for machines.”

What that means is that a HAL API can provide enough data for a machine agent to visit the root ("/") of a website, then navigate its way through the remainder of the system exclusively by following the links provided in responses. Humans do the exact same thing by visiting a page and clicking on links. The notion that machines might also want to do this is a relatively obvious idea, but one that has, until recently, rarely been followed on the web.

Still, though, it's pretty abstract. To really understand why HAL is powerful – and what it does for us in Drupal – it's necessary to go back to the basic constraints and capabilities of the problem space it operates in: HTTP and REST. The crucial documents there are RFC2616 and Roy Fielding's thesis, both well-worth [re-]reading. But a more easily digestible version comes in the form of the Richardson Maturity Model, first laid out by Leonard Richardson in 2008, and since revisited by Martin Fowler and Steve Klabnik.


The Richardson Maturity Model helpfully suggests a set of four “maturity” levels into which HTTP APIs fall:

  • Level 0, The Swamp of POX Your service is nothing more than a few RPC passthrough endpoints; the use of HTTP is incidental.
  • Level 1, Resources The passthrough endpoints are broken down into unique URIs that identify resources.
  • Level 2, HTTP Verbs The interaction with resources is further refined through the correct use of the full set of HTTP’s verbs, rather than simply GET and POST.
  • Level 3, Hypermedia Controls Media types capture a resource’s representational form(s), and hyperlinks are included for navigation.

These ideas have already been generally explored quite well in the previously-linked articles, and I don't want to just duplicate that work. Instead, I'll aim for a brief, illustrative treatment, so that we can spend more time focusing on the implications for Drupal.

Level 0 – The Swamp of POX

At this level, HTTP is being used essentially as a tunnel for an RPC service. Many calls are made to individual endpoints, with purely application-defined restrictions on what sort of work that endpoint can perform. POX refers to “Plain Old XML,” reflecting that there are essentially no boundaries on what sort of data is either sent into, or returned from, such a system.

In short, it's the Wild West.

Level 1 – Resources

The most glaring issue with our generic service endpoint is that it lumps together a bunch of different, potentially unrelated things under a single endpoint. In non-HTTP APIs, this may be less of an issue, but HTTP and the web are grounded in the idea of Uniform Resource Identifiers (URIs), which identify individual resources. Thus, the requirement for the first level is representing our system as a set of resources, and assigning a unique URI to each of them.

Of course, that begs the question, what is a “resource”?

Well, there's some wiggle room. As we'll see later, Drupal 8’s HAL output is primarily built around representing individual entities as resources, and that correlation generally works well. But what about lists of entities – what about a View?

That’s where things get interesting. I just described resources as “individual things,” which would seem to suggest that a View can't be a resource. Not so. Lists are still resources, but their relevant properties (not including the list items themselves) are things like, say, sort order. If sort order seems trivial or incidental, consider its role for a site like Reddit, where sort order is how the social apparatus is expressed.

All of this is pretty old hat; most systems – Drupal included – have been using URIs for a long time. Level 2 may be slightly less familiar.

Level 2 – HTTP Verbs

Having URIs for all our resources is great, but there's an immediate problem: how do I interact with those resources in different ways, for example to perform Create/Read/Update/Delete (CRUD) operations? This is the domain of HTTP methods, often called verbs.

Much as resources were a refinement of POX, verbs refine resources. Instead of having to create separate URIs for a resource in order to perform different actions on it, we access it using the same URL, but with a different HTTP verb to describe our intent. To get a representation of the resource, we use GET; to delete it, we use DELETE, and to update or create it, we use PATCH, PUT, or POST. (The differences between those last three are important, but outside the scope of this article.)

Resources should have a single URI: it’s an effective way to encode more information onto a single request, allowing us to better the original guideline. There is ongoing debate as to the efficacy of using all the verbs “correctly,” but it's still worth understanding the intent.

Level 3 – Hypermedia Controls

The fourth level encompasses two ideas: different representations of resources via media types, and linking between resources. The web generally adheres decently enough to the first two levels, but it's pretty awful on this one.

Media types are another refinement on the notion of a resource. They allow the client to request – and/or the server to respond with – varying representations of a resource. A simple example is HTML vs. JSON. The client can request a particular content type using the Accept HTTP request header, and the server indicates the type of its response with the Content-Type HTTP response header. HTML is text/html and JSON is application/json.

But JSON is really just a raw data serialization format, not a document type like HTML. It can't be turned into something meaningful without additional context that specifies how to interpret it. Not all media types are created equal. An excellent example of how this issue crops up is the other major component of Level 3 – hyperlinks.

Links are easy to understand; we've been using them forever in HTML. They're a pointer from the current resource to another resource. Without them, you'd have to input URLs by hand into the address bar, copying them from a separate document. Ridiculous! And yet, that's exactly what we expect API implementers to do: interact with our REST APIs by copying in URIs from a specifications document.

But if the server instructs REST clients on how to construct URIs for the system by providing links to other resources in its responses, then, as with human users and HTML, clients can navigate their own way through. So long as all resources are linked together somehow, clients will be able to reach them all.

In fact, actually providing the links is a bit of a trick. The referent itself is easy – it's just a URI. The question is, how does the REST client know to interpret those URIs as links?

For example, here's text/html which knows how to handle links just fine:

  <title>It's Jamal's Birthday!</title>
    <p>Happy birthday, <a href="">Jamal!</a>

And here's a representation of roughly the same resource in application/json:

    "title": "It's Jamal's Birthday!",
    "body": "Happy birthday, Jamal!",
    "????": ""

What named key should the client look for to know that its values are hyperlinks?

Sure, you can make one up for your own purposes, but then it's application-specific. And besides, most of us are in the business of creating applications, not defining what a link is. Which brings us back to HAL.

HAL is a generic spec for hypermedia applications. It provides an answer for this link-property-naming question (and many others). If your HTTP API produces compliant HAL data and reports it with the media type application/hal+json (or application/hal+xml), then it's immediately navigable by any generic HAL client. In fact, hal-browser provides a web UI for browsing HAL trees. (You can experiment with one online.)

This also makes HAL APIs largely self-describing; as there's less need for API docs when clients can just walk the API – although HAL even defines a standard way to link to a resource's documentation!


So, we've reached Level 3 – now what? HATEOAS, that's what. HATEOAS stands for Hypertext As The Engine Of Application State. Pronounce it like Cheerios on a bad day.

Essentially, HATEOAS is the functional result of achieving Level 3 maturity. If you have good, clean resources at unique URIs that fully represent the entire state of your system, and they can be manipulated through a sufficient suite of HTTP methods, and are variously representable through appropriate media types, then it's possible to drive (i.e., state transitions in) the application entirely through hypertext interactions. Which is to say, every state transition – roughly speaking, CRUD operation – can be driven through a series of steps that looks something like this:

  1. Start from either the root or a known URI.
  2. Walk through the links provided in responses to GET requests.
  3. Once you arrive at the entity you'd like to manipulate, follow a link – possibly a self-link – to the appropriate URI and send a request using the appropriate method (PUT, PATCH, DELETE, or POST) for your desired state change.

All of this should be familiar; it's how we use a browser.

Achieving HATEOAS for your entire application means you're pretty much 100% RESTful. And that’s a lovely goal...right?

Well, probably.

REST and the resource model is not necessarily appropriate for all types of applications. And, after all, one could design an API that's probably just as capable using a POXy, Level 0 approach. Certainly, plenty of other RPC endpoint-style protocols work. So, why care at all?

For me, it's an issue of design. REST was designed in a way that fits very nicely into the constraints of HTTP. It scales better into complexity – something that Drupal projects often desperately need. If your application is appropriate for REST, then not aspiring to HATEOAS for your HTTP API is sort of like sending Morse code over video chat by toggling your camera on and off: you're not using the medium as intended, and your utility will suffer for it.

HAL, HATEOAS, and Drupal 8

Drupal 8 ships with three modules that, together, seek to allow Drupal to act as a hypermedia API via HAL:

  1. serialization The serialization module is built atop Symfony's serialization component. It provides interfaces for (surprise!) serializing different types of data into strings, and deserializing them back into data.
  2. rest The REST module provides a framework for expressing Drupal's data as web resources.
  3. hal Building on the previous two, it facilitates HAL encoding of Drupal's data types to create application/hal+json output for HTTP responses.

These modules are not enabled by default, but if you turn them on, Drupal will start serving HAL for entities. By default, these are only available to authenticated users. You can either enable the basic authentication module to allow clients to authenticate as part of their request, or simply open up the permissions. Note that currently, there is only support for relaying content entities (as opposed to, say, config, menu, or user entities) via REST.

Now the basic pieces are in place, but there's still work to be done. For example, Views can generate a HAL representation of their output, but it's not automatic in quite the way entities are; you have to configure a separate display. This starts to get into murky territory with respect to REST's ideal expectations, as a separate display likely entails a separate resource (though not necessarily). Media types are only supposed to be a way of providing a different representation of the same resource, not a maybe-the-same-but-maybe-not resource.

A similar but more intractable problem exists with entities: if you enable field-level access controls, it becomes impossible to guarantee the same resource at a given URI, as it is now user-specific. To remain compliant with the ideals of REST, your response would have to encode per-user information into the media type, as in:
Content-Type: application/hal+json; user=<username>.

This is a classic Drupal challenge: it doesn't prevent you from making standards-noncompliant and possibly poor decisions. That’s why an understanding of the underlying principles at work is so important; without it, you won't know even know the tradeoffs you're facing, and it's easy to find yourself off the path, wandering the woods of pseudo-maintainability.

In a way, though, this is “by design.” Roy Fielding believes that:

A REST API should spend almost all of its descriptive effort in defining the media type(s) used for representing resources and driving application state, or in defining extended relation names and/or hypertext-enabled mark-up for existing standard media types.

While HAL relieves us of much of the basic drudgery around specifying a media type, it cannot solve the whole problem. Determining what links should exist between resources, and the types of those links, is a crucial design choice that remains in the hands of the implementer – as it should. That is the design half of the challenge. The other half will be learning what possibilities Drupal 8 allows that you should not use in order to remain as standards-friendly as possible.


May 19 2015
May 19

28 pages of unmarred perfection. This book is pure unadulterated genius

- Chris Arlidge

Never Be Shocked Again! - Budgeting your Web Project

Are you having trouble figuring out an appropriate budget for your next web project? Our whitepaper can help!

Download your FREE COPY, to learn the different types of website projects, understand the factors that play a role in the budgeting process, and determine where your web plans will fit when it comes to costs!

Don’t ever be shocked by web costs again! A clear guide to help you plan the budget for your next web project.

This tutorial is written for new drupal developers or php developers who want to learn drupal.

We are going to cover the following in this tutorial:

  • Creating the file structure for the module.
  • Creating a simple table.
  • Register a path to display the custom form.
  • Create a custom form with 4 fields.
  • Capturing and saving the form values in the database.
  • Retrieving and re-populating the form with user's input.

Let's get started.

Step 1: File Structure

Create the structure for our first module. We are going to call it simple recipe.

create the following folder and files:

  • sites/all/modules/custom/simple_recipe
  • sites/all/modules/custom/simple_recipe/
  • sites/all/modules/custom/simple_recipe/simple_recipe.install
  • sites/all/modules/custom/simple_recipe/simple_recipe.module

*you don't have to put our module in the custom directory.

Step 2: Storage

We are going to need few columns to store our recipe:

    a) A unique auto increment ID to identify the recipe.

    b) An int column for user id, so we know who the author is.

    c) A column to store all the form values, this way, we get the flexibility to add/remove form api fields.

    d) A status field for this record. 1 for active 0 for disabled.

    e) A Unix timestamp for the creation time.

Now that we have that covered, lets declare the schema in the install file.

*Please note that with the drupal platform, there are endless ways to solve the same problem.

We can use node to store the data, however for the demonstration purpose, we are going to store the whole form_values instead.


Enable the module and make sure we have the simple_recipe table in database.

Step 3: Register paths

We are going to need to register a path to create this module. Let's edit the simple_recipe.module and add the following code.


Now flush cache.

The variable %user in the path will auto load a user object by its id and pass it in our access arguments check function. We will implement the access callback later.

Step 4: Create the form

We are going to have four (4) fields for this form.

  1. Recipe name (text field)
  2. Ingredients (textarea)
  3. Instructions (textarea)
  4. Difficulty (select)

Edit the simple_recipe.module and add the simple_recipe_form() function.


Please pay attention to the form element property '#default_value', this is how we repopulate the form when data is present. For textfield and textarea, clean the content before its outputted for better security.

Step 5: Save the form

At this point if you click save, the form will be submitted. We will need to add a submit handler to capture and store the data.

Add the following function to simple_recipe.module file.


This function will be called once the form get submitted.

Now that we have the form, we will call drupal write record to save it.


Step 6: Re-populate the form

We need to load the recipe data from the database, and then extract the data and feed it back to form api.

Add the following function to simple_recipe.module file.


However, if you have a complicated from with lots of form elements, saving the form values can be quite time consuming. Serialization is our friend.

I have included the source code in a zip file for you to check it out.

Reference here:

May 19 2015
May 19

Using video as part of your content strategy is an excellent way to not only increase user engagement and connect to your audience, but also entertain your readers by using video to complement your text content. Social networks such as Twitter and Facebook are embracing the trend that video marketing has an integral role in user engagement. Reports predict that “by 2018, mobile video will represent 69 percent of global mobile data traffic, up from 53 percent in 2013.” If you haven’t integrated video marketing into your content strategy, it's time to get started. It's easier than ever with the Drupal Media module and its submodule, Media: YouTube.

Getting Started with the Drupal Media and Media: YouTube Modules

To help you get started, I have created a tutorial on how to add YouTube videos to your Drupal 7 site using the Media and Media: YouTube modules. First, let's review what makes the Media and Media: YouTube modules valuable for our project.

The Drupal Media module is one of the most popular Drupal modules as it serves as a framework for multimedia-related contributed modules. Its primary use is as a file browser for multimedia, whether internal or external files. The Drupal Media module can replace the Drupal core upload field. Drupal Media module submodules exist to build on top of this framework. They provide hooks into third-party APIs and external data sources so that the Media module can consume those data sources and display them within the multimedia library or browser to be used throughout your Drupal project.

One such submodule is the Media: YouTube module. This module enables Media to consume YouTube videos and display them as Media field data sources, hooking into all the features and functionality the Drupal Media module offers. This means that you'll be able to display YouTube videos on Nodes, in Views, Panels, or as part of Taxonomy Terms. The Media: YouTube module even provides for a way to display the YouTube video thumbnail and style the thumbnail with your site's image styles. All you'll need is a YouTube video URL.

Take a deep breath, it's time to make your first move. If you feel like you're in over your head or Drupal's Media module is making you queazy, you can always hire the Drupal experts.

Download and Install the Modules

First, head over to and to the project pages for our key modules: Media and Media: YouTube. Be sure to grab the dependencies, too.

If you're using Drush, you'll just need to execute this command:

drush en media media_youtube -y

Primary modules


For this Drupal tutorial, I’m going to assume that you already know how to download and install modules through Drush or manually, so I’ll skip that step. 

Module and Drupal Core versions

I will be using Media module 7.x-2.0 alpha 4 and Media: YouTube module 7.x-2.0-rc5. You'll need to use Drupal 7 to follow this tutorial.

Configuring Drupal Media and Media: YouTube Modules

Manage the Video File Display

After you have downloaded and enabled Media and Media: YouTube modules, let’s head over and configure the display settings for your new Video file type. You’ll access this configuration under Structure -> File types.

Most of the time, you won’t need to edit the settings under manage fields or manage display for videos unless you have specific customizations that are required for your project. Navigate to manage file display.

Drupal manage Media Module file display for Video

As you see, there’s a set of predefined File Display Modes already in place, so I’ll choose the display handler I wish to use.

Drupal Media file video handler

You will need to tweak the Display Modes to fit your needs. Typically, I’ll use the Default File Display Handler for my video style that will be used in my Content Type to actually play the videos, and I use the Teaser File Display Handler to display a smaller YouTube preview image in my Views.

The images below illustrate the configuration settings for the Default File Display Handler.

Drupal Media file video display

Drupal Media file video display, further

Save the configuration.

If you want to use the YouTube preview images or thumbnails on a View or other location on your site, you will want to update the Teaser File Display Handler settings.  Although, I will show you an example of how to set up the Teaser File Display in this tutorial, I will touch more on how to add video to Views in my next tuturial.

The image below illustrates the configuration settings for the Teaser File Handler Display. The image styles available can be configured under Configuration -> Media -> Image Styles.

Drupal Media video teaser display

Add and Configure the Video Field on the Content Type

Now that you have set up your Video File Displays, you can head over to your Content Type to add and configure your YouTube Video field. In my case, I want to add YouTube videos to my Article Content Type. So, I’ll go to Structure -> Content Types and then select Manage Fields on my Article Content Type.

Drupal Content Type video field setting

Since the YouTube video is a new field, under Add New Field, choose a Label for your new video field and then select File for the Field Type. Your widget should automatically set to the Media Browser Widget. Save your configuration.

Drupal Content Type add video field

The below images illustrate the configuration for the YouTube Video field settings.

Drupal Content Type configure video field setting

Drupal Content Type configure video field description

As you can see in the images above, I have chosen the options to enable the Library and Web Plugins. This allows my users to add the URL of YouTube videos or choose a previously added video from the Library. I have also chosen Video as the Allowed File Type, and have chosen my URI schemes. It's important to note here that I have selected Enable Description Field. Enabling the Video Description field is helpful to improve your video's SEO and should be incorporated by you as common practice. Save your configuration.

Manage Display Settings on the Content Type

Next, you’ll need to manage your display settings on your Content Type.

Drupal manage display for Content Type video field

For your YouTube Video field, select Rendered File in your Format Settings. Now, select your View Mode. For me, the View Mode that I want to use is Default, which is the File Display Mode that we setup earlier. Save configuration.

Drupal view mode Media module

Add an Article Node with Video Content

Now, you can start adding content. Go to Content -> Add content -> select Article (node/add/article). Add your title and other information for your Node. For the Video field, select Browse to add your YouTube Video.

Drupal Media module YouTube field browse

Add the URL of your YouTube Video.

Drupal Media module YouTube field add video URL

Once added, you will see a thumbnail of your video. Save your content.

Drupal Media module YouTube field thumbnail

You should now see your YouTube video and be able to play your video directly on your site.

Drupal Media module YouTube player as field

Next Steps for your Media: YouTube fields

Displaying your YouTube video and YouTube player is just the first step in utilizing the Media: YouTube module. Now that we have our YouTube video as field data, any of our listings that use the Teaser display will show the thumbnail of the YouTube video. You can use regular image styles to alter the thumbnail sizing or aspect ratio through the file display options instead of the original image option.

You'll probably want to use that thumbnail in Views or even the player in Views as a field display. You might have designs that call for multiple video thumbnails in a carousel or a sidebar. You can integrate your YouTube videos anywhere Views can expose fields from content: in FlexSliders, Views Slideshows, Colorbox dialogs—there's numerous possibilities. All this is now possible by combining Media, Media: YouTube, and Views modules.

Having trouble getting your videos to display? Say hello on Twitter. Did you enjoy the video guide? Subscribe on YouTube for more. Think this guide is useful? Don't forget to share!

May 18 2015
May 18

As mentioned during Dries's DrupalCon LA keynote, the Drupal Community Working Group is now accepting nominations for the Aaron Winborn Award, to honour Drupal community members who demonstrate personal integrity, kindness, and above-and-beyond commitment to the Drupal community.

Nominations are open until Monday 15 June 2015, and the selected recipient will receive a scholarship and stipend to attend DrupalCon with recognition during a plenary session at the event.

Submit your nominations here: