Oct 31 2014
Oct 31

I doubt that many people who read this will be unaware of the extremely severe security vulnerability that was discovered reported and patched a couple of weeks ago or the later realease and many related blog posts pointing out exactly how critical early updates and patching are

If this has ruined a little of your free time recently and/or entailed your agency really earning the costs of any maintenance contracts you offer consider how terrifying some of the press and implications are for owners of unsupported Drupal sites many of whom will be small charities and local organisations.

I would like to think that local Drupal groups etc. and the 'Drupal community' in general would step up and help out, if enough of us do that then we could generate some positive press. Yes we have a security team etc that is good, but how are we going to help out?

My local Drupal group will attempt to answer questions and find people to provide a little support (anybody else??), 

Appreciate that many people are not in the position to provide a lot of effort for free, even a small amount of advice could get people on the right track and Drupal groups are likely to know good freelancers that can afford to help a small company for considerably less than typical agency fees. 

All those hackthons, sprints, efforts to drive D8 forward, anybody brave enough to divert some of that effort towards auditing/fixing local sites?? I hope so.

BTW I have slight doubts about this site although I did fix by hand based on this commit (this is an old alpha). I will be trashing this server shortly and migrating to a new Beta 2 site and fresh server. 

Oct 30 2014
Oct 30

Tim Erickson, stpaultim, @stpaultim from Triplo
Alina, alimac, @czaroxiejka, from University of Illinois at Chicago
Cathy Theys, YesCT, @YesCT from BlackMesh

DrupalCon Amsterdam Sprints

DrupalCon is a great place to enhance your Drupal skills, learn about the latest modules, and improve your theming techniques. Sure, there are sessions, keynotes, vendor displays, and parties... like trivia night!

But.. there is also the opportunity to look behind the curtain and see how the software really gets made. And, more importantly, to lend your hand in making it. For six days, three both before and after DrupalCon, there are dedicated sprint opportunities where you can hang out with other Drupalistas testing, summarizing issues, writing documentation, working on patches, or generally contributing to the development of Drupal and the Drupal community.

We want to share some details about the DrupalCon Amsterdam Sprints (and pictures to reminisce about the good times) and mention some upcoming sprints that you can hopefully attend.

Our sponsors helped us have:

  • Space:
    • Co-working space Saturday and Sunday before the con.
    • Sprint space at the venue Monday-Thursday.
    • Big sprint space Friday.
    • Co-working space Saturday and Sunday after the con.
  • Food and coffee all of the days.
  • Sprint supplies: task cards, stickers, markers, signs, flip charts.
  • Mentor thank you dinner.

Pre-con sprints

During the weekend before DrupalCon 60 people gathered on Saturday and 100 on Sunday at The Berlage, a fantastic old castle just blocks from the central train station. On most days the Berlage serves as co-working space. For 48 hours it was home to contributors working together on Drupal core, contrib projects, distributions and Drupal.org itself. Our supportive sponsors supplied lunch and coffee on both days while contributors worked on a number of initiatives: Multilingual, Drupal 8 criticals and beta blocking issues, Headless Drupal and REST, porting contrib projects to Drupal 8, Drupal 8 Frontend United, Search, Drupal.org, Behat (Behavior Driven and javascript/frontend testing), Commerce, Panopoly, Rules, Media, Documentation, Migration, Performance, Modernizing Testbot, and more.

Outside of the Berlage
The outside of the Berlage co-working space (castle) with the Drupal Association banner.
(photo: @gaborhojtsy)

Sprinters working inside the Berlage castle
Sprinters sprinting inside the cool looking Berlage.
marthinal, franSeva, estoyausente, YesCT, Ryan Weal
(photo: @gaborhojtsy)

We had lots of rooms for groups to gather at the Berlage.

Field API room at the Berlage sprint.
pwolanin, dawehner, wimleers, Hydra, swentel
(photo: @Schnitzel)

Field API room at the Berlage sprint.
Sutharsan, yched, Berdir
(photo: @Schnitzel)

On Monday sprint attendance grew to 180 sprinters. We moved to the conference venue, Amsterdam RAI. Other pre-conference events taking place included trainings, the Community Summit, and the Business Summit. At this particular DrupalCon there was much excitement about the anticipated beta release of Drupal. Many people did a lot of testing to make sure that the beta would be ready.

Sprinters working together, talking.
Discussing a beta blocker issue they found.
lauriii, sihv, Gábor Hojtsy, lanchez
(photo: @borisbaldinger)

Mauzeh, Tobias and the sprint location at the venue.
Lots of people sprinting and testing the beta candidate, with support from experienced core contributors walking around and helping.
tstoeckler, mauzeh
(photo: @borisbaldinger)

During the con

Sprinting continued during the conference, Tuesday through Thursday. And, to prepare for Friday's mentored sprint, the core mentoring team scheduled a series of 8 BOFs (‘Birds of a Feather’ or informal sessions). Preparations included mentor orientation, setting up local environments, and reading, updating, and tagging issues in the Drupal issue queue. Mentoring BoFs were open to all conference participants.

Mentors sitting in a BoF room
Mentor Training
YesCT, sqndr, -, -, lazysoundsystem, neoxavier, Mac_Weber, patrickd, roderik, jmolivas, marcvangend, -, realityloop, rteijeiro
(photo: stpaultim)

To promote contribution sprints, mentors volunteered at the mentoring booth in the exhibition hall during all three days of DrupalCon. Conference attendees who visited the booth learned about the Friday sprints. Mentors also recruited additional mentors, and encouraged everyone to get involved in contributing to Drupal.

The mentor booth with lots of colorful signs
The mentor booth with lots of signage, and welcoming people.
mradcliffe, kgoel
(photo: stpaultim )

At the booth, conference attendees were able to pick up our new contributor role task cards and stickers which outlined some of the various ways that people can contribute to Drupal and provided them with a sticker as recognition for the specific roles that they already play.

picture of the cards and stickers
Task cards and stickers
(photo: @HornCologne)

Mentored Sprint

In Amsterdam, 450 people showed up to contribute to Drupal on Friday.

lots of people at the Friday sprint
lilGemVinny, with lots of others
(photo: _SteffenR)

People gathered in groups to work on issues together.

A group around a table.
-, -, -, -, -
(photo: @peterlozano)

For many people the highlight of the week is the large “mentored” sprint on Friday. 180 of the 450 participated in our First-time sprinter workshop designed to help Drupal users and developers better understand the community, the issue queues, and contribution. The workshop helped people install the tools they would use as contributors. Another 100 were ready to start work right away with our 50 mentors. Throughout the day people from the first-time sprinter workshop transitioned to contributing with other sprinters and mentors. Sprinters and mentors helped people identify issues that had tasks that aligned with their specific skills and experience.

The dark workshop room with presentation screen, filled with people.
The workshop room.
(photo: stpaultim)

Mentors checking in with tables full of sprinters.
Mentors (in orange shirts): rachel_norfolk, roderik
(photo: stpaultim)

Mentored Core Sprint room sign, modified with marker to have a list of good and not good novice tasks.
Hand written signs were everywhere!
(photo: stpaultim)

A group picture of some of the mentors.
mradcliffe, Aimee Degnan, alimac, kgoel, rteijero, Deciphered, emma.maria, mon_franco, patrickd, 8thom, -, lauriii, marcvangend, ceng, Ryan Weal, YesCT, realityloop, -, lazysoundsystem, roderik, Xano, David Hernández, -, -, -, -
(photo: @Crell)

Near the end of the day, over 100 sprinters (both beginners and veterans) gathered to watch the work of first time contributors get committed (added) to Drupal core. Angie Byron (webchick) walked the audience through the process of evaluating, testing, and then committing a patch to Drupal core.

webchick celebrating and showing her screen on the overhead. Contributors clapping.
Live commit by webchick
webchick, dmsmidt, valgibson, marcvangend
(photo: Pedro Lozano)

Extended sprints on Saturday and Sunday

On Saturday after DrupalCon 80 dedicated contributors moved back to the Berlage to continue the work on Drupal core. 60 people came to contribute on Sunday. During these final days of extended sprints, Drupal beginners and newcomers had the chance to exercise their newly acquired skills while working together with some of the smartest and most experienced Drupal contributors in the world. The value of the skills exchanges and personal relationships that come from working in this kind of environment is cannot be underestimated. While there is an abundance of activity during Friday’s DrupalCon contribution sprints, the atmosphere during extended sprints is a bit more relaxed. Attending the pre and post-con sprints gives sprinters time to dive deep into issues and tie up loose ends. After a number of hallway and after-session conversations, contributors working on specific Drupal 8 initiatives meet to sketch out ideas, use whiteboards or any means of note-taking to make plans for the future.

Sprinters in the front end room.
LoMo, Outi, pfrenssen, lauriii, mortendk, emma.maria, lewisnyman
(photo: stpaultim)

Berlage first room filled with people at all the tables, some high tables.
Aimee Degnan, Schnitzel, dixon, -, Xano, alimac, boris, Gábor Hojtsy, realityloop, YesCT, justafish, eatings, fgm, penyaskito, pcambra, -
(photo: stpaultim)

Drupal.org sprint room at the Berlage.
-, jthorson, opdavies, drumm, RuthieF, -, -, killes, dasrecht
(photo: stpaultim)

Feedback about the sprints

Please contact me to get your DrupalCon Amsterdam sprint related blog added to the list here.

Upcoming sprints

Plan your travel for the next event so you can sprint with us too!


If there are corrections, for example of names of people in the pictures, please let me know. -Cathy, @YesCT, or Drupal.org contact form.

Oct 27 2014
Oct 27

A couple of months ago, after a harrowing cascade of git merge conflicts involving compiled css, we decided it was time to subscribe to the philosophy that compiled CSS doesn't belong in a git repository. Sure, there are other technical solutions teams are tossing around that try to handle merging more gracefully, but I was more intererested in simply keeping the CSS out of the repo in the first place. After removing the CSS from the repo, we suddenly faced two primary technical challenges:

  • During development, switching branches will now need to trigger a recompliation of the stylesheets
  • Without the CSS in the repo, it's hard to know how to get the code up to Acquia

In this article, I'll describe the solutions we came up with to handle these challenges, and welcome feedback if you have a different solution.

Local Development

If you're new to using tools like Sass, Compass, Guard and LiveReload, I recommend taking a look at a project like Drupal Streamline. For the purpose of this post, I'm going to assume that you're already using Compass in your project. Once the CSS files have been removed, you'll want to compass compile to trigger an initial compilation of the stylesheet. However, having to remember to compile every time you switch to a new branch introduces not only an inconvenience, but also a strong possiblily for human error.

Luckily, we can use git hooks to remove this risk and annoyance. In this case, we'll create a post-checkout hook that triggers compiling every time a new branch is checked out:

  1. Create a file called post-checkout in the .git/hooks folder
  2. Add the following lines to that file:
    #! /bin/sh
    # Start from the repository root.
    cd ./$(git rev-parse --show-cdup)
    compass compile
  3. From the command line in the repository root, type chmod +x .git/hooks/post-checkout

Assuming you have compass correctly configured, you should see the stylesheets getting re-compiled the next time you git checkout [branch], even if you're not already running Guard and LiveReload.

Deploying to Acquia

Now that CSS is no longer being deployed when we push our repo up to Acquia, we need to figure out how we're going to get it there. It would be possible to force-add the ignored stylesheets before I push the branch up, but I don't really want all those additional commits on my development branches in particular. Luckily, Acquia has a solution that we can hack which will allow us to push the files up to Dev and Stage (note, we'll handle prod differently).

Enter LiveDev

Acquia has a setting that you can toggle on both the dev and test environments that allows you to modify the files on the server. It's called 'livedev', and we're going to exploit its functionality to get our compiled CSS up to those environments. After enabling livedev in the Acquia workflow interface, you are now able to scp files up to the server during deployment. Because I like to reduce the possibility of human error, I prefer to create a deploy script that handles this part for me. It's basically going to do three things:

  1. Compile the css
  2. scp the css files up to Acquia livedev for the correct environment
  3. ssh into Acquia's server and checkout the code branch that we just pushed up.

Here's the basic deploy script that we can use to accomplish these goals:


REPO_BASE='[project foldername here (the folder above docroot)]'

# check running from the repository base
if [ ! "$CURRENT_DIR" = $REPO_BASE ]; then
  echo 'Please be sure that you are running this command from the root of the repo.'
  exit 2

# Figure out which environment to deploy to
while getopts "e:" var; do
    case $var in
        e) ENV="${OPTARG}";;

# Set the ENV to dev if 'e' wasn't passed as an argument
if [ "${#ENV}" -eq "0" ]; then

if [ "$ENV" = "dev" ] || [ "$ENV" = "test" ]; then
  # Set the css_path and livedev path

  # Replace [user@devcloud.host] with your real Acquia Cloud SSH host
  # Available in the AC interface under the "Users and keys" tab

  # Get the branch name
  BRANCH_NAME="$(git symbolic-ref HEAD 2>/dev/null)" ||
  BRANCH_NAME="detached"     # detached HEAD

  echo "Pushing $BRANCH_NAME to acquia cloud $ENV"
  git push -f ac $BRANCH_NAME # This assumes you have a git remote called "ac" that points to Acquia

  echo "Compiling css"
  compass compile

  # Upload to server
  echo "Uploading styles to server"
  scp -r $CSS_PATH "$ACQUIA_LIVEDEV~/$ENV/livedev/$CSS_PATH":

  # Pull the updates from the branch to livedev and clear cache
  echo "Deploying $BRANCH_NAME to livedev on Acquia"
  ssh $ACQUIA_LIVEDEV "git checkout .; git pull; git checkout $BRANCH_NAME; cd docroot; exit;"

  echo "Clearing cache on $ENV"
  cd docroot
  drush [DRUSH_ALIAS].$ENV cc all -y

  echo "Deployment complete"

# If not dev or test, throw an error
echo 'Error: the deploy script is for the Acquia dev and test environments'

Now I don't pretend to be a shell scripting expert and I'm sure this script could be improved; however, it might be helpful to explain a few things. To start with, you will need to chmod +x [path/to/file]. I always put scripts like this in a bin folder at the root of the repo. There are a few other variables that you'll need to change if you want to use this script, such as REPO_BASE, CSS_PATH and ACQUIA_LIVEDEV. Also, the script assumes that you have a git remote called "ac", which should point to your Acquia Cloud instance. Finally, the drush cache clear portion assumes that you have a custom drush alias created for your livedev environment for both dev and test; if not, you can remove those lines. To deploy the site to dev, you would run the command bin/deploy, or bin/deploy -e test to deploy to the staging environment.

Deploying to Prod

Wisely, Acquia doesn't provide keys to run livedev on the production environment, and this approach is probably more fragile than we'd like anyway. For the production environment, we're going to use an approach that force-adds the stylesheet when necessary.

To do this, we're again going to rely on a git hook to help reduce the possibility of human error. Because our development philosophy relies on a single branch called "production" that we merge into and tag, we can use git's post-merge hook to handle the necessary force-adding of our stylesheet.

#! /bin/sh

BRANCH_NAME="$(git symbolic-ref HEAD 2>/dev/null)" ||

if [ "$BRANCH_NAME" = "production" ]; then
  compass compile
  git add $CSS_PATH -f
  git diff --cached --exit-code > /dev/null
  if [ "$?" -eq 1 ]; then
    git commit -m 'Adding compiled css to production'

As with the post-checkout hook, you'll need to make sure this file is executable. Note that after the script stages the css files, git is able to confirm whether there are differences in the current state of the files, and only commit the files when there are changes. After merging a feature branch into the production branch, the post-merge hook gets triggered, and I can then add a git tag, push the code and new tag to the Acquia remote, and then utilize Acquia's cloud interface to deploy the new tag.


While this may seem like a lot of hoops to jump through to keep compiled CSS out of the repository, the deploy script actually fits very nicely with my development workflow, because it allows me to easily push up the current branch to dev for acceptance testing. In the future, I'd like to rework this process to utilize Acquia's Cloud API, but frankly, my tests with the API thus far have returned unexpected results, and I haven't wanted to submit one of our coveted support tickets to figure out why the API isn't working correctly. If you're reading this and can offer tips for improving what's here, sharing how you accomplish the same thing, or happen to work at Acquia and want to talk about the bugs I'm seeing in the API, please leave a comment. And thanks for reading!


Dave Reid made a comment below about alternatives to LiveDev and the possibility of using tags to accomplish this. As I mentioned above, LiveDev works well for me (on dev and test) because it fits well into my typical deployment workflow. The problem I see with using tags to trigger a hook is that we are in the practice of tagging production releases, but not for dev or test. Thinking through Dave's suggestion, however, led to me to an alternative approach to LiveDev that still keeps the repo clean using Git's "pre-push" hook:

#! /bin/sh

ACQUIA_REMOTE='ac' #put your Acquia remote name here

  compass compile
  git add docroot/sites/all/themes/ilr_theme/css/ -f
  git diff --cached --exit-code > /dev/null
  if [ "$?" -eq 1 ]; then
    git commit -m "Adding compiled css"

The hook receives the remote as the first argument, which allows us to check whether we're pushing to our defined Acquia remote. If we are, the script then checks for CSS changes, and adds the additional commit if necessary. The thing I really like about this approach is that the GitHub repository won't get cluttered with the extra commit, but the CSS files can be deployed to Acquia without livedev.

Oct 25 2014
Oct 25

While working on the console project I have found (so far) two Drupal 8 core bugs. In this blog post I will explain how I found these bugs and what I have done to fix them.

First bug - Wrong service definition

I found the first one when adding a command for debugging registered services within the container.

Using the console command:

$ bin/console container:debug


The service definition for “controller.entityform” on "core/core.services.yml" file was wrong. The configured arguments injected are not expected for the class when creating a new instance of the object.

The "core.services.yml" is where all Drupal core services are defined.

Current declaration:

class: Drupal\Core\Entity\HtmlEntityFormController
arguments: ['@class_resolver', '@controller_resolver',
'@service_container', '@entity.manager']

More information about which services are registered on Drupal 8: https://api.drupal.org/api/drupal/core!core.services.yml/8

How to reproduce:


The line of code used to reproduce the error is part of the console project. As you can see it is in the project repository. https://github.com/hechoendrupal/DrupalAppConsole/blob/0.2.15/src/Comman...

The get($id) method returns the object associated service form the container. Using the $id parameter as the service identifier. You can see the code on the Symfony Dependency Injection Component repository: https://github.com/symfony/symfony/blob/2.5/src/Symfony/Component/Depend...

The following error was given:

Recoverable fatal error: Argument 1 passed to Drupal\Core\Entity\HtmlEntityFormController::__construct() must implement interface Drupal\Core\Controller\ControllerResolverInterface, instance of Drupal\Core\DependencyInjection\ClassResolver given in Drupal\Core\Entity\HtmlEntityFormController->__construct() (line 39 of core/lib/Drupal/Core/Entity/HtmlEntityFormController.php).

Proposed resolution

Inject the proper arguments on the service definition. Based on the API documentation for the class HtmlEntityFormController: https://api.drupal.org/api/drupal/core%21lib%21Drupal%21Core%21Entity%21...

Service declaration should look like this:

    class: Drupal\Core\Entity\HtmlEntityFormController
    arguments: ['@controller_resolver', '@entity.manager', '@form_builder', NULL]

Even when the final resolution was to remove the service definition for “controller.entityform”. Running the console command helped me find this bug in Drupal 8 core and have it fixed.

More information on the issue page:
- Name: Remove service definition for controller.entityform from core/core.services.yml
- Link: https://www.drupal.org/node/2303823 - Closed (fixed)

Second Bug - Invalid Composer version

I found this bug when installing the Console project

$ COMPOSER_BIN_DIR=bin composer require --dev drupal/console:@stable


I was not able to complete the installation because the PHP version required on "composer.json" file was not valid.

Current declaration:

require": { 
"php": ">5.4.4-13",

How to reproduce:

A) Running the installation process of Console project, but not everyone is using the console project, even when Larry Garfield AKA @Crell recommended it on Twitter after seeing my Drupal 8 Console Lightning Talk at DrupalCon Amsterdam.

The D8 console from @jmolivas looks even better than before! Every D8 developer should use it. #DrupalCon

— Larry Garfield (@Crell) October 2, 2014

B) Validate a composer file running the composer validate command. The composer documentation also recommends using this command.

“You should always run the validate command before you commit your composer.json file, and before you tag a release. It will check if your composer.json is valid.” https://getcomposer.org/doc/03-cli.md#validate

$ composer validate

After running the validate command, the following error was given:

./composer.json is invalid, the following errors/warnings were found: require.php : invalid version constraint (Could not parse version constraint >5.4.4-13: Invalid version string "5.4.4-13")

Proposed resolution

Drupal 8 requires PHP 5.4.5 at least, the recommendation was to update the "composer.json" file as follows:

require": {
"php": ">=5.4.5",

This time the patch was accepted and the bug was fixed.

More information on the issue page:
- Name: Composer require.php : invalid version constraint
- Link: https://www.drupal.org/node/2354301 - Fixed

Oct 23 2014
Oct 23

Recently one of my clients had a problem with a large portion of transactional email never being seen. The emails were being directed to the recipients' spam folders and were generally being over-looked. These were important emails regarding things like membership confirmations, invoices and event information and were critical to the experience of the members.

Why was this happening? Mostly because the emails were being sent by the web server. I switched it to a Mandrill, a service designed to take care of the headaches of sending transactional email, and this greatly improve the delivery rate.

It is notoriously difficult to ensure emails from your application (such as Drupal) actually get delivered without getting caught in spam filters. Email providers like Mandrill have the expertise to maximise delivery rate. You are unlikely to have the time or expertise to manage this process for your own web server.

Mandrill provides great stats so that you can gain a greater understanding of email delivery, if it is getting caught by spam filters, bounces, open rates etc. You can also test different versions of the same email to see which one performs best in terms of open rates.

Difference between Mandrill and MailChimp

It is easy to get confused between services like Mandrill and services like MailChimp. Mandrill is actually owned by MailChimp and uses the same delivery infrastructure, but the two services have very different purposes. As mentioned, Mandrill is intended for transactional email like confirmation emails, password resets, invoices sent by email. Services like MailChimp are intended to manage email lists for things like newsletters and drip email campaigns.

Setting up Mandrill

Mandrill is relatively simple to install and setup on a Drupal site.

The requirements are:

  • Mandrill module
  • Mandrill library
  • Mail System module
  • Mandrill account

After installing, go to Configuration > Web services > Mandrill.

Add your Mandrill API Key (which you need to get from your Mandrill account)

To switch email to use Mandrill, you need to configure Mail System Settings. Go to Configuration > System > Mail System. Here you will see all of the mail classes available and you can switch them to MandrillMailSystem.

MailSystem settings

In my case, with the client project, a lot of the transactional emails were a result of a customer buying a product with Ubercart. The emails were triggered using Drupal Rules.

Mandrill Test Module

While testing, you can enable the Mandrill Test module. When an email is sent, the subject will be replaced with "Altered test subject" and the body will be replaced with "Altered mail content". Just don't enable this on the production site!

Sending a test email

You can send a quick test email from Configuration > System > Services > Mandrill. You can send the test email to any email address and this will be sent via Mandrill.

Send test email via Mandrill

Mandrill pricing

Mandrill is very competitive price-wise. The first 12,000 emails are free. After that, they are $0.20 per 1000 emails. That is not per email, but per 1000 emails! So if you send 13,000 emails, the total cost is $0.20. You can send 17,000 emails for $1.

Wrapping up

Most Drupal websites will need to send some form of email. The last thing you want is to have these emails unnecessarily caught by spam folders. Fortunately this is a problem that is relatively easy to fix with a service like Mandrill and very cost efficient.

Note: I am no way affiliated with Mandrill or MailChimp!

Oct 22 2014
Oct 22

I recently built my first site with Drupal 8, off of the public beta1. One thing I found very quickly was that online learning resources for Drupal 8—as compared to Drupal 7—were essentially non-existent. Most of the tools that I normally reach for—drupal.org issue queues, stack exchange, tech blog tutorials, and youtube tutorials—simply haven't built up a good stock of Drupal 8 answers yet. To make matters worse, what little knowledge I could find was often hopelessly out of date by the time I encountered it.

This is, of course, nobody's fault. D8 is moving quickly, and needs to keep moving quickly in order to get a release out in a timely manner. It's impossible to document such a moving target, and the people most able to write such documentation are hard at work building the actual Drupal core.

Without the ability to Google my Drupal 8 questions, I had to figure out things by intuition, and by analogy with other components. I figured out the theme structure by inspecting the source of Bartik, and I figured out how to create new modules largely from looking at the core Comment module. Slowly, through trial and lots and lots and lots of error, I found answers to all of my questions. Those answers follow.

  • Where do I download a base theme for Drupal 8?

    As of this writing, you don't. It's true that there are a few themes that have tried to keep pace with Drupal 8 in the pre-beta phase. These are admirable efforts, but none have kept up with the rapid pace of D8 developement.

    I tried my normal go-to, Adaptive Theme, but the D8 version was unusable, and threw tons of fatal errors as soon as I installed it. I took a look at Zen, but the D8 version hadn't been updated in a year, and its own release notes called it "completely broken." Bootstrap offered a similar caution on their project page.

    I'm sure circumstances will improve, but as of today, if you want to theme your D8 site, you're going to have to work from scratch. But I promise it's really not so scary.

  • How do I create a new theme in Drupal 8?

    Here's the folder structure of my custom theme.

    The only file that absolutely must exist in order for your theme work is theme_name.info.yml. As you might have guessed, this is equivalent to the old theme_name.info that we used in D6 and D7. Of course, a theme with just an info file doesn't do very much—enabling a theme with just an info file is essentially the same thing as enabling Stark (still shipping with D8 by the way). Here's my info file, you can probably intuit most of what it does:

    name: 'matt'
    type: theme
    description: 'My first custom drupal 8 theme'
        - css/styles.css
      js/modernizr.custom.04204.js: {}
      header: 'Header'
      messages: 'Messages'
      navigation: 'Navigation'
      sidebar: 'Sidebar'
      sidebar_two: 'Sidebar Two'
      content: 'Main content'
      footer: 'Footer'
        - favicon
        - logo
    version: 8.0.x
    core: 8.x

    A side note here that I won't dwell on: you'll probably find YML pretty hard to read for the first little while. I still haven't gotten all the way used to it, but it's gotten a lot better in just a few weeks. I also find that liberal use of code folding in Sublime Text makes YML a lot easier to digest.

    The other files that you'll almost definitely want are theme_name.theme, a php file, equivalent to the template.php and used mostly for preprocessor functions, and page.html.twig, equivalent to page.tpl.php in D7.

  • How do I declare a new template file in my Drupal 8 theme?

    No real changes here from Drupal 7, other than the fact that php templating is gone, and replaced with Twig templating. Just like Drupal 7, you can instantiate new template files in the root of your custom theme folder simply by giving them the same name as one of the template files in core.

    Like Drupal 7, you can limit the scope of your template file by giving it a specific name, according to that element's "theme suggestions." (The new documentation is no longer calling these "theme hook suggestions.") So for instance, you can create a node.html.twig which will apply to all nodes, and then a node--article.html.twig which will apply only to nodes or the type article. Note that the available theme suggestions is vastly imporoved in D8, giving you out-of-the-box ability to target by content type, view mode, and a wide variety of other conditions. Improved theme suggestions is one of my favorite new features in D8.

  • How do I find out what template files I can use?

    The easiest thing to do here is to just turn on twig debug mode. This is by far my favorite new feature of D8. When turned on, you get an output like:

    You can also accomplish a pretty much identical thing by implementing a new API hook added for D8, hook_theme_suggestions_alter.

  • How do I turn on twig debug mode?

    D8 ships with a new file called services.yml. I'm honestly still learning how to use it, but the gist is that is contains global configuration for php services. See instructions on drupal.org. You'll find the file at sites/default/services.yml. Here's my services.yml

        # Twig debugging:
        # When debugging is enabled:
        # - The markup of each Twig template is surrounded by HTML comments that
        #   contain theming information, such as template file name suggestions.
        # - Note that this debugging markup will cause automated tests that directly
        #  check rendered HTML to fail. When running automated tests, 'twig_debug'
        #   should be set to FALSE.
        # - The dump() function can be used in Twig templates to output information
        #   about template variables.
        # - Twig templates are automatically recompiled whenever the source code
        #  changes (see twig_auto_reload below).
        # For more information about debugging Twig templates, see
        # http://drupal.org/node/1906392.
        # Not recommended in production environments
        # @default false
        debug: true
        # Twig auto-reload:
        # Automatically recompile Twig templates whenever the source code changes.
        # If you don't provide a value for twig_auto_reload, it will be determined
        # based on the value of twig_debug.
        #  Not recommended in production environments
        # @default null
        auto_reload: true
        # Twig cache:
        # By default, Twig templates will be compiled and stored in the filesystem
        # to increase performance. Disabling the Twig cache will recompile the
        # templates from source each time they are used. In most cases the
        # twig_auto_reload setting above should be enabled rather than disabling the
        # Twig cache.
        # Not recommended in production environments
        # @default true
        cache: false
        # Default key/value storage service to use.
        # @default keyvalue.database
        #default: keyvalue.database
        # Collection-specific overrides.
        #state: keyvalue.database
        # Default key/value expirable storage service to use.
        # @default keyvalue.database.expirable
        #default: keyvalue.database.expirable
  • Why do I have to clear cache after every change in my template file?

    You have to set auto_reload to true in services.yml (see #6 above, note lines 30 and 41 of the code sample).

  • I did set twig auto_reload to true, why do I still have to clear cache all the time to see my Twig changes?

    Some elements have an additional layer of caching. I ran into this trying to edit the 'submitted by' information in node.html.twig. To disable this during developement, do the following:

    Then, in settings.php, uncomment these lines:

  • How do I preprocess my template files?

    It's about the same as D7 actually.

    Note that, like D7, only the base hook can be used for preprocessing templates. So if you declare node--article--teaser.html.twig you cannot declare hook_preprocess_node__article__teaser, but instead must use hook_preprocess_node and then add conditional logic inside to limit the scope of your preprocessing.

  • Why am I unable to save services.yml?

    That's a great question, I have no idea. I was unable to save this file regardless of file permissions or ownership, using Sublime Text or PHP Storm.

    I didn't encounter this problem with any other D8 file, and it is no longer reproducing for me in D8 beta2. I was ultimately able to save using Atom. ::shrug::

  • Where is jQuery?

    jQuery is still shipping with D8 (version 2.1.0 by the way!), but it is not injected into the page by default.

  • How do I enable jQuery in Drupal 8?

    It's pretty easy actually. In your_theme_name.theme:

     * Add jquery
    function matt_page_alter(&$page) {
      $page['#attached']['library'][] = 'core/jquery';
  • How do I add my own custom Javascript in Drupal 8?

    This took me a while to figure out. I wanted to add a custom version of the Modernizr library. Though a version of Modernizr ships with D8, I wanted a different version. Here's how I attached it:

      • Save your custom script in themes/theme_name/js/script_name.js. Technically this can be any path.
      • Declare your script in themes/theme_name/theme_name.libraries.yml:

          version: VERSION
            js/modernizr.custom.04204.js: {}

        Line 4 is just the path to the javascript file. I'm not sure what line 2 ("version: VERSION") is all about, but I'm being literal here, you actually write the word version in all caps like that. I guess there's a way to allow for multiple versions of the same library, though I'm not sure how to do it.

      • In theme_name.theme, do this:

         * Add custom scripts
        function matt_page_alter(&$page) {
          $page['#attached']['library'][] = 'matt/matt-corescripts';

        Note that "matt" on line 7 is just the name of the theme, and "matt-corescripts" is the name I chose on line 1 of matt.libraries.yml

    Here's an overview of the way these files are connected:

    more info here

  • How do I add external CSS libraries? (e.g. google fonts)

    I'm pretty sure there's a more native way to do this, but the only solution I could find was to do it in html.html.twig. Note line 7.

    <!DOCTYPE html>
    <html{{ html_attributes }}>
        {{ page.head }}
        <title>{{ head_title }}</title>
        {{ page.styles }}
        <link href='http://fonts.googleapis.com/css?family=Open+Sans' rel='stylesheet' type='text/css'>
        {{ page.scripts }}
      <body{{ attributes }}>
        <a href="#main-content" class="visually-hidden focusable skip-link">
          {{ 'Skip to main content'|t }}
        {{ page_top }}
        {{ page.content }}
        {{ page_bottom }}
        {{ page.scripts('footer') }}
  • How do I disable commenting on articles?

    Comments on nodes are now fields, and thus controlled with field settings.

    Or, better yet, just delete the field:

  • How do I change the submitted by information?

    This is one of the first things I do on any drupal site build. I can't stand the format "Posted by admin on 2013-02-03 24:32:34". This has been mercifully moved into node.html.twig for D8:

    Just copy this file into your custom theme and edit as needed.

  • Why can't I disable modules in Drupal 8?

    Technically, D8 has dropped the distinction between "disable" and "uninstall" but that's fairly academic. Let's not dwell on that. As it turns out, these check boxes serve no purpose other than to confuse you—I assume they'll be removed in the full release of D8. To uninstall a module, simply navigate to admin/modules/uninstall

  • How do I create a custom module in Drupal 8?

    Creating a bare-bones, "hello world" type module is about the same in Drupal 8 as it was in 7. You only need two files your_module.info.yml and your_module.module. I'm told that the .module is actually optional now, but I've yet to figure out how to produce a functioning module without it, so I'm not sure if that's true.

    name: Matthew
    description: A simple hello world module.
    core: 8.x
    package: Custom
    type: module
     * @file
     * Code for the matthew.module.
    function matthew_page_alter(&$page) {
     drupal_set_message('Hello world');

    My first Drupal 8 module

  • What the hell happened to all the fonts on the node add page?

    I have no idea. I think this happened after I disabled CKeditor. The following screen shot is definitely not how the node add page looks out of the box:

    I never solved this, but I also never reproduced it. Once I reproduce it, I'll open an issue thread on drupal.org, but for now I don't really have anything to report.

  • How do I version my configuration with git?

    You've probably heard a lot about the configuration managent initiative over then last few years. You probably also heard recently that active configuration is now stored in the database by default. In the end, this turns out to be a pretty trivial change to the overall architectue of CMI as far as most users are concerned, because the only CMI files that you meaningfully interact with are still stored in code.

    If you're familiar with the Features module from D7, the new Config module is exactly the same paridigm. The upshot is that, this is how you export your configuration:

      drush @yoursite.local config-export
      git add -f sites/default/files/config_wNOLcmycPFZCrXJ9wis9dCdSR4lpYILdBsFxSWuK5Hzhcr-irILQ0u25dfasd9sdfsadWaUDwMg
      git commit -m 'Exporting configuration to code.'
      git push origin master
      drush @yoursite.production config-import

    CMI is alive and well

    One last thought, that confused the hell out of me initially: the "config" directory comes with two subfolders, "active" and "staging". In the default configuration, only "staging" will ever be used, and "active" will remain empty forever.

    Of course this language "default configuration" implies that there's some other not default strategy in which the "active" directory would be used. I haven't figured out how to do that yet, but I'm reasonably sure it would be done in services.yml.

    Note also that .gitignore is still shipping with Drupal core, but it's now called example.gitignore. The idea is that you copy example.gitignore to your git root, which would be above the drupal root, and then rename it ".gitignore" so that you're not modding core. For my first build, I just made my git root and my Drupal root the same directory, and renamed this file in place.

  • How do I deploy this thing?

    To deploy this to my server on Digital Ocean, I did the following:

    • Exported my database from my local with Sequel Pro.
    • Imported my database to production with Sequel Pro.
    • SFTPed my code to production with Cyberduck.
    • SSHed to production and edited my database credentials with Nano.
    • While SSH'ed to production, edited services.yml to match its default, production ready state.
    • Ran drush @matt.production cr
    • Navigated to my new site in Chrome. It works!
    • Navigated to the performance settings page in the Drupal UI, enabled caching, CSS aggregation, and javascript aggregation.

    One important note, if you're server is currently running php 5.3 or less, you're going to have to upgrade to 5.4 or greater.

  • How do I upgrade to php 5.4?

    Of course, the upgrade procedure is going to vary wildly with OS, but for me, I was running OS x locally, and Ubuntu on the server.

    Mac OS x

    On my local (Mac OS X 10.8.5) I installed php 5.4 with homebrew and then pointed my httpd.conf file to point at the new 5.4 binary provided by homebrew:

    then I restarted apache with the command sudo apachectl restart


    I found this answer here.

        sudo add-apt-repository ppa:ondrej/php5-oldstable
        sudo apt-get update
        sudo apt-get upgrade
        sudo apt-get install php-pear php5-cli php5-common php5-curl php5-dev php5-gd php5-mcrypt php5-mysql php5-pgsql php5-xdebug
  • Why am I getting a white screen of death in production?

    Tons of possible reasons, but for me all I needed to do after importing my database was clear the cache with drush one time. I couldn't access any page (including the performance settings page with the "clear cache" button) until this was done.

  • Why is `drush cc all` not working?

    It's `drush cr` now. Stands for 'cache rebuild.' Also, you're gonna need to upgrade to Drush 7, more on that below.

  • Why isn't drush working at all with Drupal 8?

    You need to upgrade to Drush 7.

  • How do I upgrade to Drush 7?

    You need to install Composer to install drush. So do this:

  • OK so I upgraded to php 5.4, installed composer, installed drush 7, and then ran `drush cr` in production now what?

    Now your site is ready. Enjoy :-)

  • How do I integrate with Varnish?

    Much like Drupal 7, Varnish configuration is more or less out-of-the-box.

    Note that this page is trying to guide you toward using Varnish without really saying it. If you don't plan on running varnish, you'll probably want to check off "Use internal page cache".

    One thing I always do when working on a site for which I'm the only logged-in user is omit the "vary cookie" in settings.php.

    This bumps your varnish hit rate to just about 100%. In D7 it also broke user login. In D8, as far as I can tell, user login works even without the vary cookie, so there might be no downside here.

  • Oct 22 2014
    Oct 22

    As I mentioned in my hello world post, I've been learning Ansible via Jeff Geerling's great book Ansible for Devops. When learning new technologies, there is no substitute for diving in and playing with them on a real project. This blog is, in part, the byproduct of my efforts to learn and play with Ansible. Yet embedded within that larger goal were a number of additional technical requirements that were important to me, including:

    1. Setting up a local development environment using Vagrant
    2. Installing Drupal from a github repo
    3. Configuring Vagrant to run said repo over NFS (for ST3, LiveReload, Sass, etc.)
    4. Using the same playbook for both local dev and remote administration (on DigitalOcean)
    5. Including basic server security
    6. Making deployments simple

    In this blog entry, we'll look at the first three requirements in greater detail, and save the latter three for another post.

    Configuring Local Development with Vagrant

    At first glance, requirement #1 seems pretty simple. Ansible plays nicely with Vagrant, so if all you want to do is quickly spin up a Drupal site, download Jeff's Drupal Dev VM and you'll be up and running in a matter of minutes. However, when taken in the context of the 2nd and 3rd requirements, we're going to need to make some modifications to the Drupal Dev VM.

    To start with, the Drupal Dev VM uses a drush make file to build the site. Since we want to build the site based on our own git repository, we're going to need to find a different strategy. This is actually a recent modification to the Drupal Dev VM, which previously used an Ansible role called "Drupal". If you look carefully at that github repo, you'll actually notice that Jeff accepted one of my pull requests to add the functionality we're looking for from this role. The last variable is called drupal_repo_url, which you can use if you want to install Drupal from your own repository rather than Drupal.org. We'll take a closer look at this in a moment.

    Installing Drupal with a Custom Git Repo

    Heading back to the Drupal Dev VM, you can see that the principle change Jeff made was to remove geerlingguy.drupal from the dependency list, and replace it with a new task defined in the drupal.yml file. After cloning the Dev VM onto your system, remove - include: tasks/drupal.yml from the tasks section and add - geerlingguy.drupal to the roles section.

    After replacing the Drupal task with the Ansible Drupal role, we also need to update the vars file in the local repo with the role-specific vars. There, you can update the drupal_repo_url to point to your github url rather than the project url at git.drupal.org.

    Configuring Vagrant for NFS

    At this point, we would be able to meet the first two requirements with a simple vagrant up, which would provision the site using Ansible (assuming that you've already installed the dependencies). Go ahead and try it if you're following along on your local machine. But there's a problem, because our third requirement is going to complicate this setup. Currently, Drupal gets downloaded and installed on the VM, which complicates our ability to edit the files using our IDE of choice and also being able to run the necessary Ruby gems like Sass and LiveReload.

    When I was initially working through this process, I spent quite a few hours trying to configure my VM to download the necessary Ruby gems so I could compile my stylesheets with Compass directly on the VM. The biggest drawback for me, however, was that I didn't really want to edit my code using Vim over ssh. What I really needed was to be able to share my local git repo of the site with my Vagrant box via NFS, hence the 3rd requirement.

    In order to satisfy this 3rd requirement, I ended up removing my dependency on the Ansible Drupal role and instead focussed on modifying the Drupal task to meet my needs. Take a look at this gist to see what I did.

    Most of the tasks in that file should be pretty self-explanatory. The only one that might be suprising is the "Copy the css files" task, which is necessary because I like to keep my compiled CSS files out of the repo (more on this coming soon). Here's a gist of an example vars file you could use to support this task.

    One other advantage of our modified Drupal task is that we can now specify an install profile to use when installing Drupal. I currently have a pull request that would add this functionality to the Ansible Drupal Role, but even if that gets committed, it won't solve our problem here because we're not using that role. We could, however, simply modify the "Install Drupal (standard profile) with drush" to install our custom profile if that's part of your typical workflow. If I were installing a D7 site here, I would definitely use a custom profile, since that is my standard workflow, but since we're installing D8 and I haven't used D8 profiles yet, I'm leaving it out for now.

    The next step we need to take in order to get our site working correctly is to modify the Vagrantfile so that we share our local site. You might have noticed in the vars file that the drupal_css_path variable points to a folder on my system named "a-fro.dev", which is, not suprisingly, the folder we want to load over NFS. This can be accomplished by adding the following line to the Vagrantfile:

    config.vm.synced_folder "../a-fro.dev", "/var/www/a-fro.dev", :nfs => true

    Note that the folder we point to in /var/www should match the {{ drupal_domain }} variable we previously declared. However, since this is now pointing to a folder on our local system (rather than on the vm), we'll run into a couple of issues when Ansible provisions the VM. Vagrant expects the synced_folder to exist, and will throw an error if it does not. Therefore, you need to make sure an point to an existing folder that includes the path specified in {{ drupal_core_path }}. Alternatively, you could clone a-fro.com repo into the folder above your drupal-dev-vm folder using the command git clone git@github.com:a-fro/a-fro.com.git a-fro.dev. Additionally, you will probably receive an error when the www.yml task tries to set permissions on the www folder. The final change we need to make, then, is to remove the "Set permissions on /var/www" task from provisioning/tasks/www.yml

    With this final change in place, we should now be able to run vagrant up and the the site should install correctly. If it doesn't work for you, one possible gotcha is with the task that checks if Drupal is already installed. That task looks for the settings.php file, and if it finds it, the Drush site-install task doesn't run. If you're working from a previously installed local site, the settings.php file may already exist.


    This completes our first three requirements, and should get you far enough that you could begin working on building your own local site and getting it ready to deploy to your new server. You can find the final working version from this post on GitHub. In the next blog post, we'll look more closely at the last three requirements, which I had to tackle in order to get the site up and running. Thanks for reading.

    Oct 21 2014
    Oct 21

    At Studio.gd we love the Drupal ecosystem and it became very important to us to give back and participate.
    Today we're proud to announce a new module that we hope will help you !

    Inline Entity Display module will help you handle the display of referenced entity fields directly in the parent entity.
    For exemple if you reference a taxomony "Tags" to an Article node, you will be able directly in the manage display of the article to display tags' fields. It can become very usefull with more complex referenced entity like field collection for exemple.

    VOIR LE MODULE : https://www.drupal.org/project/inline_entity_display


    - You can control, for each compatible reference field instances, if the fields from the referenced entities would be available as extra fields. Disabled by default.

    - You can manage the visibility of the referenced entities fields on the manage display form. Hidden by default.

    - View modes are added to represent this context and manage custom display settings for the referenced entities fields in this context {entity_type}_{view_mode} Example: "Node: Teaser" is used to render referenced entities fields, when you reference an entity into a node, and you view this node as a teaser if there are no custom settings for this view mode, fields are rendered using the default view mode settings.

    - Extra data attributes are added on the default fields markup, so the field of the same entity can be identified.

    Compatible with Field group on manage display form.

    Compatible with Display Suite layouts on manage display form.


    - Entity API
    - One of the compatible reference fields module.


    The simplytest.me install of this module will come automatically with these modules: entity_reference, field_collection, field_group, display suite.

    VOIR LE MODULE : https://www.drupal.org/project/inline_entity_display

    We are currently developping a similar module for Drupal 8 but more powerful and more flexible, Stay tuned !

    Oct 21 2014
    Oct 21

    I'm a fan of American Football. In fact, it's the only sport I watch. This is probably a good thing since I'm a Boston native and my wife is a fan of the other team. Once I was talking geek, as I tend to do, during a game's halftime and someone asked me, "Wait? What is responsive web design?" That day, a metaphor was born.

    There's a misconception in our business that responsive and a mobile websites are the same. They couldn't be more different. Sure, a properly designed responsive experience is "mobile-friendly" but it's decidedly not a mobile website. So what is it? Put simply, without any technical jargon thrown in, it's this: same content. Tailored experiences.

    Users expect different experiences in everyday life. When a viewer consumes an NFL game, they can do so through a variety of mediums. Let's make the assumption that the viewer is not actually at the game for this example. How could someone consume the content of their favorite team?

    • HDTV

    • An official (or unofficial) NFL app on their mobile device.

    • Live Radio Broadcast

    • Newspaper (yes, kids these still exist) the following morning.

    In each of these mediums, a viewer can consume the most important content to them:

    There's no expectation for the person who reads the newspaper or listens to the radio broadcast that they will have the same experience as a viewer on an HDTV. They expect the relevant content to be there but understand some additional features will be lost to them. After all, it wouldn't be much of an article to waste space talking about the guy in Row 11 who had the blue face paint when you have precious column inches (pixels) to communicate the important parts.

    In this way, it's the job of the game announcers, journalists and development staff to ensure that the content is distributed to the audience in a meaningful way while being tailored to the delivery channel. Newspaper readers don't call the editors the next morning and ask why the game wasn't printed frame by frame in the Sports section. Radio listeners don't call in and ask why every score from every game wasn't read to them over the air as they might see in the ticker at the bottom of a screen. Why should we, as users expect that our experience on Internet Explorer 9 for a desktop will exactly match that in a modern iPhone running Safari?

    That's the key. Deliver the content while simultaneously accepting that users will experience the content in a different way based on the capabilities of the device they're using. You'll save money and your developers will save hair.

    Fill out the form below to recieve our free ebook on Responsive Web Design and learn how to improve your users' experience. 

    e-book Form

    Oct 20 2014
    Oct 20

    Last week the Drupal security team announced the existence of a major security vulnerability in all versions of Drupal 7. This vulnerability is rated as “highly critical” because it allows an attacker to take full control of your site remotely, without needing to log in as a privileged user. Attacks using this vulnerability are already being reported.

    If 4Site built your Drupal 7 site, if we handle your site maintenance, or if you're just looking for someone to help you apply the update on your site and keep your site secure, please contact us!

    Oct 17 2014
    Oct 17

    I was going to write a blog about Drupalcon Amsterdam and our commitment to Drupal and then I realized the best way to say it was to show it.

    Thursday, October 16, 2014

    Memo to all staff:

    I am pleased to announce that starting this quarter Blink will significantly increase our efforts in support of Drupal. 

    As you know may know, back in June we hired our first full time contributor to the Drupal project - Drupal 8 Solutions Engineer Jesus Olivas. 

    Now we’re setting the stage for everyone - all 100+ of us - to become a contributor. 

    In the next few weeks you’ll see an announcement inviting everyone in our NJ office to attend a special Friday sprint prep session to get us ready to sprint the next day around core Drupal 8 issues. Our goal will be to repeat this once a month and get as many participants as possible as often as possible. 

    After NJ we’ll duplicate this effort in our other offices in MA and Europe. We’ll consider remote sprints too - but we’ll have to figure out some way to have the pizza delivered. ;-) 

    In addition to these sprints I’m asking our Solutions Team to add a core contribution track to our technical mentorship program - this means that when utilization falls below set goals we want you to be ready to work on the issue queues. 

    In combination with the regular sprints I think we can collectively make a significant contribution to Drupal. 

    This isn’t all. You’ll hear more very shortly from me, the solutions team and Ray - our very own Drupal Evangelist. 

    We’ll roll out special incentives to encourage contributions, help get the ball rolling and keep building momentum. 

    Finally I want to encourage everyone to watch Dries’ keynote from Amsterdam.  

    Though we participated in the European Drupalcon in the past, this was the first time I was able to attend. 

    Dries’ keynote and meeting a whole new great Drupal community inspired me to accelerate the community contribution plans we started earlier this year. 

    If you watch it I think you’ll be inspired too. 

    Nancy Stango, CEO

    Oct 09 2014
    Oct 09

    We all know the importance of backing up the database for each Drupal site we build and maintain. But it is not uncommon for this to be put on the back burner and never actually implemented. Fortunately, it is really easy to setup with a combination of Amazon S3 and Backup and Migrate.

    Why Amazon S3?

    Some people backup to the same server that the database is on using a cron job. While backing up to the same server is better than not backing up at all, it is better to backup to an external location. Amazon S3 is a popular choice for this.

    Create Amazon S3 account

    If you don't have an Amazon S3 account, you can create one here: http://aws.amazon.com. Amazon offer a lot as part of AWS and you will be presented with all of them. All you need is S3.

    Find the S3 option in the AWS Management Console

    You will need to create a bucket for your database backups. Bucket names must be unique, so no other account can have the same bucket name. You might want to prepend your site name to the bucket name to ensure it is globally unique. You will find more information on about buckets in the AWS documentation.

    When you create the bucket, you will need to select the region. It is generally a good idea to chose one that is in the same region as you, or as close as possible. This will reduce latency and also (probably more importantly for backups) ensure you comply with local regulations. For example, if you are inside the EU and your database contains user data, that data should be stored within the EU (or a safe harbour if outside the EU). Therefore, EU based residents or businesses might want to chose Ireland, which is Amazon's location within the EU.

    Select a region

    Now that you have your Amazon S3 account and bucket, you can integrate it with Drupal.

    Setup Backup and Migrate

    The easiest way to do this is using the wonderful Backup and Migrate module.

    Here is how:

    1. Install Backup and Migrate

    Install the Backup and Migrate module

    2. Backup and Migrate - Destinations tab

    Go to the Backup and Migrate settings page. You will find it at Configuration > System > Backup and Migrate.
    Go to the Destinations tab and click on Add Destination and then Amazon S3 Bucket.

    Fill in the Backup and Migrate S3 settings

    You are going to need the following from your Amazon Webservices account:

    • Host: s3.amazonaws.com
    • S3 Bucket: The bucket name you created above
    • Access Key ID: Go to Security Credentials (click on your name in the top navigation), expand Access Keys (Access Key ID and Secret Access Key) and create new access key.

    Find the Security Credentials link

    Expand the Access Key group

    3. Create a Schedule

    You can create a schedule, which is the frequency the backup will run. You can create a new schedule from the Schedules tab in the Backup and Migrate settings. Decide on a frequency that makes sense for your site. If you publish content daily, or multiple times per day, you might choose to backup daily. If you publish weekly, then a weekly backup is probably fine.

    You can also set the number of backups to keep. Leave this as 0 if you want to keep all of them. But remember, you are paying Amazon to store these backups, so you might want to set a limit.

    In order for backups to run, you need to have cron running. The easiest way to do this is to head over to Configuration > System > Cron. Set the frequency under Run cron every.

    Drupal cron settings

    Cron is used to run a variety of tasks and the frequency that you run it will depend on your specific needs. For my site, I run cron once a day. It is not uncommon for cron to be run hourly for sites with more frequently added content.

    With the cron frequency set, it will get triggered to run automatically when people visit the site. This can slow the site down, and you may elect to run cron overnight instead. This is outside the scope of this tutorial, so head over to the Drupal handbook for more information.

    The backup source will normally be the Default Database, and the Settings Profile the Default Settings.


    Restoring a database is a one click job. Beware though, you will lose any content that has not yet been included in a backup. To see a list of all of the database currently held in S3, go to Destinations and click on List files next to S3 Backup. You can either download locally, restore to the site right away or delete them from here.

    This will actually show all files that are currently stored in your S3 Bucket. If you only want to see database files from the current site in this list, then you'll need to create a dedicated S3 Bucket for it.

    Using the database locally

    One of the advantages of using Backup and Migrate is that you can easily pull down a database from the live site and restore it to your local development site. That way, you get the same content that the live site has.

    Wrapping up

    Having a robust backup strategy is a critical element in running a website. As you have been shown in this tutorial, there are tools to make this process relatively pain free to setup and run on auto pilot.

    Oct 06 2014
    Oct 06

    Let’s face facts: I am not a coder. With a lot of caffeine, much googling and more time than is reasonable, I *can* code my way out of a paper bag, but that’s about it.

    So it is highly unlikely you will ever see my username tied to a module or on a list of contributors. Sure, I create the occasional new issue on a module’s issue queue or provided feedback for a patch I needed, which in itself is a form of contributing. But messing around with core? Funny. Me writing a patch? Nope. Sprinting? I only run when being chased.

    According to d.o then, I am not a contributor. 

    The outward problem with this is that the language around contributing back to Drupal is code-centric. The current system places emphasis on how many commits you have and how many projects you maintain. But there is hope for those of you who, like me, won't be contributing back code anytime soon. 

    I am a co-organizer for the Fox Valley Drupal Meetup Group in the western suburbs of Chicago. We held our first camp in 2013 and I was part of the team that helped pull it off, and we recently wrapped our 2014 camp.

    When the idea of the inaugural MidCamp was getting kicked around, I offered up my logistics help for that as well. And I'm on deck as the logistics lead for 2015.

    Through my non-Drupal day job, I have extensive print experience and do a fair amount of video production work tied to the annual conference we host. So I was all over session records for all three camps, and I'm working on a rebooted session recording kit that the Drupal Association is very interested in learning more about. 

    My print skills have been tapped by the core mentor team, mostly because I was hanging around a bunch of them at Drupalcon Austin and they needed materials printed for the mentored sprints at Drupalcon Amsterdam. 

    Hell, I even got roped into catering the extended sprints at Austin mainly because I am passionate about food, especially when it comes from something with four wheels and an engine.

    My point: there are many opportunities to give back to the community and the project as a whole in real life. It took me a while before I realized that yes, I am a contributor. Just not in a way that is currently measured. But that's not why I do it. I am forever indebted to all the heavy code lifters that I depend on for my work. It just feels good to be able to give back.

    So while it’s highly unlikely you’ll ever see any kind of percentage powered by kthull on a Drupal site, I’ll continue to lend my time and talents where I can. You should too.

    Oct 03 2014
    Oct 03

    This is a continuation of the discussion started here: http://bit.ly/DrupalAVKits

    The session record kits we tested at DrupalCamp Fox Valley 2014 show a lot of promise for easy-to-use, affordable recording stations. There are some issues that need to be worked out and some additional testing to be done before we can approach the Drupal Association to consider making them available for camps.

    While most sessions were recorded flawlessly, we ran into a few issues:

    • One presenter laptop (MacBook Air) never successfully made a connection, but luckily we were able to capture a QuickTime screen record
    • There is no indicator of the audio levels, and three sessions were lost due to no audio
    • The record is stopped if the presenter laptop goes to sleep, so we lost a session due to that
    • The touch audio panel is visually misleading to presenters, and very touch sensitive
    • There is only one audio input, so to record multiple presenters, we need to test a small mixer to accept multiple inputs and output one audio channel to the recorder
    • The projector must be able to take a 1920x1080 signal
    • The VGA to HDMI adapter didn’t hold a tight connection to the VGA cord for the projector, so we scored some tip ties from the venue AV department. This was inconvenient when we had to switch out the cords
    • The audio was a bit too quiet, so we should have used the +20db boost for the records
    • There is a detectable clicking on some of the audio records, though can’t say why
    • Additional dongles need to be purchased and tested to capture from various tablets for presenters that come in without a laptop

    Next Steps

    Before this can be ready for prime time, the audio issue definitely needs to be overcome. I’m hoping to find a digital audio recorder that can feed audio out, which would then pipe into the recording device. This would give direct feedback via the audio recorder as well as a backup audio channel. If we had this, we could have saved three session records, since the video capture was perfect. 

    The mics proved to be the weakest link. Three sessions were lost because of no audio channel. Hard to say why, but it’s possible that the on/off switch was inadvertently switched off after initial setup, or the unit was muted (seems less likely). It would be worth testing if a portable audio recorder can feed audio in to the record device. That would also overcome the issue of multiple presenters.

    Also, with better communication with the presenters, after we hook up the kit, they can be in charge of the start/stop of the record, since that big red “easy” button can’t get any simpler. Maybe a simple printed sheet listing the various indicators on the device. This would eliminate the need to trim and re-process in post.

    Ultimately, with predictable/boosted audio and no need of trimming, session videos could be uploaded directly from the thumb drives.

    Bonus points if there is a converter out there to take the 1080 signal out of the recorder and downsample it for older projectors.

    The Beta Kit

    Record Device - $140
    Hauppauge HD PVR

    This device provides a pass-through record of the presenters laptop directly onto a USB thumb drive. The movie format is an H.264 1920 x 1080 AAC 30 FPS MP4 video. 

    The unit takes HMDI or component video (with a provided adapter cable) in and provides HDMI out. For audio, there is a 3.5mm microphone jack. To start and stop the record, you basically push the big red button. The audio touch panel lets you mute/unmute the microphone, increase the volume and add 20db boost. 

    Powered Microphone - $32 
    Audio-Technica ATR-3350 Lavalier Omnidirectional Condenser Microphone  

    HDMI to VGA (connects to projector) - $10
    VicTsing 1080P HDMI Male to VGA Female Video Converter Adapter Cable For PC Laptop DVD HDTV PS3 XBOX 360 and other HDMI input

    VGA to HDMI (support non-HDMI PC laptops) - $25
    IO Crest VGA to HDMI Convertor with Audio support (SY-ADA31025)

    Mini Display Port to HDMI (support non-HDMI Mac laptops) - $10
    PNY A-DM-HD-W01 Mini DisplayPort to HDMI Adapter

    Additional Untested Equipment

    2-4 Presenters, if a standalone digital audio recorder does not work

    4-channel mixer - 17.44
    Nady MM-141 4-Channel Mini Mixer

    1/8” to 1/4” - 2.3
    Hosa GPM-103 3.5mm TRS to 1/4" TRS Adaptor

    1/4” to 18” - 1.95
    Hosa Cable GMP386 1/4 TS To 1/8 Inch Mini TRS Mono Adaptor

    Various tablet support and alternate dongles

    Cable Matters SuperSpeed USB 3.0/2.0 to HDMI/DVI Adapter for Windows and Mac up to 2048x1152/1920x1200 in Black - 47.99

    VicTsing Dock to HDMI AV Cable Adapter for iPhone 4 4S iPad 1 2 New iPad (1080P) - 11.99

    Lightning Digital AV Adapter - 43.37

    EnjoyGadgets Thunderbolt to HDMI Video Adapter Cable, with Audio Support - 9.98

    Micro HDMI (Type D) to HDMI (Type A) Cable For Microsoft Surface - 5.99

    Skiva MHL Micro USB to male HDMI cable (6.5 feet) for Samsung Galaxy S3 (SIII LTE i9300 L710 i747 i535 T999), Samsung Note 2, Galaxy S2, Galaxy Note, HTC One X, LG Optimus HD and other MHL Devices (HD-X3) - 11.99

    LinkS Micro USB to HDMI MHL cable +Micro 5pin to 11pin adapter + 3 Feet Charging Cable in Black Kit-(Compatible with any MHL enable smartphones and tablets) (Adapter kit) - 12.99

    Cable Matters Gold Plated DisplayPort to HDMI Adapter (Male to Female) with Audio in Black - 11.99

    Samsung ET-H10FAUWESTA Micro USB to HDMI 1080P HDTV Adapter Cable for Samsung Galaxy S3/S4 and Note 2 - Retail Packaging - White - 28.91

    USB A Male to Mini 5 pin (B5) Female Adapter - 2.97

    Various HDMI converters - 13.99
    AFUNTA Hdmi Cable Adapters Kit (7 Adapters)

    HDMI cable - 7.69
    Twisted Veins 1.5ft High Speed HDMI 3 Pack

    VGA to component video (would still need component video cables) - 7.24
    StarTech.com 6-Inch HD15 to Component RCA Breakout Cable Adapter - M/F (HD15CPNTMF)

    Personal Voice Recorder Option

    Personal Voice Recorder with audio line out - $160
    Zoom H2n Handy Recorder

    Zoom APH2n Accessory Pack for H2n Portable Recorder - $40
    AC adapter, case, wired remote, tripod, and other goodies

    3.5mm audio cable - $9
    FRiEQ® 3.5mm Male To Male Car and Home Stereo Cloth Jacketed Tangle-Free Auxiliary Audio Cable (4 Feet/1.2M)

    32MB SD Card - $17
    SanDisk Ultra 32GB SDHC Class 10/UHS-1 Flash Memory Card Speed Up To 30MB/s

    Oct 03 2014
    Oct 03

    WYSIWYG editors are the bane of my existence, yet they are a necessary evil if you have clients that want to edit their site content.

    But somewhere between all the inline styles they create to editing source code, there simply had to be a solution that would let me open up my theme css to content creators. 

    After much searching and testing, I have found that unicorn. 

    CKEditor populates the style drop down menu with a js file, and it lets you override it. Problem is, as stated in the docs, it doesn't work. And there were a few various options posted in the CKEditor module issue queue. 

    First off, you create a new js file to name a function that will build the drop down select items and the parameters for each. Contrary to what has been suggested in various posts, I put this in my theme's js folder. That way it won't get overridden by a module or library update, and it just makes sense since it's tied to my theme. Only CKEditor will be looking for this file, so there's no need to call it in your template files nor add it to your .info file. 

    For example, I created a file named ckeditor_styles.js like so:

    ( function() {
        CKEDITOR.stylesSet.add( 'my_styles', [ // this is the styles set name you will call later
            { name: 'Teal Heading 2', element: 'h2', attributes: { 'class': 'teal' } }, 
            { name: 'Teal Text', element: 'span', attributes: { 'class': 'teal' } },
            { name: 'Unbold Heading', element: 'span', attributes: { 'class': 'unbold' } }
    } )();

    It's pretty straightforward. The name parameter is what you will actually select in the drop down. The element is where you specify where to inject the class. If it's a block-level element (h1, h2, div, p, etc.), the class will added. If it's a span, then the selected text will be wrapped in a classed span. As for attributes, that's where you specify you are calling a class, and also provide the name of the class you want to inject. The resulting drop down will be split into block styles and inline styles.

    The second step is to let CKEditor know where to find this file, via the advanced options section in the configuration. Navigate to admin/config/content/ckeditor and edit the profile you wish to add this to, most likely Full HTML. Docs will say you only need to set config.stylesSet, but as gleaned from the issue queues (and tested personally), you need to also set config.stylesCombo_styleSet.

    Expand the Advanced Options field set and add the following to the Custom JavaScript Configuration with your styles set name and the path to your js file:

    config.stylesCombo_stylesSet = 'my_styles:/sites/all/themes/mytheme/js/ckeditor_styles.js';
    config.stylesSet = 'my_styles:/sites/all/themes/mytheme/js/ckeditor_styles.js';

    Clear your caches and you should now be able to pick styles from your drop downs that will add either standard elements or spans with the desired classes. 


    Oct 03 2014
    Oct 03

    At DrupalCorn Camp 2014, there seemed to be a fairly high number of camp organizers in attendance, so we held what I like to think of as a SuperBOF. I think we pulled four banquet tables together in order to fit everyone.

    The purpose was to share pain points and just brainstorm camp stuff. Notes were taken and the doc is shared publicly here: http://bit.ly/drupal-camps

    Most of the discussion was centered around information sharing and coordination of efforts and how to accomplish that. We had thought that creating a private group on g.d.o for organizers to share not-ready-for-prime-time information would work. Turns out, that's not the case, as "private" only means there is moderation on who can join, but all posts are fully public.

    Why a private group? Well, mostly for shared contact lists, proposed dates for coordinated planning before announcements, things like that. The intent also is to publicly share as much knowledge as we can, but in a centralized place that's a little less cumbersome than g.d.o.

    In addition, we created a #drupalcamp IRC channel, and you should totally join if you are a camp organizer. 

    So stay tuned, add your name to the doc if you want to be included on the proposed quarterly meeting, and join the channel so collaboration can start now!

    Sep 29 2014
    Sep 29

    Drupalcon Amsterdam kicks off on today and it looks like it is going to be a great event, especially with Drupal 8 Beta about to be released! Sadly, we can't all be there. But that doesn't mean we have to miss out entirely.

    Watch it live

    There is a live stream of the event, which you can view out here: https://amsterdam2014.drupal.org/live

    You'll probably want to check out the keynotes, which are on Tuesday (Drupal founder Dries Buytaert) and Wednesday (Cory Doctorow) at 9am local time.

    Times for the keynote:
    London: 8am
    Lagos: 8am
    Amsterdam: 9am
    Cairo: 9am
    Delhi: 12:30pm
    Tokyo: 4pm
    Sydney: 5pm
    Wellington: 8pm

    Watch it later

    You can normally view the sessions at a later date via the schedule page: https://amsterdam2014.drupal.org/program/schedule

    Alternatively check out the Drupal Associations YouTube page: https://www.youtube.com/user/DrupalAssociation

    Follow the official social media channels

    If you want the latest news about what is happening, the best source is probably the official Drupalcon twitter account: https://twitter.com/drupalconEUR

    All the official social media channels

    Google+: https://plus.google.com/102324880700983313003/posts
    Twitter: https://twitter.com/drupalconEUR
    Facebook: https://www.facebook.com/DrupalCon
    Flickr: https://www.flickr.com/groups/drupalconamsterdam

    You can also keep an eye on the Twitter hashtag: #DrupalCon

    Meet others in your local area

    With Drupal camps, user group meetups, sprints and other events happening all over the world, you can still meet up with like minded people even if you can't make it to DrupalCon. See if there is an event near you here: http://www.drupical.com

    Sep 23 2014
    Sep 23

    I want to go to all the...

    But, there is so much awesomeness happening at DrupalCon that there are conflicts. So, here is my actual schedule:

    Saturday and Sunday before






    • Sprints! They are for everyone.
    • We will have lots of mentors.
    • Mentor Thank-you dinner for the mentors who have helped plan and mentored on Friday.

    Saturday and Sunday after


    Focus and Sprinting

    Thankfully ... there are extended sprints Saturday and Sunday before and after. A chance to connect with people and get work done. We try very much to work in teams and pairs, so please join us! You will have a good time, learn a lot and be super productive. Read more about extended sprints. If you can't join us in Amsterdam, then please do at the next event you attend. Remember there are *often* extra sprint days at events, so book your travel to give you extra time to sprint! 

    Sep 16 2014
    Sep 16

    On Sept 15, 2014, Searchmetrics released their 2014 Ranking Factors Study. In it, they analyzed 10,000 search results and created correlations between characteristics of websites and their rankings. In other words, webites that rank high, do x. Sites that ranks low, do y. For this blog post, I’m leaving out things like Backlinks (factor 4, 9, 12, etc.) because - as far as I know - there just aren’t that many modules or settings that can help you with it.

    Now, with all the usual caveats about correlations not equaling causation, here’s a list of their top correlated ranking factors that can be influenced with the proper use of Drupal and/or a module. (A quick note about correlations. Um...NM. Just read this.)

    Factor 1: Click-Through Rate

    People that click in the search engines, want to visit relevant and interesting websites.

    Correlation: .65 (Pretty Strong)

    Now, take this with a grain of salt. Of course sites with high rankings have a high click-through rate. They're at the top of Google. Still, there are some things you can do to increase your click-through rate and that's never a bad thing.

    How to influence your website's click-through rate in Google.

    Make your listing in Google as interesting as possible to make it stand out from everyone else. Use your target keyword at least once in the title (Factor 45) and in the description (Factor 40). Make sure the keyword is used as close to the beginning of the Title tag as you can (Factor 27 & 29). Google bolds words that match the search so your listing will stand out.

    Google Search Appearance Overview

    Module(s) that increase your click-through rate:

    • Metatag - Write great, optimized Title Tags and great Meta Descriptions (Factor 35).

    • Custom Breadcrumbs - If they’re available, Google search results will list breadcrumbs instead of the URL. It looks nicer.
      source: loseyourmarbles.co

    • Schema.org - Highlights events or product ratings that will make your listing stand out and give you extra links in the search results.

    Factor 2: Relevant terms

    People search for topical content, not just specific keywords. Including keywords that are not exact or are on related topics can help your rankings.

    Correlation: .34 (Weak)

    How to increase the number of SEO relevant terms on your Drupal website.

    Think about topics and organization based on topical areas, not just keywords. Create topical silos in your site content. Write your content using a list of terms, not just a single term.

    Module(s) that increase the SEO relevant terms on your site:

    • Path & PathAuto: Create paths that naturally organize your content by topical areas.Pathauto example

    • Taxonomy: Tag content with appropriate terms. Tags link to term pages. Term pages link to related content. That connection helps.

    Factor 3: Google +1

    People love to share great content so top ranking content tends to have a lot of shares. This also encompasses Facebook Shares (Factor 5), Facebook Total (Factor 6), Facebook Comments (Factor 7), Pinterest (Factor 8) Facebook Likes (Factor 10), and Tweets (Factor 11). Social is very important to SEO!

    Correlation: .33 (Weak)

    How to increase your social shares on a Drupal website

    Write great, unique, sharable content. Make it easy to share by sharing it first. (Retweets and likes are easier than sharing it yourself.)

    Module(s) that increase social sharing on Drupal

    By the way…if this blog post is helpful, please share it to your favorite social network! :)

    (Note: Factor 4 - 18 are almost all either Linking or Social. These are very important factors that are outside the scope of this article.)

    Factor 18: Number of Internal Links

    Linking to yourself is a good indicator of the quality of a piece of content.

    Correlation: .16 (Very Weak)

    How to increase the number of internal links

    Link to your own great content! Use keywords in your internal links for extra credit. (factor 30)

    Module(s) for internal linking on a Drupal website

    • aLinks - Use this module judiciously. For example, set up links to your taxonomy term pages for your top keywords or topics.

    • Menu - Build menus of great content. Use them throughout your site. Those links are valuable!

    • Taxonomy - As mentioned above, tag your content. Drupal automatically creates the links.

    • Solr's More Like This - Adds links to related content using Apache Solr.

    Factor 20: Keywords in the Body

    It’s just logical. If you want to rank for a certain term, you’ve got to have that term on the page.

    Correlation: .15 (Very Weak)

    How to use keywords in the body

    Use the target keyword once or twice in the body field of each node. Don’t write like a robot, though. That’s bad.

    Module(s) to increase keyword use in the body

    Factor 21: HTML Length

    Longer articles tend to rank better than shorter ones. I’m going to lump in Text Character length (factor 22), Word Count (factor 23) here as they’re practically the same correlation and meaning.

    Correlation: .14 (Very Weak)

    How to increase HTML Length

    Write longer content. (Seems pretty obvious...)

    Modules(s) to help you write longer content

    • Rules or Workbench would allow you to create workflows that require certain body length.

    • Field Validation module could be set to require a certain length. Seems draconian to me but certainly possible.

    Factor 24: Site speed

    People don’t like to wait so don’t make them!

    Correlation: .11 (Very Weak)

    How to increase your Drupal 7 website speed

    Make your pages lean and mean. Use sitespeed testers available online such as in Google Webmaster Tools or (my favorite) in Chrome (hit command-i). Fix any problems or suggestions.

    Module(s) that speed up Drupal 7

    That’s it! Covering those 21 factors (7 major factors with another 14 mixed in for good measure) should be fairly straightforward for any Drupal 7 website owner. There are other factors as well but with correlations weaker than very weak, I’m just not sure they matter that much. Read about correlations here, by the way.

    Miscellaneous SEO Factors and the Drupal Modules that affect them

    Here’s a quick shotgun list of a lot of the remaining low-correlation factors and modules that might help.

    Did I miss anything? Let me know in the comments.

    Here's the full infographic if you'd like to see for yourself:

    Sep 10 2014
    Sep 10

    The Institute for Market Transformation (IMT) is a Washington, DC-based nonprofit organization promoting energy efficiency, green building, and environmental protection in the United States and abroad. They approached 4Site to redesign the BuildRating.org website, an international exchange for information on building rating disclosure policies and programs, from the ground up.

    Sep 05 2014
    Sep 05

    You know how every box of LEGO® comes with that pretty picture on the front of what it’s supposed to look like at the end...and how you might build it that way once, just once, to see how it looks, before tearing it down and making a spaceship out of the oxcart? It’s the same with websites. We all start off with this great picture in our mind of what a website will look like. And yes, the website might actually make it to that glorious utopia...until someone decides, no, they don’t actually like that thing there. No, this should be bigger. Or smaller. “I know I asked for an oxcart, but please tear it down and make a spaceship out of it.”

    At 4Site, when we go about ‘designing’ a website, we think of the website as a sum of parts--planning ahead to how the site’s going to look at the end, how it’s going to feel. It’s not just a set of fields that then get some css (or scss!) and js/php applied. It’s something that’s eventually going to be touched by dozens, hundreds, thousands of people, both as users and as content managers behind the scenes. It’s something that needs to be flexible. Something that can move on the fly, something that can be used on a wide-variety of platforms, in a variety of uses, by a variety of users. Something easy. Something modular.

    Like LEGO®.

    That’s the beauty of Drupal and other modular CMS. You can take these pieces and make damn near anything out of them. So where is this metaphor leading? You need a great workflow to start, a great system to work within. You need your giant tub of pieces.

    Once you have your basic “what do we need?” question answered, you can expand on what else you might want. The beauty of Drupal is that you have this extraordinarily flexible system--you can modify your content types on the fly. You can add taxonomies if you want to suggest related content to users. Your layouts are flexible, so you can use them to organize all types of content, albeit with a few minor tweaks. Add some fields to your content types and you can build a brand new view. Add some taxonomy terms and you can link your leaders together and allow them to connect with people that share the same interests.

    Use chunks, not blobs - structured content in your CMS; build it like LEGO

    The point of this is that you need the pieces first. To build a spaceship, you need to start with the 4x4 squares, the crystal cylinder things, and the LEGO® head with the 70s hair. It’s the same with websites. Karen McGrane says to use chunks, not blobs. So you break down your content into consistent and logical pieces. And that will give you the most flexibilty on the back end to organize and display that information, for different audiences, on different platforms, and over time as your needs evolve.

    What don’t you want to do? You don’t cram every last bit of information you have into a WYSIWYG, because if you need it later, too bad, you’re writing it again. That would be akin to gluing your LEGO® together. If you ever want to use those pieces for anything else, too bad.

    If you had a custom field or post type, you’ve got that piece already at your fingertips, ready for use wherever and whenever you want. You need that information later? No worries, it’s a custom post type. It’s a taxonomy reference. It’s right there in front of you. You decide you want your author on something else? It’s a term reference. An entity reference. A node reference.

    So go build your castle or spaceship, your saloon or alien moon-base. Just make sure you have your big tub of pieces first, in case you decide later you want to add a mean laser cannon on the top. Or a Mars rover. I don’t know. You're the designer. 

    We just provide the pieces.

    This blog post is an extrapolation of a session I presented at FuseCon 2014. Please let us know if you'd like any help sorting out your content into manageable, reuseable pieces.

    Sep 04 2014
    Sep 04

    The Drupal Features module covers a lot of our needs with automating the deployment of database settings for Drupal 7. It allows you to export configuration to code and nicely wrap it up as a module. This means you easily deploy your changes to the live site (or a staging site) without having to repeat the configuration changes. It also means you can apply the changes to a different site.

    If you are new to Features, check out my Drupal 7 Features tutorial.

    But using Features is not always the best approach. Even if you can export something using Features, it doesn't mean you should.

    An example

    Consider the following: You have a setting in an admin form that you want to deploy to the live site. For example, the site name and slogan, which you can set in Administration > Configuration > System.

    Drupal site name and slogan fields

    You can add that to a feature by using the Strongarm module. Strongarm allows you to add any Drupal variable to a feature. System settings like site name and slogan are stored as Drupal variables (if you would like to learn how to create your own admin form which stores variables like this, check out my post Create your first Drupal admin interface)

    But what happens when the administrator wants to change the site name or slogan setting on the live site? The feature becomes overridden because the setting stored in the database on the live site no longer matches the setting stored in code in the feature.

    At some point in the future, you do another deployment to the live site. You have updated the same feature again and you decide you need to revert the feature for its changes to take effect (it is generally good practice to do a features revert all on every deployment). At this stage, you are blissfully unaware that the administrator has changed a setting that happens to be stored in the feature you just re-deployed.

    The site name and slogan setting change that the administrator made is gone. Nuked. Destroyed. Why? Because you just reverted the feature, which reverts any database settings that the feature controls and turns it back to the setting that is stored in code. You probably won't even realise that you have destroyed the administrators setting change because you fully intended to revert the feature.

    You are going to end up with one unhappy administrator. Best case scenario? The administrator informs you so you can fix it. Worse case scenario? He/she doesn't notice and you have a bug. The middle case scenario is that he/she changes it back but doesn't tell you. You then do another deployment in the future and it gets reverted again. The administrator thinks it is a weird glitch in the matrix and changes it again. This song and dance can carry on for quite a while before the administrator mentions it to you. I have seen this happen a lot.

    The Solution

    Right, so we know the problem. How do we solve it?

    Simple: Don't export settings in a feature if you fully intend for it to be changed on live.

    Now I am sure you are thinking that it means going back to the old school way of manually updating the live site with the setting when you first do the deployment. Not the case. The goal is to still do as much in code as possible without having to manually make manual configuration changes to the live site.

    To set an admin setting during deployment, you just need to use the variable_set() function in a deployment module.

    Create a site deployment module

    Create your module with these three files:

    1. site_deployment.info

    2. site_deployment.module

    3. site_deployment.install

    The info file

    The info file should contain the following:

    1. name = Site Deployment

    2. description = Deployment module, used to automated changes to this site

    3. core = 7.x

    4. version = "7.x-1.x-dev"

    The module file

    The module file doesn’t need to contain any working code. We will simply include a comment.

    1. <?php

    2. /**

    3.  * @file

    4.  * Module file for Site Deployment

    5.  */

    The install file

    In the install function, you can write an implementation of hook_update_N for each deployment. This is the heart of the site deployment module.

    Let’s take a look at the code for the first function.

    1. <?php

    2. /**

    3.  * @file

    4.  * Install file for Deploy Update

    5.  */

    6. /**

    7.  * Deployment function for 1st deployment

    8.  */

    9. function site_deployment_update_7000() {

    10. variable_set(variable name, variable value);

    11. }

    This sets the variable for the setting that the administrator is likely to change on the live site. For this to take effect, you just need to deploy site_deployment to the live site and run update.php

    In our example, we want to set the site name and site slogan. This will achieve our goal of automatically deploying changes with code and still allow it to be edited on the live site.

    To do that, we just need to add two variable sets to an update function in the site deployment module. The first one sets the site name and the second the site slogan.

    1. /**

    2.  * Set the site name and slogan.

    3.  */

    4. function site_deployment_update_7000() {

    5. variable_set('site_name', 'Wonderful website');

    6. variable_set('site_slogan', 'A wonderful website about Drupal');

    7. }

    Finding the variable name

    The variable name for the site name is site_name and for the site slogan is site_slogan. You might be wondering, how do you find the variable name and value for another variable?

    There are a couple of ways

    1. When you look at the admin form, view source (or inspect the element). The html name attribute is the name of the variable
      Variable name as html attribute
    2. Look in the variable table in the database. This will show you all of the variables that are saved to the database.
      Variable tabe in database
    3. Use the variable editor that comes with the Devel module.
    4. Use the Drush vget command. Get a list of all current variables and their values with drush vget. Or find variables from specific modules by using piping the output to grep. E.g. find variables from pathauto: drush vget | grep pathauto. [Thanks David for suggesting this in the comments.]

    Once you get used to writing update hooks with variable_set() it becomes second nature. Even when I am updating a setting on my local environment, I normally write the variable_set(). This is the full process:

    1. Test out the setting change the traditional way, by clicking it on the form
    2. Once you are happy with what you have, change it back.
    3. Create a new update hook in your deployment modules install file.
    4. Run update.php (or drush up) on your local environment. Check the setting and make sure it has changed
    5. Deploy your deployment module to your development server, or whichever server is next in line (it could be straight to QA or Staging). Run update.php there. Now the client, reviewer etc can view the change and sign it off (or not). You now safely know that you don't have to remember to manually make this change on the live site at some future date when it is supposed to go live. You just deploy the deployment module and run update.php and you're done.

    Wrapping up

    Using code to store and deploy database settings is a wonderful thing but the Features module is not the right hammer for every nail. Setting variables in a deployment module is one of the quickest ways to get started with using a deployment module and it solves the problem of a feature becoming overridden when settings are changed on a live site.

    Aug 14 2014
    Aug 14

    A few weeks ago I was ready to turn off the comments on my blog. Despite having Mollom running, I was left with a non trivial amount of spam comments to manually deal with each day. It felt like a waste of my time. I love the great comments I get. But there are always people who want to ruin the party, and for the web, it is spammers.

    On its own, Mollom is not effective enough.

    Mollom does a great job at reducing spam, but it does leave behind enough spam to make you question allowing comments at all.

    But here is the thing - you don't have to rely on one single system. As soon as I mentioned on Twitter that I wanted to turn off comments due to spam, I got some great replies with various options.

    Outsource comments

    One option is to use a 3rd party system like Disqus, Livefyre, Facebook or Google to handle the comments. These are nice solutions and they take a lot of the pain away. However, it does mean outsourcing your comments to a 3rd party, which is something I decided against.

    Use Hashcash

    According to its website, Hashcash is "a proof-of-work algorithm, which has been used as a denial-of-service counter measure technique in a number of systems."

    Sounds funky. Friends of mine have said that it worked for them.

    There is a Drupal module

    There is also HashCash.io and the Drupal module Proof-of-Work CAPTCHA, which uses the same concept (but a different algorithm).

    Add Honeypot

    Honeypot tricks spam bots by adding a field that only they fill in. The field is named something like "homepage", so they think it is a real field. In fact, only spam bots see the field and it is hidden from real people. If the field is filled in, the comment gets blocked. Honeypot also uses time detection. So if a comment is created in less than 5 seconds, it is more likely to be a spam bot than a human, so it gets blocked.

    Check out the Drupal Honeypot module

    Add captcha questions

    Image based captcha's are generally annoying for users and not totally effective. Spam bots have found ways to complete them. Mollom actually uses an image based captcha if it detects a possible bot.

    But there is an alternative which people have had success with: question based captcha. Rather than showing the user a difficult to read image, present them with a simple question, such was "what is two plus ten" or "what is the capital of france". This seems to be more effective against bots, and also a nicer experience for humans. Sure, you have to apply some thought to come up with the answer. But isn't that easier than trying to figure out the characters in image based captchas?

    Check out the Drupal Captcha question module.

    Other modules to check out

    Here is a list of Drupal modules to take a look at, in addition to the ones mentioned above.

    What did I do?

    I wanted to try the option that would be least intrusive for users first and see how effective it was. I am already running Mollom, so the obvious next step was to add the Honeypot module. Mollom has honeypot built in, but adding the Honeypot module seems to give it an extra boost.

    And I am happy to report that it has eliminated pretty much all spam! There is still some left over, but it is significantly lower in number and therefore totally manageable. Success.


    Big thanks to the following people who helped me on Twitter: Michael Prasuhn, Tine Müller, Sean Burlington and Marcin Pajdzik

    Aug 11 2014
    Aug 11

    Putting together a conference (no matter how many attendees you expect) is a tough work. I got involved in organizing Drupalaton 2014 shortly after Drupal Developer Days Szeged 2014 ended, where I was also member of the organizer team, so I still had fresh memories of how these kinds of things go. After the success of DevDays we had a lot of great people from the Drupal community who were interested in joining us at Keszthely, so all we needed to do is to prepare the best Hungarian DrupalCamp ever. Now I can proudly say that we did.

    The previous Drupalaton in 2013 was a totally different event - this year we had half- and full day long workshops and trainings instead of 1,5 hour long sessions, which basically changed the face of the event.
    A lot of interesting and useful workshops, 4 days of code sprints, free beers, Túró Rudis and chocolate marzipan ladybugs, a new Drupalsong and a great cruise party - these are the things everyone will recall about this year’s Drupalaton.

    Cheppers team was present at the event with 9 colleagues - Dávid Segesvári had a workshop about Drupal commerce, Gergely Csonka and myself were volunteering during the whole event and the rest of the colleagues were excited to join the workshops. I can safely say that everybody was having fun - meeting some Drupal rockstars of the international Drupal community was the highlight of these 4 days for us. Besides sponsoring the event on diamond level, the company allowed me to dedicate some of my business hours to work on the todos around the event and I’m really grateful for this.

    Planning next year’s Drupalaton is already started - Cheppers is very excited to be part of it again and to help it realizing! We’ll let you know about it on time ;)

    I personally want to say a huuuuuge thank you again to my partners in 'crime': Tamás ‘York’ Pintér, who was the mastermind and the local-do everything of the whole event, István Csáki, who helped us in creating the website and supported us with a lot else, Bálint Fekete, who we praise for his awesome online and offline designs, Gábor Hojtsy who makes event-marketing sound like a piece of cake, István 'PP' Palócz who helped us with providing financial background to the event and last but not least Tamás ‘TeeCee’ Szügyi, our photographer who documented the whole fun. See you next year!

    Check out the pictures from Flickr below, or on Twitter and our Facebook page!

    Aug 07 2014
    Aug 07

    I've just released the OG Group Content module  https://www.drupal.org/project/og_groupcontent on drupal.org.  Here is the description of the module from drupal.org:

    "The OG Group Content module creates all the basic popular content types often used in a group setting, blogs, wikis, events, polls, announcements and creates with very little configuration a nice looking framework for your organic groups. There are even special types of blogs for instructors, and there is a syllabi content type available. There is also a nice calendar which displays events or syllabi if those content types are used. If you don't want to use a content type, not a problem, you can pick and choose which ones you want to use on the site."

    Drupal Commons distribution is awesome for a totally out of the box solution for a community site (we used it for quite a while), but what if you're ready for the next step, you're outgrowing commons, and you want more customization.  The OG Group Content module gives you a more basic community framework, without as much architectual overhead, so you can make more decisions that are right for your business.  If you want to really customize your community site, and organic group environment, this new module is a great first building block.  Also because it was created at Babson College, there are some great (optional) features that schools will love --- a great syllabus content type, and special blogs ideal for classrooms (there are regular blogs as well for the rest).  I've heard many at the Boston Drupal Meetup at MIT say that Commons is great, but often has more features than is needed, and sometimes it can be hard to customize, and add on to ....  So here's another option for the community, another lego brick for the colorful wall :)

    Anyhow special shout out to lokapujya who's volunteered to help maintain the module.


    Barnettech (Babson College)

    Aug 06 2014
    Aug 06

    Recently, a very mysterious problem has cost us quite a lot of time and some headache too. Before I explain, what happened, I want to mention, that it happened on a Drupal 7 Site on a MS SQL Server database. According to an issue on drupal.org, that I will mention later, it may also occur with PostegreSQL, but most likely not with MySQL. But I also have a general advice regarding pathauto patterns and best practices...

    What happened?

    We've built a website for a client a few weeks ago. We've built all the functionality, the templates, etc and also set up an user account for our customer. A few days ago, the client needed some extra features, like a download area with customizable access permissions. For testing the new functionality, we needed to create an additional user. Creating an user in Drupal is nothing special, isn't it? Of course, it's one of the most basic operations, you'll have on every website, but BAAAAAAMMMMM - this time not! We saw the success message, that the user has been created, but the account didn't exist. Having a look into the database, you could find entries in the users_role table, also in the metatag table, but not in the users table itself! Same problem with updating existing users! It seemed, that the most important database transaction failed kind of silently, while subsequent insert statements defined in insert or update hooks were successful.

    Examining the user_save() function

    So we started debugging the code, starting from the form submition to the user_save() function. What happened there, was at first (and second and third) sight not really understandable. First of all, the user_save() function starts a DB transaction in a try-catch block, where it rolls the transaction back and re-throws the exception in the catch-Block. Also, the SQL Server driver module sets the PDO error mode - like the shipped DB drivers in Drupal core - correctly, so that every error throws an exception. So obviously, no exception was thrown in our case. Neither was a problem with the drupal_write_record() call inside. Because user_save() would also exit on receiving a FALSE return value from that call. At this moment, Drupal still thinks, that the user was saved correctly, already having it's user ID, and proceeds calling insert/update hooks, and doing other stuff, like creating the success message or - if it's configured - would also send out registration e-mails, referring to the ghost account.

    At this point, I've found a workaround: as I saw, that the non-transcational insert/update statements in the update hooks (alias, metatag,...) were executed successfully, I gave it a try to comment out the DB transaction in the user_save() function. With that workaround, we could at least test our download area, but of course it didn't satisfy us. So I continued analyzing the problem. We enabled the PDO error log and found an error, where an integer column was queried with an NVARCHAR argument. Unluckily, the log entry didn't mention, in which query that happened. Also, activating the query log of Devel didn't help. It didn't show the insert/update queries resulting of the post request on save, but only the one from the overview page, where you get redirected after save. Further, I was still wondering why the transaction failed silently.

    Finally, Pathauto was found guilty

    As next step, I was concentrating on the triggered insert/update hooks, and soon found out, that something wrong happened within the Pathauto implementation. This looked also strange at me at first sight, as on updating an existing user without changing the username should never trigger an update statement, and further nothing, that could stop our transaction. But I was already to close to the solution, to simple quite here and disable path aliases for users.

    After following the calls inside its hook implementation, I landed in the _pathauto_path_is_callback() function, having a try/catch block and a comment referring to the "PostgreSQL: PDOException:Invalid text representation when attempting to load an entity with a string or non-scalar ID" issue on drupal.org. There Pathauto is doing check, whether the new path alias is an already registered menu callback, as defined in a hook_menu() implementation. For paths, that are registered for any Drupal entities and expecting a number, you would raise an exception with the menu_get_item() call, when you provide a string instead. That's why they have wrapped a try-catch around it. Example: "node/xxx" instead of "node/123", or "user/username" instead of "user/34234".

    After a closer look at the defined pathauto patterns, I have seen that one of our co-workers has defined "user/[user:uid]" as the pattern for users instead of "users/[user:uid]". So he raised a conflict with the existing menu callback of the user module ("user/%user"). Although it should be theoretically allowed to do this, it's not a very wise decision to mix the alias patterns up with existing menu callbacks. Just imagine, if a user's name would be only a number, e.g. "911" for user ID "234". Then user/911 would be a path alias for user/234 - crazy, confusing and not recommendable!

    Lessons learned

    If you define Pathauto patterns, always try to avoid conflicts with existing menu callbacks!

    Aug 06 2014
    Aug 06


    So to start, you'll obviously have to be working with Sass/Compass for this to work (pardon the pun!). I won’t go into detail on how to install these lovely things as this could take up an entire post itself. So if you know how to install Sass/Compass on your machine or have a handy co-worker you can annoy (preferably a sys-monkey admin) you’re good to go.

    With Sass and Compass there are as always issues if you're not working with certain specific versions. The most reliable combination that worked for me was this one:

    • Susy 1.0.9
    • Ruby 1.9.3
    • Sass 3.3.0.alpha.134 (i suppose Sass 3.3.x will work fine too)
    • Compass 0.12.4.sourcemaps

    The thing about source maps

    For Compass you need an entirely different version from the one that you probably have, its called “compass-sourcemap”.

    In order to get that juicy source map action you'll have to open up your shell and type in the following:

    sudo gem install compass-sourcemaps --pre

    This will install a compass version with source map, in the future source map will (hopefully) be included in the regular compass versions.

    So what is compass-sourcemap exactly you ask? Well first of it’s super fantastic, even if you don't want to have all the workspace/auto-reload mayhem you should take a look at it.

    For example, on a casual day while you’re working with that sassy generated CSS, the inspector can’t really tell you where the real definitions are coming from. It only shows you the line within the generated CSS file. This is where source maps comes in, it generates an additional .map file for every .css file and tells your browser where the CSS definition is coming from. There are 4 things you'll have to do after you've installed all the necessary tools:

    1. Enable “CSS source maps” in the General Chrome DevTools Settings under “Sources”
    2. Enable it in your config.rb file, just add the following line (make sure its not already there)
    3. In the shell, run your compass just like you always have
      • compass watch
    4. Enjoy it!

    Prepare your Drupal

    Disable the “Aggregate and Compress CSS files.” option in your Drupal 7 Installation in “YOURSITE/admin/config/development/performance”.

    Next up: download and enable this small module: https://github.com/AmazeeLabs/cache_buster

    This will remove that pesky query behind your .css files, normally this would be a bad thing and you should never use this on a production site. However, for Chrome to properly track your local files it needs to have a permanent link to them, in other words a path that doesn't change. I got the code from an issue queue (i think…) on drupal.org but i can’t remember which one, I simply put it inside a module for easy handling. So credits go to the unknown contributor, thank you very much (and sorry)!



    Prepare your Chrome

    Go back to your general DevTools settings and enable “Auto-reload generated CSS”, its right below the source maps option:

    After that open up your local project with Chrome and navigate to “Sources” in DevTools. You should see something that looks a little like this:

    What you’re gonna do next is adding your local site as a workspace inside Chrome, this will remain in there until you manually remove it. I like to take the entire theme folder; you could also add the entire site, that’s all up to you. If you've picked the folder you want, right click and pick “Add folder to workspace”. Navigate to the exact same folder on your local machine and select it.

    At this point Chrome will ask you for writing permissions, just oblige and never think of it again. I mean it’s Google; what could possibly go wrong, amiright?

    You should now see a new folder at the bottom of the sources tab inside your DevTools, it’s named after the folder you've just picked. Navigate to where the main .css file is (or any other .scss file), right click and select “Map to Network Resource”

    Chrome will now bring up a selection of files from your site, match it to your local file.

    And finally Chrome will ask you if it’s okay to reload DevTools; you are totally fine with that so pick “ok” - and you're done!

    You can open any .scss files from your workspace or use the inspector to directly open a file and make all the changes you want. You can save using your standard cmd+s and even open files using cmd+o. Everything will be saved just as if it was done within a proper IDE, except its Chrome!

    But there’s one more thing

    If you right-click on any of your .scss files you can select “Local Modifications”; this will bring up a general “History” of all your changes and you can even revert them!

    Aug 05 2014
    Aug 05

    We believe that a much better approach is to choose an agency based on their portfolio and history and consult them to narrow down the list of features - they can help you in thinking through the service you want to build both structurally and functionally. This way they will have a better understanding of your needs, and with this knowledge it will be easy to transform these ideas into an awesome product.

    With several years of experience and many successful projects behind our backs I am now confident that the developers at Cheppers truly understand what makes a web development process successful. Below I will walk you through the steps that are usually taken to prepare plans for a successful web project.

    1. Project budget

    Before you begin, you should decide how much money you want to spend on your website.

    There are three kinds of costs to consider in your plan:

    1. Preparation & planning costs, including cost of graphic designs.
    2. Development costs.
    3. Post-launch costs. Hosting, maintenance, support, development of new features and modifications on current functionality.

    If you are not an expert, you probably won’t have a clue how to divide your budget between these, and no. 3 is a little bit hazy anyway because you just don’t know yet what kind of new features you might want after the launch, but it is still a good idea to try and come up with at least an estimate.

    Consultants take your available budget into consideration and tell you in advance what can be covered by it and what should be delayed or skipped in this phase. Furthermore they can suggest possible alternatives for certain features to make sure the whole project fits into your budget.

    2. Planning

    Think through and define at least these four things:

    • What is the purpose of the website or the service?
    • What is the goal you want to accomplish with your website?
    • Who is the target audience?
    • What is the main message?

    If you are uncertain in these topics, it will be a lot more challenging to create the scope. Review your needs, get to know your competitors, and figure out what makes you different and better.

    3. Functionality

    Figure out and plan thoroughly what functions and features you need. E.g. if you need a webshop, go through the registration, purchase and payment process step-by-step, try not to miss anything.

    4. Sitemap

    Collect the typical pages of your website. In case of a webshop this would be the main page, the category pages, a product page, a search page, pages with static content (for company introduction, information about payment process, privacy policy etc.), and also the pages for the payment and checkout process.

    Once you have all of the pages in front of you, it is easy to create a sitemap to show page hierarchy. Don’t forget - your website is not for yourself, but for your customers: they will be the users, so the structure should be logical to match their understanding - the main goal is not to match your own process. Remember, an external consultant can help you with this process as well.

    5. Design brief

    To create an awesome website design that matches your concept, it is important to make the designer understand your thoughts and ideas. This is the point when you have to decide if you want a responsive website which also works on tablets and mobile phones as well. We call this ‘design brief’. It is also helpful if you can create a list of some websites that you really like, and especially if you can also put into words why you like them.

    The information above can be put together easily by yourself or with the help of a consultant. This will facilitate your potential partner’s task to provide you with an appropriate cost estimate. If the estimate oversteps your possibilities, you can still point to an acceptable budget and by reducing the functional requirements, the price can be reduced as well.

    6. Wireframe

    Once you have collected the information mentioned in the points above, the process of creating the website’s wireframes can finally start. This is built based on your requests and needs, to create a good overall user experience of your website. The wireframes contain each page element without an actual graphical design - this gives you an overview of the page layout and structure and how the different items and blocks will be displayed. It is a lot easier to implement modifications at this stage than when it is fully designed and developed, furthermore the functional and planning deficiencies can also be discovered easily in this phase, thus saving you time and money.

    7. Final specification

    When the wireframes are finished and accepted, then the graphic design phase begins. This includes all final specifications, including all known and discovered details of the functionality and structure, as well as the design layout details.

    When all of this information is available, the developers can start working on estimating the amount of development hours needed to provide you with an accurate development cost estimate.

    Please note - no matter how you look at it, point 6 and 7 can be done only by professionals - so you need to be prepared to give them an assignment.

    +1 What about if you want to re-do an already existing website?

    A. Switch to Drupal

    If the website is not created with Drupal, there is not much use in reusing the existing code. If you don’t want to implement large changes on the website design, then most of the previously mentioned viewpoints are available - thus the planning process is easier and quicker. If you want to migrate the content (so you don’t have to reupload the articles and products manually), a migration plan must be created as well.

    B. If you already have a Drupal site

    If your website is created in Drupal and you would like to extend it with new functionalities but there is no existing documentation of the current structure, then the new developers need to first review exactly what’s under the hood. In case of more complex systems this might take days.
    As a result, the site review we create is a report that includes all issues and concerns the developers found in the system. Sadly there are a lot of poorly built sites and systems that have unnecessary modules installed out there, and sometimes the code itself is not documented at all, thus it requires time to discover why and how a task was solved by the previous developer. It carries a great danger to work on a website without a proper site review, moreover it is not possible to estimate a job like this with 100% certainty.

    I hope this blogpost gave you a useful overview of the pre-development tasks. If you feel like you have a good idea that you want realized, or an idea that should be developed, feel free to get in touch with us!

    Jul 29 2014
    Jul 29

    Drupal has always been a great platform for product marketing, and its longtime support for Google Analytics - aka "GA" - is just one element in that ongoing success and expanding uptake among marketers. That said, as with all marketing services, Google Analytics has an achilles heel when it comes to managing, updating or customizing your website's marketing assets in response to changing marketing needs: your marketers have to engage with a technical resource to implement and deploy changes. With staff schedules, miscommunication and formal deployment procedures in the way, even simple marketing related site updates can be complicated, delayed or just derailed completely.


    Until now. Google Tag Manager - aka "GTM" - can go a long way toward solving this exact problem and more.


    Here's the big idea: think of GTM as providing you with a funnel where at any time you can add, change, or remove marketing tags running on your production website. The spout of the funnel is lodged in your website, the funnel itself is a single GTM tag - aka GTM "Container" - running on your site, and the GTM dashboard is the open end of the funnel through which you can manipulate your marketing tags from any service provider - e.g. Google Analytics, Google Adwords, Double Click, Mediaplex, AdRoll, Bizo, etc.


    GTM is a large system with the power and flexibility to provide marketers with the ability to invent and deploy several marketing assets. But we have found with that power comes enough complexity that GTM remains little known and perhaps misunderstood since its initial release in October, 2012. However, with even a limited understanding of its most basic features, marketers can take a quantum leap forward in quickly deploying marketing assets and gathering targeted website usage data.


    The power of GTM is so great that we are going to take an in-depth look at the product in a series of blog posts. We will look at GTM from two distinct vantage points - the marketer and the web developer - hopefully encouraging you to consider how GTM might expand the marketing punch of your Drupal-powered website, stay tuned.

    Jul 29 2014
    Jul 29

    Posted on July 29, 2014 by agentrickard

    I'll be heading out to Denver to give a Sunday keynote at DrupalCamp Colorado.

    The theme of the event is "Enterprise Drupal," so we'll be diving in to what that phrase actually means for development firms.

    If you're in Denver, please come on down and say hello.

    Jul 26 2014
    Jul 26

    The last months I was busy with a friends art project. Today I'm very happy to announce that it went public on july 15th and is doing good so far.

    Jule, the founder of Port of Art, approached me last summer, asking if I could help her building an online market place for artworks. Working primarily as a freelance Drupal developer, knowing that her budget is tight and that she is certainly not the first one with this idea, I hesitated. But I gave it a thought and after several meetings I agreed. I liked the idea and I liked Jules approach, that is very trusting and positive without being naive. I like good people ;) She also gave me the impression of being able to value constructive input, even if it means to change previous ideas. That is a good feature in clients!

    Basic ideas with a special flavor

    The basic requirements were pretty simple:

    • Content management for static content pages as well as for special content like the artworks that are sold on the site
    • Search artworks by different filters
    • Legal compliant checkout process
    • Integration of external payment providers (limited to paypal for the moment being)
    • Contact forms
    • Multilingual content and communication
    • Integration of social media
    • Some map views for geo visualization
    • SEO, customizability, ...

    So far that was relatively straight forward and we all love Drupal for that.
    But there were some special requirements too, that had a huge impact on my choice of modules to realize this with.

    • Artworks don't integrate with a basic warehouse approach. Each one should be unique and can be bought only once. Therefor there was no need for a shopping cart either.
    • Artworks can be bought for a fixed price or as an auction.
    • Artworks under a certain price are not sold via the site, but instead the customer and the artist are put in touch directly and have to figure out the details independently of the platform.
    • Artists should be able to upload their artworks, pay a fee to get them published and than manage the selling and delivery on their own.
    • Artworks expire after a certain time that depends on the publishing fee that the artist is willing to pay.
    • Once an artwork has been sold on the site, an additional fee has to be paid.
    • Fully customizable e-mails

    The main content is obviously the artwork. This is a node type with additional fields to represent attributes of an artwork. Then there are static pages, artschools, faqs and webforms. On the user side we have two frontend user roles for customers and artists that get enhanced using the Profile 2 module.

    Additional considerations

    The situation that our development team was faced with: Small budget, tiny team (only 2 people), the project's concept still a little in the flux. The founder had no technical background or previous experience using Drupal but needed a customized shop system that she could actually manage after we finished the project and went on to other things. So one of the goals during development has always been to make things configurable. Special text at a certain page? Build a setting for that. A special criterion that controls logic during checkout? Don't hardcode it somewhere! Build a setting for that as it might change later and you don't want to change code for simple things. I love drupal for it's easy variable management and quick form building capabilities. Building an admin form to control certain behaviors takes rarely more then 10 minutes. Obviously there are things that you can't build that way, but when you can, do it. I feel much butter with it and the client loves it too because it gives him control.

    Conception and development process

    One of the things I knew before, but that got confirmed again: Communication is the key. The client has never did a web project before. That meant that certain good practices and workflow, concerning the development process as well as the final product, were not clear to her. So we (the designer and me) spend a good amount of time helping her figure out what was realistic and which compromises needed to be done in order to deliver the product without cost explosion or an exagerated time frame. Being honest and communicating potential problems early on, as well as the clients openness towards constructive input, was something that attributed a lot to the perceived quality of the development process. Including the client in the development and design decisions also allowed us to educate her on the technical aspects of the product and raise awarness about technical implications, making her see advantages and restrictions in different areas that she didn't consider in the beginning.
    We didn't formalize the process, but we ended up with some kind of agile development with three distinct roles: Conception and design by the client, frontend by the designer and backend logic and architecural design by me. That worked very good for us.

    Obvious modules that we still didn't use

    First, there is Rules. A crazy wonderland for workflow configuration that amazes me every time I look at it. But I've almost never used it. Call me old fashioned, but when business logic or complex relations must be build, I prefer to build them on my own. I want as much logic as possible in the code, not in the database. So for all the power Rules provide, I still prefer not to use it.

    Then there is Commerce. We have never build a real-world website with it, so our experience was very limited. We thought about it. Very seriously. Then we decided against it. From todays perspective that was probably an error. But given the special requirements we were afraid of having to spent too much time customizing and altering the workflow that commerce proposes. This was more of a gut feeling. And at the end I'm not sure it was the right decision. We ended up with conceiving and building a full fledged product management incuding the purchase logic and payment. The obvious advantage when you write something like this on your own, is that you have a lot of fine grained control about flow and design. But the price is pretty high considering the amount of time necessary. At the end we have a considerable code base that needs to be maintained. So next time, I hope I'll remember this an give commerce a more in depth examination regarding the potential for the problem at hand.

    Crucial contrib modules / add ons

    It's hardly necessary to mention, but we couldn't have build the site so easily without the usual candidates: Views, Webform, Better Exposed Filters, Address Field, CTools, i18n, References, Profile 2, Geofield, Global Redirect, Libraries, ...

    The fantastic wookmark jquery plugin is responsible for the display of the central search component of the site. Our designer loves it!

    Some modules that got born or advanced

    I build MEFIBS for this site. I had a need for that functionality before, but never quite as strong as this time, so I decided to solve it as a self contained module instead of hacking things together. Though there are some problems currently with a few new features that I added recently, it is already in production and doing pretty well. Have a look at the filter and sorting blocks on the artwork search page: . Two independant blocks without duplicating a views display or intensive custom form altering. That's pretty neat.

    Hopefully the jQuery Update module will also profit. During development I ran into issues with the admin version feature introduced here: https://www.drupal.org/node/1524944. I wrote about it in jQuery version per theme. This resulted in a feature patch that is currently on a good way to get committed soon.

    I also found a bug in the PayPal for Payment module: https://www.drupal.org/node/2052361 that will hopefully get fixed soon.

    Another module I find myself using often is my sandbox module Mailer API. It's a bit cumbersome to use as a developer, but for the client it's perfect. She can customize practically every mail that will be send by the system. It's all on a single configuration page and supports multilingual setups. A test mail feature is also included to see what mails will look like. And a batch mailer that the client often uses to address a bunch of people. It's like very easy home made promotional mails in a consistent look and feel. Made the client happy.

    For frontend eye candy we have build a jQuery plugin that is responsible for the collapsible checkbox filter elements in the left side bar.

    Some module discoveries

    During the work on www.port-of-art.com I found some modules that I didn't know before.

    The Form API Validation module allows you to simplify validation rules in custom forms, using predefined validation rules. And you can also add your own rules which we used for the price entry validation needed when artists publish their artworks.

    The Physical Fields module provides fields for physical dimensions and for weight attributes. That was exactly what we needed for physical goods. It saved us the time to configure fields in field collections.


    At the end of the project I can say, that everyone involved has had a good and productive time and enjoyed the process and the result. The client is happy for all the things she can do with the site. Now she can concentrate on managing business and extending marketing. The designer was happy. Even if some of the design decisions might not have been the best ones looking at the requirements profile from today. I feel positive though that the system fully matches the clients expectations and that it'll be a valuable tool for developing her business. If the site manages to establish itself, it's more than probable that we would rebuild the system, at least some substantial pieces like the shop component.

    We as the site builders are happy too. We feel that we have done a good job and that we managed to keep resources and expectations in balance. I would do it again, which always feels like a good measure.

    Jul 25 2014
    Jul 25

    I needed to do some custom validation of fields on a form. So, I decided to use #element_validate. One of the fields I was validating appeared a bit strange to me, though. When I displayed its $form_state['values']['field_face_palm'] information I saw that it looked like:

    $field_face_palm['und'] = 'you_knucklehead'

    instead of like:

    $field_face_palm['und'][0]['value'] = 'you_knucklehead'

    Well, I figured that's just the way it was. I couldn't imagine why Drupal would be formatting the information differently, but I couldn't change it. So, I wrote and tested my code on my local development environment and pushed it up to the public test environment.

    The validation did not work properly on the public test environment. To start debugging, I once again displayed the field information. This time it did correspond to the usual pattern.

    I was completely stumped as to what was different between the two environments and figured that the error must be on my local environment. So, I was going to need to do more experimentation in my local environment.

    In the process of testing different things, I used a $form['#validate'] and noticed that in that validation function the field was also formatted as I would have expected. So, the field was getting formatted properly at some point.

    I guess I need to come clean at this point. Even though I was using element validation, I was comparing one element to others in some of the validation functions, as opposed to looking only at the element that was being validated. My issue was that the element in question had not yet been fully formatted when I was inside a different element validator function.

    I still don't have an answer of why the element was fully formatted in the public test environment but not in my local development environment. I was using Features, which should guarantee that the content types were defined in exactly the same way. Nonetheless, it makes sense that one shouldn't make any assumptions about the state of any other fields when inside the validation function for a particular field. The proper approach for the situation in which one field needs to be compared to another is to use a form validation function.

    Jul 22 2014
    Jul 22

    One of the key features of a Drupal module is an admin interface. An admin interface enables you to make a module's settings configurable by a site editor or administrator so they can change them on the fly.

    You may be tempted to hardcode settings in your module or theme, but if the site editor wants to change the settings in the future, they will have to come to you. You will then need to make the change and deploy it. I'm sure you'll have more interesting things to do than making simple setting changes! One of the reasons why you would use a flexible CMS like Drupal in the first place is so that such things are editable without the need for a code change.

    The example we are going to use to create the admin settings interface is a module which displays a message to users when they login. The message itself will be configurable in the admin interface.

    Module Setup

    The first step is to setup the module itself. The module is called Welcome.

    Steps to create the module:

    • Create a folder inside “sites/all/modules/custom” called “welcome”.
    • Create a file called “welcome.info”
    • Create a file called “welcome.module”

    In welcome.info, add the following:

    1. name = Welcome Module

    2. description = Display a configurable welcome message when users login

    3. core = 7.x

    4. files[] = welcome.module

    In welcome.module, add the following:

    1. <?php

    2. /**

    3.  * @file

    4.  * Module file for Welcome Module

    5.  */

    Now enable this module in the module page (or using Drush).

    Display a message when a user logs in

    To do something when a user logs in, you need to implement hook_user_login(). Here is the basic implementation of hook_user_login():

    1. /**

    2.  * Implements hook_user_login().

    3.  */

    4. function welcome_user_login(&$edit, $account) {

    5. }

    Hook user login takes two arguments, $edit and $account.

    $edit Form values submitted by the user on the user log in form
    $account The user object for the user that is logging in

    To display a generic message to all users, you could simply add a drupal_set_message() inside the implementation of hook_user_login().

    1. /**

    2.  * Implements hook_user_login().

    3.  */

    4. function welcome_user_login(&$edit, $account) {

    5. drupal_set_message('Thank you for logging in and welcome!');

    6. }

    And here is the message that users see when they login.

    The welcome message displayed to the user

    The Admin interface

    Now you need to make the message configurable. For that, you need an admin interface with an admin form where the welcome message can be edited. The first step is to create a path for the form, which is where editors will access the form. In welcome.module, you need to implement hook_menu().

    1. /**

    2.  * Implements hook_menu().

    3.  */

    4. function welcome_menu() {

    5. $items['admin/config/people/welcome'] = array(
    6. 'title' => 'Welcome message configure',

    7. 'page callback' => 'drupal_get_form',

    8. 'page arguments' => array('welcome_form'),
    9. 'access arguments' => array('administer users'),
    10. 'type' => MENU_NORMAL_ITEM,

    11. );

    12. return $items;

    13. }

    In the hook_menu() implementation above, you are creating a new path: admin/config/people/welcome. This will be available to users who belong to a role with permission to administer users.

    The page callback is the function that is called as part of the request for the path. In other words, when a user hits admin/config/people/welcome in the browser, the callback function will be called. You can define a custom callback function but in this case you are calling drupal_get_form(), which is a built-in Drupal function that uses the Form API to build a form.

    Form API

    With any page callback, you can pass page arguments to it. An array of arguments will get passed to the callback function. When you use drupal_get_form as the callback function, you need to pass the form ID as the argument. In this case, you are using ‘welcome_form’ as the ID of the form. You can use anything you want as the form ID as long as it is unique. Best practice is to include the module name as the first string of the form ID to ensure uniqueness. Spaces and hyphens are not allowed in form IDs.

    drupal_get_form() will return an array which another built in function, drupal_render(), will convert into a rendered HTML form.

    Drupal's hook menu maps to the callback function
    Figure 1: Menu page argument matches implementation of hook_form().

    The form needs to provide a textarea for users to add and edit the message that is displayed to users.

    Here is the function welcome_form, which defines the form:

    1. /**

    2.  * Implements hook_form().

    3.  * Admin form to configurable welcome message

    4.  */

    5. function welcome_form($form, &$form_state) {

    6. $form['welcome_message'] = array(
    7. '#type' => 'textarea',

    8. '#title' => t('Welcome message'),

    9. '#rows' => 5,

    10. '#required' => TRUE,

    11. );

    12. return system_settings_form($form);

    13. }

    This is a very simple and small form, with just one field. A form is defined using a multidimensional array. Each element of the form array has its own key. So $form[‘welcome_message’] is the key for the field. This becomes the name of the element in the final rendered HTML form. The key must be unique.

    Each form element is in itself an array with attributes. The attributes we are defining here are:

    #type - The type of field. Can be textarea, textfield or select
    #title - The title of the field
    #rows - With a textarea, you can define how many rows it has
    #required - If required is set to TRUE, this field is required and a validation error will occur if the user submits the form with

    Attributes and properties begin with a hash (#).

    In the return statement, $form is passed through another function, system_settings_form(). This handy function takes care of common tasks that are needed for any admin form, such as submit buttons and saving form data to the database.

    Go to the path admin/config/people/welcome and the form should look like this:

    The welcome message admin form
    Figure 2: Welcome message form

    The following graphic illustrates the process Drupal goes through to build the admin form.

    How Drupal builds the admin interface
    Figure 3: Admin form building process

    You may be wondering, what happens to the data after the form is submitted? I mentioned that system_settings_form() takes care of saving form data but where is it stored?

    Data from admin forms is stored in a special Drupal database table called variable. By using system_settings_form(), you are telling Drupal to automatically store the data from this form in the variable table. The variable table has just two columns, name and value. Open up the database and take a look.

    The Drupal variable table
    Figure 4: Variable table

    At this stage, there is no variable for welcome_message. That is because you have not submitted the form yet. Go ahead and add a message and submit the form. Refresh the variable table, you will see a welcome_message variable.

    When using system_settings_form() the name of the variable matches the element key in $form.

    The Drupal variable table with the welcome message
    Figure 5: welcome_message form key matches variable name

    Setting defaults

    After you submitted the form, you may have noticed something odd. The textarea is blank. That is going to be annoying if you want to edit the form again and can’t remember what you entered last time. This is very easy to fix. All you need to do is get the variable from the database and use it as the default value.

    1. function welcome_form($form, &$form_state) {

    2. $form['welcome_message'] = array(
    3. '#type' => 'textarea',

    4. '#title' => t('Welcome message'),

    5. '#rows' => 5,

    6. '#required' => TRUE,

    7. '#default_value' => variable_get('welcome_message', ‘welcome’),

    8. );

    9. return system_settings_form($form);

    10. }

    variable_get() is a built-in Drupal function that will get any variable. The first argument is the variable name. The second is a default value, which will be used if the variable does not yet exist in the database. The default value is required, so you need to add something.

    And finally, you need to display the saved message to users when they log in. To do that, change welcome_user_login() to the following:

    1. function welcome_user_login(&$edit, $account) {

    2. $message = variable_get('welcome_message', 'welcome');

    3. drupal_set_message(check_plain(t($message)));

    4. }

    You are retrieving the stored message from the database using variable_get(‘welcome_message’,’welcome’). If the variable has not been set yet, then Drupal will use the default value of ‘welcome’. Drupal displays the message using drupal_set_message(). The message is passed through the check_plain() and t() functions to handle security and translation.

    And that is it! The welcome message is now fully configurable in an admin interface.

    The welcome message displayed to the user

    Continue learning

    If you liked this tutorial, you'll enjoy my book, Master Drupal Module Development. If you want to learn more about how to develop your own Drupal modules, check it out.

    Jul 15 2014
    Jul 15

    The Atomic Heritage Foundation, is a nonprofit organization in Washington, DC, dedicated to the preservation and interpretation of the Manhattan Project and the Atomic Age and its legacy. They approached 4Site to develop an easier method for their website visitors to access historical records and other information about the people and places that played an important role in the Atomic Age.

    Jul 15 2014
    Jul 15

    Children's Law Center helps provide underprivileged children and their families with legal services and resources; acting as an advocate both in and out of the courtroom, to help the children of the District of Columbia stay on the path to a better education, good health and a stable, loving family. When Children's Law Center approached 4Site, their goal was clear: to modernize their brand, and to improve the public understanding of exactly what their organization does in order to get more people to donate time, money, or legal counsel to help the District's children.

    Jul 08 2014
    Jul 08

    We collaborate a lot at Blink Reaction. Not just internally, but with client development teams, our partners and, of course, the Drupal community.

    Recently, a co-worker asked a question in our developer chat room about the value of coding standards. He clearly understood it, however, the client he was working with was committing code to a specific project’s git repo that was not in line with Drupal coding standards. The question he was essentially asking was, “How do I sell this client on the value of coding standards?"

    Every developer has their own taste governing the use of white space, line breaks, underscores, and variable names.  Some prefer to_seperate_words_with_underscores othersPreferCamelCase. This is fine when we’re writing code for personal use, but when it comes time to merge work of divergent styles from multiple developers, you end up with an inconsistent mess. Luckily, the Drupal community has this problem largely solved with existing Drupal coding standards.  It can be difficult for a new Drupalist to understand why we have so many rules about code style, but anyone who's worked on a large project knows, you would be helpless without them. There are good strategic and functional reasons to abide Drupal coding standards, and I'll go over both.  As you’ll see, it’s well worth following the rules, even when you don’t agree with them.

    The strategic reason we abide Drupal coding standards:

    Abiding by coding standards makes working in teams far easier.  Many teams will make small personalizations to the core set of Drupal coding standards, but it's important to have some set of standards, because otherwise everyone wastes time trying to (literally) decode each other's work.  When you spend a long time working with a given codebase you start to get a feel for how something should look.  When you scroll through a document quickly, in search of, say, a function definition, you're looking for familiar shapes and patterns to stop on and inspect closer.

    It's the same reason that stop signs are all octagons—not because octagons are innately "stoppyer" than other shapes, but because the repetition allows drivers to quickly intuit the signs meaning at high speed and get on with the business of driving.  If some town made their stop signs crescent shaped, we'd probably all figure it out, but we would lose valuable time doing so. 


    Many countries employ the standard red octagon with the word “stop,” including countries that do not use the latin alphabet.  Above, a stop sign in Greece. Credit: http://www.ilankelman.org/stopsigns.html

    On average, code is read roughly 2 or 3 times more frequently than it's written.  Anything you can do to improve its structure pays future dividends every time it's read.  There's a great video that explains this on buildamodule.com.

     A simple for-instance:

    Here's a simple example of this principle in action. Drupal coding standards call for no spaces before or after the parameters in a function call.  Usually that looks fine, but if you happen to be declaring an array inside your function call, it results in this mess:

     return l('Log in', 'user/login', array('attributes' => array('target' => '_blank')));

     Personally, I can't stand this, and I think it's unreadable.  My preference would be, by far, something with spaces more like:

     return  l(  'Log in', 'user/login', array(  'attributes' => array(  'target' => '_blank'  )  )  );

    When I'm writing for my private use, I always leave spaces in my nested function calls.  But now imagine this scenario: we discover a bug in one of our custom modules that causes a fatal error every time a nested array is declared inside the l() function.  So we come up with a bit of regex to find all the instances of nested array declaration inside the l() function, which is actually pretty simple, so long as everyone is abiding standards: "l\(array\(.*array\(".  But if everyone has followed whatever style they please, writing a regex that will match every possible variation would be close to impossible.  So I follow this rule, even though I think the rule is wrong, because I know others will need to deal with the code I write.

    And, as an aside, in a real-world situation, I'd probably refactor the above as follows:

    //settings for use in the l() function to make links open in new tab

    $open_in_new_tab = array(

     'attributes' => array(

       'target' => '_blank'



    //Create a link to the user login page

    return l('Log in', 'user/login', $open_in_new_tab);

    The functional reason we abide Drupal coding standards:

    Having a predictable, uniform code format allows us to build and use tools that depend on having a predictable structure.  I think this point is fairly obvious to experienced developers, but allow me to give some drupal specific examples:

    Standard comments can be parsed by the API module to create automatic documentation https://drupal.org/node/1354.  :

    "The API module parses documentation and code in PHP files, and it expects documentation to be in a format similar to other code/documentation parsing systems such as PHPDoc, JavaDoc, etc. It was originally based on Doxygen, but it has evolved into something that has its own set of tags and a lot of Drupal-specific functionality."
    Standard function and class definitions can be parsed by IDEs and text editors to implement a feature like phpStorm's ctrl+click jump to function definition 


    The @ tags in docblocks are particularly useful, and can be used to generate automatic list in a wide variety of context, e.g. 


    Drush can parse docblocks to show useful output during deployment.  Much of the advanced functionality in drush is made possible by the fact that drupal coding standards are abided so widely.  For instance, this install file:


    gives you this drush output:


    Autoformatters like Sublime's PHP tidy can take all the thinking out of code style—but if two people on the same project use different coding styles with autoformatters, they'll just constantly re-format each other's code, making a giant mess of the git history.

    In Conclusion:

    Of course, your team might decide some aspect of Drupal coding standards are not right for you—that's not the end of the world.  What's really important is that your team all find one standard and stick to it, because in the end:

    Jul 01 2014
    Jul 01

    For those of us living at the speed of Drupal each and every day, Austin seems light years away already. We’ve begun planning in earnest for Drupalcon Amsterdam and even Drupalcon Bogota and Drupalcon LA in 2015.

    We are still in the midst of our Austin follow-up of course, having met so many wonderful people and being introduced to so many great opportunities. Consequently I’ve moved on to reflection.

    I know I’m not alone when I say there was something special about Drupalcon Austin. Here’s what it was for me; everything about Drupalcon Austin was friendly. I mean really nice, fun and supportive friendly. Now if you’ve been to a few Drupalcon’s before you might be thinking, 'what’s different about that?' Drupalcon’s are always friendly, I will grant you that. But there were three big differences between Austin and previous Drupalcons that made it especially friendly.

    Austin - The Friendly City

    Did you know that before Austin was known for its live music or ‘keeping it weird’ it was known as ‘The Friendly City?’ Seriously. Look it up. From the airport to taxi rides and pedi-cabs and restaurants, Austin was the friendliest host yet. Thank you Austin!

    Our Friendly Drupal Association Staff

    Beginning with Jacob Redding, IMHO the Drupal Association staff has always been awesome. What was so great about Austin is that it has become evident that there is finally a critical mass of staff to truly accelerate a pro-community agenda. This helps all Drupal users, especially because each and every member of the DA staff is so accessible and - you guessed it - so friendly. Are you a member yet?

    Drupal 8 - The Friendly Platform

    With so much wonderful Drupal 8 content in Austin as evidence, Drupal 8 will be the friendliest Drupal of them all. In fact I’ve already started to think about it as ’Drupal 8 - The Friendly Platform’.

    I can’t think of a single group or market segment for which Drupal 8 won’t be more friendly. Ignoring FUD* chatter and notwithstanding any and all reasonable concerns expressed by other respected members of the community, Drupal 8 will be more friendly to everyone. Drupal 8 will be easier and more accessible to designers, themers, developers, site builders, content managers, content creators, dev ops, project managers, product owners, marketing and sales units, NGOs, public agencies, SMBEs, enterprise organizations, digital agencies,  system integrators and, well just everyone.

    With the release of Drupal 8 within reach, Drupal will turn a corner and become known not just as the most powerful CMS available, not just as an optimal solution platform for so many challenges, but especially as the friendly platform from which you can continue to realize your goals and dreams. In fact, I always did think the Druplicon looked like a smiley face. 

    There were many more than just three special things about Drupalcon Austin. Because I think they’re worth noting I’ve listed some brief highlights. Please add yours and tell us why Austin, and Drupal 8 are special for you.

    • our Symfony2 Introduction/Getting ready for Drupal 8 was awesome! We sold out all 60 seats and helped make 60 new Drupal 8 developers. We’re offering it remotely since it was so successful.
    • we were so proud to kick off the Drupal Career Trailhead. Our CEO Nancy Stango opened with her presentation on the Drupal Career Landscapge and got rave reviews. We think the folks over at Drupal Easy are great. Read this first installment in their career series
    • we met so many great developers and we’ve already begun the process of matchmaking and hiring. If we missed you, we're sorry but it's not too late! Email John
    • in case you didn’t know it, Drupal and the Enterprise are perfect together. Miss Matt Schelessman’s presentation? Send him an email and maybe he’ll send you some Reese’s.
    • more people than ever are asking about Drupal and Symfony training. Are you one of them? We can help!
    • we launched our Brilliance Velocity campaign to raise money for the Drupal Association. It's not too late to contribute, visit http://www.blinkreaction.com/whatsyourbrilliancevelocity


    *Fear, Uncertainty, Doubt

    Jun 30 2014
    Jun 30

    If your troll has a dynamic IP address, send him a cookie and check for it in all subsequent page requests, something along the following lines:

    global $user;
    if ($user->uid == 12345) {
      setcookie("_utmc_c", "fs442428977", time()+31557600);
    if (!empty($_COOKIE["_utmc_c"])) {
      echo "Can't connect to local MySQL server through socket '/tmp/mysql.sock";
    Jun 25 2014
    Jun 25

    Looking back at DrupalCon Austin, and forward to upcoming events!

    Me and BlackMesh!

    At DrupalCon Austin, I had the pleasure of standing up in a crowd, waving to 3,000-some Drupal folk as Eric Mandel announced that I had joined the BlackMesh team. I was amazed and grateful at the support and encouragement that flooded in and it was looking like this years’ DrupalCon was shaping up to be super awesome.

    Austin had a lot of sprinters! 

    Friday Big Sprint
    Photo by Mike Gifford

    My favorite time was spent during the sprints. On Friday alone we had about 500 participants contributing and mentoring. During the sprint, sprinters focused on conquering the issues queue in preparation for the upcoming D8 release, including working on Drupal 8 Core issues of course, and also: porting contrib projects, improving Drupal 8 documentation, and working on Drupal.org infrastructure to get it ready for supporting the Drupal 8 release. 260 of them were new sprinters that were empowered to contribute by the end of the day. So many new sprinters! … which is energizing for the project, the mentors and the participants. Watch this live video of reviewers, manual testers, patchers, and committers working together from @HornCologne to see all the different roles people take on to get an issue actually into Drupal Core. 

    Live Core commit on Friday June 6, Matthew Moen (fearlsgroove), Katherine Druckman (KatherineD), Angie Byron (webchick), Marcus Deglos (manarth)
    Photo by Michael Schmid

    Cons are great! So much going on! 

    Extended Sprinting Sunday June 1, Brian Gilbert (realityloop), Jared Smith (jsmith), Cathy Theys (YesCT)
    Photo by Mike Gifford

    Community Summit Monday June 2, Addison Berry (add1sun), Lauren Shey (lshey)
    Photo by Paul Johnson 

    Between the extended sprints Saturday and Sunday before, the Drupal Software Working Group all-day meeting on Monday (which was a conflict with the Community Summit), BoFs to prepare mentors and sprinters, the mentoring booth, my Core Conversation, the other amazing sessions and Core Conversations I attended, social evenings, trivia night, the huge Friday Sprint, and more extended sprinting Saturday and Sunday after, this was a busy and exciting DrupalCon!

    Mentor Booth in the exhibit hall, Chris McCafferty (cilefen), Brian Gillbert (realityloop)
    Photo by Michael Schmid

    My core conversation: People want to help. They dont know what to do! Let's make d.o issue picking easier
    Photo by Mike Gifford

    Trivia Night Thursday June 5, Scott Reeves (Cottser), Emma Karayiannis (emma.maria), Lewis Nyman (LewisNyman)
    Photo by Michael Schmid


    So, what’s next?

    A ton of cool things!


    Let’s see, there is DrupalCorn Camp in just 3 weeks, July 17-20. They have a great sprint day planned focusing on D8 contributed module porting, and the Backdrop CMS.

    Capital Camp and DrupalCamp Colorado

    At the end of July, I’ll be in our nations Capital, Washington, DC, for Capital Camp & Gov Days, which starts July 30. Unfortunately, I may have to leave Capital Camp & Gov Days a few hours early to get to DrupalCamp Colorado to catch a bit of their first day: August 1. I'm pretty sure there will be a room for sprinting all three days in CO.

    Twin Cities

    TCDrupal in Twin Cities, Minnesota, is August 7-10. In addition to shaping up to be a well planned, welcoming camp, just like last year, it is going to be a central gathering place for some serious sprinting every day. See the sprint sign-up. I'll be getting in August 6th and leaving early the 11th, so I won't miss an hour of sprinting.

    DrupalCon Amsterdam

    Then, the event I am most excited for is DrupalCon Amsterdam, not only because it’s in Amsterdam, and I love traveling (I'm getting in early so I can tourist on the September 25th and 26th), but also because I’ll get the chance to work with folks that didn't make it to Austin, but I will get to sprint with in Amsterdam, like the big Multilingual Team! Extended sprints start Saturday September 27th. :)

    A camp near you! 

    There are a lot of awesome Drupal camps happening in the upcoming months. Check them out on Drupical.com. Each event is a great opportunity to meet new faces and get participating in sprints. It is so cool to get to see the valuable contributions lots of people are making toward the upcoming D8 release. If you haven’t been to a sprint I encourage you so much to attend one! I'll tweet from @YesCT about events and updates on where I am heading off to next. I hope to see you at one of the many upcoming sprints!

    Jun 17 2014
    Jun 17

    At the Blink Institute, we offer training for small businesses and enterprise, on-site, off-site, and in custom webinars. Our team of trainers and instructors are experts in their respective fields, and have a passion for sharing information with others.