Jul 27 2015
Jul 27

Since the last Drupal 8 core update, the API module maintainers started looking for co-maintainers, and Two-Factor Authentication was rolled out to anyone with the Community role on Drupal.org (among other improvements).

What's new with Drupal 8?

Drupal 8's minimum PHP version increased to 5.5.9, and minimum PostgreSQL version increased to 9.1.2. Also, tim-e handed off co-maintainership of the Contact module to Jibran Ijaz and Andrey Postnikov; and Frando stopped being a maintainer of the Entity, Form, and Render systems — special thanks to both tim-e and Frand for their amazing contributions!

Some other highlights of the month were:

How can I help get Drupal 8 finished?

See Help get Drupal 8 released! for updated information on the current state of the software and more information on how you can help.

We're also looking for more contributors to help compile these posts. Contact mparker17 if you'd like to help!

Drupal 8 In Real Life

Whew! That's a wrap!

Do you follow Drupal Planet with devotion, or keep a close eye on the Drupal event calendar, or git pull origin 8.0.x every morning without fail before your coffee? We're looking for more contributors to help compile these posts. You could either take a few hours once every six weeks or so to put together a whole post, or help with one section more regularly. If you'd like to volunteer for helping to draft these posts, please follow the steps here!

Jul 27 2015
Jul 27

By Steve Burge 27 July 2015

drupal shopify

One of the most frequent questions we get asked by OSTraining members is this:

"Is there an easy shopping cart for Drupal?"

Don't get me wrong. Drupal Commerce is a great system, and we have a detailed video class explaining how to use it. But no-one would describe Drupal Commerce as easy, and Ubercart is equally difficult.

There are few other valid e-commerce options. One, called Basic Cart, works great but doesn't actually have any payment options.

So in this tutorial, we'll explain how to use Shopify.com and Drupal together. Shopify is a robust option and can reduce the burden of maintaining an e-commerce store.

Install the Shopify Modules and Library

To get started, make sure you have a Drupal site, plus a valid Shopify.com account

Install these modules, plus any dependencies:

You will also need to download the Shopify API library and upload that to /sites/all/libraries/shopify/

How to Integrate Drupal and Shopify
  • Go to Configuration > Web Services > Shopify API and you should see this success message:

Connect Your Drupal Site to Shopify

Now we're going to import your Shopify.com details into D

  • Log in to your Shopify store at shopify.com
  • Click "Apps" on the left-side
  • Click "Private Apps" in the top-right corner of the page.
  • Enter a name for your application. This is private and the name isn't important.
  • Click the "Save App" button.

Now you will have the information you need to enter into your Drupal site.

  • Go back to Configuration > Web Services > Shopify API
  • Enter the "Password" as the "Token" value.
  • Enter the "API key".
  • Enter the "Secret".
  • Enter your Shopify store URL as the "Domain". It should be in the format of (store-name).myshopify.com.
  • Click the "Save configuration" button.

If you entered all the information correctly, you should see a "Shopify" tab appear in the admin menu.

  • Click "Shopify" and you'll see a series of tabs. Click the "Sync" tab.

Now we're going to import any products on your Drupal site:

  • Click the "Sync Products" button and wait for the import process to finish.
  • Click the "Sync Collections" button.
  • Check your site's content. If the import was successful, any products in your Shopify.com will now be nodes:

Now we're going to allow Drupal to update our Shopify.com store.

  • Click the "Webhooks" tab.
  • Click "Webhooks Registered"
  • Check all the boxes.
  • Click the "Save configuration" button.

Add a Drupal-friendly theme to Shopify

In this step, we're going to add a custom theme to Shopify. This is a really important step. This theme will redirect the user to your Drupal site if they attempt to use any store features covered by this module.

The theme will also disable most Shopify store features except for the shopping cart and customer login area.

  • Click the "Theme" tab.
  • If your site is online, click "Upload and Publish Automatically".
  • If you are on localhost, click "Download Only".
  • Go to your Shopify.com store, click "Themes" and upload your theme.
  • Click "Publish theme".

Work with your Shopify products

Now that the integration is complete, you'll be able to treat your products as if they were normal Drupal nodes:

You'll be able to add fields to extend your products:

A video on Shopify and Drupal

The developers of the Shopify module have a video to guide you through the integration:

[embedded content]

Jul 27 2015
Jul 27

Whether you're counting Business Summit attendees or conference registrants with C-Suite titles, last year DrupalCon Europe saw about 500 attendees who were highly interested in the business-side of Drupal. As we saw in the Business Track and the business-related BoFs, there is a strong interest at Cons for not only learning the skills to code better, but also to make your business better, and DrupalCon Barcelona will be no different.

One session we’d like to highlight is 'Self-Managing Organizations: Teal is the New Orange', found in the Business track. It will be presented by Lukas Smith (lsmith77) and Tonio Zemp (tonsibonsi). Lukas, a coder by origin, was the co-release manager of PHP 5.3 and is part of the core team of Symfony. Tonio is a product owner on the Drupal team in Zurich. Both are also part of the management team at Liip, where they explore new ways to build sustainable tech businesses based on empowerment and self-management.

While it can be easy to focus in on one track, we recommend that you keep your eye out for sessions outside your comfort track like this one -- they can add a different kind of value to your Con experience.  Join Lukas and Tonio in Room 111 on Tuesday from 15:45-16:45; they will walk the audience through their experience at Liip as it has grown and how they have dealt with those changes as an organization. Their talk will touch on business, culture and contribution as they move towards becoming a ‘teal’ organization focused on self-management.

Even if business isn’t your usual area of interest at a Con, 'Teal is the New Orange' is packed with material that will benefit everyone from coders to the C-Suite and we look forward to seeing you there.

Jul 27 2015
Jul 27

Last year we conducted a Drupal Job Market survey to better understand the opportunities for those who know Drupal. The survey showed strong demand for Drupal skills and demonstrated why Drupal is a rewarding and potentially lucrative career path. We are conducting another survey this year. 

Take the Survey

This year we are adding questions about compensation to help Drupal talent and hiring organizations benchmark themselves.

You can expect to see the results from the survey published in late August. Thank you for taking the survey!   

Jul 27 2015
Jul 27

27 Jul 2015

While you may have already moved your existing site to Drupal- the dynamic, open-source website development platform, it is yet again time to make the shift. Having enjoyed the highly flexible application framework and the benefits of this enormously versatile Content Management System; it is only natural that you move on to your chosen version of Drupal. However the question faced by many, now, is whether to opt for Drupal 7 or 8. 

drupal migration services Should you opt for Drupal 6 to 7 or 8 migration? For marketers, who are willing to take the leap, the question of why not directly jump to Drupal  8 may also arise before choosing an option.  Though Drupal 8 is still in its early stages of development, enterprises are teaming up with organizations that are experts in Drupal technology to help them make this leap easy.  However for those of you who want to migrate to Drupal 7, here is a comprehensive list of facts stating why Drupal 6 to 7 migration will be a significant and beneficial move.

Why Drupal 7:

With the latest version of Drupal 7, it now has much longer support with bugs and security fixes in place and this makes it a much sophisticated version of the existing CMS. Drupal 7 in many ways has an enhanced security system with several features such as the following.

  • A more secure password system
  • Augmented secure log-in system, 
  • Enhanced systematic and secure implementation for scheduled tasks 
  • An array of modules that can be easily updated through the web.
Drupal Migration can enormously benefit e-commerce sites that are highly complex and the ones that have unique workflow logic. This shift will prove beneficial to the sites that engage in activities such as crowd-funding, partial payment, donation etc.   Drupal 7, as a commerce module, provides innate flexibility of incorporating various business protocols and logic into e-commerce features of the site.

Drupal 7 with all its enhanced features also brings in some revolutionary usability features, unlike its previous version. With administrative links to simply edit existing page elements, they are now made available on each page. It eliminates the task of going to the administrative page each time a user wants to use the link. Drupal 7 also boasts of an increased support for integration of WYSIWYG editors along with an added mode of drag-and-drop for administrative tasks. 


Welcoming Drupal 8: 

As an advance version, this one is already on many organizations’ waiting list. Naturally the newer version comes with more sophisticated features with a very easy to use handle. With a big focus on effortless content authoring, Drupal 8 prioritizes WCM for an enhanced digital experience. Being ultra responsive, Drupal 8 comes with a “mobile first” strategy.  Along with this it also boasts:

  • An improved developer experience
  • A clear focus on multilingual and integrations

So whether you go for Drupal 7 or 8 migration, what is more important is who you team up with for a seamless process. It is crucial that you choose an experienced Drupal partner who understands your needs and who has a proven track record of providing the right solutions at the right time. 

The Faichi Advantage: Faichi boasts of proven methodologies in helping businesses carry out a seamless Drupal migration. So, be it Drupal 6, 7 or 8 we have a highly expert team of professionals on board, while we facilitate the making of multi-lingual and highly responsive portals in the most efficient manner.  With a range of consultants, specialist and an enormous community support and most importantly the pre-coded modules, migration becomes very easy as it replicates the original and also helps preserve the URLs for the future. With the onslaught of newer versions of Drupal, Faichi takes on every new version to help organizations decide what’s best for them and why? Truly innovative in its approach, Faichi is the one-stop-solution for you when it comes to the migration of your site onto the next version of Drupal.

Jul 27 2015
Jul 27

Since July 2014 there’s been a feature in Drupal 8 has a way to override backend specific services. There are over 25 services in Drupal core that are overridable.

Three of these are within the user module, so let’s look at overriding one.

A simple service is the “user.data” service. It allows data to be stored, fetched and deleted, relating to a user account. For this there is an interface, UserDataInterface and this is implemented by UserData. We need to create a module that adds a new service that implements UserDataInterface.

Firstly create a folder for your module, something like alternative_userdata, then the file alternative_userdata.info.yml, in there we can define the module’s info:

name: Alternative UserData type: module description: Adds alternative storage for user data. core: 8.x dependencies: - user

So we are setting the name of the module, the type, a description of it the core version and that it depends on the user module.

Next we need to define our service, for this you wil need to create the file alternative_userdata.services.yml. In here we can add the service called alternative_userdata.user.data and set the class.

services: alternative_userdata.user.data: class: Drupal\alternative_userdata\AlternativeUserData

Next is to create the class, add a folder named “src” (following PSR-4 autoloading standards) and within this a file called AlternativeUserData.php.

<?php /** * @file * Contains \Drupal\alternative_userdata\AlternativeUserData. */ namespace Drupal\alternative_userdata; // Here you may need to add the use operator to pull in any backend client. /** * Defines the alternative user data service. */ class AlternativeUserData implements UserDataInterface { // You may need to add a protected variable here for your backend client. /** * Constructs a new user data service. */ public function __construct() { // Here you can create a new instance of your backend client. } /** * Implements \Drupal\user\UserDataInterface::get(). */ public function get($module, $uid = NULL, $name = NULL) { // This needs to return the user data from your backend. } /** * Implements \Drupal\user\UserDataInterface::set(). */ public function set($module, $uid, $name, $value) { // This needs to save the user data to your backend. } /** * Implements \Drupal\user\UserDataInterface::delete(). */ public function delete($module = NULL, $uid = NULL, $name = NULL) { // This needs to save your user data to your backend. } }

Now that we have a service defined in our module, the module can be enabled, however it won’t do anything until we set an alias in your site’s services.yml file.

Edit services.yml and add the following:

services: user.data: alias: alternative_userdata.user.data

Now when Drupal looks to use the user.data service it will actually use the alternative_userdata.user.data service from your module.

Simple, right?

Please enable JavaScript to view the comments powered by Disqus.

blog comments powered by
Jul 27 2015
Jul 27

This is tutorial #2 in the Drupal Commerce tutorial series. In the previous article, we showed you how to add a basic product along with a display to showcase that product in the front end. This article makes product addition and management smoother by using the Inline Entity Form module.

By the end of this article, you will be able to present users with the ability to choose from multiple products in a dropdown as shown in the screenshot below. What we have done here is create an exclusive custom content type "Inline Form Flashlight Display" and associated a new field "Choose a Flashlight" with it. This field will support the functionality associated with the Inline Entity Form module.

The benefit of this module is that it conveniently allows us to add more than one product to a display. You can easily Add/Delete/Edit multiple products from one page itself by clicking "Add content" of the newly created content type.

To get started, you will have to download, unzip and enable the Inline Entity Form module. Go to www.drupal.org/project/inline_entity_form to access the latest tar.gz file.

Let’s get started by enabling the newly downloaded module. Click “Modules” on the top-level menu.

Search for “Inline Entity Form”. Enable this module.

Click “Save configuration” at the bottom of the page.

You will have to create a new content type that can satisfy the module requirements of “Inline Entity Form”. Go to "Structure" -> "Content types". 

Notice the content type “Flashlight Page Display” that was created for the earlier article. Let’s create an entirely new content type. Click "Add content type".

I have entered “Inline Form Flashlight Display” as the title for the new content type. The Description field has been left blank. 

As mentioned in the earlier article, click on Display settings and disable the checkbox against “Display author and date information”. This will ensure that it doesn’t appear like a blog post.

Similarly, go to Comment settings and select “Closed” from the dropdown for “Default comment setting for new content”.

Click “Save and add fields”.

Create a new field here. In the earlier article, we had created a Product field to manage the display. I am typing in "Choose a Flashlight" under Add new field. Under FIELD TYPE, select “Product reference” from the dropdown. Under WIDGET, select “Inline entity form - multiple values”. This value gets enabled on your machine after the Inline Entity Form is enabled.

Choosing “Inline entity form - multiple values” will ensure that more than one product is displayed via the entity form on the product display page.

Click Save at the bottom of the page. A new page appears. Click Save field settings.

Your field has been added.

Go further below on this page to configure further. Tick the box against “Product” under Product types that can be referenced. Also, tick the box against Allow users to add existing products. This will allow you to add an existing product(s) to the page display.

Select “Starts with” under Autocomplete matching and tick the boxes for all the remaining fields except for Auto generate the product title. 

Go further below and select “Unlimited” under the field for Number of values. Leave the Options List Limit blank to ensure there is no limit to products called. Click Save settings.

The content type and field has now been successfully added.

Let’s now create a page display using this new content type. Click Add Content. Click Inline Form Flashlight Display. 

Provide a title for your display. Enter a description only if you want to. I have used “Flashlights for all seasons”. I have entered a description as well – “Choose from our range of flashlights to have an impact on your marketing campaign”.

Go further below and you will see how the Inline Form functionality has been implemented for this display. You will be able to either add a new or existing product. Let's click Add existing product.

Type in the first few letters over here and your product will be autocompleted for you.

Click Add product. The page will be updated as shown below.

Click Edit under OPERATIONS and you will be able to update the product you just added on this page display itself! The level of convenience offered by the Inline Entity Form module is immense. I am going to alter the price, increasing it from $5.40 to $7.25. Click Update product.

The product has now been updated. Let’s click Add new product. Fill in a unique SKU value and price. Notice how a new product has been updated from the display page itself. Click Create product.

We are adding another new product.

Here is a summary of the products associated with this display:

Click Save.

The new product display will look as shown below. Notice how multiple products are being called from a dropdown. Inline functionality can be tweaked to help online store customers differentiate products based on attributes such as pricing, brand or color. With this article, you now have basic knowledge on how to use Inline Entity Form to manage products on a page.

Next: Adding fields and metadata to the product

Jul 27 2015
Jul 27

Recently, I wrote a blog post on the benefits of integrating your website and CRM, and Anthony followed up with another on the typical integration patterns you commonly see. Annertech have a lot of experience integrating Drupal websites with various CRMs, so this is the start of a new series on CRM integration where we will go into more detail on some of the more popular CRMs we’ve worked with. First up: Salesforce!

I first started integrating Drupal with Salesforce back in 2009 with Drupal 6 for Trócaire. Since then we’ve worked with it multiple times with both ‘web-to-lead’ integration and ‘two-way synchronisation’ integration, with both Drupal 6 and 7.

With ‘web-to-lead’ integration, the website acts as a data capture mechanism, generating leads for Salesforce. When a user submits a form on the website, the data is instantly sent to Salesforce and a 'lead' object is created. Whereas with the two-way ’synchronisation’ integration option, Salesforce remains the canonical data source, but the website can update Salesforce with the latest data captured, and can also pull down the latest updates from Salesforce to ensure the user always sees the latest version of the data. This is normally used in the case of managing membership details - so members can view and update their contact information online while administrators can modify them via Salesforce UI and both systems have the latest copy of the data.


When we worked on our first Drupal-Salesforce integration in 2009, there was only a SOAP interface available when communicating with Salesforce. Since then they have brought in a REST interface, which offers some benefits over SOAP including the fact that it is  faster, more efficient and easier to integrate with.


The module landscape has also changed with Drupal 7. The powerful Salesforce Suite is still the go-to module for any Drupal-Salesforce integration but it changed drastically between Drupal 6 and 7. The main change is that it is now entirely entity driven. This means it will only create or update records in Salesforce when an entity is created or updated. This is great for updating Salesforce with new users, new commerce shop orders, or any other Drupal entity you may have on your site. However, this means that webforms submissions are not supported, which I’ll discuss in more detail shortly.

Salesforce Suite provides a number of hooks you can leverage to manipulate the data before it gets sent/received, as well as hooks for altering the mappings and other useful integration points. The main one we availed of was hook_salesforce_push_params_alter() which allows you to alter the data that gets sent and add additional mappings. I found this one particularly useful for conditional mappings - e.g. if the data that gets sent for a mapping varies based on the data entered or a combination of values entered.

Salesforce Suite supports both push and pull integration, so it is suitable for web-to-lead and two-way synchronisation. In such instances, the module will pull down the latest version of the data from Salesforce. This pull operation is usually triggered in some form, for example a user goes to edit their profile, or renew their membership, etc. However, it could also be triggered on a periodic cron run. Whatever you use to trigger it, just be sure not to issue too many API calls as you’ll just slow down the site and probably upset the folks at Salesforce! Once triggered, the associated entity is updated with the newly fetched data.


The reliance of the Salesforce Suite module on entities posed a number of problems for us however. Most sites we create have a number of webforms, from lead generation forms to ‘volunteer with us’ type forms and donation forms. For new sites, there is an alternative to the webforms module: Entity Form. With the Entity forms module, every time the form is submitted a new entityform submission entity is created. As it’s an entity, there is no problem using the Salesforce Suite module to send the submitted data to Salesforce.

However, it’s not always feasible to use the Entity Form module. This was the case for us where we had an existing site with a large number of webforms already in place, including one that used the Pay module for processing donations. Instead we opted to use the Salesforce Webform Data Integration module. Unfortunately this doesn’t integrate with Salesforce Suite and uses the old SOAP interface, so you will have to configuration the Salesforce credentials twice and in potentially two different ways. There is a patch in the issue queue which integrates the two modules to avoid this issue, and there’s a few other patches from the issue queue that you will need too.

There is an older Salesforce Webform Integration module maintained by myself for Drupal 6, but it relied on the old non-entity driven Salesforce Suite architecture and so I’ve deprecated this in favour of Salesforce Webform Data Integration module. Other Salesforce-webform integration modules have been released since and have matured over the past couple of years that I have yet to try. Given the current lack of activity on the Salesforce Webform Data Integration module, these might be worth considering for newer projects, but Salesforce Webform Data Integration is still the one most actively in use. There’s a good comparison of these quite similar modules on drupal.org which is worth a look before choosing which module to use for your project.

However, one of the nice features about the Salesforce Webform Data Integration module is the ability to configure, on a per-form basis, whether that form should send data to Salesforce and then on a per-field basis configure which fields in Salesforce the data should be mapped to. We found this particularly useful when the client wanted to be able to create new forms and manage the mappings themselves without having to come back to the developers.

As this module works with webform submissions, and not with entities, it only supports web-to-lead integration. If the data in Salesforce is updated, there is no way to update the website with the latest changes.

It also provides a number of hooks including hook_salesforcewebform_data_alter(), which similarly to the Salesforce Suite module hook, allows you to manipulate the data before sending it to Salesforce, for example, we had to change the single on/off checkbox values from 1 and 0 to TRUE and FALSE.


One of the features all of the Salesforce modules seem to be lacking, and most external-CRM integration modules for that matter, is the ability to record which transactions have made it through to the CRM successfully.

On all external CRM integrations we’ve had to implement, I think we’ve had to maintain a custom logging mechanism to track which submissions to Salesforce have made it through and which ones have failed, and if so, with what error. For example, sometimes submissions can fail to get through if there is a problem with the data (like the 1/0 vs TRUE/FALSE issue mentioned previously) or if there was a network issue that prevented it reaching Salesforce. To implement this functionality we were able to use hooks like hook_salesforcewebform_submission_pre_send() and hook_salesforcewebform_submission_post_send() to record submissions sent to Salesforce, and then update their status afterwards.

We’ve also provided a user interface for the client to find submissions that failed, and they can either force it to reattempt the submission, or wait until it attempts it again on the next cron run.

So, that's a short introduction to some of the technical side of implementing your Drupal website with a Salesforce CRM. If you think your organisation could benefit from some of our knowledge in this area, why not give us a call on 01 524 0312 or drop us an email at [email protected].

Jul 27 2015
Jul 27

This was our 9th critical issues discussion meeting to be publicly recorded in a row. (See all prior recordings). Here is the recording of the meeting video and chat from Friday in the hope that it helps more than just those who were on the meeting:

[embedded content]

If you also have significant time to work on critical issues in Drupal 8 and we did not include you, let me know as soon as possible.

The meeting log is as follows (all times are GMT real time at the meeting):

10:08 WimLeers

10:08 WimLeers

10:08 WimLeers
10:09 Druplicon
https://www.drupal.org/node/2524082 => Config overrides should provide cacheability metadata [
=> 147 comments, 39 IRC mentions

10:09 WimLeers
10:09 Druplicon
https://www.drupal.org/node/2429617 => [PP-1] Make D8 2x as fast: SmartCache: context-dependent page caching (for *all* users!) [
=> 226 comments, 21 IRC mentions

10:10 WimLeers
10:10 Druplicon
https://www.drupal.org/node/2499157 => Auto-placeholdering [
=> 2 comments, 3 IRC mentions

10:14 pfrenssen
10:14 Druplicon
https://www.drupal.org/node/2524082 => Config overrides should provide cacheability metadata [
=> 147 comments, 40 IRC mentions

10:14 pfrenssen
10:14 Druplicon
https://www.drupal.org/node/2525910 => Ensure token replacements have cacheability + attachments metadata and that it is bubbled in any case [
=> 176 comments, 29 IRC mentions

10:18 alexpott
10:18 Druplicon
http://drupal.org/node/2538228 => Config save dispatches an event - may conflict with config structure changes in updates [
=> 6 comments, 1 IRC mention

10:20 alexpott
10:20 Druplicon
https://www.drupal.org/node/2538514 => Remove argument support from TranslationWrapper [
=> 12 comments, 4 IRC mentions

10:25 WimLeers
lauriii: welcome!
10:29 lauriii
WimLeers: little late because I'm in a sprint and was helping people ;<

10:45 alexpott
The upgrade path we're talking about http://drupal.org/node/2528178
10:45 Druplicon
http://drupal.org/node/2528178 => Provide an upgrade path for #2354889 (block context manager) [#2528178]
=> 143 comments, 1 IRC mention

10:52 alexpott
10:52 Druplicon
https://www.drupal.org/node/2538514 => Remove argument support from TranslationWrapper [#2538514]
=> 12 comments, 5 IRC mentions

10:52 WimLeers

11:02 dawehner
11:02 catch
\Drupal\block\Plugin\Derivative\ThemeLocalTask also.

11:19 alexpott
berdir: is talking about http://drupal.org/node/2513094

11:19 Druplicon
http://drupal.org/node/2513094 => ContentEntityBase::getTranslatedField and ContentEntityBase::__clone break field reference to parent entity [
=> 36 comments, 1 IRC mention

Jul 27 2015
Jul 27

As one of Canada’s most successful integrated media and entertainment companies, Corus have multiple TV channels and websites for each channel.

It had been a challenge to have multiple channels' live schedule data displayed on websites. All the data are from a central repository. It became a little bit difficult since the repository is not always available. We had used Feeds module to import all the schedule data. Each channel website keeps a live copy of the schedule data. Things got worse because of the way we update the program items. We delete all the current schedule data in the system and then imported from the central repository. Sometimes, our schedule pages became empty because the central repository is not available.

Pedram Tiv, the director of digital operations at Corus Entertainment, had a vision of building a robust schedule for all channels. He wants to establish a Drupal website as a schedule service provider - content as a service. The service website download and synchronize all channels schedule data. Our content manager can also login to the website and edit any schedule items. The site keeps all the revisions for the changes. Since, the central repository only provide raw data, It is helpful we can edit the scheduled show title or series name.

I loved this brilliant idea as soon as he had explained it to me. We are building a Drupal website as a content service provider. It means we would build a CMS for other CMS websites. Scalability is always challenging for a modern website. To make it scalable, Pedram added another layer of cache protection. We added S3 cache between the schedule service and the front end web servers. With it, schedule service can handle more channels and millions of requests each day. Front end websites download schedule data from the Amazon S3 bucket only. What we did is creating and uploading seven days' schedule data to S3. We set up a cron job for this task. Every day, It uploads thousands of JSON schedule files for different channels in different time zones of next seven days each time.

This setup offloaded the pressure of schedule server and let it serve unlimited front end users. It gives seven days of grace period. It allowed the schedule server to be offline without interrupting the service. One time, our schedule service was down for three days. The schedule service was not affected because we have seven days of schedule data in an S3 bucket. By using S3 as another layer of protection, it provided excellent high availability.

Our schedule service have been up and running for many months without a problem. There are over 100,000 active nodes in the system. For more detail about importing large number of content and building an efficient system, we have some other blogs for this project.

Sites are that are using the schedule services now:

Jul 26 2015
Jul 26

So, it turns out the Drupal CMS can be beautiful. I kid you not! Anditko has updated the Adminimal theme with a material skin based on Android Lollipop. I've mentioned Adminimal before, an admin theme that greatly improves the look and feel of Drupal’s CMS, and the latest update takes it that step further into the land of stunning.

The updated design takes the best features from Lollipop whilst maintaining the additional features and added usability that adminimal brought with it.

Page layouts

Pages as a whole have a warmer feel and make full use of the material design guidelines.

The definition of blocks and field sets within pages have been updated as have tables.


Buttons have been recoloured and designed with hover and click effects:

Select lists

Checkboxes and radio buttons are also included in the re design, with added animations when being selected and unselected.

All in all I’m loving the new skin, it also paves the way for other developers to create their own skins to be created.

Obviously as this is still in dev there are some minor issues with layout, I also would like to see some more work done of fields when adding/editing content. With continual development the issues will be worked out and we may even see additional design work done.

Check out the latest version and download it from the Drupal project page.

Jul 25 2015
Jul 25

Faster, more secure, more maintainable. Three nice benefits we get from our new standard Drupal server architecture.

This year we're replacing our old "traditional" LAMP stack with an entirely less pronounceable LNDMPS version. We still use Linux, MariaDB and PHP, of course, but instead of Apache we've moved to Nginx, and we've added Docker and Salt.

If you're a technical person and haven't heard of Docker, you must have been offline for a couple years. Docker is a system for managing Linux containers. You might think of Linux containers as a form of "cheap virtualization." However, the way the Docker community has come to use them is more like a chroot jail -- a way of isolating a single process into a container that protects the rest of the system if that process gets compromised, and not a full operating system with multiple processes. So the best practices are to have a different container for each necessary service, not a single container with the whole set.

Host layout, container linking

So far we have put PHP, MariaDB, Apache Solr, and a bunch of other supporting services into containers. On our production servers, we have kept Nginx, Postfix, and DNSMasq outside the containers and installed directly on the host.

When using Docker, each container has a different IP address bridged on the host, and can expose different TCP ports, or sockets in the filesystem, to allow connections. We configure DNSMasq on the host so that you can reach any container on the host by its name, and set each container to use the host's DNS. Using this pattern, we can link all the containers together by a simple hostname, without having to link specific containers together.

There are a couple gotchas with this approach:

  • There is a lag before the host DNSmasq populates new container addresses -- we currently have an update script that polls every 10 seconds for new changes, and Nginx needs a reload to see a change in a container IP address.
  • We ran into problems having a wildcard DNS entry set up (*.freelock.com) when the hostname was part of that domain -- most requests for a hostname that was not fully-qualified resolved the wildcard DNS entry instead of the local DNSMasq entry. We didn't end up solving this -- we just removed the wildcard DNS entry and then everything worked fine.

We also run a local Postfix instance on the host that relays mail to our mail server. Inside each container that might need to send email, we've added/configured SSMTP and pointed it at the host.

Also on the host: all data, databases, deployment tools (git, drush, composer, etc), cron jobs (using drush for Drupal cron).

Security improvements

Our philosophy on production Docker use is that containers are ephemeral -- they get destroyed and recreated all the time, and so there should not be anything in the container we care about losing.

Docker is still a pretty young technology, and it clearly opens up some new avenues of attack, which need to be carefully considered. For example, adding somebody to the "docker" group is quite effectively giving them full root access to the entire host. It is early enough days that I'm not entirely confident somebody gaining a shell on the host couldn't somehow attack the Docker process itself to gain root access and modify whatever they want on the host.

That said, I'm reasonably confident that Linux containers do an effective job of isolating processes running inside them to just what's visible to those processes. And the vast majority of attacked websites we see (5 so far this year) plant a malicious script in an executable environment and run it. So the first place to secure is the executable environment -- PHP itself.

In our former LAMP setup, we would use filesystem permissions to prevent the Apache web user from being able to write to the filesystem, anywhere other than the directories with assets Drupal manages on the disk (images, videos, aggregated CSS/JS, etc) and prevent execution in the directories where Drupal can write.

With Docker, we take this one step further: we mount the web root into the Docker container as a read-only volume -- so even if an attacker somehow gained root through the PHP interpreter itself, the attacker still cannot plant their executable code into the site.

A couple other big wins here: we support a few "legacy" installations of older Drupal and some non-Drupal sites -- we're now able to contain and isolate those from other sites on the same server, so a compromise on one site cannot hop over and infect another site. And we're able to easily run different versions of PHP on the same server.

Performance improvements

Docker gives us no performance improvements whatsoever, but it doesn't penalize us either.

Our new architecture, running Nginx and PHP-FPM, is providing very noticeable performance improvements. Sites that formerly took 3 seconds to load now take 0.3 seconds for the HTML, and by moving away from mod_php, the webserver can handle many more simultaneous connections for downloading assets.

The speed improvements are especially noticeable for admin users working on their sites.

Managing upgrades

I keep hearing people rave about how much easier it is to manage upgrades with Docker. And I pretty much completely disagree. In a production environment, if you want to upgrade a container for a new PHP release, for example, it's quite a bit more complex than a simple "apt-get dist-upgrade." You have to:

  1. Build a new Docker image.
  2. Stop the old docker container.
  3. Remove the old container.
  4. Create and run a new replacement container based on the new image.
  5. Fix all the links with the other parts of the system.

This is not simple.

Fortunately, Salt can manage this entire process!

Salt is a configuration management tool, very similar in purpose to Puppet, Chef, Ansible, CFEngine, and others. I think people who think Docker is a useful deployment/configuration tool don't have experience with one of these far more powerful configuration management tools.

These days Ansible seems to be getting a lot more attention than Salt -- my impression is Ansible is easier to learn, but Salt is more powerful. (That sounds familiar!) We've built up some Salt states to manage both containers and sites in containers.

So to upgrade a container, our process now looks like:

  1. Build a new docker image.
  2. Push into our private Docker registry.
  3. Run salt state.highstate and let Salt do the rest of the work!

Salt pulls the new image down to each host, and if it detects that the image a container is based upon has changed, it stops/removes the container and starts a new one in its place. Then the "update-dns" script we deployed earlier (also via salt) detects the new IP address for that container's name, and reloads Nginx.

This process has not been flawless, and so at this point we're running the "highstate" command that applies these updates manually, so we can address any issues that might arise -- so far we've had two failures, both of which I chalk up to the relative immaturity of Docker:

  • Docker container filesystem type -- this is configured when Docker is installed. On an Ubuntu host, it sounds like the current recommendation is the older AUFS filesystem, on other systems, Devicemapper seems to be the current standard. Our original systems ran AUFS, and AUFS upgrades have gone smoothly -- however, the systems we deployed with Salt ended up using Devicemapper, and that's broken multiple times when Docker itself came out with a new release, breaking all containers. We've eventually switched all our hosts to use AUFS and haven't seen further issues.
  • docker-py unable to pull from a V2 registry, errors on container creation -- You can (and should) run your own docker registry to store docker images. This allows you to build an image once and distribute it across your infrastructure. We only started using Docker a few months ago, so we never bothered to deploy a version 1 registry, went straight to V2. However, the Python docker-py library which Salt and Ansible use to manage Docker has lagged behind the Salt API, and again after various upgrades have suddenly stopped working, sometimes at the worst possible time.

We're nearly ready to turn the automatic "highstate" back on, but we want to go through a couple more upgrade cycles to make sure this goes smoothly first. As Docker and the tools mature, I'm sure these issues will be far less frequent.

The biggest improvement

Docker containers are cool. Docker is fun to work with. It's great to be able to quickly roll back to an earlier version of a container -- particularly some of the one-off servers we support, Docker makes us feel far more confident in the environment and in being able to quickly roll back the entire server to an older version.

But the biggest improvement we're seeing is in the process of creating the docker containers in the first place: the Dockerfile and the startup scripts. None of this is something that we couldn't do before -- it's just that we didn't do it before, we didn't map out the steps of creating our production environment. We had a bunch of ad-hoc scripts and miscellaneous Salt states to assist with adding sites to a server, tuning MariaDB, etc. But this was all jumbled, messy, and hard to maintain.

Now we're getting a much cleaner and well-defined separation of a standard environment build, and the run-time configuration. And that starts with the Dockerfile.

Docker images

You don't have to use a Dockerfile to create a docker image -- you can just start a container based on somebody else's image, make some changes, and "commit" it to have an image. But then you end up with an image that's hard to replicate if something upstream changes.

I've read about people using their configuration management system to update their containers, but this is backwards -- you need to install more software inside your container to make it capable of being managed!

The Dockerfile is a simple recipe for creating an image, and "docker build" is baked into docker itself, it's incredibly easy to use.

We see the Dockerfile not just as the recipe for creating a Docker image, but also a self-documenting map of the configuration itself. You do need to think about what variables need to change in different container instances -- for example, our PHP images can be tweaked by passing in different variables for max ram, max clients, and max execution time, and we declare these variables in the Dockerfile so it's easy to see what parameters you can pass at runtime, even though they're not necessary to build the image.

Even though we typically run Ubuntu servers, most of our Docker images end up based on Debian Jessie or other derivative images.

So we have a git repository of Dockerfiles for all our images. When doing an update, we "docker build" the new image and push it into our registry. This means we've been able to move a lot of the build information out of Salt into Docker.

Runtime configurations in Salt, Startup scripts

Inside a Docker container, you generally don't run any kind of init system -- you just run the service itself, in the foreground. So most images need to have the ability to set up the environment before starting the service, and we end up using a hand-built bash script for this. Most often, this script simply replaces values in configuration file from the values of environment variables provided to the container when it's started, and then starts the service.

You do need to consider that this script will get called whenever the container is started -- both the very first time it's launched, and also if the host gets rebooted or the container stopped/started for any other reason. Other than that, the startup scripts are very straightforward, because you don't need to consider shutdowns, status checks, or any of the other things you typically need in an init script.

The catch is, you can't change any of these environment variables after the container is created. So that means if you do need to change them, you need to create a new container and replace the old.

If you have a dozen variables to set at container creation, you need to keep track of those somewhere. We started by just putting it in our project management system, and cutting and pasting the startup line. Docker is developing "docker engine" for this purpose, to allow you to store these variables in a config file for easy startup, and orchestrating multiple containers.

But we've found Salt able to handle that task very, very well. We've built out a set of salt "states" that automatically provision the necessary containers, in a very elegant way, with the run-time values stored in salt "pillar".

Because we're primarily managing Drupal sites, we've set up pillar data to make it easy to focus on one layer at a time:

  • sites/sitename.sls -- contains information about a particular site: drush alias, git alias, public URLs, database credentials, site root path, assets path
  • server/servername.sls -- defines which containers to create on a particular host, based on which images, along with any changes to the runtime defaults and a list of sites to mount into that container. It also includes each site state file to provision on that server, and designates which container it runs in.

Once these pillars have been populated with appropriate data, Salt now ensures that a whole bunch of configuration is done on the server:

  • Latest site code is checked out of git
  • Permissions are set correctly for the site code and assets
  • Containers are running with the latest images
  • Site code is mounted as a read-only volume, and assets mounted as read-write inside the appropriate container
  • Nginx has a site configuration for each site, pointing to the appropriate PHP container
  • The Drupal settings file is written with appropriate database credentials and any other variables specific to the production environment

... and there's a couple easy next steps we haven't quite gotten to: scheduling the Drupal cron job in the host, and setting up the user account inside the database.

Other than that, completing a move of a site into this architecture consists of importing the production database, copying over the site assets from wherever they are, running "highstate" again to fix the permissions, and updating DNS to point to the new server!

From a disaster recovery standpoint, this is huge. We used to spend a couple hours dialing in the environment on a new server, working from checklists and error messages as we build it up. Now we simply copy the container configuration to the pillar for a new server, run highstate, and import a backup site database and assets and we're off and running.

This is a bit of a long and rambling post, but I hope it illuminates the big picture of how we use all these different systems to deliver what's really becoming a great result -- a fast, secure, replicable environment for running Drupal sites. We're still new to Docker, but happy to share our experiences further, and if you have any suggestions, questions, or obvious things we're missing, please comment below!

Jul 25 2015
Jul 25

I like to be technology/platform agnostic, but last couple of years I’ve built everything on top of Drupal. I get this question many times: “Why not using something else?”. My answer is usually: “I became so good at using it, that it only makes sense to me”.

I tried to came up with some objective reasons, to rationalise my future decision.

1. It’s open source

Software is the bricks and mortar of your business. If you don’t own it, then someone else has the control over you startup. Open source also means no up-front cost for licenses. Since so many people know how to work with Drupal, you are also not locked in with developers.

2. Integrates with 3rd party services

Do you really want to spend your time building custom payment solutions, analytics or notifications systems? We live in a time that there is an API service for everything. There is also a Drupal module for every popular API service. This enables you to build your solutions quicker and cheaper, without developing the integrations yourself.

3. Safe and reliable code

With a community of thousands of people working on the code it became of of the biggest open source communities in the world. The biggest challenges was to ensure the code in all 10.000 modules is secure and reliable. Drupal has a centralised system for modules and it is very rare you would download a module from Github or private websites. This enables moderators to control who releases what. Also, there is a special security team watching over the code.

4. Enterprise oriented software

When I first joined the community, Drupal was compared to WordPress more often than it is today. From some different perspective I would say WP has won the battle against Drupal. On the other hand, Drupal has won the battle of enterprise CMS platforms by far. Why is this important to your startup? It puts you shoulder by shoulder to Twitter, CISCO, Tesla and many others.

5. It’s a safe investment

There is a 90% chance you will fail. Now, if you pick a technology that you will invest your time to learn, then pick something you can use on your next project.  On the other hand there is a big demand for Drupal developers out there, if you will ever want to get a regular job and take some time off from startup madness.

Would love to hear from you too, what platforms would you recommend to me, and why?

Jul 24 2015
Jul 24

By Steve Burge 24 July 2015

When you first install the Views module, it comes with several example views.

One of the most popular examples is the Glossary view, which takes a large amount of content and organizes it all by the first letter of the content title. This is useful in a lot of situations, especially when you're creating a directory of businesses or people.

Here's what the Glossary view looks like:

Glossary View in Drupal

Even though the Glosssary view exists when you install Views, it's not always easy for beginners to understand. The Glosssary view does use some of the more advanced Views features.

In this video below, we show you how to create your own Glossary view:

And, now that we understand how new Glossary views are created, let's take at look at how to edit the default Glossary view:

These videos are part of the Drupal Glossary View class.

Jul 24 2015
Jul 24

Drupal 8, which has been in beta for a few months now, is causing plenty of excitement. In the beginning, Drupal made confident claims that it would be a major step forward:

Drupal 8 will set a new standard for ease of use, while offering countless new ways to tailor and deploy your content to the Web. Easily customize data structures, listings, and pages, and take advantage of new capabilities for displaying data on mobile devices, building APIs, and adapting to multilingual needs.

Drupal 8 has many new features, and Drupal.org also describes it as having a "leaner, meaner core," an "easier migration process from earlier versions," and "in-place content editing tools." Modules and themes will also become more powerful because of Drupal 8's adoption of OOP (Object Oriented Programming).

1. Creating a Bridge for New Developers

And speaking of the new OOP approach, this new feature might be one of the more exciting aspects of Drupal 8.

It's finally building a bridge to new developers.

Drupal 8 is much more compatible with the programming standards of PHP, and this means that new developers who may not know Drupal very well can still come in with their OOP PHP familiarity and contribute to projects.

Besides adding a wider door through which more developers can pass, Drupal 8 is improving things in all the ways you might expect in our current Age of the Smart Device:

2. Drupal 8 is Actually Mobile-First, Not Just Mobile-Friendly

Drupal 8, not surprisingly, will be mobile-first. It's a good thing too. According to SmartInsights.com, 80% of Internet users now own a smart phone, and the majority of digital media consumption is now done on mobile devices. Mobile-first is the new standard for web design, and Drupal 8 is embracing that trend. For example, Drupal 8's built-in themes are all responsive, and the administration toolbar is mobile-first.

3. File System-Based Configuration

The new management system makes it easy to switch configuration changes with greater consolidation and versatility (which translates into fewer headaches). Here's how Drupal describes it:

Drupal 8 has a whole new configuration system to store configuration data in a consistent manner. All of your site configuration from enabled modules through fields, contact categories to views, are stored with this system. The system is designed to make it much easier than prior Drupal versions to make changes, export site configuration to files, and import those changes back into the site.

4. HTML 5-Based Markup

HTML 5-based markup means, among other things, native input tools that make it simple to design for mobile. It also means that the output templates have simpler elements. It's definitely a much-needed feature. And they were thorough with it, as their initial list of HTML 5 objectives reveals:

The main goals of this initiative will be to implement HTML5 in Drupal core in a way that will:

  • Have the most benefit for end users.
  • Enable contributed modules and themes to evolve using HTML5.
  • Allow theme developers to control where to use the new semantic elements, and opt out entirely if they so choose.
  • Adding support for the new form elements to Drupal's Form API.
  • Adding new semantic elements in core templates in an appropriate way.
  • Adding ARIA roles to markup to improve accessibility.
  • Simplifying style and script elements.
  • Ensuring input filters and functions accept HTML5 elements.

5. Much Easier Editing

Drupal 8 features a new WYSIWYG configuration, two-column editing, improved draft-saving, and the ability to edit content without reverting to the full edit mode.

6. Drupal 8 Will Speak Your Language

Drupal has its eyes on the global prize, and Drupal 8 is clear evidence of this. It has powerful multilingual features: it has built-in interfaces that can translate anything in the system, it will grab software translation updated automatically from Drupal, and, according to Drupal, it can "build pages with Views language filtering and block visibility."

Powerful indeed.

Contact us or subscribe to our newsletter for more helpful tips and insights about Drupal.

Jul 24 2015
Jul 24

Since February 2012 I have been the maintainer of the Statistics module in Drupal core. Since then I have overseen two pretty big changes to the module.

Admittedly this was after in 2011 I tried to get the module removed from Drupal core.

The first big change was to remove a good chunk of the module, the access log. This tracked things like referrers and visitors, which most people do in a service like Google Analytics. This was committed by Dries to Drupal 8 in early 2013.

The second big change was to use an AJAX call to count a node view. This allowed the module to work even when a site was using a reverse proxy cache such as Varnish. As well as getting committed to Drupal 8, this was also back ported to Drupal 7.

Now I am on a mission to breath new live into the module.

A patch I’ve been working on for the last 3 years is to give the statistics module a swappable or overridable backend. Since July 2014 it has been possible to define a service as being “backend_overridable”, and this is exactly what I’m proposing for the statistics module. The patch waiting to be committed creates a service and moves all database queries to there, this can (and already is) be overridden by a contrib module.

This will bring a huge performance boost of allowing statistics to be written to CouchDB, MongoDB, Redis or anywhere.

The next step then is to get the statistics module to count views for all entity types and not just nodes. This is a legacy feature that is left over from Drupal 6 which didn’t have entities. Hopefully this can be committed to Drupal 8.1.x.

If you want to help out, please join the issue queue.

Please enable JavaScript to view the comments powered by Disqus.

blog comments powered by
Jul 24 2015
Jul 24

This article is the first in a tutorial series that teaches beginners how to configure a Drupal Commerce site. Follow this series to gain a basic understanding on how to build online stores of your choice. If you would like to see the full list of articles in this series, go to Drupal Commerce Tutorial page.

The focus of this article is on installing Drupal Commerce on an existing Drupal installation and then adding a product and displaying it on the site. By the end of this article, you will be able to create a basic product page in Drupal with an “Add to cart” button as shown below:

Getting Started

There are two ways to install Drupal Commerce:

  1. Manual method: You will have to individually download, install and enable modules from drupal.org. This gives you a level of control over the look and feel of your basic e-commerce site.
  2. Commerce Kickstart method: Commerce Kickstart is an independent Drupal installation profile that can quickly set up e-commerce functionality on your machine. It comes with a core and all modules required to make e-commerce a success for your site.

While Commerce kickstart in an excellent distribution if you need all the features provided without much customization, manual method is better if you want to customize the store to your needs. We are going to use the manual method in this series of tutorials.


Here are the modules you should install on your machine:

You will have to download the necessary files to your site’s modules folder. First, make sure you install all the secondary modules before installing Drupal Commerce. You could use Drush, the Drupal UI or any command line tool to perform installation.

After downloading, go to the Administration panel and enable all newly downloaded modules. Here is a snapshot of me enabling the Commerce module:

After you have enabled all necessary modules, you will notice a change in the top-level menu. A new link has been added – “Store”.

Clicking “Store” will reveal a page as shown below:

This is where your store's configuration and administration will take place.

Adding A Product

We now have a basic online store in place. Let’s start by adding a product to this store. Clicking on “Products” in the window above will reveal a new page:

Click “Add a product” to get the following screen:

Here are the fields you will have to fill:

Product SKU: It would be great if you follow a particular nomenclature while assigning this field. To demonstrate, I have chosen RC-Prod-0001. All values should be unique to allow for successful database searches.

Title: Type in your product's name. I have chosen 3-LED Flashlight.

Price: Provide the exact price of your product. I have given a random price – $5.40.

Status: Specify whether the product is active or disabled.

CHANGE HISTORY: This optional field allows you to provide details on every update made to the product.

Click “Save Product” to save the product’s details. The product will then be successfully added and will be displayed as shown below:

Understanding How Content Is Viewed In The Drupal Commerce module

There is a big difference between content created in Drupal in general and those created through the Drupal Commerce module. All recent content added through Drupal core can be viewed in the Content tab. This is because each content type represents a node. This is not the case with Drupal Commerce module.

For example, a new product created by you won't show up in Drupal's interface unless you specify a Product Display. For this, you will have to create a brand-new content type.

The steps below will explain how to make sure your products are displayed in the front-end through a Product Display, which is basically a node.

Setting Up Product Display

Let’s first create a new content type that can represent product display.

Click “Structure” -> “Content types”:

Click “Add content type”:

The screen below is where you will be able to add a new content type. I have chosen “Flashlight Page Display” as the title because I want the display to be associated with the Flashlight product created above. Type in a description if you want to.

You can use this new content type as a page display for multiple products uploaded in the future.

Click “Display settings” and uncheck the box against “Display author and date information”. 

After this, click “Comment settings” and select “Closed” from the dropdown for “Default comment setting for new content”.

The reason why we made these changes to “Display settings” and “Comment settings” is because a Product’s display page is not like any ordinary content page. Ideally, most product pages don’t have comments or author information displayed. But if you want users to add reviews to your page, then don’t make any changes to “Comment settings”.

Let’s click “Save and add fields” to associate the product with this display. Fill in the following:

  • Add new field: Type in Product
  • FIELD TYPE: Choose Product Reference to signify the type of product to store
  • WIDGET: Select Autocomplete text field from dropdown to ensure the product’s name can be searched and found.

Click “Save”. The following page is displayed. 

You don’t have to do anything on this page for now. Just click “Save field settings”. The following page is displayed:

Your product field has been added. Click “Save” at the bottom of the page.

Go to “Content” and click “Add content”:

You will be taken the following page. Notice that “Flashlight Page Display” is now featured as a content type. Click it to create a product display for your product:

Fill in the following fields in the new page:

  • Title: Type in the title of the product you want highlighted on the page.
  • Body: Provide a description if you want to.
  • Product: This newly created field is the most important part of this page. Enter a few beginning letters/numbers of your product's SKU or name and the field will automatically display it in the format SKU: Product.

The Product field on this same page is as shown below:

Click “Save”.  You will be taken to the newly created product page as shown below:

You have now successfully created a display page for your product. Notice the “Add to cart” button at the bottom. Since we used the “Product Reference” field in the “Manage Fields” page, Drupal Commerce knows that the node is displaying a product and “Add to cart” button is visible by default.

Next: Adding multiple SKUs of a product with different attributes and let user choose one to purchase

Jul 24 2015
Jul 24

So, what happened???.... The FreeScholar in me got really tired of hearing developers that are excited about some proprietary solutions/tools that happen to work with Drupal. Some of these solutions are being touted as the way to go in the future. I believe that our community has no shortage of genius, creative minds and brilliant ideas, so I encourage us all to think deeply about the tools we use and our personal freedoms.

About 2 months ago, I invited Richard Stallman to our monthly meetup at MIT. I wanted him to meet the Drupal community and take a look at how we work together on projects that are dear to our hearts, helpful to our communities and good for society- we make things happen. He gave a short lightning talk about free software and hardware, then we had a Q and A with him. Many of us got a better understanding of what his mission is and how we can be a part of the educational outreach for fsf.org. Learning to explain how free software is key to autonomy, privacy and human rights, is a big help for the movement.

Next, I invited RMS to NYCcamp... On Saturday July 18th, he gave the keynote speech to a packed house at the United Nations - http://nyccamp.org/keynote/2015-keynote. After the keynote and standing ovation, he connected with the Aegir team and began a discussion on web hosting platforms and free server tools which led to a larger group convening at a strategy session with the Aegir team lead, Chris Gervais and RMS. They led an engaging round table discussion with about 30 people - http://nyccamp.org/session/aegir-strategy-session-richard-stallman-and-c...

What an excellent time - your voice and thoughts are needed, let's free our future with free software as the foundation.

I love Drupal almost as much as I love freedom.

Jul 24 2015
Jul 24

Drupal Commerce is a distribution capable of building e-commerce sites. In this series of tutorials, you will learn how to create a Drupal Commerce site from scratch. You will know how individual modules in Drupal Commerce suite fit together to build an e-commerce store, whether you are selling products, services or subscriptions. Following topics are covered:

  1. Adding and configuring products
    1. Adding a product and displaying it on the site
    2. Adding multiple SKUs of a product with different attributes and let user choose one to purchase
    3. Adding fields and metadata to the product
    4. Managing inventory
  2. Shopping cart
  3. Checkout
    1. Modify the checkout flow to collect more information
    2. Express checkout to increase conversion
  4. Show me the money. Accepting payments.
    1. Credit cards using Authorize.net
    2. PayPal
  5. Adding taxes
    1. VAT
    2. Sales tax
  6. Discounts
    1. Discount coupons for your customers on special occasions
    2. Dicsount when order total exceeds a certain amount
  7. Shipping
    1. Flat rate shipping for all products
    2. Change shipping rate based on user's location

If you are interested in learning anything else to create your e-commerce site using Drupal Commerce, please write it in the comment below and we'll write a blog post about it.

Jul 24 2015
Jul 24

Every day, companies and organizations with lots of content are weighing the pros and cons of adopting Drupal. Often, this decision takes the form of “to what extent should we adopt Drupal” - meaning whether an organization will want to move toward managing all, or possibly only some of its content in Drupal. Having chosen some form of the latter (as practical concerns often warrant), organizations and their technical teams must delve into the territory of integrating Drupal with third party or sometimes proprietary data sources.

 We’re going to focus on one specific facet of this problem today: what to do about custom Apache Solr cores that need to be searchable on a Drupal webpage.

When we hear “Apache Solr” and “Drupal” in the same sentence, the first thing that comes to mind is the Search API module and it’s dependent Search API Apache Solr. This combo is great if you want to index content managed in Drupal (i.e. lists of nodes, products from Drupal Commerce, users, etc). But imagine that you already have a Solr index, and you’ve spent years customizing it to be exactly what you need. Maybe it feeds multiple existing web properties, or maybe it is fed by an ERP system. Any of these factors would make it troublesome to migrate the indexed content to Drupal.

Fortunately the Sarnia module offers an effective way to bring your custom index into Drupal and at the same time leverage the power of Search API and its Views and Facet API integration.

The Sarnia module provides its own comprehensive installation guide. I followed it and it works, so I don’t want to simply repeat what it recommends. Instead I’m going to focus on a few points of interest that I gleaned while setting up this module.

Search API Apache Solr dependency

One key feature about Sarnia is that it although it relies on the Search API Solr module, you do not have to create a Search API Solr server and/or index for this module to work. Sarnia lets you add its own type of server, which accepts Solr connection input. The module automatically generates a Sarnia index when you enable an “Sarnia entity type” for the server (edit the server and click the “Sarnia” tab).

Understanding the Sarnia Entity

The whole point of Sarnia module is so that you can get Search API features to work on indexes managed outside of Drupal, right? Then why does Sarnia module define the “Sarnia entity type” that claims to “represent data from Solr?” At first I thought this implied that the module was going to replicate indexed data in a table. However, Sarnia entity types are rather unusual:


'label' => $type['label'],
'controller class' => 'SarniaController',
'fieldable' => TRUE,
'static cache' => TRUE,
'uri callback' => 'sarnia_uri',
'view callback' => 'sarnia_view_multiple',
'base table' => NULL, // Prevent undefined array index errors from Views.
'entity keys' => array(
   'id' => 'id',
   'revision' => FALSE,
   'bundle' => FALSE,

Most importantly base table is null. So entities of this type are not stored. What is the point then? It turns out that these entity types exist mainly because Search API requires an entity type to work.

You’ll notice that in the “Sarnia” menu scope for Sarnia Search API servers, there are “Manage fields” and “Manage display.” It looks like at some point there was an initiative to allow developers to store field values for Solr documents so that Drupal can remember things about them. However, the Sarnia devs note:

“It is possible to add fields here, but there is no corresponding interface for editing field content; saving content has not been tested, even programmatically.”

I doubt that you would be successful trying to save values for Sarnia entity fields, because the Solr ID is not an integer and the field data tables require integers for entity ids. Fortunately, if you’re using the Sarnia module, saving field data in Drupal about your indexed Solr documents is more of an edge case.

Solr field typing

The Sarnia module attempts to assign types to the fields that it finds in its target index. One weak point of this module is that these field mappings are not very robust. Properties are ingested into Search API as either “text” - if the Solr field is fulltext and “none” for everything else.

In our case, we needed to use one of the Sarnia fields as a group-by field in one of our Solr queries, and group-by does not work on “text” fields. We needed to convert one of our fields to type “string.” We had to employ the following hook implementations:


* Implements hook_search_api_index_load().
function mymodule_search_api_index_load($indexes) {
 // Sarnia module only sets a type for fulltext fields, so we set it manually.
 if (!empty($indexes['sarnia_sarnia_test']->options['fields']['ss_field_pattern$url'])) {
   $indexes['sarnia_sarnia_test']->options['fields']['ss_field_pattern$url']['type'] = 'string';

* Implements hook_entity_property_info_alter().
function mymodule_entity_property_info_alter(&$info) {
 // Add a definition for our grouping field. Left to its own, Sarnia module
 // only adds properties for fulltext fields. This causes errors to be to be
 // thrown when we do our grouping implementation.
 if (isset($info['sarnia_sarnia_test']['properties'])) {
   $info['sarnia_sarnia_test']['properties']['ss_field_pattern$url'] = array(
     'type' => 'string',
     'label' => 'ss_field_pattern$url'

The Sarnia devs gave clues on how to produce this code, as they say in a comment explaining how the module field typing generally works:

 “We have to inject the Solr properties both in hook_entity_load() and in hook_entity_property_info()”

Thus I employed a similar approach to alter the Solr properties. We did not test the Facet API integration, but my guess is that some similar work would have to be done to prepare the facetable Solr fields with a data type that Search API deems facetable.


Whether you are an organization adopting Drupal as a CMS but still wanting to use an externally managed Solr core to power web searches, or a Drupal agency going for a “land and expand” strategy by providing service to clients that may want only limited Drupal integration on day one, keep the Sarnia module in mind if Solr search is part of the project scope.

Additional Resources

Preparing for Solr in Four Easy Steps | Mediacurrent Blog Post
Your Intranet on Drupal | Mediacurrent Blog Post

Jul 24 2015
Jul 24

Contributing to Drupal from a Junior Developer’s Perspective


Here at FFW, we are acutely aware that Drupal is an open-source environment, and as such, we appreciate the many thousands of hours that volunteers have put into its development. So, when developers at FFW are between projects, we are encouraged to do "contrib" work, meaning we find open issues in the Drupal issue queue, solve them, and get them pushed out into the community. Until very recently in my year at FFW I was on a single project for one of our largest clients. When I finished my engagement on that project, and before I started another one, I found myself with some free time and the opportunity to work on my first Drupal commit! This is a fantastic company culture and policy, everybody wins because of it, and I feel lucky to work here. At any given time, we always have someone doing contrib work, mainly with Drupal 8 core. It helps us learn as individuals, helps the presence and reputation of our company, and of course helps Drupal, which is the reason we’re all here, and the reason you’re reading this blog series.

That being said, this will be a post about how to contribute and the process involved. There will no (or very few) code examples. There are plenty of resources online for that. Rather, this is for the novice to intermediate Drupal developer who’s ready to give back to Drupal for the first time and doesn't quite know where to start.

The thing that's so great about open source software, and the way Drupal contrib in particular works, is that anyone can contribute, regardless of past experience. Take me for example, I've made a career of Drupal for the past five years, but have never given back until now. Shame on me! There's no certification you need in order to contribute, no permissions, just an account at drupal.org and a willingness to learn. And don't worry – like I did at first – you can't break anything. The contrib and approval process is sophisticated enough that only correct, community-approved patches will get committed. There's nothing to fear, so…

Let’s Get Started

First I'll summarize the steps you go through to contribute, then I'll dive into each one, pointing out tips and gotchas along the way. So, at its most basic level, contributing goes like this:

1.     Find an issue you would like to, and are able to, contribute to.

2.     Download the latest Drupal core to your local machine. I’ve been working on Drupal 8 core issues, so that's where we’ll start.

3.     Create a new branch, and download and apply patches that already exist for the issue, so you're working from the most recent version of the code.

4.     Complete your work locally.

5.     Create a patch and an interdiff (more on that later) and upload them to the issue.

6.     Await the automated testing results and recommendations or approval from the community.

7.     Repeat steps 4 through 6 until your work is approved and your patch is ready.

8.     Get a commit in Drupal core!

I know it sounds like a lot, but once you do it a few times it will become easier and faster. Put it this way: my first patch took me almost a day to complete. Now I can create and upload one in a few minutes. Like anything else, the more you do it the faster (and more confident) you get. And it is so worth it to learn this process. I can’t overstate how excited I was when I got my first commit into core! In Monday’s post I’ll get deeper into each step of this process.

Jul 24 2015
Jul 24

Photo of network cablesWhen faced with the task of managing videos in Drupal, the number of available solutions might seem overwhelming. Where should you store the videos? What is the difference between CDNs (content delivery networks), cloud storage services, and hosted video solutions? Which Drupal modules should you use with which service?

Drupal Modules

By using some of the most popular modules for video handling, you can quickly set up a reliable video solution:

  • Media module Although not specialized for video, this widely used module can be combined with others; some cloud services have their own modules integrated with the Media module, too.
  • MediaElement module The MediaElement module provides an HTML5 (or Flash fallback) player. With the MediaElement module you can stream video to mobile devices as well. The player can integrate with your responsive theme.
  • Video module The Video module can handle video upload, transcoding, and playing; generates thumbnails; and can deliver videos from cloud systems.

Content Delivery Networks

CDNs are largely distributed systems optimized for delivering content to end-users, over the Internet, with high availability and performance. For video content, they are often coupled with a transcoding server.
They can be expensive, but are a good choice for improving performance and delivering content for high-traffic websites. As the data centers are distributed, they will be faster than the usual hosting providers for most visitors to your site. Also, contemplate using a CDN if you already have a transcoding server.

The CDN module provides easy Content Delivery Network integration for Drupal sites. It alters file URLs so that files are downloaded from a CDN instead of your web server.

Cloud Storage Services

Cloud storage services aren’t optimized for delivering rich media content on high traffic sites, but can be a cheaper alternative to CDNs – if you don’t have a huge number of videos and your site traffic isn’t very high.

The Media module alone doesn't provide full cloud storing services (like Amazon S3, Rackspace Cloud, and Google Cloud Platform), but together with the Remote Stream Wrapper module, you can reach external files from Drupal.

Some modules for full cloud storage service integration:

  • Storage API is a low-level framework that can be extended to work with any storage service.
  • Google Cloud Storage allows you to replace the local file system with Google Storage. Files will still be managed by Drupal, but instead of being stored on the local server, they will be stored on the Google Cloud Platform.

Cloud-hosted Video Solutions

Hosted video platforms are specialized for video storage, transcoding, and playback. They already include the transcoding software, and additional features like players (for different devices), playlists, ads, live streaming or DRM (Digital Rights Management).

DRM is a robust content protection program that enables publishers to enforce policies and rules for usage of their content.

  • Brightcove offers a highly-scalable secure video hosting platform for delivering and monetizing video across connected devices.

    The Brightcove integration module adds Brightcove videos or playlists to your content, and accesses information from the Video Cloud from within your site. Brightcove actively maintains their Drupal module by paying a company from the community – Pronovix (where I work) – for development and support.

  • The Platform offers different online video packages from an enterprise-class video management system to businesses with smaller video libraries. They also provide a transcoding engine. Their Drupal module, Media: thePlatform mpx, integrates with the Media module.
  • Wistia is a professional video hosting solution with analytics and video marketing tools. In Drupal, you can either use it with the Media:Wistia module or the Video Filter module that has extended Wistia support.
  • Vzaar is a video hosting solution offering quick, secure, and reliable services with much-praised support. Use it with the Media:vzaar module for Drupal integration.
  • Viddler offers Overture, a video hosting platform for enterprise corporations, and Arpeggio, a responsive HTML5 video player that enables you to create interactive video content and timeline commenting. Use the Viddler module for Drupal integration. (Note: The Drupal 7 version is in development, currently for testing only with limited functionality.)

Self-hosted Video Solutions

For an open source solution, you can opt for Kaltura – the world's first open source online video platform – or set up your own hosted video platform with OctopusVideo.

  • Kaltura provides enterprise level commercial software and services, as well as free open-source community-supported solutions for video publishing, management, syndication, and monetization. You can use it as a one-stop solution for all your Rich Media, including images, audio, and documents.
    Using the Kaltura module, you can either leverage the Kaltura hosted platform or grab the free source code and self-host the entire platform.
  • For your own hosted video platform, try OctopusVideo, an open source video transcoding and delivery platform built on Drupal.

Video Sharing Sites

If simply embedding YouTube or Vimeo videos in your Drupal site is enough for your project, you have more options to choose from:

  • Use the Embedded Media Field module and the YouTube integration for the Media module in both Drupal 6 and 7.
  • Use the Media module to embed videos from YouTube or Vimeo, by adding the URL to a field or through an editor.
  • In Drupal 7, use the Video Embed Field module to create a simple field type that allows you to embed videos from YouTube and Vimeo or show their thumbnail previews by entering the video's URL.
  • Use the Media Embed plugin for CKEditor that opens a Dialog where you can paste an embed code from YouTube or Vimeo.


As you see, there is a wide variety of available solutions, which vary widely in price and performance: choose the approach that best suits your needs. This overview should help you get started!

Image:"Networking" by npobre is licensed under CC BY 2.0

Other articles from this issue:


Jonathan Hedstrom

There are lots of new features for testing Drupal sites, making the process simpler and more efficient. Herein, some examples and explanations.

Coming Soon

Jeff Sheltren

NYC tourist: “How do I get to Carnegie Hall?” Musician: “Practice, man, practice.” The same is true for achieving expertise as a barista – or a Drupalista.

Coming Soon


Emma Jane Westby

In Drupal, there’s no single enforced path into content; there’s more than one way to skin the cat. Taxonomies, vocabularies, and other options abound. Take a look.

Coming Soon

Jul 24 2015
Jul 24

For sites that do not need to have user registrations. The most effective way to prevent registration spam is to disable visitor registrations. If you need users to register to post comments, consider using use third party chat applications like Disqus and Livefyre.

For some inexplicable reason I missed the memo that said I didn’t have to have user registrations and comments on every Drupal site. Every site I created I diligently installed a number of anti-spam modules. I settled on Captcha, reCaptcha and Honeypot as my trusted combination of anti spam registration modules. For me spam user registrations became a fact of life. I accepted that a few determined spammers will find their way through the protective wall of modules I install. When doing maintenance on the sites I maintain one of the tasks I perform is delete spam user registrations. It took a post on Stack Exhange for me to see something that has always right before my eyes. The option to turn off user registrations can be found at this path in your Drupal site, admin/config/people/accounts.

Drupal Disable User Registration

The advantage with managing your own list of users is the fact that you have the flexibility to control your engagement with your users. After all the number of users you have is one measurement of the success of your site. In retrospect I realise that most of the sites I manage are brochure sites. Usually there is one or two administrators from the client organisation who occasionally update the content on the site. That is if they remember how to do it. Mostly they just send me an email and ask me to update a particular page. I can safely say that at least 80% the sites I have created do not need users to register for the site. I have spent years battling spam registrations having accepted user registration is necessary feature for every site. I realise now that opposite is true, few sites need users to register. Not every Drupal site is a community or e-commerce site.

For sites that need comments I recommend using one of Disqus, Livefyre Comments or Facebook Comments. If you really need to let users register, in addition to my listed modules above I also recommend you install the Mollom module. Be sure to set aside some time to clean up your site every other week. Remember time is the most precious resource you have, use it wisely.

Jul 24 2015
Jul 24

Over the last few weeks I’ve been spending a lot of time with Drupal 8 and Composer. This has lead me building up a PoC for a client and diving into the issue queues and IRC. In this post I wanted to document some of the processes I’ve been looking at.

Creating a project

The people behind drupal-composer have put together a template, a Drupal project can be started from the template using the command composer create-project drupal-composer/drupal-project:8.x-dev drupal --stability dev --no-interaction. This will create a folder called “drupal”, in there you will find “web” directory containing the Drupal installation. It has also downloaded drush, and two modules, devel and token.

Adding modules

If you open up composer.json in your drupal project folder you will see a repository with the URL https://packagist.drupal-composer.org defined. This is a custom version of Packagist setup by the drupal-composer team. If required packages are not found on Packagist they will get pulled from here.

You will also notice in composer.json under the “require” section is where drupal/token and drupal/devel are added. You can add any module on the Drupal Packagist to this then run the command composer update to update your project and download the newly added modules.

In the “extra” section of composer.json you will see a number of installer paths are added, this tells Composer (via the required composer/installers package) where to put things. You will see everything is going in the web directory, Drupal core in “web/core”, modules in “web/modules/contrib” etc. Therefore, when you call the composer update command to add the new modules you required, these automatically went into the correct Directory ready for Drupal to use. Composer knows these are Drupal modules because the modules have a composer.json files too (often dynamically added by the Drupal Packagist because Drupal doesn’t require modules to have a composer.json yet). In this composer.json the type is set to “drupal-module” for modules, “drupl-theme” for themes, etc.

Patching modules

Greg Anderson went into this in a lot of detail on the Pantheon blog earlier this week, but using the cweagans/composer-patches package you can define a patch file to use. For example, add the following to the “extra” section of your composer.json file to patch the token module:

"patches": { "drupal/token": { "Description for reused fields not correct": "https://www.drupal.org/files/issues/token-Fix_description_for_reused_fields-2497251-5.patch" } } }

You’ll see the package to patch is defined, then within that a name or discription of the patch, followed by the URL for the patch file.

Custom modules

There are a number of ways you can handle custom modules here. You could create a folder at web/modules/custom and just put them in there, or you could add them via composer. In the PoC I’m working on we have many custom modules that will be added to multiple projects. The custom modules have their own git repo, and if they’re not on Packagist or the Drupal Packagist (which they shouldn’t be if they’re custom modules) we need to tell Drupal about this repository. In composer.json under the “repositories” section add something like:

{ "type": "vcs", "url": "https://github.com/timmillwood/couchdb_statistics.git" }

In this case we’re adding a repository that contains the “drupal/couchdb_statistics” package. You could add "drupal/couchdb_statistics": "dev-master" to the “require” section of your composer.json to add this Drupal module to your modules directory.

If you have a lot of custom modules you’re adding to multiple sites it might be worth you setting up Toran Proxy. This is a project by Jordi Boggiano, the guy behind Composer and Packagist. It allows you to proxy your git repos and packagist. You then add your Toran Proxy installation to the “repositories” section of you composer.json, then require any packages you have there. Give Toran Proxy a Github token and it can also grab your private repos too.

One thing to note about repositories is:

> Repositories are only available to the root package and the repositories defined in your dependencies will not be loaded.

For the PoC project mentioned earlier I looked at creating a sub-project which just contained a composer.json. This then required all of the common modules, custom and contrib. However the custom ones were not getting pulled in because all the custom repositories were not defined in the root project, they were only defined in the sub-project. Having Toran Proxy as a single source meant that we could add it to the root project and all dependencies could also get pulled from there.

Post install

You may noticed the drupal-composer template has a scripts directory and this is defined in composer.json as a “post-install-cmd”. This is run after composer.install. The script add settings.php, services.yml and the files directory. You could customise this to do a number of other things. Run drush commands, setup a vagrant box, etc


Composer is here in Drupal 8 and it’s awesome. You can run, develop and deploy your whole Drupal 8 project with Composer. You can add and patch contrib and custom module, as well as themes, profiles and other PHP packages. The drupal/couchdb_statistics module mentioned earlier requires a couchdb client, this will all get pulled in via composer with a single command.

Move over Drush make, this is how you should be running Drupal 8.

Please enable JavaScript to view the comments powered by Disqus.

blog comments powered by
Jul 24 2015
Jul 24

I’m in the middle of several Drupal Camp / Con’s (any event over 1000 people is no longer a “Camp” but that’s for another time) and it’s occured to me: I can no longer learn by going. Now, this is learn in the traditional sense of what I used to go to Camps for (been coming to camps for 8 years now).

It’s not that I’m advanced beyond younger me, it’s that this ecosystem is an endless rabbit hole of buzz words, frameworks, CLIs, VMs, architectural components, issue queues, PRs, and other words that must make most anyone new to anything web immediately freak out and hide. And so, in coming to events I feel I rarely get to learn about implementing BEM or SASS or DrupalConsole or Composer or Dependency Injection because 1 hour or 50 minutes for a talk isn’t close to enough to scratch the surface of these concepts.

What IS important thought at Camp sessions:

  1. Community; hearing from peers and discovering together how much all of us don’t know everything
  2. Learning new buzz words, which is critical because I don’t know what to Google!

For example, I assumed everyone already knew what Drupal Console was only to find out that most people I talk to go “Duh-huh-waaaa?”. And so, to heep some more Buzz words on you all from the Camp circit and why I’m so excited for them :)


Drush is the original command-line Drupal (you’ll see why I frame it that way soon). It allows you to communicate with your drupal site via commandline, issue commands, routinize boring tasks and do complex scary tasks as single commands. Other modules can supply plugins for this as well as anything you’d like to just use as a plugin without modules.

Drupal Console

Drupal Console is a bit of the new kid on the block. It originally was met with resistance, as it’s another CLI for Drupal, but has since started to find its place in the early D8 CLI communities that are developing. What’s cool about Drupal Console is that it’s starting to find a happy middle ground of coverage for things that Drush kind of was weak at. What else is cool, is that these communities are working together to reduce overlap in capability AND (more importantly) allow for each to call the other natively. This means you’ll be able to start execution threads like `drush drupal-console ….` or `drupal drush en views`.

Console doesn’t have support for running off and grabbing Views. It’s much more of the architecture of building out things that you need to work with your drupal site, but without writing all the code (for example). Think of it more as a utility to help you work with Drupal in the development side of the house. While Drush has plugins for things like module building, code review and entity scaffolding, they were never its strong suit. This is where Drupal Console is focusing its efforts.

Drupal Console also has Symfony based plugins since it’s building against Symfony Console. Drush on the otherhand is your traditional Drupal architecture, supporting multiple versions of Drupal, etc.

Why you should care

Because if this PR / thread in Drush-ops moves forward, it would mean the two could call each other. This will allow for two different ways of developing for CLI devs, pulling in Symfony components, or traditional drupal guts. It also gets you a bigger community working on things at the low level so that stuff at the high level (site building, theming, etc) is easier and more scriptable. With Drush you’ll (still) be able to build make files and get all the dependencies for your site and with Console you’ll be able to write new modules and sub-themes a lot faster because of the templating / file trees that it will allow you to build.

As we get towards a stable D8, Drush and Drupal Console, we’ll have so much raw HP under the hood that you’ll be able to automate all kinds of things. It also means these communities can tackle different sides of the under the hood problem space for getting Drupal going.

For example; I maintain Drush Recipes which allows for tokenized drush chaining. This lets drush call itself and via arguements you get a lot of stuff done without much effort. Jesus Olivas just made me aware that there’s some Command Chaining being worked on for Drupal Console. This way you could string together commands in Console (allowing console to call itself basically) that allows you to get something more functional / end-result then having to manually type the same kinds of commands over and over (he gives an example in the blog post).

The future Drupal Development environment / workflow

Here’s a few projects that if they merge efforts even a little bit in capabilities over the next year or so, you’ll see some insane things start to get automated and working with Drupal will make anything else look painful by comparison (again, looking ahead, right now… yowza!).

What this would give you workflow wise:

  • An Ansible / YML based provisioning of a server, netting all dependencies for D8, Console, Drush, etc
  • You could login and be presented w/ a prompting like Nittany Vagrant provides, asking what kind of site you want to build
  • With even minimal logic to the script (yes I’d like a Drupal site that’s for commerce for example), we could issue a drush recipe that…
  • Runs a make file, si minimal’s itself, grabs dependencies if they weren’t in the make file, set variables and import default configuration from features to make the OOTB “distribution” a rock solid starting point for anyone to build off of.

Then we’d ask other question. Like “What’s the name of this client”. Answering something like “bigbank” would allow..

  • Drush Recipes to tokenize the input of Drush to call Drupal Console
  • Console would then be told “Hey, we need to build a new module called bigbank_custom_paygateway, bigbank_helper, and bigbank_theme” create all the components for the types of custom modules that we all use on every deployment with anyone
  • Then enable these modules in the new site

Eventually we can get into automatic sub-theme creation, asking what kind of theme you want to base off of (zurb, bootstrap, mothership, custom, etc) and automate a lot of the setup in that area too. We could probably get to the point to w/ theming where you can ask Drupal Console for the template files (defaults obviously) that you want so that it generates those too.

The future is going to get so insane (buzz words wize) we’ll need to keep investing in automation just to keep up. We’ll tell each other to just download XYZ and run through the workflow; there will be no more “hey go get all these dependencies and…” no, things will just be, awesome!

Now, create a PuPHPet style interface that builds the YML customizations (Or maybe like this thing), asks the crappy bash questions, and tokenizes everything downstream… stuff that system in a simple Drupal site and hand it to a Project Manager to kick off a new build.. and THEY’VE started building the site. True empowerment and liberation of workflows is upon us. Now let’s all go drink coffee!

Jul 24 2015
Jul 24

George Miller has a vision of the future. Judging by the non-stop mayhem and desolation that is Mad Max: Fury Road, if I had the same vision, I wouldn’t sleep very much. As a piece of action cinema, however, Fury Road succeeds on every level. I couldn’t look away – but the second time I saw it, I was surprised by the number of carefully rendered details I missed the first time. I want to see it again.

After you emerge into the sunlight and finally manage to blink after two hours of wide-eyed apocalyptic rapture and think about Fury Road as a piece of content produced by a distributed team that had an audience in mind, there are plenty of lessons we can take away from it.

It takes time to make great content

Miller first tried to make Fury Road in 2001, took it up again in as a live action film in 2011, and wrapped photography in 2013. The movie itself didn’t come out until mid-2015. Typically, Hollywood calls that sort of timeline "development hell", and it presages an Ishtar-scale flop.

When you consider recent successes like The Lego Movie (4 years in production) and the fantastic Boyhood (12 years in production), it’s clear that, with the right people involved, movies benefit from allowing directors to realize their vision. Expand that to books and music, and the quickly-created masterpiece is the clear outlier.

To think that your organization can churn out content that best serves its mission without careful thought and robust process, then, would be a mistake.

There’s a long-standing notion that people don’t read on the Internet, backed up by careful research. I believe people don’t read on the Internet because, by and large, the available content is crap.

When you think about your own habits, though, haven’t there been at least a few pieces you’ve read almost word for word? I certainly absorbed the New Yorker’s recent terrifying article about how the Cascadia Subduction Zone is going to reduce Portland to the set of the next Mad Max. And ESPN’s Outside the Lines regularly produces content I read carefully – because they’re well-written pieces about topics that interest me.

Research shows that people who read for pleasure read more carefully. As Slate’s Michael Agger points out, even Jakob Nielsen, one of the founders of "People don’t read on the Internet", believes people will read content that interests them:

Nielsen's idea is that people will read (and maybe even pay) for expertise that they can't find anywhere else. If you want to beat the Internet, you're not going to do it by blogging (since even OK thinkers occasionally write a great blog post) but by offering a comprehensive take on a subject (thus saving the reader time from searching many sites) and supplying original thinking (offering trusted insight that cannot be easily duplicated by the nonexpert).

That sounds to me like the very definition of great content – and great content takes time to produce. I’m not talking about three-levels-of-bureaucratic review time, but about putting effort and craft into your explanations of what your organization believes is important to the world, be it through written word, a video, a podcast, or even just an image.

That can seem overwhelming, particularly when most nonprofits don’t have a cadre of trained writers on staff. There are plenty of tools out there to help you. Editorial calendars. Page tables and content templates. (Yes, content strategy, when it comes to implementation, involves a lot of spreadsheets.)

But it comes down to this: Don’t try to do too many things well. Try to do a few things better than everybody else.

You’re never going to bring everybody on board with your cause, so writing for the masses may not be the most effective strategy. Take time to create at least some content for those who care enough to read it.

Technical infrastructure should be so good, it renders itself invisible

If you take the time to produce great content, you want to make sure it’s displayed in the best possible light – and by that, I mean your users shouldn’t notice the technology at all.

Fury Road benefits from a huge number of practical effects: things look like they’re blowing up because they are, not because some computer rendered its idea of what an explosion should look like. But almost every shot was still digitally enhanced. This allowed Miller to create the impression of large crowds:

Mad Max aerial shot

Fury Road also makes extensive use of compositing to expand its visual palette:

Mad Max compositing

When you’re actually watching the film, however, the "How did they do that?" gives way to breathless enjoyment of the chase. The movie feels real because the effects are integrated so well, they become part of the story instead of superseding it.

The same has to be true of your technology platforms.

Nobody cares what email platform you're using if the contents are interesting and easy to absorb – but they will notice broken HTML. And your constituents aren’t coming to your website to ogle its features, they’re coming for the content. If they notice the underlying functionality, then your technology is not serving your mission.

Stay true to your vision

When you think about it empirically, Fury Road should not have succeeded with a mainstream audience. It’s a two-hour chase scene. Its nominal hero’s face is obscured by a mask for almost half the film’s run-time. Its night scenes were filmed in bright daylight. It prominently features a tanker truck full of breast milk. And yet it has grossed nearly $400 million worldwide.

Fury Road succeeds because it stays true to its director’s vision. George Miller knew what he wanted – the entire film was storyboarded and the cast largely worked without a script – and put exactly that, and only that, on film.

By Google’s count, Fury Road has roughly 3600 spoken words. Even a relatively action-oriented movie like Jupiter Ascending has nearly 9000 – largely because it’s burdened by the presumed need to explain what’s going on to the audience through background exposition:

Your planet is just now entering its genetic age. You understand very little about something which is a vital part of our reality. In our world, genes have an almost spiritual significance. They are the seeds of our immortality. When the exact same genes reappear in the exact same order, it is for us what you would call reincarnation.


Fury Road doesn’t care about telling you what’s going on or why it’s happening, just that it is. Why doesn’t Furiosa have an arm? How did Immortan Joe come to control all the water? It doesn’t matter in the visceral thrill of the chase. We trust Miller because we know he’s thought through all of the backstory and decided it didn’t matter here. He’s right. And cutting the movie to its barest bones serves his vision perfectly.

All that to say: If you produce content for a nonprofit, you have a built-in advantage because you have your Mission, Vision, and Values as touchstones. You know the backstory about why your organization does the work it does, and that can – and should – inform every piece of content you produce.

Think about the story you’re telling in terms of narrative arcs. Most of the content you produce, while illuminating some aspect of your Mission, Vision, and Values, can’t tell the entire story of your work. Instead, capture the interest of your audience, delight them with carefully crafted, finely honed content, and link back to the bigger picture. Hyperlinks were created before content strategy was even a phrase, but they allow us to lay out our story in pieces, tied back to a central narrative.

Prepare for the haters

The Internet is pretty great at disseminating information. Now that it’s easy for anybody and everybody to post their opinions in a public forum, our access to information is limited more by our imaginations than the media gatekeepers of old who decided what story should land on the front page.

The Internet has both improved public discourse – and degraded it. Because here’s the thing: if you have an opinion, it’s almost guaranteed that somebody out there has an opposing point of view. While these used to be confined to local conversations, when you post that opinion online, they can find you. They will find you.

Social media has been great for nonprofits in terms of community building and direct interactions with constituents. But it has also allowed the haters to find each other. What might have been a marginal response to your organization’s work in the past becomes amplified by technology.

In the case of Fury Road, that response was a call for boycott from a group of "meninists". (I know! I had no idea that was a thing, either.)

After the movie came out, noted misogynist Aaron Clarey wrote some pretty hateful things on his blog: "Let us be clear. This is the vehicle by which they are guaranteed to force a lecture on feminism down your throat… This is the Trojan Horse feminists and Hollywood leftists will use to (vainly) insist on the trope women are equal to men in all things, including physique, strength, and logic."

You can read the entire article if you need a little anger in your day, but Clarey ends with a Call to Spitefulness: "So do yourself and all men across the world a favor. Not only REFUSE to see the movie, but spread the word to as many men as possible."

Never mind that everything that happens in Fury Road serves the story, not a philosophy. The plot, such as it is, hinges on the escape of five women from captivity. As director George Miller puts it, "Initially, there wasn’t a feminist agenda... I needed a warrior. But it couldn’t be a man taking five wives from another man. That’s an entirely different story. So everything grew out of that."

He’s supported by Chris Hansen, the director of the film and digital media division at Baylor University: "We’re used to women being in the background, not men. But Miller isn’t doing it as a statement, he’s doing it because that’s what the story calls for."

Because the story your organization is telling supports and builds upon your Mission, Vision, and Values, somebody, somewhere is going to assume you are trying to diminish them in some way. And they will spew their hatred like a firehose.

As part of your content strategy, you need to be prepared for reactions to the information you put out into the world. Carie Lewis, Director of Communications Marketing for the Humane Society of the United States, knows too well that "When sensitive topics come up, trolls come out in droves, and misinformation spreads... In today’s world, the only thing you can do is have a crisis plan in place for what to do if/when you get attacked. Because you just never know what the internet is going to glom onto."

The producers of Fury Road ended up benefitting from the "men’s rights" backlash because it stirred up a controversy that made people want to see the movie even more. We probably can’t hope for that.

Carie notes that, "Internal education is key. Help your staff navigate the waters with social media policies and trainings to protect them as well as the organization."

The HSUS has standards and procedures for responding to negative publicity, and those have been documented. Carie says, "It’s important for us to be honest while at the same time not drawing unnecessary attention to the issue... We develop talking points that address the issue but don’t get into internal details. They get routed through PR, membership, social media, and the executive offices."

Building on that, The HSUS tries "to respond to everyone who comes to us on one of our social media channels with a legitimate question or concern, and within 24 hours. That means all questions unless it's someone we know is just trying to stir up trouble. There are some people who live to cause trouble and that you will never win over. You have to know when to stop, and when to not even start. That comes with time and experience."

As a final word of advice, Carie offers that "One thing I see a lot is organizations trying to talk over the issue. We learned the hard way that approach just doesn’t work; people will see right through you and it will only make matters worse."

Your mission should be at the heart of the content you produce. When that’s the case, it will be easier to defend the work you do – and it will energize the people already passionate about your work.

Everybody wants to be loved, but the prospect of online backlash shouldn’t stop you from crafting great content that articulates the reasons you do the work that you do.

Now go see the movie already!

Thanks to Ivan Boothe and April Lambert for their edits and additions.

All images copyright 2015 Warner Bros. Pictures.

Jul 23 2015
Jul 23


2015-07-22 00:00 - 23:30 UTC


The next beta release for Drupal 8 will be beta 13! (Read more about beta releases.) The beta is scheduled for Wednesday, July 22, 2015.

To ensure a reliable release window for the beta, there will be a Drupal 8 commit freeze from 00:00 to 23:30 UTC on July 22.

Jul 23 2015
Jul 23

Do you need to translate a website but don't have the time or money to pay someone to do it? This training will provide you with an overview of internationalization (i18n), localization (l10n), multilingual issues, and translation basics using the Lingotek Translation module. You will also receive mentored help translating a copy of your Drupal website (or a different one, if you'd rather).

Most of those familiar with adding multilingual capabilities to a website in Drupal 7 know that it's all about corner cases. Every Drupal site is unique! As a result, your multilingual strategy will likely be quite different from others. To maximize your productivity, this training session will be more of a guided workshop where you can work through issues you are facing.

In the morning, attendees will receive a brief overview of the multilingual landscape in Drupal 7. The trainers will cover a handful of the most common considerations site administrators must decide on when making an existing website multilingual. We will also set up and then machine translate and post-edit an example site using Drupal's multilingual capabilities and the free tools available with the Lingotek translation module for Drupal. Then we will discuss the common problems site builders encounter when translating an existing site, and how to fix them.

In the afternoon, attendees will be mentored as they enable a copy of their own website for multilingual and then machine translate it using Lingotek. You will quickly be able to identify problem areas in your site, such as modules or themes that are not ready for multilingual. Our mentors will share best practices for your particular mix of modules and help you come up with a strategy for completing your multilingual projects with time to spare.

Meet the trainers from Lingotek

  • Christian López Espínola (penyaskito), Senior Software Developer
  • Rob Bailey (robertdbailey), Senior Software Developer

Christian and Rob have years of experience working with clients to implement multilingual sites using Drupal. Christian is one of the most active contributors to the Drupal 8 multilingual core initiative, and Rob and Christian are the two primary maintainers of the Lingotek translation module for Drupal 7 and Drupal 8.

Attend this Drupal Training

This training will be held on Monday, 21 September from 09:00-17:00 at the Barcelona International Convention Center. The cost of attending this training is €500 + VAT and includes coffee and pastries before the training, lunch and coffee breaks. A DrupalCon ticket is not required to register to attend this event.

Our training courses are designed to be small enough to provide attendees plenty of one-on-one time with the instructors. However, each training course must meet a minimum number of attendees by 14 August in order for the course to take place. You can help ensure your training course takes place by registering before this date and reminding friends and colleagues to attend.

Register Now

Jul 23 2015
Jul 23

By Steve Burge 23 July 2015

Several times, our members have asked about finding the Node ID of individual pieces of Drupal content. The Node ID is the primary key in the database for Drupal content and it's useful in many situations.

If you don't have the Pathauto module installed, this information is easy to find. By default, the Node ID is directly in the URL of the content.

However, if you have the Pathauto module enabled (as most sites do) the Node ID can be hard to find. Here's the solution ...

  • Visit the content whose Node ID you want to find and look at the tabs. Here's an example of a node, with extra tabs provided by the Webform module:
Find the Node ID of Drupal Content
  • Hover your mouse over one of the tabs. In this example, we'll use the "Edit", but this technique should work for all tabs.
  • Down in the bottom-left corner of the browser, you'll see the URL for the link, and it contains the Node ID. In this example, the Node ID is 2.

Things to note:

  • I'm using Firefox in these examples.
  • Even though the URL for the "Edit" page is actually /content/sponsor-registration, the browser is showing you Drupal's original URL.

You can also use this trick on other parts of the site. For example:

  • Go to Structure > Content, you can see the original URL here by hovering over the "edit" link.
Jul 23 2015
Jul 23

28 pages of unmarred perfection. This book is pure unadulterated genius

- Chris Arlidge

Never Be Shocked Again! - Budgeting your Web Project

Are you having trouble figuring out an appropriate budget for your next web project? Our whitepaper can help!

Download your FREE COPY, to learn the different types of website projects, understand the factors that play a role in the budgeting process, and determine where your web plans will fit when it comes to costs!

Don’t ever be shocked by web costs again! A clear guide to help you plan the budget for your next web project.

Back in June of 2014, the monkeys headed for Austin, Texas. We stormed the Drupalcon with adventure gear, and a few clever and even controversial tee shirt giveaways. One of the most popular tee shirt was the "We are Drupal - Resistance is Futile" tees.

We have brought it back for you in a free wallpaper version. Feel free to download the zip file (a whack of various sizes are available). We hope you enjoy, and feel free to spread the word.

Jul 22 2015
Jul 22

Attending a conference with thousands of people from around the globe can be quite daunting. It’s easy to get overwhelmed with so many opportunities to network with people from around the world, but once you take the plunge, you’ll find that you have a great time building on your online relationships, advancing business opportunities, and becoming more involved in the community.

With attendees coming from all around the world, DrupalCon Barcelona is so full of chances to network that it can be hard to know where to begin. That’s why we’ve put together some tips to help everyone, from introvert to extrovert, to take advantage of the opportunities for connections at the Con.

As part of a small mini-series of networking tips, we will be providing you some helpful hints into how to get the most out of your time at DrupalCon Barcelona.

Preparing to network:

  • Determine your goals for networking at DrupalCon

    As soon as you’ve booked your ticket to Barcelona, set aside some time to think about what you’re trying to accomplish with networking at the Con. Are you looking to find a job? Do you want to meet people you normally only see on IRC? Are you interested in meeting your mentor?

    Your goals will help shape your DrupalCon plans: it can determine which sessions you pop into to and which social events you attend in the afternoon. By determining what you’re looking to accomplish, you’ll be better able to plan your day and set yourself up with a schedule to get it all done.

  • Research the people you want to meet

    You can easily see people who have already registered to attend DrupalCon Barcelona on the Community page. With so many people coming out to the event, there are great networking options for everyone That’s why we’ve made the decision to include titles on the page -- so you can more easily determine if someone might be a good connection for you.

    You can also search our attendee database by name or company if you’re interested in meeting a specific person. To make sure that you are searchable when other people are looking at who to reach out to too, make sure that your Drupal.org profile is updated with your name and Drupal.org handle, your most recent job title, and your current employer.

    Don’t forget to take some time to check over the sessions you’re looking to attend, and do a little research the presenters -- one of them may be a great connection for you. You can also see who is planning to lead a BoF and get to know who is interested in topics that you are when you browse the BoF pages on the DrupalCon website.

  • Reach out before the Con

    A great way to break the ice before you even get to the Con is to email, tweet, or ping the people you want to meet in Barcelona. Introduce yourself and let them know you’re looking forward to meeting them in Spain, and that you’d like to pick their brains, network, or talk about Drupal. This helps other people plan their schedules, too, as they’ll know to keep an eye out for you. Plus, having a little prior context for your first conversation makes those icebreakers easier.

Time is ticking and DrupalCon Barcelona will be here before we know it, so start doing your research and setting up your schedule. Between sessions, BoFs and the keynotes, make sure to leave yourself some networking time!

Jul 22 2015
Jul 22


Back in high school, Brett Meyer (ThinkShout Director of Strategy) played the entire Ultima Series, right through Ultima IV; Stephanie Gutowski (ThinkShout Community Engagement Organizer) confesses to playing video games “since I was old enough to hold a controller.”
Together, they wrote a nifty article – “Got Game?” – in Drupal Watchdog 5.01 (Subscribe now! https://drupalwatchdog.com/subscribe/2015), analyzing Dragon Age: Inquisition from a Drupalist perspective.

BONUS! A revealing answer to a provocative question: Would you be in a relationship with someone who practices open source?

Jul 22 2015
Jul 22

I'm working on porting Security Review to Drupal 8 as my Google Summer of Code project this year. 8 weeks have passed since the beginning of the coding period, and the port is ready to be reviewed. In the remaining 4 weeks I'm going to address issues found by reviewers, possibly add more functionality and solve some issues related to the old version of the module prioritizing issues that are already solved in the D8 port. 

What is Security Review?

Security Review automates checking many of the configuration errors that lead to an insecure Drupal site and looks for existing vulnerabilities and attack attempts. The primary goal of the module is to elevate your awareness of the importance of securing your Drupal site. 

How can you help?

If you would like to help, you could review the ported module and post your findings in this issue. It helps if you have used Security Review before.

The 8.x-1.x branch of the code can be downloaded from here. For installation instructions check README.txt.

Alternatively you can use simplytest.me and you won't even have to leave your browser. Start writing Security Review in the first input box, choose the 8.x-1.x branch and start the sandbox! After going through the Drupal installation enable the module on /admin/modules (Extend) and you are ready to start testing. Note: the module has a Drush function that won't be testable this way.

Developer blog for Week 8

This is mainly a developer blog post, so let's walk through what I've worked on this week.

Added status icons

I've added some icons to the first column of the table on Run & review, and that instantly made it look a lot better. Below are the results.

Security Review status icons

The icons are loaded from /core/misc/icons which has some weirdly named subdirectories inside it, but I'm sure there's an explanation for that, I haven't looked into it.

Added progress bar (Batch)

I've implemented the usage of the Batch API in hope that it would let the user know which check runs slow on their system. Sadly the progress bar doesn't provide the needed information as it can't update itself in the right times. Anyway it does let the user know that something is happening, and it might prevent a few timeouts, so implementing it was still useful.

Wrote tests

I've added a test module that defines 2 security checks. Both checks fill the findings array with some random integers and strings, the difference is that one stores it in the State system, the other does not. This way some tests got more controlled (they don't use the real security checks), and it's also a good way of providing an example implementation of a module that defines security checks.

Fixed code style issues

I've checked pareview.sh for code style issues in the project. I was stunned by the amount of errors it listed, but I've successfully addressed all of them (except false positives). This is the commit that fixed all of them. 

Jul 22 2015
Jul 22

The Drop isn't the only thing that is moving; so is PHP. If you have a Drupal 6 site you are most likely running PHP 5.3 or older, versions that stopped being supported in 2014 or prior. Now that PHP 5.5 has moved out of active support, some hosts, such as Acquia, are dropping support for anything older and sites will be forced to upgrade, ready or not. The good news is that Drupal 6 can be made to work with PHP 5.6, which is actively supported. Drupal 6 core needs just a few patches but many contributed modules will need to be updated and/or patched.


So what patches and updates does your site need? To find out, you first want to get a list of what errors your site is producing now. If you have dblog running and your watchdog table is empty, you can skip this step. If your site is like most, though, there are a bunch of little errors that will fill the logs and make it difficult to see what problems the upgrade is causing. To make a baseline list:

  • Make sure the dblog module is enabled.
  • Truncate the watchdog table to clear out old errors so we just have the current ones. (Be sure to keep a backup if you have a need for the original error log.)
  • Spider the site as an admin to hit all of the pages, including the admin section. I used wget for it following these instructions from jesstess:
    wget --post-data='edit-name=USERNAME&edit-pass=PASSWORD&next=' --save-cookies=cookies.txt --keep-session-cookies http://EXAMPLE.COM/user
    wget -R logout -r --spider --load-cookies=cookies.txt --save-cookies=cookies.txt --keep-session-cookies http://EXAMPLE.COM
  • Try out some manual things that the spider won't hit like saving a node or editing an existing one.
  • Put this script into a text file called watchdog_distinct.php (you will need to add an opening php tag at the top):
    $result = db_query("SELECT DISTINCT message, variables, location FROM {watchdog} WHERE type = 'php' GROUP BY message, variables");
    while ($row = db_fetch_object($result)) {
      if (isset($row->message) && isset($row->variables)) {
        if ($row->variables === 'N;') {
          $message = $row->message;
        else {
          $message = t($row->message, unserialize($row->variables));
        drush_print($row->location . ' ~ ' . $message);
  • Run the following command via your terminal or command prompt (note: requires the Drush system to be installed):
    drush scr watchdog_distinct.php > current_errors.txt

The text file, current_errors.txt will now contain a list of the errors your site is currently having that you can compare against.


Now that you have a baseline to compare against, it's time to upgrade and find out what breaks. Do the following steps on a development copy of your site.

  • (optional) Upgrade core and contrib to the newest versions. This isn't strictly required and you could get by only upgrading those that turn out to need an upgrade to work with PHP 5.6. However, this is a good time to get your site completely up to date and the newest versions of contrib modules are most likely to have the necessary fixes.
  • Apply known patches to the modules you are using. See the references section below for links to the three PHP upgrade issue tags on drupal.org.
  • Set your development site to run on PHP 5.6.
  • Go through the same process as above with truncating watchdog, spidering, and running the script, but give the results text file a new name.
  • Compare the errors in the new text file to those in the old one to see what is broken that wasn't broken before.
  • For each of the errors, check for patches or write your own. Make sure to contribute back any patches you make and to tag them with the PHP version number to help others find them.
  • Repeat the testing, patching, and fixing until no new errors occur.
  • Give your site a good manual going over to spot anything the spider missed.

Once this process is completed with no new errors showing – congratulations, your site is now ready to handle PHP 5.6!


Jul 22 2015
Jul 22

Post date: 

July 22 2015




drupal planet

Two of the services Realityloop provide are Drupal Site Audits and Drupal Performance Audits. We almost always see that images used in a site haven’t been optimised, something I find surprising given images are often the weightiest parts of a webpage.

There are really 2 separate places where images can be optimised, in the theme, and in site content, I personally think that optimising theme images is a no brainer as you can usually just do it before the site goes live. Content images are something that aren’t something you can really expect your content creators to optimise, luckily we’ve implemented something to deal with that which I will outline in my next post.

Image Types

There are essentially 2 types of images. Raster images define the position of each of the pixels each a different colour arranged to display an image. Vector images are made of paths each mathematically defined that tell it’s shape and what colour it is bordered or filled with.

Raster images can display nuances in light and shading at their created resolution but cannot be made larger without sacrificing quality. Vector images are scalable, allowing the same images to be designed once and resized for any application.

Here are my key tips for optimising the graphics in your theme.

What filetype.. GIF, PNG, JPEG or SVG?

For raster images you’ll usually want to use either PNG or JPEG. SVG’s are your go to web friendly vector format.

Basically GIF’s are almost always larger than a well optimised PNG. JPEGs are usually better than PNG’s when there aren’t any sharp edges, text, or transparency.

If you’re not sure whether you should use a raster or vector file, follow this simple rule of thumb: If you’re drawing something from scratch with only a few colors, go with vector. If you’re editing a photo with multiple colors and gradients, go with raster.

Work with the grid (JPEG)

Nearly everyone has seen a heavily compressed jpeg image that show lots of artifacts,  this is because JPEG’s are made up of a series of 8x8 pixel blocks. We can use these blocks to our advantage in 2 key ways:

Align rectangular objects to this 8x8 grid:

The above image is not aligned to the 8 pixel grid it's file size is 2.53KB

The above image is aligned to the 8 pixel grid it's file size is only 1.84KB using the exact same save settings in Photoshop.

If jpeg images are allowed for upload I try to make the dimensions of any related image styles a multiple of 8 to work with the 8x8 compression system of jpeg.


Use Image Sprites in your theme

Image sprites allow you to reduce the http requests required to load your webpages, you create them by making a single image that contains all of the required images in a grid and then using CSS to manage the display of the correct part of the image.

The simplest way I’m aware of to create image sprites is to use compass sprites, this only works with PNG’s though.

I personally prefer to use SVG’s so that a single asset can be used for all breakpoints within the site, unfortunately the only way I know of to do this is manually create the sprite using an SVG editor (Inkscape or Illustrator for example).


Compress your images

Photoshop has “Save for web” but you can get much better compression using other applications.  If you do use a lossy compression I also recommend using a lossless optimiser straight afterwards to get further filesize savings if it is possible for the filetype you are compressing.

Lossy Compressors

Lossless Compressors

Learn to use Pixel Fitting when shrinking images with hard lines

For a vector graphic to be displayed, the computer has to perform a translation from the mathematical vectors into something that can be displayed with pixels.

This translation process is relatively simple: the computer takes vector lines, lays them on top of a pixel canvas, and then fills in each pixel that the lines enclose. So, for important icons and logos– really, for all rasterised vector images–you should fit the pixels to the grid and ensure they are as sharp as possible.

An excellent description of this can be found at http://dcurt.is/pixel-fitting

For the greatest reduction in filesize optimise files manually

To gain the greatest saving in file size you will need to choose a lossy compression method, for this reason you will pretty much need to play with settings to adjust the image quality versus file size tradeoff.

Stay tuned for my upcoming post about how to implement on-the-fly lossless compression to image uploads on all image fields in your Drupal site

Jul 22 2015
Jul 22

Leonid Makarov (inqui), Chief Architect at FFW US East joins Ryan to talk about Docker and FFW's (Blink Reaction's) internal developer environment, Drude.


Three Stories


Picks of the Week

Upcoming Events

Follow us on Twitter

Five Questions (answers only)

  1. skiing with his son.
  2. Revolute (credit card app)
  3. Bungee from a Bridge in New Zealand.
  4. Icelandic Horses.
  5. When he discovered hooks, and stopped hacking core.

Intro Music

"Agony (Coder vs Themer)" - from the DruaplCon Los Angeles pre-note performed by Campbell Vertesi and Adam Juran (starts at 12:37).


Subscribe to our podcast on iTunes or Miro. Listen to our podcast on Stitcher.

If you'd like to leave us a voicemail, call 321-396-2340. Please keep in mind that we might play your voicemail during one of our future podcasts. Feel free to call in with suggestions, rants, questions, or corrections. If you'd rather just send us an email, please use our contact page.

Jul 22 2015
Jul 22

TL;DR: If the internet reflects society, what does your content say about you? Having flexible and accessible content will allow you to adapt to how people consume your content.


"I miss when people took time to be exposed to different opinions, and bothered to read more than a paragraph or 140 characters. I miss the days when I could write something on my own blog, publish on my own domain, without taking an equal time to promote it on numerous social networks; when nobody cared about likes and reshares.

That’s the web I remember before jail. That’s the web we have to save."

I recently read the trending article The Web We Have to Save, by blogger Hossein Derakhshan ('Hoder'), who had been imprisoned in Iran for six years. In the article, he talks about how the internet had changed over that time. Content on the internet has been consumed and discovered in different ways over time, from directory listings, to search, to blogs, and social media. The writer had been an influential blogger (credited with starting the blogging revolution in Iran), but on coming out of jail, he found that quality blogs no longer had the position they once did. Instead, content is largely discovered and read by people in 'streams' on social media apps. Quality can be drowned out; what is important is diluted in amongst the trivial.

Personally, I believe any expression of culture will reflect the society it flows from. The internet is a global society, so incorporates so many different aspects of humanity - different, good, and bad. We see how cultural expressions can reveal something about a society in the news all the time. Music is a creative expression that will inevitably present the good and bad of a culture. Each genre of music often goes together with a subculture, so certain themes come up, often telling the stories of the society that the subculture represents. Football fan culture around the world is another classic case. In one country, it might be dominated by middle-class, sanitised and highly commercialised. In another, it is raw, dangerous and associated with criminal activity. The internet has become one of the biggest and most global cultural expressions ever known. While it is diverse, the way we consume it is perhaps becoming less so - which Derakhshan has been in a unique vantage point to spot.

What does the internet say about our global society? If you contribute to the internet with websites and social media activity, what do your contributions say about your place in the world and how you relate to it? I believe that we should all take responsibility to some extent -- especially those of us in the business of websites and content on the internet! Can we contribute to a more responsible internet? Are we equipped to do so?

A responsible internet should be diverse and inclusive. I believe a responsible internet could be positive and encouraging, with negative and destructive aspects present but drowned out by the good that the world has to offer. Maybe that's utopian, but it really does come down to decisions made that shape our content and the way we present it. Content should be accessible to any user, on any device. That means thinking of users with accessibility needs, thinking of users that do not speak English as their first language, and building solutions that will be future-proof to some extent.

Drupal, with its community-driven ecosystem of modules, can cater for accessibility and internationalization needs. Drupal 8 will be the most translatable product yet, and will also have responsive design for any screen included 'out of the box'. At ComputerMinds, we have experience in getting the most out of Drupal, and going well beyond its core capabilities, to maximise how 'responsible' our websites are in each of these areas.

Part of being responsible is being prepared to constantly improve and yet to also have the foundations to aim to cope with potential change. ComputerMinds' bespoke Drupal websites are built with future-proofing in mind - user needs change over time, as does the infrastructure of the internet (servers, browsers, connections) - so we have to think ahead. A site built exclusively for the conditions of the current time may not last long, and will not be able to serve future visitors. We improve as individuals and as a team with every project we work on. Best practises are identified and developed, we work with the best tools & modules as they mature. We help move projects in the Drupal community along with support, fixes and improvements. We won't stand still, and we'll ensure that the websites we build will last despite the inevitable change of the internet.

As a content management system (CMS), Drupal is well placed to be equipped for a responsible internet. Content will always be the core of what users consume, in whatever form or wherever it appears, so it's essential to have a flexible & powerful CMS for highly manageable content. The Drupal CMS framework, paired with our own depth of experience in modelling content and giving power to editors, enables successful, and responsible, websites. An example of this, where we are also constantly improving, is our use of the new 'Paragraphs' system to make responsive rich content that works for site visitors and site editors. The raft of SEO and social media integration modules for Drupal helps the content of a site to succeed outside of its own domain, whether it appears in users' social media 'Streams', or search engine results. We have plenty of experience in fine-tuning these modules and creating custom solutions to make content work for its intended audience.

So whether web needs saving or not, whilst the way we all consume content on the internet changes, the real key is to work towards a responsible internet. We (if you're reading this, I'm including you in!) are key contributors to that. Are you equipped to add to a responsible internet which is inclusive, diverse and high in quality? What tools and methods (technical or not!) do you use that help build towards that? Let us know in the comments below!

Jul 22 2015
Jul 22

Hacked sites. Security flaws. Lost data. Loss of trust. Lost customers. Lost revenue. Nightmare.

Just thinking about themes such as these in the media can send a shiver down your spine. It can all seem very daunting, and not just a bit scary when you start to think about it. This article aims to paint a clear picture of what you should be aware of as a site owner - where security weak points are, and strategies to avoid them.

My Website Has a Password - That Makes it Secure, Right?

Security, like the technology behind a modern website, has many facets and layers. Alas, merely password protecting your site admin screens is not enough. Having said that, I do remember, in the bad old days, being presented with a site without even that, such that anyone who guessed the admin URL could edit all the site content.

Unlike back then, security is now a serious business and needs to be treated as such.

Starting from the first point of contact and working down, the security layers are:

  • Password protected user accounts
  • Appropriately set permissions for user accounts
  • Protected forms
  • Secure file location
  • Securely written site code
  • Up to date site code
  • Up to date server applications
  • Up to date server operating system
  • A secure location for your server
  • Encryption for traffic to and from your site

User Accounts

It may seem obvious that a strong password is important, but alas, people don't seem to take this very seriously, as evidenced by this article from TechCrunch.

Fortunately, there are Drupal modules to help avoid chronic passwords, e.g. Password Policy and Password Strength to name but two.

A password is only half the battle, though. Drupal ships with a powerful and fine-grained permissions system that allows a site administrator to dictate what users can and cannot do. It is critical that proper attention be paid to user permissions when setting up a site or introducing new features.

Protected Forms

Following on from user account permissions, forms, e.g. content editing forms, comment forms, contact forms, should all be viewed as potential areas of attack and need to be locked down. The safest thing to do is simply restrict who has access to a form. E.g. only site editors can post new content. But in the event that other users can use forms, strategies to limit the potential for harm include: using a text filter on text inputs so that no potentially harmful tags, such as <script> can be used, or enforcing a publishing review policy that means all new content is reviewed by a trusted editor before publication. Other tags to be wary of include <img> <iframe> <object> and <embed>.

Secure File Location

If users can upload files to your server (images to go with a blog post or products in an e-commerce catalogue), you need to make sure that the directory these files are being uploaded to is secure and that web users cannot upload files there without using the proper form fields. You also need to make sure that you restrict the types of files that are allowed to be uploaded. We've seen examples where malicious scripts have ended up in the sites/default/files folder and been used to exploit the website and server.

Securely Written Site Code

Filtering text input is an important concept. Drupal filters content on display (rather than input), which means that content is passed through several functions before display to make sure that nothing harmful reaches the screen. The Drupal API has various security flavoured functions built-in such as check_plain(), filter_xss() and many more excellent sanitization functions that are available to developers.

There is also the database abstraction layer, which developers can use to avoid SQL Injection attacks and the Form API which protects against CSRF attacks.

Drupal.org also has a guide on how to write secure code.

The Drupal community puts a lot of store and effort into writing code that works and is safe to use.

The Drupal Security Team exists to work with module maintainers to resolve security issues and manage the announcement and release process.

(Note: Annertech is the only Irish Drupal agency privileged to have a member on the Drupal security team.)

Up To Date Site Code

Security issues happen. It is the nature of software to evolve; as machines get faster, new methods are developed, new protocols emerge and people have ideas. What was secure last year could well be insecure next week. This is why we need to keep on top of it.

When someone finds an issue with a Drupal module, they report it to the security team, who will work with the module maintainer to fix the flaw and plan a release. Only when the issue has been resolved and a patch has been accepted is the issue announced and a new version of the module released. Security releases are scheduled and so updates can be planned for - every Wednesday for contributed modules, every third Wednesday of the month for core releases.

Keeping your modules and Drupal core up to date with security releases is of major importance for the ongoing health of your site. Fortunately, if Annertech hosts your website, our hosting service includes all the security updates as part of the package, leaving you one less thing to worry about!

Secure Servers

Even if your site is up to date, written securely, has tight permissions and strong passwords, there are still avenues for attack by dastardly evil-doers. The server itself will have application software on it, e.g. the webserver, and an operating system, all of which will need to be kept up to date - a daunting task for those who are not familiar with the inner workings of servers. Often, many people make the mistake of renting a server without thinking of how it will be maintained, or by whom. A cunning solution is to rent a fully managed server, where the server provider undertakes to do all the maintenance for you.

Needless to say, along with covering all your site security updates, our hosting service also covers all operating system and server application updates so you can rest easy and concentrate on your own business, rather than the upkeep of your website. Our servers live very comfortably in a highly secure data centre in Dublin, so you can be sure that your site is safe.


SSL, or Secure Socket Layer (also commonly referred to as Secure HTTP - HTTPS), is a way of encrypting traffic between your web server and a client browser, so that nobody else listening in on the network can find out what information is being sent back and forth. Where once SSL was treated as a luxury for special pages, e.g. e-commerce checkout, it is becoming more and more common, with many sites opting to serve all pages over HTTPS only.

Recently Google announced that it was lending weight to sites which served all pages over SSL. If the mighty Google is using SSL as a factor in its algorithms one can only assume that this level of security is a good thing and a big deal.

Your customers - your site visitors also love to see the SSL symbol, be it a padlock, a little green shield or however your browser displays a secure connection. It instills confidence in your site and increases the chance of return visits and sales.

What to Do Now?

All that is a lot to take in, but don't worry. We're here to help. If you're not sure, we can offer a complete site security audit, and between our support service and hosting, we've got your back.

Need help? Call us. We're listening.

Jul 21 2015
Jul 21

By Steve Burge 21 July 2015

Everyone is excited for the launch of Drupal 8.

Come and join us and Acquia on August 20th for a fast-paced Drupal 8 webinar.

Rod will introduce you to all the major advances in Drupal 8. Rod will cover user-friendly features including the mobile-friendly admin interface and the in-place WYSIWYG editor, plus improvements in theming and module development.

What you’ll walk away with

  • An overview of all the key innovations in Drupal 8
  • How to talk to your team about what’s coming in Drupal 8
  • A greater level of confidence in choosing Drupal 8 for your next project
  • Discover the freedom of Drupal 8’s move to widely-accepted coding standards

Webinar Details

  • Date & time: Thursday, August 20, 2015 01:00 PM EDT
  • Duration: 60 minutes
  • How to register: Visit this link


Jul 21 2015
Jul 21

Submitted by petednz on July 21, 2015 - 12:52

Drupal 8 is likely to be released around September. Fuzion have been driving the initiative to have CiviCRM ready to roll when Drupal 8 is released. We have CIviCRM currently working on latest beta, and old friends like Views are working fine. But there is still lots to do to get Rules, Entities and Webform working. We expect there to be a MIH (make it happen) campaign launched soon, but we have already had some generous sponsors for our work to help get us this far.

You can give it a try out at http://civid8.fudev.co.nz/ and you can help progress this work by chipping in here http://www.fuzion.co.nz/civicrm/contribute/transact?reset=1&id=4, or you can request that we set you up your own test suite. 

If you are happy trying out your own installation, details can be found here https://civicrm.org/blogs/torrance/give-civicrm-drupal-8-test-out