Jul 31 2015
Jul 31

React.js, CouchDB, Node.js, de-coupling Drupal; if any of that sounds cool to you, then this is the podcast for you. Kyle Hofmeyer gathered a several Lullabots together, who helped create the new lullabot.com, to learn what kind of wizardry was used to make this thing purr like a happy kitten. Jared Ponchot talks about the advantages this process provided for him and his design team. Sally Young talks about the guts of the site and the magic that went in to making this de-coupled Drupal site a success. We are also joined by Kris Bulman, Wes Ruvalcaba, and Betty Tran as they share their experience building the site. From front-end advantages to lazyboyDB, this podcast has it all.

Jul 31 2015
Jul 31

Mmmm… love that new website smell!

Some history

It's been nearly 10 years since we launched our first company website at lullabot.com. During that time, we've done five full redesigns of the site. The company has grown from two people to 62. We've expanded from a small Drupal consulting and education company to a full-service agency with a complete Design team, dedicated front-end developers, and of course, the expert Drupal back-end development which has always been our foundation.

As we've grown, our site design has reflected our focus and skills. The first site that Matt and I put together back in 2005 was intentionally sparse – not exactly beautiful, but functional and simple to maintain for just 2 or 3 people. As we hired talented designers and skilled front-end developers, site redesigns became more complex. In 2010, we split our Drupal education services into Drupalize.Me and the main focus of lullabot.com became our client services work, showcasing our design and development projects and sharing insights from our team.

Revving up the new Lullabot.com

The newest iteration of Lullabot.com is our most ambitious to date. As with most of our client engagements, the project started with research. Our Design team interviewed existing and potential clients, site visitors, and the Lullabot team to understand how people were using our site – what they wanted to get out of it, and why they visited. Our team distilled all they'd learned into goals and early wireframes for the site. They then worked with our Development staff to try to come up with the most flexible way of achieving these goals so that we could have full control of the site in ways that Drupal often doesn't afford. They wanted full <html> to </html> blue-sky design of any arbitrary page on the site without losing Drupal's amazing content management capabilities.

The technical team settled on a decoupled, isomorphic approach using Facebook's React, Node.js, CouchDB (a noSQL database) and Drupal as the backend CMS.

Content management is what Drupal does best, and this happens through a purpose-built subsite where the Lullabot team can login and post articles, podcasts, and manage their bios. Drupal pushes content into CouchDB, which exposes a REST API for React to consume. React is an isomorphic library (its code can run both in the server and the client), which means that when a visitor first visits the site, they receive the html of the entire page. Then, the rest of the navigation happens client-side, updating just the parts of the page which are different from the current one. Furthermore, React is written to be completely backward compatible with older browsers.

Our clients are often in need of API-driven native mobile apps, television-based apps, and content ingestion on connected devices. We've implemented these things in less holistic ways with our clients in the past. But the new Lullabot.com gave us a chance to experiment with some methodologies that weren't quite tried-and-tested enough to recommend to our clients. But now that we've had a chance to see the type of flexibility they give us on lullabot.com, we'll be adding this to the array of architectural strategies that we can consider for our clients in the future.

Look ma, no hands!

The results are amazing; high-speed, high-performance, and superlative flexibility. In layman's terms, this means our Design and Front-end people can go crazy – implementing blue-sky ideas without the usual Drupal markup constraints. The new site is fully responsive. Articles and portfolio work pages can have giant, dazzling, full browser-height background images or videos. Articles have big text that is easy to read on any scale from large desktop monitors to the smallest phone screens. Furthermore, we did everything with an eye toward blazing fast page loads. We omitted jQuery, trading convenience in the development process for speedy page loads. Then we looked at every http request, every image, every library to make sure our website was as snappy on an older smartphone as it was on the desktop. Best of all, we off-loaded much of the heavy lifting to the client-side with React.

Design-wise, the new site is uncluttered, sparse, and relatively simple. But whether you're looking for our vast archive of articles or podcasts, information about what services Lullabot offers, who we've worked with and what we've done, or you're curious to know what it's like to work at Lullabot, it's all there.

Over the coming months, we will be writing a series of articles and doing a few podcasts talking about different aspects of the new site. Please subscribe to the Lullabot email newsletter below and you'll be the first to know when new articles are published.

Jul 31 2015
Jul 31

I've been working with Drupal 8 for a long time, keeping Honeypot and some other modules up to date, and doing some dry-runs of migrating a few smaller sites from Drupal 7 to Drupal 8, just to hone my D8 familiarity.

Raspberry Pi Dramble Drupal 8 Website

I finally launched a 'for real' Drupal 8 site, which is currently running on Drupal 8 HEAD—on a cluster of Raspberry Pi 2 computers in my basement! You can view the site at http://www.pidramble.com/, and I've already started posting some articles about running Drupal 8 on the servers, how I built the cluster, some of the limitations of at-home webhosting, etc.

Some of the things I've already learned from building and running this cluster for the past few days:

  • Drupal 8 (just core, alone) is awesome. Building out simple sites with zero contributed modules, and no custom code, is a real possibility in Drupal 8. Drupal 7 will never feel the same again :(
  • Drupal 8 is finally fast; not super fast, but fast enough. And with some recent cache stampede protections that have been added, Drupal 8 is running much more stable in my testing—stable enough that I was finally comfortable launching a site on Drupal 8 on these Raspberry Pis!
  • My (very) limited upload bandwidth isn't yet an issue. I only have 4-5 Mbps up, and as long as I host most images externally, serving up tiny 8-10 KB resources for normal page loads allows for a pretty large amount of traffic without a hiccup. Or, more importantly, without interfering with my day-to-day Internet use as a work-from-home employee!
  • It's really awesome being able to see the live traffic to the servers using the LEDs on the front. See for yourself: Nginx Load Balancer Visualization w/ LEDs. It's fun watching live traffic a few feet away from my desk, especially when I do things like tweet the URL (immediately following, I can see all the requests come in from Twitter-related bots!).

I'm hoping to continue writing about my experiences with Drupal 8 (especially on the Pi cluster), etc. in the next few weeks, both here and elsewhere!

Jul 31 2015
Jul 31

By Nick Savov 31 July 2015

cropp images

One of the biggest factors to a slow page load is unoptimized image sizes. The bigger the image, the more time it takes the browser to load it.

This tutorial will show you the two free tools that I use to optimize images quickly.

These services will be useful whether you use WordPress, Drupal, Joomla or any other platform.

Background information

When you access a URL, it sends your request to the host server. The server then sends over the requested information.

Three main factors determine the total speed of the page load:

  • Your speed: internet connection, hardware, and software speed all contribute to this
  • The host speed: internet connection, hardware, and software speed all contribute to this
  • The total size of the information being transferred

Image optimization focuses on the 3rd part, the information size. The smaller the images, the less that's needed to be transferred. But, you don't want the images too small so that they are blurry or users can't see them. Finding the right balance is best.

The faster the site, the more users like it. Also, it will rank better in Google and other search engines.

Crop and scale down: WebResizer.com

Image Optimization with WebResizer.com and ImageRecycle.com

Keep your images as big as you need them, and no more.

New developers make the mistake of using HTML to resize the image after it's loaded. But, that slows down the page load. The browser still downloads the big image.

If your image will only be displayed to 175px by 175px, you don't need it as 2000px by 2000px. Scale the image down to the size you need it.

I use WebResizer to crop and scale down my images. It's free, easy to use, and effective.

Optimize the weight: ImageRecycle.com

Once you're done cropping and scaling in WebResizer, upload the finished image to ImageRecycle.com. It will reduce the size even more, about 50-80% more. And it's free!

Voila! You now have an extremely optimized image, perhaps with a 90%+ reduced size from the original.

A Note of Caution

hese two services are web services. So the images are uploaded to 3rd party websites. As a result, I only recommend this approach for public images.

For images that contain sensitive information (ex: contractual information), use secure personal software. Focus on software that only allows you to have access to the images.

For example, GIMP would be a good option. It stores the files on your computer and doesn't share it with others (even its developers).

Jul 31 2015
Jul 31

Many website owners have recently received an email from Google with the title, "Googlebot cannot access CSS and JS files".

It doesn't matter whether you're running WordPress, Drupal, Joomla or another platform entirely. Google has sent these emails to 100,000's of sites.

I'm going explain the issue that Google is complaining and how you can resolve it.

What is the email that Google is sending out?

Google Webmaster Tools has been sending out this notification to sites:

Google systems have recently detected an issue with your homepage that affects how well our algorithms render and index your content. Specifically, Googlebot cannot access your JavaScript and/or CSS files because of restrictions in your robots.txt file. These files help Google understand that your website works properly so blocking access to these assets can result in suboptimal rankings.

Here's what the email looks like:


What does the message mean?

This message is all about Google's new focus on responsive design.

Your robots.txt file is telling Google not to use some of your CSS and/or Javascript files.

Google now needs these files to test your site for mobile friendliness. Without them, your ranking will go down. We covered almost exactly the same problem back in April during Google's "Mobile Armageddon".

What is the solution?

Go to Google's Mobile-Friendly Test and check your site. Here's a screenshot of a site that's received Google's e-mail.

Thankfully, Google's tester is very informative and points out the exact issue for this particular site. It's that the robots.txt file is blocking Google from accessing the resources it needs.

You can find the exact resources that are blocked by Google by clicking on the "Show resources" link on the right-hand-side of Google's tester Compare the files there to your robots.txt to determine what needs unblocking.

Googlebot Cannot Access CSS and JS Files

How to fix the problem

The solution will depend on the CMS you use, because each platform stores CSS and Javascript files in different locations.

Test your site again

After saving your robots.txt file, go back to Google's tester and run another test. If Google shows that your site is mobile-friendly, you've likely fixed the problem.

Jul 31 2015
Jul 31

Hi, drupalers! Enjoying your summer? Time to make plans for your awesome autumn! We have checked the forecasts, analyzed the position of stars, written php scripts to define the luckiest date and place for you. Here you go, the answer is: October 17-18, Lviv Euro DrupalCamp. The time and place cannot be changed ;)

Yes, it’s officially announced! And welcome to its new website!

Great news for everyone! We are happy to say that Lviv Euro DrupalCamp 2015 is officially announced and its new amazing website is launched. Those who have not yet been to Lviv Euro DrupalCamp during previous years will visit the website, learn more about the event and know it’s time to pack their bags ;) Those who have already been to Lviv Euro DrupalCamp will become even more sure that it’s time to pack their bags again and get ready for something unforgettable :)

Drupal reports + great fun = 100% joy for real drupalers

What are we all going to do at the Camp? Discuss our favourite Drupal and listen to interesting reports from cool IT geeks, improve our skills and learn a lot of new things. But that’s not all! We will make new friends and have a great time in Lviv. Yes, we have prepared a really special surprise for you. Have we intrigued you? Keep reading! We will tell everything later in this post.

Three times more options for complete drupalization

At Lviv Euro DrupalCamp 2015, cool Drupal reports will come in three streams. This means you will have a three times greater choice of most delicious “dishes” from the best Drupal “menu”. Enjoy! Latests trends, useful tips, hot discussions! You will be a part of the great Drupal show. Have interesting ideas? Want to deliver a report yourself? Welcome to submit a form.

What about the place? It’s great, fantastic, unbe-LVIV-able!

The host city of Lviv Euro DrupalCamp 2015 deserves a special attention. The ancient, mysterious, warm, homelike Lviv with the scent of coffee, the taste of chocolate and breathtaking architecture welcomes you! You will discover interesting places at your every step. Especially when it comes to our big surprise ;)

A unique quest through the unique city

We won’t keep you waiting any longer. Our special surprise for all Lviv Euro DrupalCamp 2015 participants is an unforgettable quest through the city of Lviv. Whether you have already been to Lviv or are just going to discover it for the first time, we promise that you have never seen it from such an angle.

Ticket prices: early bird catches the discount!

We have prepared flexible pricing options for Lviv Euro DrupalCamp 2015. It all depends on the time when you decide to buy your ticket. Of course, it’s wise to be an early bird!

Golden autumn is the time for golden opportunities. Enjoy your best October with Lviv Euro DrupalCamp 2015!

Jul 31 2015
Jul 31

How does Drupal work? Let's find out.

For years now I've wanted to dig through Drupal core, line by line, and understand how the big pieces do what they do. I'm finally doing that, and writing up my notes as I go.

Drupal 7 Deconstructed is the in-progress result of that.

If you've ever wondered what happens in the bootstrap process, or how Drupal's Form API works, or how exactly Drupal figures out which menu callback to run per page request, then this is the place to go.

It's just getting started, and so far I've only gone through the bootstrap process and the menu router, but I'm having a great time and learning a ton, so I expect to fill it up quickly.

Who could benefit from this?

Any developer who has ever wondered how Drupal works could get some value out of reading this. You'll need to know at least a little about Drupal development to understand parts (for example, I don't explain what hook_menu() is when talking about the menu router), but you shouldn't need to be an expert or anything.

If you feel like that describes you, but you don't understand a part, please let me know so that I can make it more approachable.

Want to help?

If you're interested in helping out, the best thing to do would be to keep an eye on the repo and proofread or review things as they're written.

Pull requests are also greatly appreciated, whether you want to fix a typo or submit a whole new chapter.

Or, if nothing else, just let me know if you like this idea! Knowing that this could be helpful to people besides just me is a huge motivational boost to keep things moving.

What's the end game?

I don't know. I could see this staying on GitHub forever, or being published on Leanpub, or ending up as a blog series.

Any suggestions?

Why Drupal 7? What about Drupal 8?

I chose Drupal 7 because it still has a pretty long shelf life left. Drupal 8 Deconstructed definitely needs to be written though, and I'd love to dive into that after 7 is complete.

What about contrib?

I would love to take apart some of the more commonly used contrib modules like Views, CTools, Panels, Webform, Pathauto, etc., as well, but one step at a time!

Please check out Drupal 7 Deconstructed and let me know what you think so far!

Jul 31 2015
Jul 31

We met again today to discuss critical issues blocking Drupal 8's release (candidate). (See all prior recordings). Here is the recording of the meeting video and chat from today in the hope that it helps more than just those who were on the meeting:

[embedded content]

If you also have significant time to work on critical issues in Drupal 8 and we did not include you, let me know as soon as possible.

The meeting log is as follows (all times are CEST real time at the meeting):

[11:03am] jibran: I think it is sorted by name
[11:03am] jibran: the order in the hangout
[11:03am] WimLeers: y
[11:07am] jibran: We have to look at google hangout code base for that.
[11:08am] WimLeers: https://www.drupal.org/node/2499157#comment-10172426
[11:08am] Druplicon: https://www.drupal.org/node/2499157 => [meta] Auto-placeholdering [#2499157] => 5 comments, 4 IRC mentions
[11:11am] WimLeers: amateescu's issue link: https://www.drupal.org/node/2336627#comment-10160850
[11:11am] Druplicon: https://www.drupal.org/node/2336627 => Deadlock on cache_config (DatabaseBackend::setMultiple()) [#2336627] => 39 comments, 24 IRC mentions
[11:12am] WimLeers: plach: yay for vacation :D
[11:12am] GaborHojtsy: VACATIOOOOOON!
[11:12am] GaborHojtsy: sometime, sometime :)
[11:12am] alexpott: https://www.drupal.org/node/2542762 is the nearly ready issue
[11:12am] Druplicon: https://www.drupal.org/node/2542762 => hook_entity_type_update doesn't get the entity in the new revision after addTranslation and setNewRevision [#2542762] => 11 comments, 4 IRC mentions
[11:13am] alexpott: https://www.drupal.org/node/2542748 is the gnarly update issue
[11:13am] Druplicon: https://www.drupal.org/node/2542748 => Automatic entity updates are not safe to run on update.php by default [#2542748] => 21 comments, 7 IRC mentions
[11:15am] plach: WimLeers: :)
[11:15am] WimLeers: alexpott: yay :)
[11:15am] dawehner: https://www.drupal.org/node/2540416
[11:15am] Druplicon: https://www.drupal.org/node/2540416 => Decide whether we need hook_upgrade_N()/upgrade.php front controller [#2540416] => 27 comments, 4 IRC mentions
[11:16am] WimLeers: dawehner: cache tables are auto-created
[11:16am] naveenvalecha|af left the chat room. (Read error: Connection reset by peer)
[11:16am] WimLeers: but yeah, router table etc… #sadpanda
[11:19am] WimLeers: The issue that originally turned it from a separate PHP file into a route + controller: https://www.drupal.org/node/2250119
[11:19am] Druplicon: https://www.drupal.org/node/2250119 => Run updates in a full environment [#2250119] => 21 comments, 1 IRC mention
[11:34am] WimLeers: plach: ROFL
[11:34am] WimLeers: plach++
[11:43am] WimLeers: "a foam of circles" lol
[11:48am] alexpott: https://www.drupal.org/node/2542748
[11:48am] Druplicon: https://www.drupal.org/node/2542748 => Automatic entity updates are not safe to run on update.php by default [#2542748] => 21 comments, 8 IRC mentions
[11:55am] dawehner: WimLeers: well but those tables aren't auto fixed
[11:56am] dawehner: WimLeers: so just imagine what happens if you need to change the cache_ tables
[12:00pm] WimLeers: dawehner: ohhh!
[12:02pm] WimLeers: plach: is that the church bells in Venice that I'm hearing?
[12:03pm] WimLeers: dawehner: lol
[12:03pm] WimLeers: :)
[12:05pm] plach: WimLeers: yeah, sorry :)
[12:05pm] WimLeers: plach: made me feel like I was on vacation, ever so briefly
[12:05pm] WimLeers: :D
[12:05pm] plach: :)
[12:23pm] dawehner: alexpott: are the issues the new thing or the solutions ;)
[12:24pm] alexpott: dawehner: well we have better ideas

Jul 30 2015
Jul 30

By Steve Burge 30 July 2015

There are several tools that that makes designing for Drupal much, much easier.

In particular, we recommend the Theme developer module and also Firebug.

Many of you may know about Firebug already. It's a browser tool that allows you to inspect any CSS, HTML or JavaScript elements.

Think of Theme developer as a Drupal-specific version of Firebug. Using Theme developer you can click on any element of your Drupal site and get a breakdown of how it was built.

This video is part of complete Drupal theming video class.

Note: please make sure to install and older version of the simplehtmldom API module. You need version 7.x-1.12, not the latest release.

Jul 30 2015
Jul 30

Making themes and specially advanced ones for Drupal has never been an easy task, it requires considerable amount of Drupal knowledge and in most cases at least bit of programming. So it comes as no surprise that despite the popularity of Drupal, web designers are reluctant to create themes for Drupal. Hopefully by the release of Drupal 8, it becomes a bit easier, but there is still a lot of work to do. The module which i'm going to introduce, can considerably simplify theming and eliminate/reduce the required programming for making almost all sort of Drupal themes.

When we create themes in Drupal, there are great number of reoccurring tasks that we have to do like adding IE conditional comments, remove or replace some core or contributed modules CSS/JS files to prevent conflict with the theme, putting some JavaScript at the bottom of the page or even adding inline CSS or JS files.

Unfortunately we can't do any of these common tasks using Drupal's theme .info file. Surprisingly however we can do most of it using Drupal 7's JS/CSS API! But not easily and not without programming. So as a themer with no knowledge of programming or Drupal's API, we will have no choice but to work around Drupal and directly modify the HTML (as most Drupal themers do) and by doing so not only lose all the great features that Drupal's modularity brings like all sort of CSS/JS optimizations, CDN, etc., but also will have to manually resolve the problems that it causes for core and contributed modules' UI and functionality.

Wouldn't have been great if we had total control over CSS/JS files via theme .info without having to know programming? That's exactly the purpose of CSS JS Total Control module. It extends Drupal's theme .info and adds loads of new features for handling JavaScripts and Stylesheets and is fully compatible with core and all the related contributed modules. No more programming or working around Drupal for handling JavaScripts and Stylesheets.

Download this module from [here], and start using it right away :) don't forget to send feedbacks

So lets have a look at the supported features :

  • Full support for drupal_add_css and drupal_add_js parameters and even more!
    • Adding external files
    • Defining where to include it : header / footer
    • Adding inline css/js
    • Whether to display on all pages or not
    • Defining style/script group : theme / module / core
    • Weight (the order of adding to the page)
    • Supporting Defer parameter
    • Enable/Disable caching per style/script
    • Enable/Disable preprocessing
    • Enable/Disable using core
    • Adding attributes like id to stylesheet/javascript include tags
    • Support for IE conditional comments for both styles and scripts
    • Defining style media : print/all/screen
  • Manipulating existing styles/scripts
    • Creating a white-list or blacklist to decide which style/scripts should be added to the page
    • Possibility of replacing and overriding core and contributed modules styles and scripts using only the info file
  • Possibility of altering the scripts and styles (hook_js_alter and hook_css_alter support for Drupal 6)
  • Compatible with most of the style and a script manipulation modules
  • Adds theme_path variable to be used by template files and css_js_total_control_get_theme_path function

Some examples for demonstration : You can read the full document plus practical examples [here]

Replacing core jquery!

scripts-settings[filter][rules][0][function] = regular_expression
scripts-settings[filter][rules][0][pattern] = %misc/jquery|jquery_update%
scripts-settings[filter][type] = blacklist

scripts-extended[js/vendor/jquery.min.js][scope] = header
scripts-extended[js/vendor/jquery.min.js][weight] = 0
scripts-extended[js/vendor/jquery.min.js][group] = core

Adding an inline script at the bottom of the HTML!

scripts-extended[js/menu-effect.inline.js][scope] = footer
scripts-extended[js/menu-effect.inline.js][type] = inline

Adding a stylesheet only for IE 7

stylesheets-extended[css/font-awesome-ie7.min.css][condition-string] = if IE 7

Adding an id to a stylesheet's include html tag (usage is mostly for dynmically changing theme style via javascript)

stylesheets-extended[css/menu/styles/lblack.css][media] = all
stylesheets-extended[css/menu/styles/skins/lblack.css][attributes][id] = custom_menu

Moving an script before all the other scripts

scripts-extended[js/vendor/jquery.min.js][scope] = header
scripts-extended[js/vendor/jquery.min.js][weight] = 0
scripts-extended[js/vendor/jquery.min.js][group] = core

Adding an inline script at the bottom of the page, (prints the content of the file)

scripts-extended[js/menu-effect.inline.js][scope] = footer
scripts-extended[js/menu-effect.inline.js][type] = inline

Adds a javscript library. (Relied on libraries module's API to load it)

scripts-extended[easing][type] = library
scripts-extended[easing][version] = default

Add some settings to Drupal js variable, (we can use this settings later on in our custom js files)

scripts-extended[mythemename][type] = setting
scripts-extended[mythemename][setting][name] = special

Allowing only necessary stylesheets and removing the rest to prevent conflict with theme styles

stylesheets-settings[filter][rules][0][function] = regular_expression
stylesheets-settings[filter][rules][0][pattern] = %settings|admin|misc|jquery_update%
stylesheets-settings[filter][type] = whitelist

The END.

Jul 30 2015
Jul 30

It is becoming more and more common with instances of continuous integration and testing to do any configuration through code. This is true of enabling modules as well reverting Features using hook_update_N() to carry those actions out. The problem with just putting module_enable() in an update hook is that even if the module enable fails, the hook_update_N is considered a success, so there is no way to re-run the specific hook_update. With a pair of helper functions, it is possible for the update hook to be considered a failure if a module does not enable. Considering it a failure allows you to resolve the issue and re-attempt the update hook.

Here is what the update hook to enable two modules might look like. Notice it calls a custom function rather than just module_enable().

 * Enables book and og_book.
function mymodule_update_7001(&$sandbox) {
$modules = array(
$return = mymodule_module_enable($modules);

  return $return;

Here are the two helper functions that enable the modules and check to makes sure they were actually enabled.

 * Check to see if the modules are actually enabled.
 * @param array $modules
 *   An array of module machine names to check for being enabled.
 * @return string
 *   Messsage indicating the modules are enabled
 * @throws DrupalUpdateException
 *   Calls the update a failure, preventing it from registering the update_N.
function mymodule_check_modules_enabled($modules = array()) {
$modules = (array) $modules;
$return = TRUE;
$enabled_modules = array();
$t = get_t();
// Check to see if each module is enabled.
foreach ($modules as $module) {
    if (!
module_exists($module)) {
// This module is not enabled, throw an exception.
throw new DrupalUpdateException($t('The module @module was supposed to be enabled by this update, but was not. Please investigate the problem and re-run this update.',array('@module' => $module)));
$module_list = implode(', ', $modules);

  return $t('The modules @enabled were enabled successfully.', array('@enabled' => $module_list));

 * Enables an array of modules and checks to make sure they were truly enabled.
 * @param array $modules
 *   An array of module machine names to check for being enabled.
 * @return string
 *   Messsage indicating the modules are enabled.
 * @throws DrupalUpdateException
 *   Calls the update a failure, preventing it from registering the update_N.
function mymodule_module_enable($modules = array()) {
$modules = (array) $modules;
$enable_good = module_enable($modules);
  if (!
$enable_good) {
// Enable command failed.
$module_list = implode(', ', $modules);
$t = get_t();
    throw new
DrupalUpdateException($t('The requested modules @modules to be enabled by this update, were not, because one of them does not exist in the codebase. Please investigate the problem and re-run this update.',array('@modules' => $module_list)));
// There are cases where $enable_good is TRUE but the module may not have enabled, so double check.
$success = mymodule_check_modules_enabled($modules);

  return $success;

Jul 30 2015
Jul 30

At Isovera, we like to build websites that people like to use. One of the best ways to enhance a user's experience is to give them what they are looking for fast! Site performance is often overlooked or relegated to the end of a project; but, it could be the most important factor in the success of your project.

While we would like to make every page we build load as fast as possible, sometimes you have to make compromises — whether for design, features or budget. That said, in most cases there is really no excuse to have a page that takes more than a few seconds to load on a fast connection. Pages that take more than two or three seconds to load fail to meet users' expectations and lose visitors.

Testing your site

Just loading the page to see if your site seems slow might be a good place to start. Most often, though, I want a little more data; and, in that case there are a few tools that I like to use when assessing the speed of a site:

I like http://www.webpagetest.org/ because it gives you a visual timeline of the page load. Often, you don’t have to wait for the entire page to load in order to start using it, so seeing when the page looks right can be very helpful.

Google’s PageSpeed Insights is another nice tool that gives you a clear sense of how well your site is doing using its PageSpeed Score, and also lists actionable steps you can take to improve things.

Performance budget

The average size of the top 1000 websites has increased from 1.2MB to 2.1MB during the past three years. While some may argue that a 2MB webpage is far too big, if you are just starting to think about performance as a goal for your web projects, then it's probably a good guideline to start with.

Using the Isovera website as an example, the home page is about 2.8MB and loads the visible part of the page in about two seconds. While there is always room for improvement, we are not doing too bad. One thing to note, though, is that these numbers are all based on a very fast broadband connection. If you want users to have a fast experience on their phones, with a decent 3G connection for example, then optimizing your site becomes even more important.

Chrome has a nice feature in its developer tools that not only lets you view your site at different device screen sizes but also simulates different mobile network speeds. While this is only a simulation, and other factors may impact page load times (that's why Isovera has a device lab), it provides a good starting place if you want to see what mobile users' experience might be like.

So what should you do?

While tools like these are a good start, it is important to incorporate the underlying best practices that make a site performant from the beginning. In some cases you might be able to just change a setting or tweak some code; but, in other cases there are more fundamental underlying architectural decisions that need to be made.

For example, using web fonts has become commonplace but often designers don’t realize how much of a performance impact this can have if you are not careful. Each different font style variation (bold, italic, etc.) requires an additional file to be loaded, so it is important to work with the designer to limit the number of different fonts and styles required for the site.

The list of performance issues that need to be considered is far too long to go into here. You need developers who think about performance issues all day long to be a part of the website design and development process — from beginning to end — if you want your site to delight your users. The basic idea here is to keep an eye on your page load times. Everyone on the team should be keeping performance in mind; and, it should be a part of your testing process. In the best case scenario this is integrated into your automated testing or continuous integration process.

Jul 30 2015
Jul 30

28 pages of unmarred perfection. This book is pure unadulterated genius

- Chris Arlidge

Never Be Shocked Again! - Budgeting your Web Project

Are you having trouble figuring out an appropriate budget for your next web project? Our whitepaper can help!

Download your FREE COPY, to learn the different types of website projects, understand the factors that play a role in the budgeting process, and determine where your web plans will fit when it comes to costs!

Don’t ever be shocked by web costs again! A clear guide to help you plan the budget for your next web project.

Have you ever been working on a site, and had your QA department, or your client come back with issues because when logged in, the local tabs (view, edit, etc) distort the page layout? Or maybe there are a lot of pages that contain a lot of content, and it has become frustrating for site admins to have to scroll all the way back up to edit the page? Wouldn’t it be great to still have easy access to the local tabs and not have them add extra bulk to the page layout and content?

We have began using a few different tweaks to add in the local tabs into the shortcuts menu area of a few of our main administration menu modules. So far, we’ve added this to toolbar, admin_menu, and nav_bar.

The first step, is to remove the rendering of the tabs in your page.tpl.php file(s). In most themes, you’ll find the following code renders out the tabs.

<?php if (!empty($tabs)): ?>
 <?php print render($tabs); ?>
 <?php if (!empty($tabs2)): print render($tabs2); endif; ?>
<?php endif; ?>

Simply delete or comment out those lines.

The second step is to add the local menu tabs and local actions to the shortcuts menu. All three modules have a preprocess or an alter hook to that allows us to add some extra menu items to the shortcuts menu area in each one. I’ll showcase the code we use for adding extras to admin_menu’s shortcuts area, but will also provide an example custom module which has examples for all three.

For admin_menu, we’ll take utilize hook_admin_menu_output_alter() to make our alterations.

function admin_menu_shortcuts_admin_menu_output_alter(&$build) {
 $path = drupal_get_path('module', 'admin_menu_shortcuts') . '/admin_menu_shortcuts.css';

 // Ensure our additions render last.
 $build['shortcut']['shortcuts']['toolbar'] = array(
    '#weight' => 99,
 $build['shortcut']['shortcuts']['toolbar']['menu_local_tabs'] = menu_local_tabs();
 $build['shortcut']['shortcuts']['toolbar']['menu_local_tabs']['#primary'][] = menu_local_actions();

 // Add workbench moderation menu if needed.
 if (module_exists('workbench_moderation')) {
   $build['shortcut']['shortcuts']['toolbar']['workbench'] = array(
     '#prefix' => '<div class="workbench-information">',
     'button' => array(
     '#prefix' => '<a href="#" class="expand-workbench-information">',
     '#markup' => 'Workbench information',
     '#suffix' => '</a>',
   'content' => workbench_block_view(),
     '#suffix' => '</div>',

As you can see, there really isn’t much to it, once you know the structure you’re adding it to. However, with admin_menu, just adding the alter function doesn’t get us all the way there. With admin_menu, there are two caching options which will nullify our new addition, since it will cache for one page, and won’t be contextual to the page you are on. However, if you set the following to variables to FALSE, admin_menu_cache_client and admin_menu_cache_server. This will disable the caching, and let your users enjoy the easier to use links.

And, as you’ve probably noticed, we have included a css file. In an effort to avoid conflicts with front-end and admin theme styling, we have created a custom overriding stylesheet to apply the module’s shortcuts menu styling to the new links to have a nice uniform look and feel everywhere across the site.

Another item to note, is that our examples are using a module. Which means, the changes will apply to both front-end and admin themes. We realize that with admin themes, having the tabs show is more practical and useful, so not everyone will want the tabs in the shortcuts menu with both themes. We like to be consistent, so we use the module option most often, but you can also add all of these hooks to your custom theme’s template.php file to have them be front-end only.

Now, as promised, here is the full example module.

Jul 30 2015
Jul 30

The logical way to analyse data is to start by looking at summarised data before looking at the detail. This is referred to as drilling down. In this post I demonstrate how to define drill down functionality between two Drupal Views. This is a continuation on my series of posts showing how Drupal can be used as a BI or data analysis front end.

In a previous post, using the Chinook sample data set, I created a database view that I imported as a Drupal View. The view displays sales per annum of the different genres of music sold by Chinook. I am going to use this view as the detailed data.

I created a summarised database view by removing the year from the list of fields. The summary view shows total sales per genre for all years. I wanted to drill down on each genre to see the details sales per year. The summarised view SQL is shown below.

CREATE VIEW vw_genre_total AS
    c.Name AS Genre_Name
    ,SUM(a.Quantity) AS Total_Quantity
    ,SUM(a.Quantity*a.UnitPrice) AS Total_Amount
    InvoiceLine a
    JOIN Track b ON a.TrackId = b.TrackId
    JOIN Genre c ON b.GenreId = c.GenreId
    JOIN Invoice d ON a.InvoiceId = d.InvoiceId

I started implementation of drill down in detail view. I defined a contextual filter on the genre field.

Contextual filter

Then in the summary view I rewrote the genre field as a link to the detail view. The two views must have a common field that can be passed from the summary view as a filter to the detail view. Below is Genre field configuration form showing how link is defined.

Views Drill Through Link

The summary view is shown below.
Views Drill Through Link

You can define many levels of drill downs. Staying with the Chinook example, you could define a drill down on the yearly genre total to the monthly totals. You can also define drill downs to different views giving the end user the option to see different detail data. The drill downs defined in the sample are on an external database table which could be your data warehouse. This functionality is a powerful way for users to access and analyse their data.

Jul 30 2015
Jul 30

The life of a Drupal developer… it can be a drag. After all, we sit at our chairs for hours on end, sipping on tea or Coke, and staring blankly at the screen. We do what we can to keep ourselves awake. We occasionally divert our eyes to relieve the strain or get up and walk around to ease the tension in our back.

But, is it really that bad?


The life of a Drupal developer is also very busy. After all, the inbox is overflowing with emails asking about a progress update and there are deadlines you need to meet by week’s end. What are you to do? It seems like a never-ending cycle of work.

When you do finally start working, you may feel like you’re being productive; but, the reality is… your workload is piling up. Do you really need to burn the midnight oil… just to catch up? As a Drupal developer, you have a few things to learn.

Are you ready to learn?

What Is The Issue With Productivity Science

Have you ever noticed the amount of productivity studies circulating the Internet? If you sit down to read through the various studies, it’s not going to take you long to realize how many nitwits are actually writing these studies. Many people just sit at their computers, making Facebook updates, playing Candy Crush and just surfing the ‘Net.

It’s been documented that what you read in the majority of scientific studies needs to be taken with a grain of salt. However, you need to ask yourself if it’s possible to have a business that increases productivity.  When it comes to productivity studies, there is an array of invaluable information that everyone can learn from.  

After all, there is work that must be done and you need to complete. You have clients that depend on you to ensure their website is up and going… all the time.  And, as much as you love to build websites or modules, you still have interests outside the Drupal environment.

In order to be more productive, you need to know what to put your attention on if you are to get more things done for the day.

What should you realize if you’re going to better your productivity levels? Well, you need to realize that there isn’t a perfect answer. After all, what works for one person may not work for someone else.

Find A Balance In The Business

When it comes to productivity, it doesn’t mean you need to cram more coding hours in your day. It means you need to do more coding in the same amount of time or in less time. When your projects begin piling up or when you have clients constantly contacting you about those pesky little deadlines. It’s so easy to add screen time to compensate for it all. 

Here’s what you need to understand: the more time behind the computer you spend, the less productive you actually become.

Therefore, take some time away from your career life – away from all the Drupal setup – and throw yourself into your life. You need to eat right, exercise and spend time with your family and friends in order to boost your productivity level.  Without that balance, your work is going to suffer… in time. And, all that productiveness you’ve been aiming for is going to get thrown out the window.

It’s important you lay out the different parts of your life so you know what part is most important to you. After you’ve done this, decide the amount of time you should dedicate to it. Remember, there are only 24 hours in a day. Thus, you need to figure out how much time to give each one.

  • Work
  • Time w/ Family
  • Time w/ Friends
  • Hobbies
  • Fitness
  • Sleep

So, if you must spend more time on client projects, you must take away time from other areas. Make sure you choose which one is less important to give your “work” time to.

What’s Your Work Environment Like?

Did you know that your work environment can have a huge effect on your productivity level? Look at things around you. How organized is your desk? Is it cleaned off? Does the desk look like a tornado went through it?

It’s worthwhile to have a physical and digital clean desk. The less distraction you have, the more creative your mind can be.

Set Up Intervals To Maintain Structure

Okay, so you’re looking at intervals as something only athletes do, right? Well, when it comes to Drupal development, intervals are a thing too.  You see, people are often told that sitting down for seven to nine hours a day, with a couple of breaks in between, is the way to be productive. If you’re lucky, you can have a little energy to exercise, play with the kids or have a talk with your significant other. 

What if you threw interval training in your Drupal development business? Your schedule could look a little like this:

  • 8 a.m. – Wake up
  • 8:15 a.m. – Answer emails, work on projects for clients
  • 10:30 a.m. – Breakfast
  • 10:45 a.m. – Go to the gym
  • 12:30 a.m. – Work on client projects
  • 4 p.m. – Answer emails
  • 4:30 p.m. – Lunch
  • 3 p.m. – Social media and marketing
  • 3:30 p.m. – Work on client projects
  • 4 p.m. - Take a 20 minute walk
  • 4:30 p.m. – Work on home projects
  • 5 p.m. – Spend time with family, social gathering and dinner
  • 9 p.m. – Plan the next day out
  • 9:45 pm. – Relax
  • 11 p.m. – Go to bed

Yeah, it seems like a pretty segmented day and, by design, it really is. You’ll need to be flexible and give it some personality. What you’ll notice is that the shorter the timeframe, the more intense the energy bursts are going to be. 

Better Prioritization and Focus

If you’re going to boost your productivity level, you must effectively prioritize your day.  Think of it this way: 20% of what you do during your day will be responsible for 80% of your household income. Be sure to prioritize that 20%, scheduling the important tasks first thing in the morning.

When you sit in front of a computer screen all the time, you’re hit with distractions from every angle – family, friends, business associates, etc. And, when you get sidetracked by the distractions, you may find it difficult to get back on track. There are going to be times when these distractions cannot be accepted. Your first and second work intervals must be when you are 100% engrossed in the job. This means nothing else must be scheduled during this time period.  Your focus on these work times should only be WORK.

Make Plans To Stay Productive

Have you gone through a day without a plan? Many people have, and these kinds of days are never very productive. As you get to your desk, look over your list of 20 tasks that you haven’t prioritized.  Do you suddenly feel overwhelmed?  Do you feel like you don’t know where to go? You may find yourself an hour into work with nothing to show for it.

The last thing you should do every day is plan out the next day’s work, making sure you label what’s important and what’s not as important.  Write these tasks down or type them out.

How To Get Started

The surefire way you’ll get the productive working environment you are after is to get started with the tasks you’re faced with. Of course, it’s the most obvious thing to do; but, you may suddenly realize that you’ve gone through the day and didn’t do a single thing that would be considered productive.

A huge barrier in productivity is failing to get right into the task. If you handle an array of client projects, it’ll be easier to avoid the task you need to be working on.  Planning will certainly solve that issue. When you notice you’re having issues with the tasks at hand, ask yourself if you had a prioritized list. Create this all-important list the night before.

Be Open To The Possibility Of Experimentation

The above concepts are important; but, that doesn’t mean they’re all going to work for you. Most will however, make a huge difference. The steps you need to take are fairly easy to implement:

  • Create a systematized, clean work environment
  • Lay out the things in your life
  • Prioritize and put attention on tasks and areas considered important
  • Use intervals to arrange your day
  • Make plans ahead of time
  • Get moving

Lastly, you need to be creative. This means you need to experiment with your schedule, and find what does and does not work for you. You will find that your productivity level is going to improve drastically!

Business vector designed by Freepik

Jul 30 2015
Jul 30

We are HUGE fans of simplicity, automation and open source. Here's one of our solutions for keeping track of Drupal module versions, custom/contrib code, deploying and taking backups. And more!

A while back I was introduced to a script which was used to download a given version of the Drupal core with a handful of modules AND  copy any custom modules/profiles/themes under a given path inside the Drupal installation AND run the Drupal installation from shell! SO COOL! This completely eliminated the need to have 3rd party code in our repositories and it made (at least) my life a whole lot more worry free.

So as it happened, we quickly adopted the script in our workflow and everyone was happy with it. Time went on and we found the script lacking and started modifying it, improving it, and finally after a couple of rewrites we ended up with build.sh.

Things it does for you:

  • download and install Drupal
  • update Drupal core and modules
  • copy & link custom code directories and/or files
  • handle separate settings.php for each environment
  • take backups

Our continuously evolving workflow is:

  1. Grab a copy of build.sh
  2. Modify conf/site.make to our liking
  3. Enable contrib & custom modules as a dependency in code/profiles/wk/wk.info
  4. Run ./build.sh new


If things go well this should result in a folder called drupal, this is where your fresh installation is.

Next, we might want to update the version of Drupal core.

  1. Bump up the version of Drupal core in conf/site.make
  2. Run ./build.sh update


But wait - theres more! Build.sh allows you to define your own commands that do whatever you want. Have a look at the README.md for further information.


Jul 30 2015
Jul 30

Submitted by cividesk on July 29, 2015 - 21:33

Cividesk is one of the leading CiviCRM service providers. Giving Back is a cornerstone of our company culture, and we proudly support many charitable and humanitarian organizations with pro-bono or reduced-cost services. While our Giving back program is usually full, we still have a few Drupal openings this summer and would therefore be glad to support nonprofits that need Drupal work with pro-bono services.

Criteria for being part of the program are simple: being a primarely volunteer-run non-profit organization providing free and secular services to the underprivileged (every word is important!).

If your organization fits these criteria, please drop us a line at [email protected] with your wishes and we might make turn them into reality!

Jul 30 2015
Jul 30

What's happening in the world of Drupal Commerce.

Commerce Kickstart 2.27 was released today, and includes quite a few bugfixes and features. Recently Commerce Kickstart 2 upgraded from Features 1.x to the Features 2.x API, and we've added some measures to help with the upgrade process! If you're not using Features Override yet, go on get it! Use this to save your customizations to the distribution and have a smoother upgrade. For more information, see the Installing & Upgrading guide.


  • TravisCI integration improvements. Utilizing Drush and Composer caches to speed up build times, testing Features Overrides support
  • Migrate API implementation upgraded. Using Migrate 2.8
  • Better support with Features 2.x and upgrades. Features Overrides included, but not enabled
  • Provides method to allow overrides in field_base and field_instance for 2.23+ upgrades
  • We're now testing Features Overrides integration in upgrade tests
  • View for products filters out unpublished nodes
  • Added admin_views for administration improvements of admin/people page
  • Added and enabled Distribution Status Manager to reduce code in profile for "hiding" contributed updates, streamline the bundled module update notification experience
  • Update Commerce Features to newest 7.x-1.1 release

Full changelog since 7.x-2.26:

  • #1701958: Clarify "reset demo store content" action
  • Improve .travis.yml
  • #2325349 by caschbre: Use the new Distribution Update Status Manager module
  • #2532292 by mglaman: Commerce Kickstart Feature 2.x changes the display field settings to hidden
  • #2061473: Make spinner JS configurable so module plays nicely with other modules
  • #1774632: Kickstart profile creates a duplicate user menu
  • #2224985: Unsupported operand types commerce_autosku.module on line 23
  • #2264127: display_products view should filter out unpublished products
  • #2411435 by mglaman, lsolesen: Facet drop-down lists do not work in mobile browsers
  • #1871678: Convert People screen to use Views (like the Content screen)
  • #2118059: Missing H1 / Title on default taxonomy view
  • Prevent MySQL server from going away
  • Fix Behat Drupal\DrupalExtension settings
  • #2317105: Flat rate rules components are not exportable with features
  • #2393519: Show breadcrumbs on account/profile pages
  • #2242495: Better default User menu link weights
  • #2461727: Features overriden in fresh install
  • #2536070: Add Features Override
  • #2402929: Hover text on user agreement opt-out says "I agree to the user agreement"
  • #2399141: Notice: Undefined offset: 1 in FacetapiAdapter->processActiveItems() line 312
  • #2534138 by mglaman: field_base_features_rebuild doesn't catch exceptions
  • #2533236 by mglaman: Error on update to 2.24 - 2.26
  • #2498645: Social Menu updates links to CommerceGuys
  • Update Migrate API integration
  • Run Travis on container infrastructure for faster tests.
  • #2106687 by bc: Commerce search error with newer jquery versions (above 1.9)

Try it out

Ready to get started? You can download the update directly from Drupal.org here, or better yet give it a try on Platform.sh. (check out the banner on our homepage for a free-trial code)

Matt Glaman

Posted: Jul 29, 2015

Jul 29 2015
Jul 29

(Picture of Ryu and Ken by FioreRose)

Michael Prasuhn recently sent out a tweet regarding Composer vs Drush Make:

I'm gonna go out on a limb here: composer has a long way to go to catch up with Drush make.

In the brief discussion that ensued on that thread, it was pointed out that Composer and Drush Make are fairly similar in terms of feature parity; however, there remain some differences between them, and the topic of the pros and cons thereof is more complicated than the 140-character limit of Twitter allows. I therefore decided to explain the current differences between these two options.

If you are not familiar with Composer yet, see the Composer Tools and Frameworks for Drupal presentation that Doug Dobrzyncski and I did for DrupalCon LA. It’s easy to get started quickly with Composer today with starter projects such as example-drupal7-circle-composer and example-drupal7-travis-composer—but is Composer right for your project? Let’s examine some of its strengths and weaknesses:

Recursive Dependency Management

Composer is, first and foremost, a dependency manager. Each dependency that a project requires can itself declare what its dependencies are; if multiple libraries require the same thing, but specify different sets of versions that they work with, Composer will analyze the whole dependency tree and either pick a compatible version, or explain which components do not work together. While Drush Make does allow for recursive make files, no analysis of the dependency tree is done.

Composer’s dependency manager is a point in favor of Composer for projects that need to make use of php libraries outside of the Drupal ecosystem of modules and themes. 

Generation of the Autoload File

One of the best features of Composer is the autoload.php file that it creates during every install or update operation. The autoload function allows php code to instantiate instances of classes without having to know where the source code for each class is located. Autoloading is a built-in feature of php, and is available to Drush Make users through the xautoload module; with Composer, it’s built in, and covers code that is not part of any Drupal module.

Composer’s handling of the autoload file is a benefit for projects that want to use object-oriented php code.

Recording the Exact Components Used in a Build

Composer has a file called composer.lock that records the exact version of each component that was selected in the most recent build. The components listed in the composer.json file can either be locked to a single version, or can be allowed to be updated when new versions are available. With Drush Make, in order to capture the exact version of each component used in the bulid, you must specify the exact version to use in the make file itself. What people usually do to update a make file is use drush pm-update, which itself has a locking function, if it is needed. 

So, Make has feature parity with Composer for this function, but Composer does it better, and is more convenient.

Patching and External Libraries

Composer’s support for patching and external libraries are provided by plugins, whereas this is standard functionality in Drush Make. Composer supports patch files via cweagans/composer-patches; see my previous blog post, Patching and the Composer Workflow for more information on this custom installer. It works better, and has a much more compact representation of the patch file list than the previous alternatives. Javascript libraries such as ckeditor can be managed with Composer using generalredneck/drupal-libraries-installer-plugin.

Composer has feature parity with Drush Make for patching and external library use, but you have to know which plugins to use.


The biggest advantage that Drush Make has is its maturity. Since it has been in use for so long, and is based directly on the releases repository maintained on drupal.org, and is itself used in the drupal.org profile packaging system on drupal.org, you won’t need to think twice about the availability of and module that you want to use. Composer users have packagist.drupal-composer.org at their disposal, which contains Composer packages for all drupal.org projects with tagged releases. Minor road-bumps may be encountered here and there; some projects might not exist yet on drupal.org or packagist.drupal.org, or a Drush extension might be mis-labeled as a Drupal module in packagist.drupal-composer.org. These sorts of problems are resolvable, but might cause a little extra head-scratching for new users. As Composer adoption increases, these situations will be found, reported and fixed at a greater rate.

Drush Make is more mature than Composer, but Composer is still very usable today.

Score Card

So, which one should you use? It is going to depend on which factors are most important to your project. Here's the scorecard from our comparison:

Both are powerful tools that have all of the capabilities needed to get the job done; Composer is stronger in the area of modern features, while Drush Make currently has the most mindshare among projects within the Drupal ecosystem.


Drush Make has served a lot of projects well for a long time, and there certainly is no need to switch maintenance projects over to Composer. Composer is a mature and standard tool in the broader php community, though, and the Drupal community would be well served by adopting it as well over time. Composer has achieved feature parity with Drush Make, and is more modern, more standard, and more convenient. Projects that are under active development using object-oriented code or external php libraries would be well served by a switch to composer today.

Topics Drupal Planet, Drupal
Jul 29 2015
Jul 29

Rudy Grigar (Infrastructure Manager, Drupal.org) would love to have learned Drupal in pre-school, but alas, he had to wait till third grade.
Here, he opens up about Git commits and DevOps which sounds very hush-hush. As I probe further, Rudy lets slip controversial remarks about Drupal’s potential for subversion, the NSA’s consequential attempts to suppress open source (if I understand him correctly), and an upcoming article he’ll write for Drupal Watchdog. (Hurry, subscribe! https://drupalwatchdog.com/subscribe/2015)

Jul 29 2015
Jul 29

[embedded content]

Commercial Progression presents Hooked on Drupal, “Episode 10: Summer of Drupal with Special Guests Hillary Lewandowski and Michael Zhang".  In this episode of Hooked on Drupal, the usual crew is joined by two new members to the CP team.  Hillary Lewandowski, the latest member to the development team brings her wisdom from a formal education in computer science.  

Michael Zhang of Northville DECA, the world's happiest intern

Additionally, Michael Zhang is one of two summer interns from Northville High School and an active member of Northville DECA, with a focus on marketing.

Hooked on Drupal is available for RSS syndication here at the Commercial Progression site. Additionally, each episode is available to watch online via our YouTube channel, within the iTunes store, on SoundCloud, and now via Stitcher.

If you would like to participate as a guest or contributor, please email us at

[email protected]

Content Links and Related Information

We experienced this year's DrupalCon vicariously through our last Michigan Drupal meetup and our previous podcast with Steve Burge from OSTraining.  This summer proves to be quite busy with new team members, projects, and Drupal 8 investigations.

As the Commercial Progression team size grows, our development team has begun to specialize.  Brad has focused on developing new processes for site architecture and shares his discoveries for preparing a Drupal project for design and development. Other team members share their personal project subject matter.

OOP In Drupal 8

In addition to working with the new WYSIWYG Fields and Conditional Fields,  Hillary shares some of her thoughts and computer science background with Object Oriented Programming and Drupal 8 in her latest blog post.

Personalize Module

Chris and Shane discuss the Acquia contrib Personalize module based on Lift technology for content personalization via URL based campaign parameters, geography, visitor cookies, A/B or Multi-variate testing, and a host of other variable session data.

Paragraphs Module

Inspired by Jeff Eaton and the Battle for the Body Field DrupalCon presentation, Brad dug into the Paragraphs module and put together a popular paragraphs blog post with some best practices for winning the battle for the body field.  When Brad is not fighting the good fight for the supremacy of the Paragraphs module, he has also created an automated competitive marketing intelligence research script… yeah I know, really.

Hooked on Drupal Content Team


CHRIS KELLER - Developer




Podcast Subscription

Michigan Drupal Developer podcast

Jul 29 2015
Jul 29

Using MailChimp with Drupal 7

This week we'll be continuing our Using MailChimp with Drupal 7 series, and like last week, all the tutorials are free thanks to MailChimp's sponsorship. Last week we looked at creating, and collecting contacts for, a MailChimp mailing list. This week we'll look at all the different ways we can send email to our lists. Before we jump in and hit the send button on our first campaign we'll look at the elements that make up a good marketing campaign. Then, we'll look at how to send email through both the MailChimp UI and directly through Drupal allowing us to compare and contrast the two and talk about the use-case for each.

We'll also look at how to preview an email before sending it, and how to create a new custom email template that will work both in the MailChimp UI and when sending campaigns from Drupal using the MailChimp Campaigns module. For campaigns that go out on a regular basis it can be nice to automate the process, so we'll look at creating an RSS feed in Drupal using Views and then use that to power a MailChimp RSS-based email campaign with a couple of RSS-specific merge tags.

Finally, we'll wrap things up in this series by looking at how to integrate Drupal with MailChimp's webhook system so that our site can receive a ping from MailChimp for actions that happen on the MailChimp server, allowing Drupal to keep its cached data up-to-date. The last step will be to review the MailChimp Activity module, which allows for tracking the activity of any entity subscribed to a MailChimp list and displaying that data in the Drupal UI.

This week's free tutorials:

Our next series to come out, in August, will be the next episode of our Object-oriented PHP tutorials. This series will pick up where the Introduction to Object-oriented PHP series left off and dives into PHP services and some object-oriented programming best practices.

Jul 29 2015
Jul 29

Post date: 

July 29 2015




drupal, drupal planet

Realityloop has a long history with the Melbourne Drupal community; We’ve been heavily involved in the monthly Drupal meetups and began the monthly mentoring meetups. However, the monthly mentoring only came about after a failed experiment in community based web development.

While that experiment may have failed, the idea of community driven Drupal development has been of great interest to me as it truly embraces the spirit of open source.

In the last two weeks I have release two websites for the Drupal Melbourne community, a DrupalMelbourne community portal and a landing page for the upcoming DrupalCampMelbourne2015. In this tutorial I will be demonstrating how anyone can get involved with the development of these sites, or how the process can work for the benefit of any other community based website.

The workflow:

  1. Build the codebase
  2. Setup and install the site
  3. Make your changes
  4. Update features and makefiles
  5. Test the changes
  6. Fork repository / push changes / pull request

Build the codebase

As per usual for myself and Realityloop, these sites are built using a slim line profile / makefile approach with the GIT repository tracking custom code only, which means you will require Drush to build the site codebase.

If you are not familiar with Drush (DRUpal SHell), I highly recommend familiarising yourself as it’s not only incredibly useful in everyday Drupal development, it is also a requirement for this tutorial. Installation instructions can be found at http://www.drush.org/en/master/install/

Assuming you have Drush ready to go, building the codebase is as simple as running the following, relevant command:

  • DrupalMelbourne? drush make --working-copy=1 https://raw.githubusercontent.com/drupalmel/drupalmel/master/stub.make drupalmel-7.x
  • DrupalCampMelbourne2015 drush make --working-copy=1 https://raw.githubusercontent.com/drupalmel/drupalcampmel/2015.x/stub.make dcm-2015.x

The resulting codebase contains the specified Drupal core (currently 7.38) along with the relevant install profile containing all custom and contrib code (modules, themes and libraries).

Note: --working-copy=1 is used to retain the link to the GIT repository.

Setup and install the site

Once your codebase is built you simply need to install a site as you would normally do so, ensuring that you use the relevant installation profile (DrupalMelbourne or DrupalCampMelbourne).

The sites are constructed in such a way that there is absolutely no need to copy down the production database, any content is either aggregated from external sources, or dummy content is created via the Devel generate module for the purposes of development. This means that there is no laborious data sanitization processes required, allowing contributors to get up and running in as short of time as possible.

For more details on how to setup a Drupal site, refer to the Installation Guide or your *AMP stack.

Make your changes

No change is insignificant, and a Community driven site thrives on changes; if you think you can make the site look better, found a bug that you can fix, or want new functionality on the site, the only thing stopping you is you!

Update features and makefiles

Once you’re happy with the changes you’ve made, be it content, theme or configuration, you need to ensure that it’s deployable, and as we’re not dealing with databases at all, this means that you need to update the codebase; features and makefiles.

If you’re not familiar with Features or Makefiles, much like Drush, I highly recommend them, as again, they are required for this particular approach of Community driven development.

You can find more details on Features, Makefiles and Drush at my DrupalSouth 2012 talk "Ezy-Bake Drupal:Cooking sites with Distributions".


Features allows you to capture most of your configuration in the filesystem, allowing to be deployed via GIT.

In the case of these sites, there is only one feature which encapsulates all configuration, as these sites have a relatively straight forward purpose. Some sites may warrant more, that is a discussion for another day.

To update the feature, it’s an extremely simple process:

  1. Navigate to the relevant path within your site:
    • DrupalMelbourne: admin/structure/features/drupalmel_core/recreate
    • DrupalCampMelbourne2015: admin/structure/features/drupalcampmel_core/recreate
  2. Add or remove any required components (Page manager, Strongarm, Views, etc).
  3. Expand the Advanced options fieldset and click the Generate feature button.

More information can be found at the Features project page.


Makefiles are recipes of modules, themes and libraries that get downloaded by Drush make, including their versions and patches.

Updating a makefile is relatively straightforward, it’s just a matter of opening the file in your IDE / text editor of choice and updating the entries.

There are two makefile in these sites, stub.make and drupal-org.make; the stub.make contains Drupal core and the install profile (and any relevant patches) and the drupal-org.make contains all third-party (contrib) code.

Any new or updated modules, themes or libraries (and any relevant patches) need to be added to this file, as no third-party code is tracked in the GIT repo.

The makefiles are organized into three primary sections; Modules, Themes and Libraries. Below are some examples of how an entry should be defined:

Bean module, version 1.9:

projects[bean][version] = 1.9

Reroute Email module, specific GIT revision with patch applied:

  1. projects[reroute_email][download][revision] = f2e3878

  2. ; Variable integration - http://drupal.org/node/1964070#comment-7294928

  3. projects[reroute_email][patch][] = http://drupal.org/files/reroute_email-add-variable-module-integration-19...

Note: It is always important to include version, if you need a development release then use a GIT revision as otherwise what you build today may be drastically different from what you build tomorrow.

Bootstrap theme, version 3.1-beta2:

  1. projects[bootstrap][type] = theme

  2. projects[bootstrap][version] = 3.1-beta2

Note: As the default projects 'type' is set to module, themes need to specify their type. This is a personal choice in the Drush make file configuration, as it is highly likely you will always have more modules than themes.

Backbone library, version 1.1.2:

  1. libraries[backbone][download][type] = get

  2. libraries[backbone][download][url] = https://github.com/jashkenas/backbone/archive/1.1.2.zip

Note: As libraries are not projects hosted on Drupal.org (in general), you need to specify the URL of which the files is downloadable, or cloneable, from.

More information can be found on the Drush make manual page.

Other changes / hook_update_N()

Sometimes changes don’t fall under the realm of features or makefiles, either due to a modules lack of integration with features, or when dealing with content rather than configuration. This still needs to be deployable via the codebase, and can be done with the use of a hook_update_N() function.

A hook_update_N() is a magic function that lives in a modules .install file, where hook is the machine name of the module and N is a numeric value, formatted as a 4 digit number in the form of XY##, where X is the major version of Drupal (7), Y is the major version of the module (1) and ## is a sequential value, from 00 to 99.

Example: drupalmel_core_7100() / drupalcampmel_core_7100()

The contents of a hook_update_N() is whatever you wish it to be, and Drupal API function or PHP.

An example of one such function is:

  1. /**

  2. * Assign 'ticket holder' role to ticket holders.

  3. */

  4. function drupalcampmel_core_update_7105() {

  5. $query = new EntityFieldQuery();

  6. $results = $query->entityCondition('entity_type', 'entityform')

  7. ->entityCondition('bundle', 'confirm_order')

  8. ->execute();

  9. if (!empty($results['entityform'])) {

  10. $entityforms = entityform_load_multiple(array_keys($results['entityform']));

  11. foreach ($entityforms as $entityform) {

  12. $user = user_load($entityform->uid);

  13. $user->roles[3] = 'ticket holder';

  14. user_save($user);

  15. }

  16. }

  17. }

For more details, refer to the hook_update_N() API documentation.

Test the changes

Once you’ve made your changes and prepared your features and makefiles, it’s ideal to ensure that everything is working as expected before you push it up to the GIT repo.

This is a multi-step process, but it’s easy enough, especially given that we don’t have a database that we have to worry about.


  1. Take a database dump of your local (development) site; Safety first.
  2. Re-install the site with the the relevant install profile:
    • DrupalMelbourne: drush si drupalmel -y
    • DrupalCampMelbourne2015: drush si drupalcampmel -y
  3. Test to ensure your changes are present and working as expected.


Testing your makefile can be a little bit trickier than testing your features, as when you download a module, theme or library there are various places they can be stored, and it’s easy to get a false positive.

  1. Build a --no-core version of the drupal-org.make file into a temporary directory. A --no-core build is exactly what it sounds like, build the makefile excluding Drupal core.


    1. cd ~/temp

    2. drush make --no-core --no-gitinfofile ~/Sites/drupalmel-7.x/profiles/drupalmel/drupal-org.make dm-temp

  2. Run a diff/merge tool over the --no-core build’s sites/all directory and your local (development) site’s relevant profile directory (e.g., profiles/drupalmel).

    I personally use Changes on OS X, but there are different free and paid diff/merge tools for different operating systems.

  3. Ensure that all third-party (contrib) code is identical on both sides of the diff/merge, any discrepancies imply that you may be missing an entry in your makefile, or that your local version of the code is located in an incorrect location.

If your changes aren’t working as expected, or something is missing, simply restore your database dump and go back to the Update features and makefiles step.

Fork repository / push changes / pull request

Now that you have made your changes and everything is good to go, it’s time to push those changes back to the repository.

For the sake of a manageable review process, it’s preferable that all changes should be made in a fork with a pull request back to the master repository.

If you’ve only ever lived in the realm of Drupal.org, then this may be an entirely alien process, but it is again a relatively straight forward process.

Note: If you don’t have a Github account, you will need one. Signup for free at https://github.com/join

  1. Go to the relevant Github repository:
  2. Click the Fork button (top right of the page) and follow the onscreen instructions.
  3. Click the Copy to clipboard button on the clone URL field (in the right sidebar).
  4. Add a new GIT remote to your local (development) site with the copied URL.


    1. cd ~/Sites/drupalmel-7.x/profiles/drupalmel

    2. git remote add fork [email protected]:Decipher/drupalmel.git

  5. Commit and push the changes to your fork.
  6. Create a pull request via your Github fork by clicking the Pull request button, providing as much detail as possible of what your changes are.

If all goes well, someone will review your pull request and merge the changes into the relevant website.

The review process

So this is the not so community friendly part of the process; In a perfect world the community should be able to run itself, but Github isn’t necessarily setup this way, nor is Drupal.org. Someone has to specifically approve a Pull request. Currently this is only myself and Peter Lieverdink (cafuego).

I’m absolutely open to suggestion on how to improve this, comment below if you have any thoughts on how this could be improved.

The uncommitables

Not everything should be committed, especially in a public repository. A perfect example of something that shouldn’t be committed is an API key.

The DrupalMelbourne website integrates with the Meetup.com API to pull in all DrupalMelbourne Meetups, but exposing the API key to the codebase would open the DrupalMelbourne meetup group to abuse and spam.

To deal with this, API keys and other sensitive items can be dealt included directly on the server or in the database, and placeholders can be user for local development.

Open source your site?

Exposing your website codebase is definitely not the normal practice, and it's absolutely not for everyone. I couldn't imagine trying to convince a client to go down this road. But for a community site, especially a Drupal based community site, it just makes sense. While I wouldn't expect every visitor with Drupal knowledge to volunteer their time to help with the sites development, any who do is 100% more than you'd get otherwise.

Jul 29 2015
Jul 29

So, Doug Vann emailed the two of us a while ago and said that I should have you on because

"I'm on the record as a huge fan of Tom's for his well educated and well rounded perspective on proprietary and Open Source software solutions. ... I'd be excited to hear Tom interviewed on the topic of how Drupal 8 will continue to erode into the proprietary market."

I thought that sounded good, so here we are!

Web Content Management

  • When you replied, you said that Drupal 8 is the most important product release in the history of the WCM market. Can you start out by explaining what WCM stands for and what qualifies software as a WCM product?
  • You also mentioned that the 2nd most important release was Day Software’s CQ5. What is that?
  • When I hear about Drupal’s competitors, I generally hear about Wordpress and Joomla. Why aren’t either of those number two?

Drupal’s Place in the WCM Market

  • How has Drupal faired in the WCM market so far?
  • What do you see Drupal 8 bringing to the table that sets it apart from other products?

Questions from Twitter

  • Jacob Redding

    • How does Drupal 8 change the comparison with AEM? Specifically what are the features with Drupal 8 that bring Drupal to a more level playing field with AEM? Is there a single specific feature that Drupal does hands down better than AEM?
  • Doug Vann

      • Drupal promotes an "Ownership Society" where Universities, Media Companies, Governments, etc. hire in ?Drupal talent and build sites inhouse. How does D8 impact that trend? Is D8 more for shops and agencies and less for DIYers or is that just F.U.D. talking??
    • Any Drupaler would state that Drupal has been "disruptive" insofar as we have allowed highly visible sites to ditch their proprietary CMS in favour of Drupal.
      • To date, has that success been "truly disruptive" by your definition?
      • With the astounding advancements baked into D8, are you looking forward to an even more disruptive presence in the CMS playing field?
    • Shops
      • Is Drupal 8 ushering in a new era which will see a fundamental shift in how Drupal is delivered in the areas of customer procurement, engagement, and delivery?
      • To reword that. Are Adobe CQ5 and Sitecore shops operating significantly different than Drupal shops today AND are we going to see Drupal shops retooling and reshaping to a more enterprise looking organization?
      • In The past 18+ months, it seems that more people are willing to ?admit that Drupal 8 is moving Drupal "Up Market." Agencies are often the vendor of choice in those deep waters. Should we expect some more mergers and acquisitions which will ultimately empower agencies to deliver Drupal services inhouse??
    • ?The little guys
      • Where are the little guys in the D landscape?
      • Do you still see the $10K and the $45K range websites feeding the smaller end of the Drupal ecosystem?
Jul 28 2015
Jul 28

There was a time when search engine bots would come to your site, index the words on the page, and continue on.  Those days are long past.  Earlier this year, we witnessed Google's ability to determine if our sites were mobile or not.  Now, the evolution of the Googlebot continues.

I would say that it was not uncommon for web developers to receive at least a few emails from Google Search Console today.

To: Webmaster...
Google systems have recently detected an issue with your homepage that affects how well our algorithms render and index your content. Specifically, Googlebot cannot access your JavaScript and/or CSS files because of restrictions in your robots.txt file. These files help Google understand that your website works properly so blocking access to these assets can result in suboptimal rankings.

Well, that's a little bit of information that I never thought about before, Google wanting to understand, how my "website works", not just understanding the content and the structure of it.  Turns out, Google has been working toward this since October of last year.

Update your robots.txt

To allow Googlebot to access your Javascript and CSS files, add a specific User-agent for Googlebot, repeating the rules you already have, and adding the new "Allow" rules.

Allow: /*.js*
Allow: /*.css*

User-agent: *
# Directories
Disallow: /includes/
Disallow: /misc/
Disallow: /modules/
Disallow: /profiles/
Disallow: /scripts/
Disallow: /themes/
# Files
Disallow: /CHANGELOG.txt
Disallow: /cron.php
Disallow: /INSTALL.mysql.txt
Disallow: /INSTALL.pgsql.txt
Disallow: /INSTALL.sqlite.txt
Disallow: /install.php
Disallow: /INSTALL.txt
Disallow: /LICENSE.txt
Disallow: /MAINTAINERS.txt
Disallow: /update.php
Disallow: /UPGRADE.txt
Disallow: /xmlrpc.php
# Paths (clean URLs)
Disallow: /admin/
Disallow: /comment/reply/
Disallow: /filter/tips/
Disallow: /node/add/
Disallow: /search/
Disallow: /user/register/
Disallow: /user/password/
Disallow: /user/login/
Disallow: /user/logout/
# Paths (no clean URLs)
Disallow: /?q=admin/
Disallow: /?q=comment/reply/
Disallow: /?q=filter/tips/
Disallow: /?q=node/add/
Disallow: /?q=search/
Disallow: /?q=user/password/
Disallow: /?q=user/register/
Disallow: /?q=user/login/
Disallow: /?q=user/logout/

User-agent: Googlebot
# Directories
Disallow: /includes/
Disallow: /misc/
Disallow: /modules/
Disallow: /profiles/
Disallow: /scripts/
Disallow: /themes/
# Files
Disallow: /CHANGELOG.txt
Disallow: /cron.php
Disallow: /INSTALL.mysql.txt
Disallow: /INSTALL.pgsql.txt
Disallow: /INSTALL.sqlite.txt
Disallow: /install.php
Disallow: /INSTALL.txt
Disallow: /LICENSE.txt
Disallow: /MAINTAINERS.txt
Disallow: /update.php
Disallow: /UPGRADE.txt
Disallow: /xmlrpc.php
# Paths (clean URLs)
Disallow: /admin/
Disallow: /comment/reply/
Disallow: /filter/tips/
Disallow: /node/add/
Disallow: /search/
Disallow: /user/register/
Disallow: /user/password/
Disallow: /user/login/
Disallow: /user/logout/
# Paths (no clean URLs)
Disallow: /?q=admin/
Disallow: /?q=comment/reply/
Disallow: /?q=filter/tips/
Disallow: /?q=node/add/
Disallow: /?q=search/
Disallow: /?q=user/password/
Disallow: /?q=user/register/
Disallow: /?q=user/login/
Disallow: /?q=user/logout/
# ========================================= #
Allow: /*.js*
Allow: /*.css*
# ========================================= #


Starting from the root of the site, /, the first * will take care of the path and file name. The second * will cover any versions or variables that come after the file.

Fetch and Verify

Google Search Console, Fetch as Google

Once you have completed the changed and published to your production server, log into Google Search Console to verify.

  1. Under Crawl, select Fetch as Google
  2. Enter robots.txt in the URL box, and click Fetch and Render. Once the query finishes, click on the double arrows >> to verify it grabbed the correct file.
  3. If all is well, return to the page, and click the Fetch and Render button again to fetch the homepage. Click the double arrows >> to view the results.

If you need to exclude other files, update your robots.txt and start from the first step.  If you want to test other pages, repeat step 3 until you are satisfied.

You may get a "Partial" result if you have external scripts, these are scripts that you are loading on your page from other websites or services.  I can't think of a solution for that. I find it hard to believe that google will penalize you for not being able to allow them to index those also.

Jul 28 2015
Jul 28

After some more tinkering with the Raspberry Pi Dramble (a cluster of 6 Raspberry Pis used to demonstrate Drupal 8 deployments using Ansible), I finally was able to get the RGB LEDs to react to Nginx accesses—meaning every time a request is received by Nginx, the LED toggles to red momentarily.

This visualization allows me to see exactly how Nginx is distributing requests among the servers in different load balancer configurations. The default (not only for Nginx, but also for Varnish, HAProxy, and other balancers) is to use round-robin distribution, meaning each request is sent to the next server. This is demonstrated first, in the video below, followed by a demonstration of Nginx's ip_hash method, which pins one person's IP address to one backend server, based on a hash of the person's IP address:

It's fun to be able to visualize things like Drupal deployments, Nginx requests, etc., on this cluster of Raspberry Pis, and in addition to a presentation on Ansible + Drupal 8 at MidCamp, and Ansible 101, I'll be showing the Dramble in a soon-to-be-released episode of Jam's Drupal Camp from Acquia—stay tuned!

Jul 28 2015
Jul 28

Mediacurrent Dropcast: Episode 8

This episode we welcome Shellie Hutchens, Mediacurrent’s Marketing Director, to talk about upcoming webinars and and the fact that Mediacurrent is hiring. Ryan picked Stage File Proxy as the Module of the Now. We discuss our first non-drupal article from Four Kitchens about Saucier (pronunciation TBD). Mark stumbles through some D8 News and of course we finish off with some great conversation during Ryan’s Final Bell.

Episode 8 Audio Download Link

Show Updates:

Weekly Drop

Mediacurrent Blog Mentions:

  • Part 1: Testing, Testing, 1, 2, 3: Tips and Tricks To Start User Testing  by Beth Davenport - Senior Digital Strategist - her very first post.

    A roadmap to usability testing.  Very important for client feedback.  You want it often and early.  Once a website is ready, it is important to do usability       testing with the stakeholders so you can get feedback in order to make any adjustments before    delivery of the product.  Some of the key elements for success:

    • Schedule the testing session and do not make it to long.
    • Make sure you have the right testers.
    • Have scenarios and scripts for them to execute.
    • Record the testing session.
    • Document any issues found.
  • 17 Tips for Leading Effective Conference Callsby Rob McBryde -  Project Manager

    Good blog on having efficient and productive conference calls.  Rob sent out a survey to the Mediacurrent team about what they would like to see done better in conference calls.  Here are some of tips they came up with:

    • Join the call at least a minute or two before the scheduled meeting time.
    • Don’t start unless the key people are there so you don’t have to repeat things
    • Address someone by name if you want them to respond
    • When not speaking, attendees should mute their phone/mic
    • Provide a meeting recap at the end of the call

Special Guest: Shellie Hutchens

  • Upcoming Webinars
  • Upcoming Camps / Cons
  • MC’s Culture of Content
  • We’re Hiring (make sure you tell ‘em we sent ya)


  • Jeff from Innoppl sent us an article about assembling a Drupal 8 Development Team. PS: He addressed it to someone named Robyn.. Do we have one of those on our podcast?

Module of the Now:


This week in Drupal 8:

The Final Bell:

Jul 28 2015
Jul 28

Here we go again! It's your monthly summary of all things board meeting at the Drupal Association. This month we covered board governenance (there's a seat opening up), the D8 Accelerate Campaign, and the Association strategic frame. Plus, as a bonus, the board approved the Q2 financials for publication. As always, if you want to catch up on all the details, you can find everything you need to know about the meeting online, including minutes, materials, and a recording. If you're just here for a summary view, read on!

Meeting Minutes

Related Materials

Video Recording

Board governance

Angie Byron's term on the board is going to be up this fall, and she has expressed her desire not to renew that term. We're going to be very sad to see Angie go, but thrilled that she will have one less hat to talk about when explaining which hat she is wearing at any given point during your next meeting with her. Seriously - she's brought so much thoughfulness and passion to the board. She's not leaving us yet (her term expires 10/31), but our Governance Committee will be working with the Nominations Committee to recruit candidates and help the board make the next selection.

D8 Accelerate

As I write these words there are just 10(!) release blockers standing between us and a release candidate for Drupal 8. Part of the momentum this year has come from Drupal 8 Accelerate. We've made over 40 grants, worth more than $120,000 so far. That's helped us close nearly 100 issues, addressing some really important features, like a beta to beta upgrade, security bugs, and performance. If you're curious about what's getting funded, you can always see the full list. And, we're getting close to reaching our goal - we've raised $223,000. You can help us reach our $250,000 goal by making a donation today!

Drupal Association Strategic Frame

Why are we doing the work we do? Because everyone at the Association wants to have a positive impact for Drupal. The best way for us to have an impact is to pick a few goals that we are going to focus on achieving. The Association board used their January retreat to set some 3-5 year goals for the Association:

  • To develop sufficient professionals to meet global demand for Drupal
  • To lead the community in focused, efficient, effective development of Drupal
  • To ensure the sustainability of the Drupal project and community
  • To increase Drupal adoption in target markets
  • To increase the strength and resilience of the Drupal Association

We've been working since then to select the right strategies and objectives (1 year to 18 month time frame) for our work. You can see the directions we're headed in the presentation we shared. It's important to note that we expect to revisit our strategies and objectives on a quarterly basis to adjust as we go. The world of Drupal moves fast, and we need to as well. So, although we are setting 12 to 18 month objectives, we will be adjusting the frame much more frequently, and won't be sticking with objectives that we find don't really support the work.

2015 Q2 Financials

And in the most exciting news of all, the second quarter financials were approved by the board. You can always find whatever financials have been released in the public financials folder. If you have never taken a look at the financials before, I recommend it. Although I tease about them being boring, I love financial statements! A while back, I wrote up a post about how to read our financial statements. I also like pointing out that each Con has it's own tab in our financial statements, so you can see exactly how that money coems in, and where it is spent. 

See you next time!

And that's it for this summary. But, if you have questions or ideas, you can always reach out to me!

Flickr photo: Joeri Poesen

Jul 28 2015
Jul 28

Find some useful tips on Drupal website data security from our guest blogger Jack Dawson, founder of Big Drop Inc.

In Drupal web development, there are a number of things that can be done to ensure the superior user experience and consistency, as well as save time and pain for the webmaster in times to come. First, you’ll need to have in mind the theme structure of Drupal and how you intend to draft your content in order to take advantage of Drupal’s best aspects and make the site efficient.

As a CMS, Drupal provides a host of advantages especially as relates to the flexibility, allowing you to customize virtually everything about the site design. While this is great, it would all go to the dogs if you forget the most important aspect of running a Drupal site: regular and consistent backups.

According to research last year, hackers penetrated around 30,000 websites daily. That’s scary, right? You can believe that this number is even more today. Post-hacking, not all affected websites end up losing their data. As a matter of fact, hacking is just one among many ways website data can be lost, including corrupted hard drives, botched updates, server hiccups and human errors among others.

Whether for migration or just to be safe, you must have a robust plan to back up your Drupal site consistently. In this article, we’ll discuss how this can be done manually and then automatically.

Backing up Drupal manually

There are two parts to your Drupal site: the database and the files, which are what you’ll be backing up. File backup is very straightforward, you only need to:

  • Run your preferred FTP client to access your hosting server
  • Download your Drupal directory and save it to a safe location – use a cloud solution for added safety, such as your Dropbox account for individuals/smaller businesses or full-scale cloud services for large enterprises. The FTP manual can direct you if you’re unfamiliar with the process

To backup the database itself, which stores your configurations, settings and content, you’ll do it using phpMyAdmin. The steps are given below:

  • Gain access to phpMyAdmin from the cPanel
  • Choose the database to back up from the left-hand menu by clicking on it. This opens all tables found in the database
  • On the top menu, click “Export”
  • Click Custom – "display all possible options” which offers further options. Ensure you have selected all tables.
  • Mark the box with “Add DROP TABLE/VIEW/PROCEDURE/FUNCTION/EVENT statement” which is found under “Object Creation Options”. This configures existing tables to be automatically replaced on database import
  • Select “Go” to save the database in a cloud location of your choice.

Backing up Drupal Automatically

Manual backup is not only time-consuming, you cannot guarantee that you’ll always have the time and/or presence of mind to consistently back up every few days. Luckily, Drupal provides for automatic backup through the use of modules.

The leading module used by more than 250,000 websites today is the Backup and Migrate module. It supports a host of backup storage destinations including FTP, DropBox, email, Amazon S3 etc. You can also schedule multiple backups, exclude tables and restore databases among many other useful backup applications.

In conclusion, remember that this module doesn’t backup files, only the database. However, you can install a different module for file backup. For best results, Backup and Migrate should be used with a cloud storage service like Amazon S3 which is provided for out-of-the-box, or Dropbox, for which an additional module is required.


Author bio: Jack Dawson is a web developer and UI/UX specialist at BigDropInc.com

He works at his own design, branding and marketing firm that he founded 9 years ago.

He likes to share his knowledge and points of view with readers.

Jul 28 2015
Jul 28

Entity Data is a handy little API to make module builder's lives easier. If you need to build a module that adds functionality and data to an entity, no longer will you have to implement your own CRUD and export/import support.

A module builders dilemma

Fields are a powerful way to add data to Drupal entities. However, sometimes fields can be rather cumbersome. Particularly when you want to add something and thus attach fields to entities that already exists.

The alternative is to roll your own. Just use a hook alter to add UI for your data to the entity edit form using regular form API fields. Then on submit, save the data to a custom table created by the module.

This approach was the primary way of adding functionality to nodes and other entities prior to Drupal 7, before fields were in core. This technique is still utilized quite often in Drupal 7.

Each module that needs to manage entity data without fields has to implement a common pattern for properly storing data:

  • Provide a schema to create a new table to store the custom data.
  • Implement CRUD functions for saving, loading and deleting data, including deleting records when the entity is deleted.
  • Make sure the data exports and imports with the entity

A module builders dream

OK, maybe the header is a bit of hyperbole, but on recent projects were we were having to add data to nodes and various custom entities, Entity Data really made our lives a lot easier.

We created the Entity Data module to centralize the functionality in the common pattern above into a simple to use API.

The primary functions for interacting with Entity Data are:

entity_data_set($entity_type, $entity_id, $name, $value, $revision_id = 0, $language = LANGUAGE_NONE); entity_data_get($entity_type, $entity_id, $name, $default = NULL, $revision_id = 0, $language = LANGUAGE_NONE); entity_data_del($entity_type, $entity_id, $name, $revision_id = 0, $language = LANGUAGE_NONE);

They work pretty much the same way as Drupal's handy variable_get, variable_set, and variable_del. Except instead of storing data in a global context, the data is associated with an entity. And rather importantly, data is loaded only when the entity is loaded, not globally like core's variable_* methods.

In addition to using the getter/setter functions, data is auto-loaded and saved with the entity. Any data that has been saved to an entity will be automatically loaded into the entity_data property and can be accessed using:


Additionally any data added to the entity_data property will automatically be saved on entity_insert or entity_update.

This also means that entity_data is automatically exported with the entity and will be maintained on import.

So next time you have to write a module to extend entities, save yourself some time and code by letting Entity Data handle your CRUD.

Have questions or comments about Entity Data? Leave them in the comments below and we'll answer ASAP!

Jul 28 2015
Jul 28

Business and Strategy - Track

Our very own Dania Gerhardt talks about how to diversify your business beyond Drupal.

Coding and Development - Track

Josef Dabernig (who held his first keynote at DrupalCamp North last week) shares insights on coding with Rules in Drupal8 alongside with Wolfgang Ziegler (fago) and Klaus Purer (klausi).

Core Conversation - Track

Avoiding in the first place and surviving contribution burnout is what Michael Schmid talks about; together with Lauri Eskola.

DevOps - Track

He is the master of log files at Amazee Labs. Bastian Widmer will show you how to visualize log files with ELK stacks.

Project Management -Track

Sharing is caring. That is why Dagmar Muth and Michael Schmid tell it all in their Session AMA: Drupal Shops explain how they do it together with Emma Jane Hogbin Wetsby (emmajane), Steve Parks (steveparks) and emkay.

Site Building - Track

Building layouts from 7 to 8: Coding vs. Clicking is Josef Dabernigs second session in which he shares his site building expertise. But we've got even more for you - Anna Hanchar and Claudine Braendle present how to create a great experience for content editors.

See you soon in Barcelona!

Jul 28 2015
Jul 28

The Problem

Image of balloon Drupal traditionally excels in the area of content organization – not only as a content management system, but also in allowing you to create structured data, thanks to the entity and field systems.

However, while flexibility in Drupal 7 has grown – compared to Drupal 6 – the preview and revisioning systems have been very limited (and still are in Drupal 8, as of now). The only possibility in Drupal 7 was to click “preview” and see a very rough outline of how the content might look styled with the admin theme.

Trying to use the same CSS and/or JS in the admin and default themes is a difficult endeavor. Solutions include AJAX callbacks and iframes, but those solutions are neither optimal nor in widespread use.

The Drupal 7 core revisioning system is also limited and mainly allows auditing and reverting back to another revision; any saved revision is immediately live and overwrites the state of the old revision. Therefore, it is impossible to have different stages of the same piece of content once it has been published.


The workflow needed by most larger content teams is that each article can be a “draft” stage, then revised by an editor and, finally, approved by a content publisher.

While the workflow provided by the Workbench module is already quite good at this, it still lacks something that even bigger teams need: The possibility to publish content together as a “pack.”

One example of this is a large marketing campaign that has several articles which, taken together, form the new front page and show several subpages. In order to properly review these changes, editors and content publishers need to be able to see the set of changes on the site as a whole. CPS fills this gap, because it allows you to view the whole site as if the content was already published – but your live site remains unchanged!

How Does it Work?

CPS divides your site into changesets, called ‘site versions’ in the UI.

Every editor has their own ‘site version’ (though collaboration and moving of drafts between changesets is possible) and can see the site overlayed with all the changes they have made.

A ‘site version’ tracks all the entities that have changed over time, together with the ‘draft’ revision that was last saved within the ‘site version’.

[ IMAGE: "CPS diagram for entities and revisions.pdf" Caption: "In the "Sandbox" site version, Revision 2 is shown for Entity 1, while Revision 1 is shown for the live version. Entity 2 and Rev 4 do not yet exist on the live site.". ]

Whenever Drupal is asked to load an entity or create a list via Entity Field Query or Views, CPS checks which site version is active and replaces the entities retrieved with the correct revision that applies to this site version.

Under the hood, CPS uses the Entity Status and Drafty modules to allow creation of non-published revisions by creating the new revision – via the core mechanism – then immediately reverting to the old “live” revision.

Example Workflow

CPS includes the cps_workflow_simple submodule, which allows you to set up a simple editorial workflow, and create personalities who will receive e-mails at each stage of the workflow. Due to a rich hook system, CPS can easily be extended for more complex workflows.

CPS starts with an initial site version which contains all the content first published when CPS was installed.

To integrate CPS with your site, download the CPS module, enable cps, cps_node and cps_workflow_simple, and all associated modules asked for by CPS, like entity_status, drafty, iib, and diff. You will now see a Site version bar at the top of your site. Click on the plus sign (“+”) to create a new site version – your very own “playground.”

Please note that once CPS is active, entities under CPS control can no longer be added or edited when not within a site version.

Revision Chart

Step 1: Create a New Site Version

Create and edit some entities. Notice how when you use the ‘site version’ switcher widget at the top, you can view your site with and without your changes.

Step 2: Review Changes

Next, click on the gear to the right of the site version switcher widget.

On this screen you will see a diff of all changes that have been made in this site version. You can review all the changes and then send the ‘site version’ with a message to the editor, or approve and publish it directly.

Congratulations! You have made changes, but they are not visible on the live site, only on the internally visible site version.

Step 3: Publish the Site Version

When you are ready, click the Publish button, confirm, and after a few seconds, your site version will be published.

All your changes are now visible to everyone and you have completed the editorial workflow.

Known Limitations

There are some caveats: Only entities that are revisionable can be used together with CPS.

For example, to use Menus with CPS, you need to create a menu_link content type and use Views to populate the menus. Similarly, you would need to use a simple custom entity type to replace taxonomies, enable file_entity_revisions for media module, and so on.


CPS is already being used in production, although it’s still under heavy development. But testing it can give you an idea of what a powerful editorial workflow it provides for Drupal, even now.

Care should be taken to combine this module with the Field Collection module. While CPS is now based on Drafty – which takes care of all the low-level revisioning parts – this is still an area where both Workbench and CPS had major bugs in the past: proceed with caution.

Other open areas for contribution and testing are the Title and Entity Translation modules.

Please file any bugs you find in the issue queue and help us move on down the road to a stable release.


Jul 28 2015
Jul 28

Yesterday we hosted our first Drop Guard webinar. For those who couldn’t attend, I share the video with you below. Both the participants’ interaction and the number of attendees far exceeded our expectations: more than 70 people watched the free webinar and learned how to update Drupal automatically with integration into development and deployment workflows. First of all, I want to thank everybody who has supported us from the initial idea through to the first closed beta phase and helped us to improve the service. Without the help of the awesome Drupal community we would never reach our ambitious goals to build this product!
Here I’d like to share a couple of tweets we got during the webinar, such as @drop_guard @tweetsBS Chapeau, you guys build something really great! #drupal and Drupal updates as a service? @drop_guard looks like a powerful solution. Thanks! You make us very proud and prove we’re on the right track.

A couple of questions have come up that I want to summarize and share with you here.

"How do you update servers on @acquia or @getpantheon when you don't have direct access to staging servers?"
In the current version we support "Events and Actions" to trigger deployment actions such as "call a URL", "Execute SSH command". With these actions you can call scripts on a separate server that trigger deployment actions on pantheon.io, Acquia cloud, platform.sh or Freistil Box hosting. In the future we’ll consider how we can provide seamless integration with these hosting providers. As they all expose an API or a CLI, you can use scripts that are triggered by the "execute SSH command" action in the events tab (see the video for further details). We’ve already started our first talks on cooperating with freistil Drupal hosting and platform.sh, so we hope to have the same interesting and valuable conversations with other hosting platform providers. If you have concrete ideas for integration scenarios, I’m happy to discuss them in the comments.

"It there a way to let Drop Guard work via FTP only?"
An FTP-based workflow is planned for the future, earliest in 2016. We plan to provide a feature to let you connect to your FTP account, and Drop Guard will copy the code base to operate on a local GIT repository. Actually, Drop Guard requires a GIT repository where your code is committed.

"Can I integrate Drop Guard with Jira?"
You can do this in the current version by using the web hook integration to create a task. You need to wait until we release our REST API that lets you execute actions on task to change statuses and trigger deployment actions accordingly.

"Does Drop Guard need a copy of my database?" No, Drop Guard doesn't require you to grant access or copy your live database. To avoid this we based our architecture on services that are exposed by the Drop Guard module that you need to install on your live site. This module exposes and API and transfers the information that Drop Guard needs from your live installation via an encrypted connection, which is also how Drop Guard receives information regarding installed modules and their versions to determine available updates.

"Is there an agency partner program for Drop Guard partners?" Our pricing model targets the need for scale of Drupal shops: you pay for one site and use Drop Guard as a white-label service to sell reliable update services to your client. You can also add your agency fee to the price as needed. There will be an affiliate program till the end of this year that rewards successful recommendations to Drop Guard.

For those of you who couldn’t attend the webinar live, here’s the full video:

[embedded content]

and the slides are available as well:

Jul 28 2015
Jul 28

Drupalize.Me Tutorial

In this tutorial, you will learn how to place panels as a block in any region of your theme using Drupal's Blocks UI and a module packaged within the Panels project: Mini panels. This tutorial is based on a free video in our library, Placing Panels in Blocks with Mini Panels

A common use case for mini panels is "mega menus"—a collection of blocks grouped together that displays as an enhanced menu when the user activates a menu item. But mini panels can be used anytime you want to place panels as a block into any region of your theme, either using the core Blocks UI or contributed module Context. Instead of using the Panels module to create custom content within the primary content region, the Mini panels module allows you to expose panels as blocks that can be placed in any region of a page template, such as a header or footer region.


This tutorial assumes that you have at least an intermediate understanding and familiarity with Drupal, especially Blocks and Menus, that you can download and enable modules using a method of your choice, and that you have some experience or exposure to Panels, even if your only exposure is walking through any of the Panels video tutorials in our Drupalize.Me Library.

What You Will Learn

By the end of this tutorial, you will know how to find and enable the Mini panels module, create a mini panel that uses a 3-column layout containing 3 separate menus, and use the Blocks UI to place this mini panel in the footer region of our theme.

Why Mini Panels?

Panels are useful containers for Views, Menus, Blocks, and other widgets provided by Drupal core or contributed modules. But did you know that you can also expose panels as blocks? This can be a useful option, especially when you want to place panels in specific places in various regions of your theme—outside of the main content region.

For example, on the recent Lullabot redesign of blastr.com, mini panels containing views are placed as blocks using the Context module. These views leverage taxonomy terms as a contextual argument, and make use of the sitewide context that Context module provides. You can see this in action at blastr.com by activating the visible primary navigation items or by activating the collapsed mobile-friendly menu. (This menu is denoted by 3 horizontal bars, which is colloquially referred to as the "hamburger menu").

Mini Panels on blastr.com

What We Are Building

On my Drupal demo site, I want to create an enhanced footer region that displays 3 different menus in 3 separate columns. Even though my site currently uses Panels, the only region that Panels displays content in is the main Content region. The other regions of my theme, such as the header and footer regions, aren't configured to be accessible to Panels.

Additionally, my footer region is a full-width region, but I want to make use of a 3-column panels layout that allows me to place a different menu in each column.

So, is it possible to use panels in the footer region? The short answer is yes! Using the Mini panels module, we can create a mini panel and place it as a block in the footer region using the Blocks UI. Let's get started!

Enabling Mini Panels

Mini panels is a module contained within the Panels project. If you haven't already downloaded Panels, head over to https://www.drupal.org/project/panels and download the latest recommended release.

To enable the Mini panels module:

  1. In the Administrative Menu, go to Modules
  2. Locate the Mini panels module in the Panels group
  3. Check the box next to Mini panels, then click the Save configuration button at the bottom of the page

Creating Our Mini Panel

  1. From the Admin Menu, go to Structure > Mini panels (admin/structure/mini-panels)
  2. Click +Add to create a new mini panel
  3. In the Administrative title field, type "Footer Menus", which generates the machine name of footer_menus
  4. Complete the Administrative Description field for the benefit of your future self or anyone else wondering about the purpose of this mini panel (e.g.,"Navigation, user, and main menus in a 3-column mini panel.")
  5. Optional: Add a category to help you organize your mini panels.
  6. Click Continue
  7. Optional: Add contexts (none used in this tutorial)
  8. Click Continue
  9. On the Layout page, from the Category drop-down, select Columns 3
  10. Select the radio button for the Three column (33/34/33) layout
  11. Click Continue
  12. On the Content page, in the "Left side" region, click the gear then select Add content
  13. Select Menus on the left then click Navigation then the Finish button
  14. In the "Middle column" region, click the gear and select Add content
  15. Select Menus, click User menu, then click the Finish button
  16. In the "Right side" region, click the gear and select Add content
  17. Select Menus, click Main menu, then click the Finish button
  18. Under Live preview, click the Preview button to see the content from each menu displayed in the 3-column mini panel
  19. Optional: To change the content pane titles (Navigation, User menu, Main Menu), click the gear on the menu pane and select Settings. Check Override Title and enter a title for the pane. To remove the pane title completely, check Override Title and enter <none> into the Title field.
  20. When you are done configuring your mini panel, click the Finish button on the Content page.

Adding the Mini Panel to the Footer

All mini panels are exposed as blocks. Navigate to the Blocks administration page, and add our new mini panel to the footer region of our theme.

  1. From the Administrative Menu, go to Structure > Blocks
  2. Scroll down to the Disabled blocks section and locate the block Mini-panel: "Footer Menus"
  3. Select the cross-hairs to the left of Mini-panel: "Footer Menus" and drag it to the Footer region. Alternatively, select Footer from the dropdown menu in the center column of the Mini-panel: "Footer Menus" row.
  4. Click Save blocks

Check Your Work

Now navigate to the home page by clicking the home icon from the Administrative menu, and scroll down to the footer. The footer now shows the three menus we added in our mini panel!

Two Places for Configuration

The important thing to remember is that this component is both a block and a mini panel—that is, the mini panel is exposed as a block. So, configuration can happen in 2 places.

Block Visibility and Configuration

To control visibility settings or decide which page this block should appear on, go to Structure > Blocks and click the Configure link next to the Mini panel block. (If you're using the Context module, the block's visibility and placement settings may be configured using the Context UI instead.)

Mini Panel Configuration

There are several ways to access the mini panel's configuration to add or remove content, add a context, or change layout.

When viewing the mini panel, hover the cursor over the mini panel until a gear appears in the upper right. Select the gear then choose Configure block. From the block's configure screen, there will be a link to Manage the mini-panel. This link will take you directly to the Content settings page in the mini panel administrative section.

You can also access the mini panels from the Administrative menu. Go to Structure > Mini panels. Next to the mini panel you want to edit, from the dropdown menu on the right, select Edit.

One thing to note about the administrative interface for Mini panels, the main navigation: Settings » Context » Layout » Content is not a linear or wizard-type configuration, despite its appearance. You don't have to proceed through these settings from left to right, now that the mini panel has been created. So, to access Content, click the word Content. You can now add or remove content, move panes, or change pane settings.

Resources and Inspiration

This tutorial is also available as a free video, Placing Panels in Blocks with Mini Panels in the Drupalize.Me Library, or you can find it on the Drupalize.Me YouTube channel.

Create Mega Menus with Mini Panels! (What is a Mega Menu? (sitepoint.com).) You can use mini panels to create "mega menus" with the Drupal contributed module Menu Minipanels.

Explore the Lullabot redesign of blastr.com, and check out the sweet mini panels driving the main menu ("hamburger menu") and primary navigation (Trending, Features, Interviews, etc.).

Jul 28 2015
Jul 28

This is a followup post to a post where I showed how you can use VDC to display data from an external database table in a Drupal View. In this post I display an external database view as a Drupal View. This is another step towards showing how Drupal can be used as a Business Intelligence (BI) or data analysis platform.

Most reports usually have data from more than one database table. To cater for this need to use multiple tables, Drupal Views have the functionality to create relationships. At the time of writing you could not create relationships with VDC. This is not a problem because VDC allows you to use database views as a data source for your view. You can create database views that use data from different tables by using SQL joins. You can think of SQL joins as relationships in Drupal language.

Using the Chinook sample data set, I created a database view that showed total sales by genre per annum. The SQL code to create the database view is shown below.

CREATE VIEW vw_genre_total_year AS 
    DATE_FORMAT(InvoiceDate, '%Y') AS Calendar_Year
    ,c.Name AS Genre_Name
    ,SUM(a.Quantity) AS Total_Quantity
    ,SUM(a.Quantity*a.UnitPrice) AS Total_Amount
    InvoiceLine a
    JOIN Track b ON a.TrackId = b.TrackId
    JOIN Genre c ON b.GenreId = c.GenreId
    JOIN Invoice d ON a.InvoiceId = d.InvoiceId

Having already created my data source, the next step was to import the database view as a Drupal View using VDC. As you can see from the screen shot below VDC gave me the option to select database views or database tables.

VDC Database View

Once I had imported the database view, I created some exposed filters to make interacting with the data easier for the end users. The screenshot below shows the database view data shown in a Drupal View along with some filters defined.

VDC Database View Data & Filters

You have to apply your mind to database performance when creating database views. Your database view cannot have millions of lines that need to be summarised every time the view is accessed via the Drupal View. If you have considerable data that needs to be summarised you should use materialised views or create aggregated tables for better performance.

Using views or preferably materialised views will give you a measure of control on the SQL queries that you do not have when you rely on Drupal to generate the SQL queries. You will be able to use SQL features such as SQL UNIONS, OUTER JOINS and sub selects which I have found hard to implement using Drupal Views. Database views also give you a layer of abstraction. You could change the underlying database tables in the database view definition and not affect the Drupal View.

Jul 28 2015
Jul 28

MariaDB is a community-developed fork of the MySQL relational database management system intended to remain free under the GNU GPL. You can use the link to know more about MariaDB and  it's features. If you want to try MariaDB without losing MySQL, then here is the tutorial for running MariaDB alongside MySQL.

Let's start with the steps to install Mariadb along with Mysql

  • Download the latest version (mariadb-10.0.20-linux-i686.tar.gz - as of writing this article) from here and extract the files in /opt directory.
  • Create a directory for storing the mariadb data
[root@knackforge opt]# mkdir mariadb-data
  • Create symlinks from mariadb-10.0.20-linux-i686 to mariadb
[root@knackforge opt]# ln -s mariadb-10.0.20-linux-i686 mariadb
[root@knackforge opt]# ls -al
total 32
lrwxrwxrwx  1 root       root         26 Jun 24 10:06 mariadb -> mariadb-10.0.20-linux-i686
drwxr-xr-x 12 mariadb    mariadb    4096 Jun 24 10:05 mariadb-10.0.20-linux-i686
drwxr-xr-x  9 mariadb    mariadb    4096 Jun 24 09:42 mariadb-data
  • Create group mariadb and user mariadb and set correct ownerships:
[root@knackforge opt]# groupadd --system mariadb
[root@knackforge opt]# useradd -c "MariaDB User" -d /opt/mariadb -g mariadb --system mariadb
[root@knackforge opt]# chown -R mariadb:mariadb mariadb-10.0.20-linux-i686/
[root@knackforge opt]# chown -R mariadb:mariadb mariadb-data/
  • Create a new my.cnf in /opt/mariadb from support files:
[root@knackforge opt]# cp mariadb/support-files/my-medium.cnf mariadb-data/my.cnf
[root@knackforge opt]# chown mariadb:mariadb mariadb-data/my.cnf
  • Edit the file /opt/mariadb-data/my.cnf. We need to add custom paths, socket, port, user, data directory and base directory. Finally the file should look something like the following:
port            = 3307
socket          = /opt/mariadb-data/mariadb.sock
datadir         = /opt/mariadb-data
basedir         = /opt/mariadb
port            = 3307
socket          = /opt/mariadb-data/mariadb.sock
user            = mariadb
  • Copy the init.d script from support files to correct location:
[root@knackforge opt]# cp mariadb/support-files/mysql.server /etc/init.d/mariadb
[root@knackforge opt]# chmod +x /etc/init.d/mariadb
  • Edit /etc/init.d/mariadb file, we need to replace mysql with mariadb as below:
- # Provides: mysql
+ # Provides: mariadb
- basedir=
+ basedir=/opt/mariadb
- datadir=
+ datadir=/opt/mariadb-data
- lock_file_path="$lockdir/mysql"
+ lock_file_path="$lockdir/mariadb"
  • We need to inform mariadb to use only one cnf file. These changes need to be done carefully. In the start section after $bindir/mysqld_safe add --defaults-file=/opt/mariadb-data/my.cnf. Finally the lines should look like:
# Give extra arguments to mysqld with the my.cnf file. This script
# may be overwritten at next upgrade.
$bindir/mysqld_safe --defaults-file=/opt/mariadb-data/my.cnf --datadir="$datadir" --pid-file="$mysqld_pid_file_path" $other_args >/dev/null 2>&1 &
  • The same change needs to be made to the mysqladmin command below in the wait_for_ready() function so that the mariadb start command can properly listen for the server start. In the wait_for_ready() function, after $bindir/mysqladmin add --defaults-file=/opt/mariadb-data/my.cnf. The lines should look like:
wait_for_ready () {
    if $bindir/mysqladmin --defaults-file=/opt/mariadb-data/my.cnf ping >/dev/null 2>&1; then
  • Run mysql_install_db by explicitly giving it the my.cnf file as argument:
[root@knackforge opt]# cd mariadb
[root@knackforge mariadb]# scripts/mysql_install_db --defaults-file=/opt/mariadb-data/my.cnf
  • Now MariaDB can be started by
[root@mariadb-near-mysql opt]# /etc/init.d/mariadb start
Starting MySQL...
                                 [  OK  ]
  • To make MariaDB start at system boot, we need to do the following:
[root@knackforge opt]# cd /etc/init.d
[root@knackforge init.d]# chkconfig --add mariadb
[root@knackforge init.d]# chkconfig --levels 3 mariadb on
  • Finally test that you have both instances running:
[root@knackforge ~]# mysql -e "SELECT VERSION();"
[root@knackforge ~]# mysql -e "SELECT VERSION();" --socket=/opt/mariadb-data/mariadb.sock
Jul 28 2015
Jul 28

XHProf is a hierarchical profiler for PHP. It reports function-level call counts and inclusive and exclusive metrics such as wall (elapsed) time, CPU time and memory usage. A function's profile can be broken down by callers or callees. The raw data collection component is implemented in C as a PHP Zend extension called xhprof. XHProf has a simple HTML based user interface (written in PHP). The browser based UI for viewing profiler results makes it easy to view results or to share results with peers. A callgraph image view is also supported.

Install XHProf

I made sure xhprof-0.9.2 was installed inside the www-directory of the webserver.

wget http://pecl.php.net/get/xhprof-0.9.2.tgz
tar xvf xhprof-0.9.2.tgz
cd ./xhprof-0.9.2/extension/
./configure --with-php-config=/usr/local/bin/php-config
sudo make
sudo make install
sudo make test

The php-config file on my machine was located in /usr/local/php-config . Php-config is a file that lists the location of your php-installation. Xhprof needs it for its configuration. If you cannot find the file, you can simply execute ./configure without the parameter.

If phpize is not found, you have to install php5-dev

sudo apt-get install php5-dev

The first time, make install and the make test both failed with me, but xhprof worked nevertheless.

Set up PHP

In php.ini I set up this at the end of the file:


Create the directory: /var/tmp/xhprof

xhprof will keep its logfiles in there (files that are needed for the page analysis)

Set up your webserver

Create an entry for /path/to/xhprof-0.9.2/ in your webserver’s configuration (point xhprof.localhost to the /path/to/xhprof-0.9.2/. If local, setup your hostfile correctly (eg:  xhprof.localhost)

Restart php and your webserver.

Set up Drupal

When using Drupal, install Devel, and activate the XHProf settings (admin/settings/devel) and point to /path/to/xhprof-0.9.2/ and the path to the website: http://xhprof.localhost/xhprof_html

Devel will create a link at the end of each page that points to XHProf.


If you get the error that the function xhprof_enable is unknown, then it’s because XHProf wasn’t properly installed. Try to install it again and make sure you have restarted php.

If the link in Drupal to XHProf doesn’t work, check if the run parameter in the url is supplied. If not, try to install XHProf again and make sure you have restarted php. A typical error for this is: “No XHProf runs specified in the URL.”

For errors concerning php-config and phpize, see above.

Jul 28 2015
Jul 28

This post is about the progress which I have made while porting print module to Drupal 8 as part of GSoC 2015. Below is an excerpts about the work done in week nine while work done in week eight can be tracked over here.

This week I again visited PDF generating libraries again. And this time I managed to add dompdf to PDF submodule. It is an HTML to PDF converter and is compliant with CSS 2.1. The library is available on GitHub over here and was easily installed via composer by adding following lines as a requirement in composer.json file of the module:


"require" : {
   "dompdf/dompdf" : "0.6.*"


The library makes use php-font-lib hence when we are installing library using composer it is recommended to set parameter DOMPDFENABLEAUTOLOAD defined in the dompdf_config.inc.php file to false. This was done by adding following lines in the generator


requireonce '/path/to/vendor/dompdf/dompdf/dompdfconfig.inc.php';

After following the above procedure I was finally able to generate PDF for my node's content. It is one of the problem of this library that it does not supports rendering of SVG images but the themes logo which are present in Drupal 8 are of SVG type hence they are not able to render themselves in the PDF. I am still searching the solution for this but currently it seems that the user who wants to have a logo in the PDF will have to themself convert SVG to any other compatible formats like png or GIF. The dompdf generator still lacks function for generating header and footer which I will be adding in coming week.

In this week I was able to add some more unit tests in the module mainly related to testing of configuration of tabs and block generation. These can be browsed over here. Next week after completing the dompdf generator I will be taking care of some of the loose ends present in the code and then will be adding some more functional tests.

Some progress of PDF part of module can be viewed over here and remaining over here.

Jul 27 2015
Jul 27

Drupal 8 is the talk of the town and hopefully by Barcelona time, it can be the headliner. But with all of the excitement, there will also be changes that come with the new release. Drupal developers have grown used to using hooks throughout Drupal's history, but ‘times are changing’ as Joe Shindelar (eojthebrave) notes in his session about the new patterns Drupalistas will need to learn to make Drupal 8 work.

In Altering, Extending, and Enhancing Drupal 8 in room 117 on Tuesday from 11:00-12:00, eojthebrave will draw on his experience to help everyone in the audience leave with all the best practices for coding with Drupal 8. This session will provide the information needed to make informed decisions about how your custom code can, and should, talk to Drupal in order to uphold best practices and write modules that are easy to maintain and play well with others in a post hook world.

After drawing a large crowd at DrupalCon Los Angeles, this is a Coding and Development session you should make sure to add to your schedule!

Jul 27 2015
Jul 27

Since the last Drupal 8 core update, the API module maintainers started looking for co-maintainers, and Two-Factor Authentication was rolled out to anyone with the Community role on Drupal.org (among other improvements).

What's new with Drupal 8?

Drupal 8's minimum PHP version increased to 5.5.9, and minimum PostgreSQL version increased to 9.1.2. Also, tim-e handed off co-maintainership of the Contact module to Jibran Ijaz and Andrey Postnikov; and Frando stopped being a maintainer of the Entity, Form, and Render systems — special thanks to both tim-e and Frand for their amazing contributions!

Some other highlights of the month were:

How can I help get Drupal 8 finished?

See Help get Drupal 8 released! for updated information on the current state of the software and more information on how you can help.

We're also looking for more contributors to help compile these posts. Contact mparker17 if you'd like to help!

Drupal 8 In Real Life

Whew! That's a wrap!

Do you follow Drupal Planet with devotion, or keep a close eye on the Drupal event calendar, or git pull origin 8.0.x every morning without fail before your coffee? We're looking for more contributors to help compile these posts. You could either take a few hours once every six weeks or so to put together a whole post, or help with one section more regularly. If you'd like to volunteer for helping to draft these posts, please follow the steps here!