Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
May 22 2016
May 22

The basics

If you run an online business you should take analytics very seriously. Improving sales, conversions and any other objectives your web application has is an iterative process that needs to be based on measurable and meaningful indicators.

Google Analytics is the most widely used tool to track user data and interactions with your web application, but if you don't have a clear strategy on what you expect and what you are going to do with this data you will easily see yourself failing to extract any value from the overwhelming amount of data that GA collects by default.

If your current digital presence provider is just "enabling" analytics for you - relying on the fact that GA captures a lot of data out-of-the-box - you are wasting your time because:

  • It is difficult - and sometimes not possible - to map the default collected data with the specific processes and interactions on your web application.
  • Although it might be "fun" to track page views, visitors and other global indicators, if you are not integrating this into a defined strategy with objectives it will be useless.
  • Conversions - something that you will always want to track in an e-commerce - do not work out of the box. Avoid "url based" conversion tracking or "configurable" conversion tracking through GA. Conversions should be always reported to GA with laser precision from within your application. Stay away from conversions measured through "thank you pages" and anything similar.

Although GA can get extremely complex, there are some basic steps you can take to ensure that you are at least extracting some real value from this tool:

  • Identify what are the key processes/interactions in your web application. Group them conceptually (i.e. the different pages during a checkout process are all part of the checkout process). Of course, this includes regular conversions (a sale, a subscription, etc..) but can also be extended to processes the user has to go through before performing the conversion and other non-conversion related stuff but that is related to global goals such as improve engagement, reduce bounces, etc.
  • Define meaningful indicators for this processes, when possible choose indicators or groups of indicators that answer a specific question and that can point to potential areas of improvement.
  • Ask your digital provider to track this data as GA events. No excuses here. Everything can be tracked, both from server side and client side with a couple of lines of code. Even e-mail views and opens (for example from autoresponders in your application) can be tracked as conversions or GA events
  • Track this information for a reasonable period of time, no need to seek for statisticall significance here, just use your common sense.
  • Go through the data to detect areas of improvement. If needed, define new indicators to expose an area or opportunity or collect additional data that will allow you to propose improvements.
  • Make your changes.
  • Wait.
  • Repeat.

Besides tracking specific events, you should always

  • Track conversions (sales, subscriptions, etc.)
  • Ensure that your web application is properly managing the User Id functionality of GA. The User Id funcionality allows google to track everything a visitor has done on a site, or even accross devices if you have an authentication system in place. Imagine that you could open a sale on your e-commerce, retrieve the UserId, go to the GA panel and see exactly everything that customer did prior to the purchase (how many visits, how many time, what pages, etc.). This is what the User Id is for.
  • If you are spending money on Google Adwords make sure that you are perfectly propagating conversion values to GA. If you can't perfectly mesure the ROI of what is being spent in Adwords you are throwing money away. Adwords now allows you to use any event from GA to track conversion values. 

Tools for doing this in Drupal

From a technical point of view you need the following to support the above strategies:

  • Embed the GA tracking script
  • Track client side events
  • Track server side events
  • Manage the lifetime of the UserId and integrate it with your application

Embed the GA tracking script

The first thing you need to do is to add the GA tracking script to your application. You can do so with the Google Analytics Drupal module or embed the script programatically. If you use the module there some tweaks you can do and some extended tracking you can setup.

Track client side events

There's the Google Analytics Event Tracking Drupal module that let's you define jquery selectors on a server side hook that will trigger GA events on the client side. This might be a good starting point, but as soon as you want to track interactions that cannot be declared through a selector (for example scrolled to end of page, or hovered over an area) or a combination of interactions, you should go manual. Don't worry, this is super easy.

To trigger an event client side just use this sample code:

ga('send', {
          'hitType': event.hit_type,
          'eventCategory': event.event_category,
          'eventAction': event.event_action,
          'eventLabel': event.event_label,
          'eventValue': event.event_value
        });

This is just a sample, you need to decide when to fire the event and consider situations such as an event being fired twice for the same user. You can do whatever you want here with some code.

Track server side events

The Google Analytic Events Drupal module exposes a small API that will let you trigger GA events during the execution of server side code. These events are sent to the client via Javascript on both page loads and Ajax calls.

Use the following code to trigger an event:

\Drupal\google_analytics_events\EventService::getInstance()->queueEvent(
          (new \Drupal\google_analytics_events\Event())
          ->setHitType('event')
          ->setEventCategory('checkout')
          ->setEventAction('personal_data_submit')
          ->setEventLabel($curso->titulo->value())
          ->setEventValue(0)
        );

Manage the lifetime of the UserId and integrate it with your application

This one is a little more tough to implement as you need to make some operational decisions that depend the nature of your web application.

For example, if you have a 100% anonymous checkout e-commerce where users never log-in (those exist and work quite well if properly crafted) you can manage the lifetime of the UserId using client side cookies, and then store this data server side to match a UserId with whatever you use to store conversions (i.e. a sale).

Here are Google's guidelines to implement this feature:

https://developers.google.com/analytics/devguides/collection/analyticsjs...

May 08 2016
May 08

[Check out or free Installing Drupal 8 on PHP7 tutorial]

Drupal 8 was released with full PHP7 support:

[META] Support PHP 7

But the fact that the Drupal 8 codebase is able to run on PHP7 does not mean that you can run PHP7 Drupal 8 deployments. Why? Because there is more than just "Drupal" to run a decently performing Drupal 8 based application.

There are at least 3 additional components that need full PHP7 support:

  • In-memory caching (Wincache/APCu)
  • Database PDO driver
  • Key/Value store or NoSQL alternative for high performance caching

Depending on what your software stack is these might be ready or not yet.

Our stack of choice is Wincache + SQL Server + Couchbase. The reasons are varied, but these (at least the two last of them) are enterprise level products that outperform their competitors in many ways, will scale without issues and keep innovating at a faster peace than their competitors do.

For example, SQL Server 2016 with the improved in-memory tables feature is able to speed up database operations (including transactional ones, joins and other complex situations) up to x15 without changing a single line of code (you will need some extra RAM but with today's pricing that is not an issue).

Couchbase is x4.5 faster than MongoDB.

So how is each one of those components ready for PHP7 and Drupal 8?

  • In-memory caching - Wincache: I have been working closely with the maintainer on the last releases, and both 2.0.0.6 (for PHP 7) and 1.3.7.10 (for PHP < 7) and these are both production ready. Special thanks to Drophone for his continued - and personal he is not getting paid to work on this - efforts on this PHP extension. We uncovered and (he) fixed some tough bugs that were affecting Drupal 8. The Drupal Wincache module has also been brought up to date, updated to Drupal 8 and made PHP 7 ready. 
  • Database PDO driver - MSSQL: The MS team is working intensively into making PHP 7 on SQL Server a reality, and providing Linux support. You can follow their work here . With the latest release (4.0.4) installing Drupal 8 using PHP 7 is now a reality. The Drupal SQL Server driver has been tested and updated to work with PHP 7 and the latest PDO driver from Microsoft. Bugs in PHP itself were fixed in order to make the MSSQL pdo driver work and the minimum required PHP7 version is 7.0.6
  • Key/Value - NoSQL - Couchbase: Early work has been posted by the Couchbase team, but no official PHP binaries yet. We recommend to hold on before trying PHP7 with Couchbase until a more stable release is made and an official statement done regarding PHP7 support.

Overall, you can start deploying Drupal 8 on PHP7 for applications that will need not to rely on Couchbase. I am wondering if that will ever be needed, considering that with the new in-memory enhancements fo SQL Server 2016 cache backends can be tunned to work x15 times faster with just some more RAM.

I would like to give special thanks to Drophone and the MSSQL Server team for making this a reality.

What to expect from here on?

Now that there is full SQL Server and Wincache support for Drupal 8 since - more or less - day 1, our customers are asking for Azure Apps support. The latest Drupal SQL Server driver is 100% comptatible with Azure SQL V12. On the upcoming months, I will be releasing specific guides for deploying Drupal 8 (and probably 7) on Azure Apps.

Why using Azure Apps? There are lots of reasons and these are some:

  • Host anything from brochureware (from €5/month) to sites with millions of hits with predictable performance (yes this is no crap shared cloud hosting)
  • Infrastructure as code.
  • Automatic deployment slots per branch.

There's much much more to azure apps, but you can see that this is just awesome devops from a reliable provider.

May 01 2016
May 01

This post is on how we implemented a simple (yet effective) BigPipe "like" rendering strategy for Drupal 7.

Why is big pipe so important?

Big pipe is a render strategy that assumes that not all the parts of your page are equally important, and that a loaded delay on some of them is acceptable as long as the "core" of the page is delivered ASAP. Furthermore, instead of delivering those delayed pieces with subsequent requests (AJAX) it optimizes network load by using a streamed HTTP response so that you get all those delayed pieces in a single HTTP request/response.

Big pipe does not reduce server load, but dramatically improves your website load time if properly integrated with your application.

Sounds great, and will work very well on some scenarios.

Take for example this landing page (excuses for the poor UX, that's a long story...):

This page has about 20 views and blocks. All of those views and blocks are cached, but can you imagine what a cold cache render of that page looks like? A nightmare....

What if we decided that only 4 of those views were critical to the page, and that the rest of the content could be streamed to the user after the page has loaded? It willl roughly load 70% faster.

UPDATE: Adding support for content streaming has oppened the door to awesome succesfull business strategies - without penalizing initial page load times - such as geolocalizing (or even customizing per user) blocks, advertising and others. All of that while keeping page cache turned on and being able to handle similar amounts of traffic on the same hardware, and without resorting to custom Ajax loading (and coding).

We decided to take a shot and try to implement a big-pipe like render strategy for Drupal 7. We are NOT trying to properly do BigPipe, just something EASY and CHEAP to implement and with little disruption of current core - that's why this is going to be dubbed Cheap Pipe instead of Big Pipe.

Furthermore, it was a requirement that this can be leveraged on any current Drupal application without modifying any existing code. It should be as easy as going to a block or view settings and telling the system to stream it's contents. It should also provide programmatic means of defining content callbacks (placeholders) that should be streamed after the page is served.

We made it, and it worked quite well!

Now every block has a "Cheap Pipe" rendering strategy option:

Where:

  • None: Block is rendered as usual.
  • Only Get: Cheap pipe is used only on GET requests
  • Always: Cheap pipe is used on GET/POST and other HTTP methods.

Cheap pipe is never used on AJAX requests no matter what you choose here.

Why these options? Because some blocks might contain logic that could missbehave depending on the circumstances, and we want to break nothing. So you choose what blocks should be cheap piped, how, and in what order.

What happens after you tell a block (the example is for blocks but there is an API to leverage this on any rendered thing) to be cheap-piped?

  • The block->view() callback is never triggered and the block is not renderd but replaced with a placeholder.
  • The page is served (flushed to the user) and becomes fully functional by artificially trigerring the $(document).ready() event. The </body></html> tags a removed before serving so that the rest of the streamed content is properly formed.
  • After the page has been served to the user, all deferred content is streamed by flushing the php buffer as content gets created and rendered.
  • This content is sent to the user in such a way that it leverages the Drupal AJAX  framework (although this is not AJAX) so that every new content that reaches the page gets properly loaded (drupal behaviours attached, etc...)

Take a look at this end-of-page sample:

The output markup even gives you some stats to see what time it took to render each Cheap Piped piece of content.

Because cheap piped elements are generated sequentially, if an element is slow, it will delay the rendering of the rest of the elements. That's why we implemented a "weight" property so that you can choose in what order elements are cheap-piped.

What kind of problems did we find?

  • Deferred content that called drupal_set_message() was, obviously, not working because the messages had already been processed and rendered. Solved by converting the messages rendering into Cheap Pipe and making it the last one to be processed (thanks to the weight property).
  • Deferred content that relied on drupal_goto() (such as forms in blocks) would not work because the page had already been served to the user. drupal_goto() had to be modified so that if cheap pipe rendering had already started, the redirection was done client side with javascript.
  • When fatals are thrown after the main content has been served your page gets stuck in a weird visual state. There is nothing we can do about this because after fatals you loose control of php output.
  • Server load sky rocketed. What used to be anonymous pages served from cache, now require a full Drupal bootstrap to serve out the streamed content.
Apr 27 2016
Apr 27

As of this writing the only site building readily available module is the PLUpload file widget.

This module depends on the PLUpload form element provided by the Plupload integration module, that is a more developer oriented module.

With the Plupload widget/integration you can:

  • Have a drop-in replacement for core's file widget that supports big uploads (HTML 5 based)
  • Have a powerful FAPI element to upload big files

What kind of things can you do with this? See this video from our "Powerpoint to Files" post where a user uploads a several hundred megabytes Powerpoint file that gets converted to high quality images on the server side:

Once you have deployed the Plupload integration module you can use the FAPI widget to add uploading capabilities with code like this:

  /**
   * {@inheritdoc}
   */
  public function form(array $form, FormStateInterface $form_state) {
    $form = parent::form($form, $form_state);
    $form = Form::Instance('', $form);

    $form->AddElement('slides', [
      '#type' => 'plupload',
      '#title' => t('Cargar archivos'),
      '#autoupload' => TRUE,
      '#autosubmit' => TRUE,
      '#description' => t('Suba aquí los archivos de slides que se adjuntaran a la presentación.'),
      '#submit_element' => '#slides-upload',
      '#upload_validators' => [
        'file_validate_extensions' => ['zip ppt pptx'],
      ],
      '#plupload_settings' => [
        'runtimes' => 'html5,html4',
        'unique_names' => TRUE,
        'max_file_size' => static::MAX_FILESIZE_UPLOAD,
        'chunk_size' => '1mb',
      ],
    ]);

    $form = $form->Render();

    return $form;
  }

To use the site building capabilities of the Plupload File widget you simply pick the Plupload widget:

After that you will be presented with a Plupload widget to upload very big files comfortably;

What is important about this module is the fact that it is an extremely simple drop-in replacement for the core File widget. Indeed, most of the logic is inherited from the core File widget, making it future proof with very little maintenance.

Apr 14 2016
Apr 14

Drupal 8 performance: the Supercache module

Post date: 

Thu, 04/14/2016 - 00:00

Difficulty: 

Piece of Cake

The Supercache module is the result of an attempt to improve Drupal 8 efficiency when dealing with cache tag management and other design issues with several caching components that make it a pain to deal with Drupal 8 based applications that change a lot. 

An out of the box Drupal 8 install will issue about 2,100 database statements for a simple task such as performing a log in and creating two articles.

With a little setup and the Supercache module I was able to bring that down to 240 statements.

Here is a video proof that these numbers are real. The statement count is being measured real time thanks to the awesome SQL Server Profiler tool.

[embedded content]

The impact of the Supercache module - that was for a while a core patch - was benchmarked and proved to reduce wall times by about 25% and database queries by as much as 50% after things change (doing a cache write).

How does the Supercache module do this?

  • Drupal's cache system got heavier in Drupal 8 at the expense of making it more precise thanks to cache tags. But there are situations where you simply do not need all that bloatage. The Supercache module introduces a new and sleeker cache layer (of course without cache tags). A simple cache tag stored in Drupal's cache system takes up 196 bytes. The new caching system only uses 12 bytes. This does not seam like a big deal after all, it's just a few bytes difference. But it translates to being able to store 65,000 cache tags in 1MB of APCu/Wincache instead of just 5,000, But that is not the only advantage of this cache layer:
    • Reduced storage size, up to x12 less for small cache items.
    • Levarage native functionalities provided by most storage backends such as touch, counter, increment, etc.
    • Faster processing due to lack of cache tags and other extras.
    • Scalar types are stored natively so you can batch operate on the cache items themselves if the storage backend allows you to do so (database, couchbase or mongo)
  • Drupal 8 introduced the very useful ChainedFastBackend (that you can easily use in Drupal 7). But the current implementation of that backend has some design flaws - such as invalidating the whole fast backend when doing a cache write or not using cache tags in the fast backend. Supercache replaces the ChainedFastBackend implementation with one that solves those two issues improving hit rates in the fast backend on systems with lots of writes.
  • Replaces the default cache tag invalidator services (that works directly on the database) for one that leverages a similar concept to the ChainedFastBackend.
  • Introduces the ability for the key value storage to work similarly to the ChainedFastBackend.

To properly leverage what the Supercache module has to offer you should setup support for a centralized caching backend such as Couchbase

By: root Thursday, April 14, 2016 - 00:00

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web