Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
May 29 2020
May 29

Happy to announce we're attempting to kick off a new community initiative primarily focused on fixing and closing bugs.

We're hoping to

  • reduce the total amount of open core bugs
  • reduce the average age of open bugs
  • increase the number of closed bugs
  • mentor those looking to get more involved in contributing to core
  • give community members from countries in the Eastern Hemisphere the chance to participate in a core initiative in their local time zone.

We need you!

Resolving core bugs requires a wide range of skills - we're looking for volunteers to join us who can help with one or more of the following tasks:

  • Triaging and classifying bugs
  • Writing good bug reports, with steps to reproduce
  • Writing patches to resolve the bug
  • Writing automated test-cases to confirm the bug, and that it is resolved
  • Reviewing patches, through the lens of each of the core gates e.g. technical, accessibility, performance, backwards-compatibility, documentation, internationalization and usability.
  • Communicating with sub-system experts in order to gain sign off on non trivial changes
  • Writing documentation as required
  • Manual testing of the patch
  • Communicating any changes via blog posts, change records, release note snippets etc.
  • Coordination and management

If you are are looking to get involved - come join us on Slack in the #bugsmash channel.

We will meet asynchronously in the #bugsmash slack channel fortnightly on a Tuesday at 0400 UTC which maps to the following times:

  • 2pm AEST
  • 12 noon AWST, CST
  • 4pm NZST
  • 9.30am IST
  • 6.00am CEST
  • 5.00am BST
  • 8.00pm PST
  • 11.00pm EST
Nov 28 2016
Nov 28

At Drupal South 2016, I attempted to summarize some of my pain points with Drupal core.

The session 'At 16 years of age, does Drupal have an identity problem' is the result of this.

You can watch the recording, or download the slides below.

Recording

[embedded content]

Slides

View the slides.

Drupal South Drupal Core Drupal 8 Install profiles
Sep 09 2016
Sep 09

There has been some movement of late around adding some default content to the standard profile.

This was originally reignited by Roy Scholten in his getting something in the box post.

As author and co-maintainer of the default content module for Drupal 8, I wanted to share my thoughts on the potential of adding it to Drupal core.

The Snowman initiative is not a new one. And no-one would dispute that the 'You haven't got any content' message you see as soon as you install Drupal is a fairly ordinary first experience.

So it makes sense to try and improve that experience by adding some sample content.

We've been talking about it for ten years (yes that is a 5 digit issue number).

However it's not as simple as 'adding default content module to core as an experimental module' and away we go.

Default content is great insofar as it doesn't rely on any modules other than those provided by core.

But it's a two-edged sword.

It works by looking for default content in the form of json files inside a module's folder structure. These files live in content/{entity type} inside the module.

This content is stored in hal+json format as output by core's Rest and HAL modules.

The content is imported during hook_modules_installed which fires when a module is installed, but before its configuration is imported.

Issue 1

This is one of the first hard problems. Adding default content to standard profile would mean that none of the profile's configuration would have been imported yet. Which would result in exceptions/partial imports. You cannot import content into the page content type if it does not exist.

Profiles that make use of default content do things a little differently.

They put their configuration in a module which the profile depends on.

They put their default content in another module, which depends on the configuration module. For an example, see how aGov does it.

That way the install order happens like this:

  1. The configuration module is enabled and all the content types and fields etc are created
  2. The default content module is enabled, creating the content as required. An option to disable this can be added to the installer screen steps, since not everyone wants to start with default content.
  3. The install profile is enabled. It is always enabled last - this is non-negotiable.

This isn't how the current standard profile is structured - so there is considerable work to be done to reorganize how the configuration is structured.

Issue 2

The second issue here is that the default content module relies on the Rest, HAL and serialization modules in core. The HAL module provides the hal+json format. The Rest module provides the link managers that allow reverse resolving of embedded links to entity types and fields. The serialization module provides the serializer service which handles normalizing entities to arrays and then serializing them to a given format - in this case hal+json.

The issue is that you shouldn't need to enable these three modules just to import default content. If your site isn't providing an API in the form of Rest endpoints, you shouldn't need to have them available just to install default content. Those who build Drupal sites for a living normally steer clear of enabling modules they don't need. Comment module in standard profile is a good example of something that core enables by default but many sites don't use. Its the same issue here.

To resolve this, we need to do a lot of shuffling.

There is a valid argument that the serialization module is critical to core's functionality and hence should be disolved into the \Drupal\Core namespace instead of \Drupal\serialization. Being able to turn our data objects such as entities, field item lists and field items into arrays and back is a genuine core piece of Drupal functionality. We should support this in the low-level plumbing. Particularly for things like the MapItem typed data type. Using PHPs built in serialize/unserialize to convert our objects to and from arrays is a bad idea - just ask the PHP maintainers - this functionality was added to support user sessions and was never meant to be used the way it is.

So bringing serialization services into core resolves one of the three dependencies. The issue for that is here.

The next dependency is the HAL module. The key piece of functionality it provides is the embedding of entity references in the body of the serialized entities during normalizing. For example, if you normalize a node, you also get a reference to the author embedded - with the link done by UUID so it can be de-referenced. The large parts of default content module are concerned with sorting this dependency tree so that content is imported in the right order, with the dependencies intact. This is probably the easiest bit to swap out - and support for this has existed in the serialization module since Drupal 8.1.x - Default content module hasn't been updated to support it yet - because we don't want to break people's existing exports.

This leaves the Rest module as the final dependency. The main pieces it provides are the link manager services that allow generating canonical links to referenced/embedded entities and then de-referencing these into entity type, field name and bundle information during denormalizing. E.g. When you normalize a node, the author reference is stored using a URI that contains information about the entity type, the bundle and the field name. During denormalizing (which happens on content import) this information can be extrapolated so things end up in the right place. Fortunately we have an issue to move these out of the Rest module.

Issue 3

The next issue is perhaps the biggest one. There are shortcomings in core's normalizers. The main ones are around fields that resemble entity references but really aren't and fields with calculated values. And then there's normalizing files and images. The issues here are as follows:

Issue 4

The final issue is probably the most contentious. Just because we could add default content to the standard profile - does that mean we should? The standard profile isn't a product. A product is created to solve a problem or fill a need. The standard profile does not do that. It is a random assortment of features that once resembled a blogging site but no longer does.

We need a real product. Or several products. This was the basis of the platform initiative and also the Snowman group. Personally I feel that is a better use of our limited time and resources. Much of the platform initiative was put on hold while we focussed on the unofficial framework initiative. This was a five-year effort focussed on untangling the spaghetti like nature of core. We got a long way towards that goal.

Perhaps now its time to revisit the platform initiative.

I have some thoughts around how this might look - but will cover those in a future post.

Drupal 8 Platform initiative Default Content Framework Initiative Normalizers REST
Aug 23 2016
Aug 23

Now we've got the experience of a number of production D8 sites under our belt we took the time to consolidate our CMI workflow into some useful drush commands.

And naturally we've open sourced them.

Read on to find out more about our drush CMI tools.

Use case

Say you're working on a local development environment for a project where the client is adding and editing configuration. For example, the project might be using contact-forms or Yaml forms for user interaction. Each of these and their associated fields, form and view displays are a config object. As the client is editing these, you don't want that configuration tracked in your source control.

So you start working on a new feature, the first thing you do is sync down a QA or production database and then you run your config import so that your local environment is in a clean state.

You work on some features and the time has come to export those changes to your config export folder, in order to check in the new work into git ready for deployment.

Enter drush cexy

drush cexy

Its like drush cex but with some powersauce.

So the normal drush cex command comes with a --skip-modules option that prevents configuration from say devel module from being exported. But let's go back to our original use case.

We want to export all configuration, but we want to exclude certain patterns.

This is where the --ignore-list option of drush cexy comes in.

In our project we have a ./drush folder, so we stick a file in their called config-ignore.yml with contents as follows.

ignore:
  - field.field.contact_message.*
  - field.storage.contact_message.*
  - contact.form.*
  - core.entity_form_display.contact_message*
  - core.entity_form_display.contact_form*
  - core.entity_view_display.contact_message*
  - core.entity_view_display.contact_form*
  - system.site
  - workbench_email.workbench_email_template.*

You'll note there are some wildcards there. We're ignoring all contact message fields and forms as well as any form or view display configuration. Additionally we're ignoring Workbench Email templates and the system site settings.

So now we run drush cexy like so

drush cexy --destination=/path/to/config-export --ignore-list=/path/to/drush/config-ignore.yml

So what this does is export the active configuration, and then apply the ignore list to remove unwanted configuration.

So now when you run git status you should only see changes you want to commit.

Single install configuration

So lets assume you're working on a feature branch that requires installation of the Google Analytics module.

You download and enable the module

drush dl google_analytics
drush en -y google_analytics

And then you export your configuration with drush cexy using your build tool of choice (make in this case - cause remembering all the flags is bound to go wrong)

make export

After running that you find you have a new file in your exported configuration folder:

google_analytics.settings

Now you know that you want this configuration to be editable by the client, as they'll have different GA urchin codes on different environments.

So you don't want to check this into git. But, you do need it to be deployed at least once. And that's where drush cexy's sibling comes in drush cimy.

drush cimy

drush cimy is the import equivalent of drush cexy. We've found it significantly increases our CMI workflow productivity.

So returning to our single install of the google analytics settings. You'd just exported your config using drush cexy and found yourself with a new google_analytics.settings.yml file that you needed to deploy, but only once.

drush cimy combines the following features

  • The power of drush cim --partial
  • The ability to perform config deletes
  • The ability to perform one-time installs

The format is as follows

drush cimy --source=/path/to/config-export --install=/path/to/config-install --delete-list=/path/to/config-delete.yml

So we move the google_analytics.settings.yml out of our config-export folder and into our config-install folder. And then we add it to our drush/config-ignore.yml file, so it doesn't get exported in the future.

Partial imports

So as alluded above, drush cimy is similar to drush cim --partial in that it does partial imports.

The way drush cim --partial works is equivalent to the following

  • firstly it creates a temporary folder
  • then it exports all active configuration
  • then it copies your nominated config-export folder (the one under source control) over the top (in the temporary folder)
  • then it imports from the temporary folder

So what you get imported is all active config plus and new config from the config export, with changes in the exported config taking precedence over the active config.

The main pain point with using --partial is you don't get config deletes.

e.g. if you delete a config file from git (it is no longer in your config-export folder) because it is still present in the active configuration, it still remains after import.

So why is this a problem. So let's consider a scenario where someone enabled dblog module on QA, and saved the settings so that dblog.settings.yml is in the active configuration.

Your core.extensions.yml that is tracked in git does not contain dblog module. But dblog.settings.yml depends on dblog module.

So you work away on your feature and go to deploy to QA. But the import step of your deployment automation fails, because --partial places a copy of dblog.settings.yml in the temporary folder and tries to import it, but because dblog module is going to be disabled by the import, you have an unmet config dependency.

This is where the --delete-list flag kicks in. Let's look at a sample delete list file

delete:
  - dblog.settings

So this is where drush cimy varies from drush cim, its (equivalent) logic is as follows

  • As with drush cim --partial, first it creates a temporary folder
  • Move one-time install configuration into the folder first - so that active configuration takes precendence over initial state
  • export all active configuration
  • delete any configuration found in active configuration that is listed in the delete list
  • copy the nominated config-export (tracked in source control) over the top, taking final precendence
  • import the result

So this means you get the good bits of partial imports, without the dependency dramas that can result. It also allows you to perform valid deletes, something that isn't possible with drush cim --partial - for example you might want to delete a field from active configuration. Previously you'd have to write an update hook to do that before you performed your config import. Now you just list it in the config-delete.yml file

Installation

cd ~/.drush
wget https://raw.githubusercontent.com/previousnext/drush_cmi_tools/8.x-1.x/drush_cmi_tools.drush.inc
drush cc drush
drush CMI Drupal 8
Aug 22 2016
Aug 22

As you'd be aware by now - Drupal 8 features lots of refactoring of form procedural code to object-oriented.

One such refactoring was the way forms are build, validated and executed.

One cool side-effect of this is that you can now build and test a form with a single class.

Yep that's right, the form and the test are one and the same - read on to find out more.

Background

Firstly kudos here to Tim Plunkett who pointed this out to me, and to all of those who championed much of the refactoring that made this even possible.

Testing forms and elements in Drupal 7

In Drupal 7 to test a form, or an element you need the following:

  • A test module with:
    • A hook_menu entry
    • A form callback
    • (optional) A validate callback
    • (optional) A submit callback
  • A web-test (a test that extends from DrupalWebTestCase)

Drupal 8 forms

As you're probably aware from all the example code and posts out there, forms in Drupal 8 are objects that implement Drupal\Core\Form\FormInterface.

Luckily, you can write a test that both extends from KernelTestBase (Drupal\KernelTests\KernelTestBase) that also implements FormInterface. This means you don't need all of the additional routing plumbing you needed in Drupal 7.

Let's look at an example in Drupal - PathElementFormTest (\Drupal\KernelTests\Core\Element\PathElementFormTest). This test is to test core's PathElement (\Drupal\Core\Render\Element\PathElement) - a plugin that provides a form element where users can enter a path that can be optionally validated and stored as either a \Drupal\Core\Url value object or a array containing a route name and route parameters pair. So in terms of testing, the bulk of the logic in the element plugin is contained in the #element_validate callback - PathElement::validateMatchedPath.

There are several different combinations of configuration for the path element as follows:

  • Required and validated with no conversion
  • Required and non validated with no conversion
  • Optional and validated with no conversion
  • Optional, validated and converted into a route name/parameter pair
  • Required, validated and converted into a route name/parameter pair
  • Required, validated and converted into a Url object

So we need to set up several instance of the element on a test form.

So because our test extends from KernelTestBase, but also implements FormInterface, we just build a normal form array with all of these configurations in our implementation of FormInterface's ::buildForm method - see \Drupal\KernelTests\Core\Element\PathElementFormTest::buildForm to see how this is done. We're not interested in doing any additional validation or submission, so our implementation of FormInterface's ::submitForm and ::validateForm can be blank.

Testing the element behaviour

So to test the element validate works as expected for each of the fields, we need to trigger submission of the form. Now in a web-test, we'd use the internal test browser to visit the form on a route and then use a method like BrowserTestBase::submitForm to actually submit the form. But as we're using a kernel test here, there is no internal browser - so instead we can submit directly through the form_builder service (\Drupal\Core\Form\FormBuilderInterface). The code in PathElementFormTest looks something like this:

$form_state = (new FormState())
  ->setValues([
    'required_validate' => 'user/' . $this->testUser->id(),
    'required_non_validate' => 'magic-ponies',
    'required_validate_route' => 'user/' . $this->testUser->id(),
    'required_validate_url' => 'user/' . $this->testUser->id(),
  ]);
$form_builder = $this->container->get('form_builder');
$form_builder->submitForm($this, $form_state);

So firstly we're building a new FormState object and setting the submitted values on it - this is just a key-value pair of form values. Then we're getting the form builder service from the container and submitting the form.

From here, if there were any errors, they'll be present on the form state object. So we can do things likes check for expected errors, or check for expected values.

For example, to check that there were no errors.

$this->assertEquals(count($form_state->getErrors()), 0);

Or to check that the conversion occurred (i.e. the input path was upcast to a route name/parameter pair or Url object).

$this->assertEquals($form_state->getValue('required_validate_route'), array(
  'route_name' => 'entity.user.canonical',
  'route_parameters' => array(
    'user' => $this->testUser->id(),
  ),
));

Or to check for a particular error.

$this->assertEquals($errors, array('required_validate' => t('@name field is required.', array('@name' => 'required_validate'))));

Summing up

So why would you want to use this approach?

Well for one, the test is damn fast. Kernel tests don't do a full site install, and because there is no HTTP to fetch and submit the form, you get fast feedback. And when you get fast feedback, you're more likely to practice good test driven development.

So if you're building an element plugin for a contrib or client project, I encourage you to start with a test, or rather a form, or rather both. Specify the various configurations of your element and test the expected behaviour.

I'm sure you agree, this is another clear case of Drupal 8 for the win.

Drupal 8 Drupal Development Testing
Jul 11 2016
Jul 11

Drupal's Batch API is great, it allows you to easily perform long running processes with feedback to the user.

But during Drupal 8's development processes it was one of the remaining systems that didn't get the full object oriented, service-based architecture.

Much of the batch API is largely unchanged from Drupal 7.

But that doesn't mean you can't write unit-testable callbacks.

Let's get started.

Our goal

Our goal here is to end up with a method that we can test with PHPUnit, without an installed Drupal and without the service container. So we're talking about a pure unit test. Not an integration or functional test. Or in Drupal 8 terms, a class that extends from UnitTestCase, not KernelTestBase or BrowserTestBase.

Our starting point

So lets start with a hypothetical example of the Drupal 7 norm for batch callbacks, a global function. This would typically live in a .module file or perhaps a .inc file. Loading this file would be performed by the Kernel during boostrap or the batch processing if we used the file key. We're going to use a hypothetical example of a batch callback that examines a product and if it is on sale but the sale date has passed, removes it from a fictional search index. The key point here is that we need some services from the container and we have some logic that is worth testing. Other that than, its purely fictional.

/**
 * Batch callback.
 */
function mymodule_batch_callback($product_id, &$context) {
  $repository = \Drupal::service('mymodule.product.repository');
  $promotion_search_index = \Drupal::service('mymodule.promotion_search_index');
  /** @var \Drupal\mymodule\ProductInterface $product */
  $product = $repository->find($product_id);
  if ($product->isOnSale()) {
    $end = $product->getSaleEndDateTime();
    $now = new \DateTime();
    if ($end->getTimestamp() < $now->getTimestamp()) {
      // Sale is finished.
      $promotion_search_index->delete($product);
    }
  }
}

Our first step towards refactoring is to move this into a static method on an object. That way we can use the auto-loader to take care of loading files. PHPunit can't find code hidden in .module files without the Kernel to bootstrap loading these files. Now its an object, the autoloader will load it for PHPUnit.

I like to call these objects batch workers, cause they do the work of the callback. So let's name our class ProductSaleExpiryBatchWorker. Our code now looks something like this and lives in mymodule/src/Batch/ProductSaleExpiryBatchWorker.php.

namespace Drupal\mymodule\Batch;
/**
 * Batch Worker to handle processing expired sale items.
 */
class ProductSaleExpiryBatchWorker {

  /**
   * Process the expiration for a given product.
   *
   * @param int $product_id
   *   Product ID to test.
   * @param array $context
   *   Batch context.
   */
  public static function process($product_id, &$context) {
    $repository = \Drupal::service('mymodule.product.repository');
    $promotion_search_index = \Drupal::service('mymodule.promotion_search_index');
    /** @var \Drupal\mymodule\ProductInterface $product */
    $product = $repository->find($product_id);
    if ($product->isOnSale()) {
      $end = $product->getSaleEndDateTime();
      $now = new \DateTime();
      if ($end->getTimestamp() < $now->getTimestamp()) {
        // Sale is finished.
        $promotion_search_index->delete($product);
      }
    }
  }

}

As you can see, all we've really done is move the code into an autoloaded class and into the static process method. We still can't unit test this code, because we need a bootstrapped container. This is because we call \Drupal::service(). We can do this because the Batch API uses call_user_func_array to execute the callback which works with static functions too.

Unfortunately, we can't use a service in our callback. Many places in Drupal core use the controller resolver service for their callbacks/executables which supports using serviceid:method notation. E.g you can do something like this for many form API attributes that support callbacks:

$output['comment_form'] = [
  '#lazy_builder' => ['comment.lazy_builders:renderForm', [
    $entity->getEntityTypeId(),
    $entity->id(),
    $field_name,
    $this->getFieldSetting('comment_type'),
  ]],
  '#create_placeholder' => TRUE,
];

So that uses the renderForm() method on the comment.lazy_builders service as its callback. But Batch API didn't get controller resolver integration. So we're stuck with things that can be passed to call_user_func_array(). But all is not lost. Enter the factory method.

The factory method

One of the key tennants of unit testability is dependency injection. And the most common method of dependency injection is constructor injection. So let take our static method, and instead of having it do the processing, let's make it a factory method.

Our code now looks like this:

namespace Drupal\mymodule\Batch;

use Drupal\mymodule\ProductRepositoryInterface;
use Drupal\mymodule\SearchIndexerInterface;

/**
 * Batch Worker to handle processing expired sale items.
 */
class ProductSaleExpiryBatchWorker {

  /**
   * The product ID we're processing.
   *
   * @var int
   */
  protected $productId;

  /**
   * @var \Drupal\mymodule\ProductRepositoryInterface
   */
  protected $repository;

  /**
   * @var \Drupal\mymodule\SearchIndexerInterface
   */
  protected $searchIndexer;

  /**
   * Constructs a new ProductSaleExpiryBatchWorker object.
   *
   * @param \Drupal\mymodule\ProductRepositoryInterface $repository
   *   Product repo.
   * @param \Drupal\mymodule\SearchIndexerInterface $search_indexer
   *   Search indexer.
   * @param int $product_id
   *   Product ID.
   */
  public function __construct(ProductRepositoryInterface $repository, SearchIndexerInterface $search_indexer, $product_id) {
    $this->productId = $product_id;
    $this->repository = $repository;
    $this->searchIndexer = $search_indexer;
  }

  /**
   * Process the expiration for a given product.
   *
   * @param int $product_id
   *   Product ID to test.
   * @param array $context
   *   Batch context.
   */
  public static function process($product_id, &$context) {
    $repository = \Drupal::service('mymodule.product.repository');
    $promotion_search_index = \Drupal::service('mymodule.promotion_search_index');
    $worker = new static($repository, $promotion_search_index, $product_id);
    $worker->dispatch($context);
  }

  /**
   * Process the expiration for a given product.
   *
   * @param int $product_id
   *   Product ID to test.
   * @param array $context
   *   Batch context.
   */
  protected function dispatch(&$context) {
    $product = $this->repository->find($this->productId);
    if ($product->isOnSale()) {
      $end = $product->getSaleEndDateTime();
      $now = new \DateTime();
      if ($end->getTimestamp() < $now->getTimestamp()) {
        // Sale is finished.
        $this->searchIndexer->delete($product);
      }
    }
  }

}

So we're changing the primary function of the static process method to be

  • Creating a new instance (factory method)
  • Calling the dispatch method

Now we have the bulk of our logic in the dispatch method.

And the dispatch method no longer relies on the container. It uses the injected repository and search indexer.

So now we can write our unit test case. We can mock products that are on sale, or have past end dates and we can wire up a mock repository to return them.

Wrapping up

So while we didn't get all the Object oriented advantages in core, it doesn't mean you can't write unit-testable batch callbacks.

If you're interested in working on modernizing the Batch API - as always - there's an issue for that.

Drupal 8 Batch API Unit testing Refactoring
Jun 29 2016
Jun 29

This one tripped me up on a recent Drupal 8 project.

Easy to miss when you're working in a development oriented environment with things like JavaScript preprocessing turned off.

A JavaScript file was being added just fine with aggregation turned off, but not getting added with it turned on.

While working on a Drupal 8 client project we were using our module's .libraries.yml file to add a custom plugin for Jquery Validation. Our plugin was using the Moment.js date library to add strict date checking so we could check for overflows. The default date validation in that plugin treats dates like 55/55/5555 as valid - because they are cast to valid JavaScript dates by the browser. We needed to detect overflows and report an error.

It was working all fine locally, but when I sent a Pull request, it didn't work in the Pull request environment (we have per pull-request environments).

After some head scratching I found the issue.

My libraries.yml definition looked like this:

moment_date:
  version: VERSION
  js:
    js/moment_date: {}
  dependencies:
    - clientside_validation/jquery.validate
    - mymodule/moment
    - core/modernizr

If you picked it, I've missed the .js suffix on the file name.

Locally I was working with developer optimised settings, so I had a settings.local.php with the following

$config['system.performance']['js']['preprocess'] = FALSE;

i.e. I was disabling JavaScript aggregation so I could rapidly iterate, something you'd normally do.

Problem was on the Pull Request environment JavaScript aggregation is turned on (as it should be).

And mysteriously this made a difference.

My libraries.yml file was just plain wrong, it should have been

moment_date:
  version: VERSION
  js:
    js/moment_date.js: {}
  dependencies:
    - clientside_validation/jquery.validate
    - mymodule/moment
    - core/modernizr

But with JavaScript aggregation turned off, my webserver was adding the file, sending moment_date.js when moment_date was requested - silently hiding the bug from me.

A tricky one, but one worth sharing.

Drupal 8 libraries JavaScript
Jun 27 2016
Jun 27

On a recent Drupal 8 client project our client was building listing pages using views exposed filters and adding these to the menu.

This resulted in several menu URLs pointing to the same base path, but with the query arguments determining the difference.

However Drupal 8's default menu-trail calculation was resulting in the menu highlighting all instances when one of them was viewed.

Luckily the active trail calculation is done in a service and it was simple to modify the default behaviour.

Read on to see how we did it.

The problem

So the site included a view that displayed all of the different Venues the client managed, with exposed filters that allowed filtering the listing into groups.

The client used the URL generated by the filters to add different menu entries. For example there was a list of 'Community centres' in one section of the menu, linking to a pre-filtered view. In another section of the menu there was a link to 'Outdoor art spaces', also a link to a pre-filtered view.

However Drupal 8's default menu active trail calculation uses the \Drupal\Core\Menu\MenuLinkManager::loadLinksByRoute() method to calculate the active trail. As indicated by the name, this only loads matches based on the route name and parameters, but doesn't consider query arguments such as those used by Views exposed filters.

The solution

Luckily, the menu active trail calculation is handled in a service. This means we can override the definition and inject an alternate implementation or arguments.

Now there are two points we could override here, we could inject a new menu link manager definition into the menu active trail service, and change the way that loadLinksByRoute works to also consider query arguments - however the active trail service is heavily cached, and this would result in the first one to be cached and any subsequent ones to not work.

Instead we need to run our code after the values are fetched from the cache, so the logical point is to override Drupal\Core\Menu\MenuActiveTrail::getActiveTrailIds() method to filter out matches and their parents that don't match the current query arguments.

So to do this we need an implementation of \Drupal\Core\DependencyInjection\ServiceModifierInterface. Ours looks something like this:

<?php

namespace Drupal\my_module;

use Drupal\Core\DependencyInjection\ContainerBuilder;
use Drupal\Core\DependencyInjection\ServiceModifierInterface;
use Symfony\Component\DependencyInjection\Reference;

class MyModuleServiceProvider implements ServiceModifierInterface {

  /**
   * {@inheritdoc}
   */
  public function alter(ContainerBuilder $container) {
    // Get the service we want to modify.
    $definition = $container->getDefinition('menu.active_trail');
    // Inject an additional service, the request stack.
    $definition->addArgument(new Reference('request_stack'));
    // Make the active trail use our service.
    $definition->setClass(MyModuleMenuActiveTrail::class);
  }
}

For more information, see our previous blog post on overriding Drupal 8 service definitions.

Filtering on query parameters

Now we have our new active trail service, we need to filter out the links that match on route, but not on query arguments.

To do this, we need to get the query arguments from the current request. In our service alter above you'll note we injected an additional service into our active trail class, the request stack.

This allows us to get the current request and therefore the query arguments.

So first we need a constructor to handle the new argument, and a class property to store it in.

<?php

namespace Drupal\my_module;

use Drupal\Core\Cache\CacheBackendInterface;
use Drupal\Core\Lock\LockBackendInterface;
use Drupal\Core\Menu\MenuActiveTrail;
use Drupal\Core\Menu\MenuLinkManagerInterface;
use Drupal\Core\Routing\RouteMatchInterface;
use Symfony\Component\HttpFoundation\RequestStack;

/**
 * Defines a class for menu active trail that considers query parameters.
 */
class MyModuleMenuActiveTrail extends MenuActiveTrail {

  /**
   * Current request stack.
   *
   * @var \Symfony\Component\HttpFoundation\RequestStack
   */
  protected $requestStack;

  /**
   * {@inheritdoc}
   */
  public function __construct(MenuLinkManagerInterface $menu_link_manager, RouteMatchInterface $route_match, CacheBackendInterface $cache, LockBackendInterface $lock, RequestStack $request_stack) {
    parent::__construct($menu_link_manager, $route_match, $cache, $lock);
    $this->requestStack = $request_stack;
  }

}

Now we have the pieces in place, we just need to add the code to filter out the links and their parents that don't match on query parameters.

/**
 * {@inheritdoc}
 */
public function getActiveTrailIds($menu_name) {
  // Get the existing trail IDs from the core implementation.
  $matching_ids = parent::getActiveTrailIds($menu_name);
  // If we don't have any query parameters, there's nothing to do here.
  if (($request = $this->requestStack->getCurrentRequest()) && $request->query->count()) {
    // Start with the top-level item.
    $new_match = ['' => ''];
    // Get all the query parameters.
    $query = $request->query->all();
    // Get the route name.
    $route_name = $this->routeMatch->getRouteName();
    if ($route_name) {
      $route_parameters = $this->routeMatch->getRawParameters()->all();

      // Load all links matching this route in this menu.
      $links = $this->menuLinkManager->loadLinksByRoute($route_name, $route_parameters, $menu_name);
      // Loop through them.
      foreach ($links as $active_link) {
        $match_options = $active_link->getOptions();
        if (!isset($match_options['query'])) {
          // This link has no query parameters, so cannot match, ignore it.
          continue;
        }
        if ($match_options['query'] == $query) {
          // This one matches - so we add its parent trail to our new match.
          if ($parents = $this->menuLinkManager->getParentIds($active_link->getPluginId())) {
            $new_match += $parents;
          }
        }
      }
    }
    // Replace the existing trail with the new trail.
    $matching_ids = $new_match;
  }
  return $matching_ids;
}

Wrapping up

Drupal 8's service based architecture gives us new levels of flexibility, personally I'm really enjoying building client projects with Drupal 8. I hope you are too.

Drupal 8 Menu active trail Views Request Stack Service Modifier
Jun 20 2016
Jun 20

Drupal 8 includes a datetime field type and widget out of the box.

The widget uses the HTML5 date element on supported browsers, providing a polyfill to a text field with jQuery UI's datepicker for browsers that don't yet support HTML5 date inputs.

However - HTML5 date formats always work in ISO8601 format - ie YYYY-MM-DD - which isn't very user-friendly for those using Firefox and Internet Explorer.

Luckily, with a few tweaks you can easily swap this into DD/MM/YYYY format for those browsers and then switch it back server side into the format Drupal expects.

Step 1 - Telling jQuery UI what format to display the date in.

Luckily, the JavaScript that provides the polyfill interrogates a data attribute on the field to decide what date format to use when the user selects a value.

So our first step is to make sure this attribute is set.

We start with a hook_element_info_alter() and add a process callback:

/**
 * Implements hook_element_info_alter().
 */
function datetime_tweaks_element_info_alter(array &$types) {
  $types['datetime']['#process'][] = 'datetime_tweaks_datetime_set_format';
}

And then in our process callback, we make sure we set the data-drupal-date-format and #date_date_format attributes. We also set a nicer title value - both for user feedback and also for modules like Clientside validation - which use this for error messages.

/**
 * Element process callback for datetime fields.
 */
function datetime_tweaks_datetime_set_format($element) {
  // Use d/m/Y format.
  $element['date']['#attributes']['data-drupal-date-format'] = ['d/m/Y'];
  $element['date']['#date_date_format'] = 'd/m/Y';
  $element['date']['#attributes']['title'] = t('Enter a valid date - e.g. @format', [
    '@format' => (new \DateTime())->format('d/m/Y'),
  ]);
  return $element;
}

Now when the user selects the value in the jQuery time-picker, the value is pasted in the more friendly d/m/Y format. However we have a problem.

Drupal expects the incoming values to be in Y-m-d format, as that is how the HTML5 date element works. The values are sent/received in Y-m-d and the browser formats them according to the user's locale settings. E.g. all of the world except the US would get d/m/Y while the US would get m/d/Y.

So we need to make sure we switch the values back.

Step 2 - switching values back.

Again we go to our old friend hook_element_info_alter(), this time we add a #value_callback to override the default one provided by the datetime element - (\Drupal\Core\Datetime\Element\Datetime::valueCallback).

/**
 * Implements hook_element_info_alter().
 */
function datetime_tweaks_element_info_alter(array &$types) {
  $types['datetime']['#value_callback'] = 'datetime_tweaks_datetime_value';
  $types['datetime']['#process'][] = 'datetime_tweaks_datetime_set_format';
}

Then in our value callback we do the switch if the incoming value is in the format we were expecting. We use the very handy \Drupal\Component\Datetime\DateTimePlus::createFromFormat inside a try-catch block, that way we only convert dates that match the format we switched the jQuery UI datepicker to use. For browsers that aren't using the polyfill, they'll already be sending the value in Y-m-d and we don't want to intervene. Finally now that we've switched the date back to the expected format, we let the default value callback run.

/**
 * Element validate callback for browsers that don't support HTML5 type=date.
 */
function datetime_tweaks_datetime_value(&$element, $input, FormStateInterface $form_state) {
  if ($input !== FALSE) {
    try {
      if ($date = DrupalDateTime::createFromFormat('d/m/Y', $input['date'], !empty($element['#date_timezone']) ? $element['#date_timezone'] : NULL)) {
        // Core expects incoming values in Y-m-d format for HTML5 date elements.
        $input['date'] = $date->format('Y-m-d');
      }
    }
    catch (\Exception $e) {
      // Date is not in d/m/Y format - nothing to do.
    }
  }
  return Datetime::valueCallback($element, $input, $form_state);
}

Now we have the values being sent in d/m/Y from browsers that don't support the date element, but switched back into the format Drupal expects, however Drupal will still be sending default values in the format it is expecting - Y-m-d, so we need to handle those.

Step 3 - Default values

So for browsers that don't support the date element, the incoming default values in the DOM will be in Y-m-d format, but we want to display them to the user in d/m/Y format. So we go back to our process callback and attach some JavaScript that uses Modernizr to check for date support, and applies a polyfill to switch the formats clientside.

/**
 * Element process callback for datetime fields.
 */
function datetime_tweaks_datetime_set_format($element) {
  // Use d/m/Y format.
  $element['date']['#attributes']['data-drupal-date-format'] = ['d/m/Y'];
  $element['date']['#date_date_format'] = 'd/m/Y';
  $element['date']['#attributes']['title'] = t('Enter a valid date - e.g. @format', [
    '@format' => (new \DateTime())->format('d/m/Y'),
  ]);
  $element['#attached']['library'][] = 'datetime_tweaks/default_date';
  return $element;
}

And then in our new polyfill for default date

/**
 * @file
 * Default date values.
 */

(function ($, Drupal) {

  'use strict';

  Drupal.behaviors.datetimeTweaksDefaultDate = {
    attach: function (context, settings) {
      var $context = $(context);
      // Skip if date is supported by the browser.
      if (Modernizr.inputtypes.date === true) {
        return;
      }
      $context.find('input[data-drupal-date-format]').once('default-date').each(function () {
        var $el = $(this);
        var val = $el.val();
        // If default date is in Y-m-d format, switch to d/m/Y for browsers
        // that don't support html5 date format.
        if (val.match(/[0-9]{4}-[0-9]{2}-[0-9]{2}/)) {
          var parts = val.split('-');
          $el.val(parts[2] + '/' + parts[1] + '/' + parts[0]);
        }
      });
    }
  };

})(jQuery, Drupal);

So we now have the pieces we want.

Bonus points - removing seconds from time element.

In addition, the default behaviour for browsers that support the date element (e.g. Chrome) is to require seconds for time, which is rarely needed, and can also be easily changed.

Back in our process callback, we add a nicer title, and change the step attribute. The default step attribute is 1, meaning the element accepts time increments of 1 second. Switching this to 60 means the smallest unit is minutes and the seconds element isn't added by the browser.

/**
 * Element process callback for datetime fields.
 */
function datetime_tweaks_datetime_set_format($element) {
  // Use d/m/Y format.
  $element['date']['#attributes']['data-drupal-date-format'] = ['d/m/Y'];
  $element['date']['#date_date_format'] = 'd/m/Y';
  $element['date']['#attributes']['title'] = t('Enter a valid date - e.g. @format', [
    '@format' => (new \DateTime())->format('d/m/Y'),
  ]);
  $element['time']['#attributes']['title'] = t('Enter a valid time - e.g. @format', [
    '@format' => (new \DateTime())->format('h:i'),
  ]);
  // Remove seconds in browsers that support HTML5 type=date.
  $element['time']['#attributes']['step'] = 60;
  $element['#attached']['library'][] = 'datetime_tweaks/default_date';
  return $element;
}

Wrapping up

So if this sounds useful for your project, all of the code can be found in this github repository

Thanks to all those that worked on Date support in core, especially the maintainers - Jonathan Hedstrom 'jhedstrom' and Matthew Donadio 'mpdonadio'.

If you want to help out - they're hard at work on adding support for end dates in a single field - why not head over there and help out with reviews and manual testing?

Drupal 8 Date Time HTML5 Polyfill
Jun 15 2016
Jun 15

Whilst working on a Drupal 8 project, we found that cache tags for a Block Content entity embedded in the footer weren't bubbling up to the page cache.

Read on to find out how we debugged this and how you can ensure this doesn't happen to you too.

The problem

On our site we had a Block Content entity embedded in the footer that contained an Entity Reference field which allowed site admins to highlight author profiles. The issue was that if the admin edited this block and changed the referenced authors, or the order of the authors - the changes didn't reflect for anonymous users until the page cache was cleared.

This immediately sounded like an issue with cache tags bubbling.

About cache tags

So what are cache tags. Well lets quote the excellent handbook page:

Cache tags provide a declarative way to track which cache items depend on some data managed by Drupal.

So in our case, as the page is built, all of the content that is rendered has its cache tags bubble up to the page level cache entry. When any of the items that form the page are updated, all cache entries that match that item's tags are flushed. This ensures that if a node or block that forms part of a page is updated, the cache entry for the page is invalidated. For entities, the tags are in the format {entity type}:{entity id} - e.g. node:2 or block_content:7

Clearly this wasn't happening for our block.

Debugging cache tags

So the first step with debugging this issue is to see what cache tags were associated with the page.

Luckily, core lets you do this pretty easily.

In sites/default/services.yml you'll find this line:

http.response.debug_cacheability_headers: false

Simply change it to true, and rebuild your container (clear caches or drush cr). Then browse to your site and view the headers in the Network panel of your developer toolbar. You'll start seeing headers showing you the cache tags like so:

Response headers

Checkout the X-Drupal-Cache-Tags one to see what tags make up the page.

So in our case we could see that the block we were rendering wasn't showing up in the cache tags.

Digging into the EntityViewBuilder for the block and block content entity, we could see that the right tags were being added to the $content variable, but they just weren't bubbling up to the page level.

Rendering individual fields in Twig templates

Now, this particular block had it's own Twig template, we were controlling the markup to ensure that one field was rendered in a grid layout and another was rendered at the end. The block type had two fields, and we were rendering them using something like this:

<div{{ attributes.addClass('flex-grid__2-col') }}>
  {% if label %}
    <h2{{ title_attributes.addClass('section-title--light') }}>{{ label }}</h2>
  {% endif %}
  {% block content %}
    <div class="flex-grid">
      {{ content.field_featured_author }}
    </div>
    <div class="spacing--small-before">
      {{ content.field_more_link }}
    </div>
  {% endblock %}
</div>

i.e We were rendering just field_featured_author and field_more_link from the content variable. And this is the gotcha. You have to render the content variable to ensure that its cache tags bubble up and end up in the page cache.

The fix

There were only two fields on this block content entity, and we wanted control over how they were output. But we also had to render the content variable to make sure the cache tags bubbled. This was a chance for the Twig without filter to rescue the day. The new markup was:

<div{{ attributes.addClass('flex-grid__2-col') }}>
  {% if label %}
    <h2{{ title_attributes.addClass('section-title--light') }}>{{ label }}</h2>
  {% endif %}
  {% block content %}
    {{ content|without('field_featured_author', 'field_more_link') }}
    <div class="flex-grid">
      {{ content.field_featured_author }}
    </div>
    <div class="spacing--small-before">
      {{ content.field_more_link }}
    </div>
  {% endblock %}
</div>

In other words, we still render the fields on their own, but we make sure we also render the top-level content variable, excluding the individual fields using the without filter.

After this change, we started seeing our block content cache tags in the page-level cache tags and as to be expected, changing the block triggered the appropriate flushes of the page cache.

Twig Caching Drupal 8 Cache Tags
Aug 23 2015
Aug 23

Earlier in the year we worked with a household Australian name to help them build a next-generation more-performant version of their energy-comparison site.

We used Blackfire.io to optimize the critical elements of the site.

Read on to find out more about our experience.

Project background

We've built some pretty complex sites and were thrilled when we were approached by a household Australian name to help them build a next-generation more-performant version of their energy-comparison site.

We took a fairly bold approach to the project, focussing on some of the key pain points of the previous build such as:

  • Slow to load search results page
  • Difficult for admin users to add and edit energy offers.

We settled on a site for admin users to maintain the energy offers with file upload support and a separate consumer site to power the offer search.

In order for them to both access shared data, the admin user site stores and updates data using AWS DynamoDb as the canonical data-source.

Published energy offers (available for consumers to search) are stored in an ElasticSearch index with access to both DynamoDb and Elastic abstracted behind a generic PHP library that contains both infrastructure concerns as well as the domain model.

This means the bulk of the code is decoupled from Drupal, and the two sites interact with the storage and retrieval via the domain model interfaces. This decoupling meant we could test domain logic without needing the full Drupal stack.

The key piece of the site functionality is the ability to enter your current energy usage and have the system show you an indicative bill for each of the offers available in your area.

Obviously this means you can't use reverse-proxy caching technologies like Varnish as each search relies on user-posted data to feed into the complex calculations performed by the algorithm.

Throughout the build, we identified a number of places to optimize the algorithm but in the interest of avoiding premature optimisation, waited until the bulk of the site was built before profiling.

Enter Blackfire.io

Having had success using Blackfire.io to profile the Drupal 8 installation process, we opted to use it to help us squeeze the most out of our application.

Setting up was a breeze, we simply added the required repositories, installed the packages and completed the configuration.

Because we were intersted in profiling the POST requests to the search form, where the most intensive calculation algorithm was, we used the browser to submit the form and then grabbed it as a cURL request from the console.

Getting POST data as cURL

From here it was straight forward to use the cli blackfire binary with the curl string to profile a search submission and the algorithm.

First pass and eliminating HTTP requests

We ran our first pass to generate a baseline for comparison. The test was ran using a local VM against a staging AWS DynamoDB instance with realistic data and Elasticsearch running on a an EC2 instance. We were expecting some network latency, but found that HTTP requests accounted for 94% of the page-load time. Even in the same data-centre instead of on a VM, this was surely low hanging fruit.

The bulk of our search data was stored in Elasticsearch, but each record was associated with the retailer that offered it. Thanks to profiling we were able to see that the bulk of the load time was loading each retailer entity in turn from the DynamoDb store. Given changes to retailer details (address, logo etc) would be very infrequent, we added a cache-layer to this, which was easy to do as access to this data was behind a domain-model interface. Thanks to the trusty service container we simply created a new cached decorator for our retailer storage service.

Saving time by eliminating http requests

Second pass and hashing

After implementing this cache layer, we had reduced the page load time by 90% in our first pass. Digging into the second pass we found a lot of time being spent hashing data returned from the ElasticSearch index. In order to track if any changes needed saving on the admin user site, we implemented a hash calculation in the Offer object's constructor, which could be recalculated at anytime to determine if changes existed that needed to be persisted. Because the domain model allowed storage of offers in both DynamoDb and Elasticsearch, we found that offers loaded back from Elasticsearch were also calculating this hash, but because the consumer-site was read-only this was redundant. So we slightly modified the constructor to only calculate the hash if it was missing and saved 56% of the cpu-time of the already improved page.

Saving time be eliminating needless hashing

Third pass and calculate on write

The bulk of the algorithm for calculating the energy offer estimate requires determining how many days a given tariff overlaps with a given season. On our third pass we saw a lot of time being spent calculating these intersections. But - these are static, and only change when the offer changes, so could be calculated and persisted with the offers when they were saved. After changing these to be calculated at write time we saw another 10% reduction in cpu-time. Now we were down to an order of magnitude faster than the baseline.

Saving time by calculating intesections at write time

Fourth pass and optimising \DateTime creation

On our fourth pass we noticed the a lot of cpu-time being spent constructing \DateTime objects. Many of the offer properties are timestamps which are stored as strings and converted back into \DateTime objects in the denormalizer. We were using the generic \DateTime constructor method in a utility factory, but in all cases we knew the stored format of the string, so switching it to the more-performant \DateTime::createFromFormat() and statically caching a single timezone object instead of creating a new one each time saved us some more time.

Date object instantiation eating CPU cycles

Wrapping up

There are a number of profiling options on the market for PHP, but Blackfire is by far the easiest we have found to install, utilize and interpret. The ability to switch between CPU cycles, load time, memory use and I/O time and quickly pinpoint pain-points means you can yield real performance improvements in just a few pases - as seen by this case-study.

I look forward to using blackfire on my next project and in my open-source contributions.

This post originally appeared on the Blackfire.io blog

Aug 03 2015
Aug 03

Drupal 8 comes with a services based architecture allowing clean dependency injection, separation of concerns and another way to modify how Drupal works without hacking core

You've probably heard that Drupal 8 lets you swap out a core service for your own implementation, hey, I even said it myself here and here, but how do you achieve that?

Read on to find out how to manipulate Drupal 8 services at run-time and how this compares to other popular PHP Frameworks like Laravel, Silex and Symfony

Services based architecture

Drupal 8 comes with a service-based architecture. Put simply, services are objects that are responsible for representing operations that cannot be modelled as value-objects or entities. They are responsible for providing infrastructure concerns, operating on domain objects (value-objects and entities) and executing operations on domain objects.

Services in Drupal 8 are managed and instantiated using the dependency injection container, also known as the service container.

The service container is an implementation of the Inversion of Control design pattern. Its primary role is to build services on behalf of client services, to decouple the clients from the burden of knowing how the dependent service is constructed.

In Drupal 8, the primary registration of services is done via YAML based configuration, in the form of core and each enabled module's services.yml file.

An example if you will

Lets consider the forum index storage service as an example. This service is responsible for managing the association between forum posts (nodes) and forums (taxonomy terms) in an optimized format to allow forum module to operate in the most efficient manner. It implements \Drupal\forum\ForumIndexStorageInterface. Everywhere the forum index storage service is required, it is typehinted using the interface. This complies with the Liskov substitution principle and allows an alternate implementation of ForumIndexStorageInterface to be substitued. The default forum index storage service uses the database connection to read and write to the {forum_index} table. Lets assume you're working on a client-project that needs forum module, but for performance sake you want to store forum posts in a No-SQL database like MongoDB or perhaps the managed DynamoDb service from Amazon AWS.

Modifying services and the container

Tim Milwood gave a static example of how you might edit a service in the container, but there are advanced use cases that it does not handle - what if you want to change the arguments - or what if you need to perform dynamic modifications - i.e. only if a particular module is installed, or based on some other state of the container. Consider again the forum index storage example. The default service definition looks like this:

forum.index_storage:
class: Drupal\forum\ForumIndexStorage
arguments: ['@database', '@forum_manager']
tags:
- { name: backend_overridable }

But in our example, we don't want to inject the database. Lets assume we're using DynamoDb and we have a Dynamo client service which we need injected instead. The alias approach doesn't allow us to do this.

This is where \Drupal\Core\DependencyInjection\ServiceModifierInterface and the \Drupal\Core\DependencyInjection\ServiceProviderInterface come in. To implement this interface we need to first give our class a magic name. This is one of the few non-hook places in core where convention prevails over configuration, but for those familiar with hooks magic names are nothing new.

Start by taking your module name and converting it into camel case, then add the ServiceProvider suffix. For example if our module for the Dynamo Db implementation of forum index storage has a machine name of forum_dynamo, we'd need a fully qualified class name of \Drupal\forum_dynamo\ForumDynamoServiceProvider. You can also extend from \Drupal\Core\DepedencyInjection\ServiceProviderBase if you like, which already implements the interfaces for you. Lets look at the two possible interfaces in detail.

ServiceProviderInterface

This interface is for when your module needs to register services in the service-container in a dynamic fashion. YAML based service configuration is great, and very easy to parse - but what if you need to alter your service definition based on another condition in the environment - such as which modules are enabled. This is where ServiceProviderInterface comes in. It consists of a single method - register as follows:

interface ServiceProviderInterface {

/**
* Registers services to the container.
*
* @param ContainerBuilder $container
* The ContainerBuilder to register services to.
*/
public function register(ContainerBuilder $container);

}

The register method is called during container building, before dumping to disk. From here we can interact with other services. If you need to check which modules are enabled you can use the getParameter method on the container builder argument like so:

$modules = $container->getParameter('container.modules');

A great example of this in core is Drupal\language\LanguageServiceProvider, which is responsible for registering the language_request_subscriber and path_processor_language services, only if the site is multi-lingual. Clearly this couldn't be done in YAML alone.

ServiceModifierInterface

This interface is the one we want for modifying existing service definitions. Heading back to our example, we want to change the forum.index_storage service.

It consists of a single method as follows:

interface ServiceModifierInterface {

/**
* Modifies existing service definitions.
*
* @param ContainerBuilder $container
* The ContainerBuilder whose service definitions can be altered.
*/
public function alter(ContainerBuilder $container);

}

So we start by getting a reference to the forum.index_storage definition like so

$definition = $container->getDefinition('forum.index_storage');

Then we need to change the arguments like so - assuming that the dynamo client has machine name 'dynamo.client':

$definition->setArguments([
  new Symfony\Component\DependencyInjection\Reference('dynamo.client'),
]);

Again Drupal\language\LanguageServiceProvider provides a great example of this in core, changing the default language manager provided in core to the configurable one provided by the language module, configuring the language config factory override service and setting a container parameter for the default languages.

Parallels with other PHP frameworks

Now what Drupal does in this space, registering and altering is not-dissimilar to comparable approaches in other PHP frameworks such as Symfony Full-stack framework, Laravel 5 and Silex. 

Symfony

Symfony bundles can define an extension, in this case the extension is the same name as the bundle, except the 'Bundle' suffix is replaced with 'Extension'. The extension class needs to implement Symfony\Component\DependencyInjection\Extension\ExtensionInterface, but normally would just extend from Symfony\Component\DependencyInjection\Extension\Extension. This has a load method which is analagous to Drupals' register method.

Silex

In Silex, services are provided by service providers. Because Silex is geared towards bespoke applications rather than generic portable code each service provider is manually registered in the application using $app->register(). In this case a service provider is a class implementing Silex\ServiceProviderInterface which consists of a boot and a register method. The register method is analagous to Drupal's ServiceProviderInterface. The boot method serves the same use-case as the alter method in Drupal's ServiceModifierInterface.

Laravel

Like Silex, Laravel is also concerned with bespoke applications so service provider registration is done manually in your config/app.php. Service providers extend from Illuminate\Support\ServiceProvider. Just like Silex there are boot and register methods, and both serve the same use case as Drupal's alter and register methods respectively.

Conclusion

Being able to modify services adds a whole-new level of customisation to Drupal, but as seen by the similarities to other framworks, one which should feel familiar to developers coming from other PHP frameworks to Drupal. I think as time goes by modifying services will become one of the essential skills in a Drupal developer's bag of tricks.

If you want to hear more about how Drupal 8 will be game-changing, we're running a series of 'Get Ready for Drupal 8' seminars in many Australian capital cities on Thursday August 6th 2015. These are free to attend - but registration in advance is required. Hope to see you there.

Mar 17 2015
Mar 17

Drupal 8 comes with two extension points for module developers to allow other modules to interact with their code.

The trusty alter hook, the linchpin of Drupal versions past is still there - allowing other modules to interact and intervene in the behaviour of your module.

But there is a new kid on the block, the event system.

So as a module developer how do you decide whether to use the alter system or the event system.

To alter or to fire an event?

This was the premise of a recent IRC conversation between myself, dawehner and bojanz, so for posterity I present to you the pros and cons we came up with.

Pros of events

  • Object oriented. Hooks still rely on procedural code meaning unit-testing is possible, but more involved. As events are object-oriented and support dependency injection, you could unit-test your event. Although in practise your events should be thin, like your controllers, with your logic deferred to the larger parts of your application. Either way, Object oriented code presents a nicer developer experience than working with procedural functions.
  • Stopping propagation. Like a JavaScript event, the event dispatcher allows any single event to stop propagation, preventing subsequent event listeners from firing. This can't be done with an alter hook.
  • Firing twice on the one event. By default with the hook system, your hook is fired based on alphabetical sorting. i.e. barfoo_some_hook will go before foobar_some_hook because barfoo comes before foobar in the alphabet. If you want to alter the order in which the hooks fire, you can alter the module's weight in the system table - which applies globally for all hooks, or implement hook_module_implements_alter to re-order on a per-hook basis. What you can't do however is fire your hook twice. With the event system, you can register two listeners for the one event, from the one module. This means you can have your event listener fire first and last when the event is dispatched.

Pros of alter hooks

  • One thing the event system doesn't easily allow is for one module to remove another module's implementation. This can be done with hook_module_implements_alter in the hook system. So for your particular site-build you may find one module's implementation of a hook problematic or even broken. With the hook system, you can implement hook_module_implements_alter and remove a particular implementation and re-implement it in your own way. To do something similar in the event-dispatcher system, would require you to alter the container definition and remove the event listener you don't want.

Code sample for hooks

<?php
function mymodule_dosomething() {
$data = mymodule_get_some_data();
return \Drupal::moduleHandler()->alter('my_module_hook', $data);
}
?>

Code sample for events

<?php
namespace Drupal\mymodule;

use Symfony\Component\EventDispatcher\EventDispatcherInterface;
use Drupal\mymodule\Events\SomethingEvent;

/**
* Defines a class for doing stuff.
*/
class MyModuleSomething {
protected $eventDispatcher;

/**
* Constructs a MyModuleSomething.
*/
public function __construct(EventDispatcherInterface $event_dispatcher) {
$this->eventDispatcher = $event_dispatcher;
}

/**
* Does something.
*/
public function doSomething($with_this) {
$event = new SomethingEvent($with_this);
$this->eventDispatcher->dispatch(SomethingEvent::JUST_DO_IT, $event);
return $event->showMeTheDoneStuff();
}
}
?>

So what will it be?

Are you building a module for Drupal 8 and have already decided to use events instead of alter hooks? Let us know what influenced your decision in the comments - let's keep the conversation going and build out a resource to help others make their decision.

Drupal 8 Event Dispatcher Event Listener Hooks
Jan 15 2015
Jan 15

In an effort to continue the velocity of work on Drupal 8 criticals from the Ghent criticals sprint, we've taken it upon ourselves to get together for at least one hour each Friday to focus on Drupal 8 criticals

Read on to find out what we got up to in the first week, but also how you can get involved.

Critical Office Hours

At the time of writing, there are 85 remaining criticals for Drupal 8. Of these 53 were updated in the last week, so as a community, we're swarming on these remaining issues to get us to a Drupal 8.0.0 release candiate.

So in order to make working on these issues as accessible as possible, every Friday from 12 noon to 1 pm GMT + 10 (Brisbane time) we're meeting online in #drupal-au to work on Drupal 8 criticals.

Week one recap

During week one we worked on four issues as follows:

So three fixed criticals in the first week is not too shabby in my books, two of those were a direct result of our work, and the third one had a couple of new patches and has since gone in.

Get involved

If you're interested in helping, say hi on #drupal-au to Jibran Ijaz (jibran), David Forbes (dashaf), Magda Kostrzewa (miss_phing), Kim Pepper (kimb0) or Myself (larowlan) and let us know you'll be coming along.

Drupal 8 Critical Office Hours Core Mentoring
Dec 03 2014
Dec 03

What's new with Drupal 8?

Where's Drupal 8 at in terms of release?

Since the last Drupal Core Updates, we fixed 18 critical issues and 12 major issues, and opened 9 criticals and 18 majors. That puts us overall at 110 release-blocking critical issues and 705 major issues.

Part of managing Drupal 8 toward its release is continuously reassessing what must block a release of 8.0.0. (Remember, hundreds of thousands of people will benefit from all the great new functionality in Drupal 8, so we need to be smart about what does or doesn't hold up that release!) The chart below illustrates not only those newly discovered and newly fixed critical issues each week, but also issues that are promoted to critical and demoted from critical based on our latest understanding. For more information on what is (and isn't) release-blocking, see the handbook page on issue priority.

Incoming and outgoing Drupal 8 critical issues per week

Current focus

The current top priority in Drupal 8 is to resolve issues that block a beta-to-beta upgrade path (critical issues tagged 'D8 upgrade path'). We also need core contributors to continue evaluating issues for the beta phase based on the beta changes policy.

Finally, keep an eye out for critical issues that are blocking other work.

How to get involved

If you're new to contributing to core, check out Core contribution mentoring hours. Twice per week, you can log into IRC and helpful Drupal core mentors will get you set up with answers to any of your questions, plus provide some useful issues to work on.

If you are interested in really digging into a tough problem and helping resolve a stagnating release blocker, or if you are stuck on a critical currently, join the #drupal-contribute IRC channel during weekly critical issue office hours on Fridays at 12:00p PST. See chx's office hours reports for an idea of what we've done so far!

If you'd like to contribute to a particular Drupal 8 initiative or working group, see the regularly scheduled meetings on the Drupal 8 core calendar. Google calendar ID: [email protected]com

You can also help by sponsoring independent Drupal core development.

Notable Commits

The best of git log --since "1 week ago" --pretty=oneline (70 commits in total):

  • Issue 2359369 by mpdonadio, Berdir, bdurbin: Render cache is not cleared when module is uninstalled - cache invalidation is always hard :)
  • Issue 2377281 by hussainweb, dawehner: Upgrade to Symfony 2.6 stable - getting close to the 2.7 LTS release
  • Issue 2342593 by znerol, grendzy, David_Rothstein: Remove mixed SSL support from core - aligning Drupal with the wider web trends regarding https
  • Issue 2369781 by larowlan: Ensure twig_debug output has needed sanitization - another critical security fix down
  • Issue 2384581 by cilefen, Wim Leers: Security: Update CKEditor library to 4.4.6 - brings some security improvements
  • Issue 2384163 by yched: Entity render cache is needlessly cleared when an Entity*Fom*Display is modified - performance++
  • Issue 2368275 by martin107, dawehner, znerol, Crell, Wim Leers: EntityRouteEnhancer and ContentFormControllerSubscriber implicitly depend on too many services - ensuring our critical execution path is a lean as posisble
  • Issue 2348459 by larowlan, alexarpen: Fields of type 'Text (formatted)' do NOT save values - a critical that was causing data loss when editor module was enabled
  • Issue 2235901 by alexpott, mdrummond, iMiksu, sun, Wim Leers: Remove custom theme settings from *.info.yml - theme system using config objects like everything else
  • Issue 2212335 by jhodgdon: Separate out NodeSearch::execute() into finding vs. processing results
  • Issue 2377397 by Wim Leers, alexpott: Themes should use libraries, not individual stylesheets - moving us towards simplifying ajax page state, and smaller Javascript settings object - and hence increased performance

You can also always check the Change records for Drupal core for the full list of Drupal 8 API changes from Drupal 7.

Drupal 8 Around the Interwebs

Drupal 8 in "Real Life"

Whew! That's a wrap!

Do you follow Drupal Planet with devotion, or keep a close eye on the Drupal event calendar, or git pull origin 8.0.x every morning without fail before your coffee? We're looking for more contributors to help compile these posts. You could either take a few hours once every six weeks or so to put together a whole post, or help with one section more regularly. Read more about how you can volunteer to help with these posts!

Nov 27 2014
Nov 27

On further research, my thinking that your syntax was outdated was due to the fact that I had used this with fields and it works differently there. I posted an issue to try and get some clarification on why it is different for fields. If you have any insight, I'd appreciate a comment. https://www.drupal.org/node/2479551

Thanks!

Nov 13 2014
Nov 13

I think you briefly touched on it two or three ways without mentioning it specifically, but to go along with the communication and understanding that sometimes what you say might offend someone else because of the language barrier, something they say might offend you because of that same barrier, without them even knowing they've done it. Keep in mind that Google Translate doesn't always produce acceptable results, and don't retaliate to someone's coment mindlessly.

Jan 09 2014
Jan 09

What's new with Drupal 8?

It's been three weeks since our last 'This week in core' post, but with holidays providing a welcome break for many, core development has continued at its usual rapid pace. Time flies! The session submission deadline for Drupal Dev Days Szeged is just around the corner on the 15th of January.

Give Drupal a birthday present

Next week is Drupal's 13th birthday! Want to give Drupal a birthday present? Why not tackle an issue, or help mentor someone else to do so. There's also a Reddit AMA appearance by Dries (that's Ask me anything for those who don't use Reddit - I had to look it up) and we're also planning a special "This Year in Drupal Core".

New Drupal core commit schedule

We're trying out a new commit schedule to increase core momentum. For one week starting January 15 up until the release of Drupal 8.0-alpha8 on January 22, core maintainers will commit only critical and major patches. (Normal and minor patches will be committed again starting January 23.) Read more about the new commit schedule.

Where's Drupal 8 at in terms of release?

Last week, we fixed 11 critical issues and 14 major issues, and opened 9 criticals and 11 majors. That puts us overall at 135 release-blocking critical issues and 482 major issues.

10 beta-blocking issues were fixed last week. There are still 61 of 98 beta blockers that must be resolved and 48 change records that must be written before we can release a Drupal 8 beta.

Where can I help?

Top criticals to hit this week

Each week, we check with core maintainers and contributors for the "extra critical" criticals that are blocking other work. These issues are often tough problems with a long history. If you're familiar with the problem space of one of these issues and have the time to dig in, help drive it forward by reviewing, improving, and testing its patch, and by making sure the issue's summary is up to date and any API changes are documented.

More ways to help

If core's toughest criticals aren't on your to-do list this week, there are lots of other places to jump in and help with conversions and cleanups in core. The Drupal 8 "meta meta", compiled by vijaycs85, is a great place to start if you want to dig your teeth into a technical problem but aren't sure where to start. Or if coding isn't your thing, there are plenty of issues tagged as Needs change notification. Writing these is a great way to keep abreast of recent changes - see more on change records to get started.

Additionally:

  • The CMI initiative are planning to hold biweekly meetings to gain momentum around the remaining critical issues
  • As always, if you're new to contributing to core, check out Core contribution mentoring hours. Twice per week, you can log into IRC and helpful Drupal core mentors will get you set up with answers to any of your questions, plus provide some useful issues to work on.

Notable Commits

The best of git log --after=2013-12-19 --pretty=oneline (192 commits in total!):

  • #2005716 by tim.plunkett, msonnabaum, dawehner, alexpott, effulgentsia: Promote EntityType to a domain object - this moves much of the entity type annotation/info to a first-class object. The existing entity type info/annotation was a large array with seemingly arbitrary keys. Much of the code had isset checks to see if a particular array key exists. This makes entity type an object with methods for retrieving the various information.
  • #2098119 by beejeebus, alexpott, chx: Replace config context system with baked-in locale support and single-event based overrides. As the name suggests this bakes locale support into the config system and removes the config context system. Modules wishing to change config based on other non-locale based circumstances (eg domain module) can subscribe to an event and react accordingly.
  • Issue #2130811 by by alexpott, Gábor Hojtsy, vijaycs85, sun, Wim Leers: Use config schema in saving and validating configuration form to ensure data is consistent and correct. This ensures that configuration is in the correct format, as defined in the schema, before the config files are written, eg integers are cast from text etc.
  • Issue #2042807 by tim.plunkett, pwolanin, ianthomas_uk, jhodgdon: Convert search plugins to use a ConfigEntity and a PluginBag. One of the last plugin conversions and an important blocker of other criticals.
  • Issue #2068471 by dawehner, Crell, tim.plunkett, jibran, fubhy, larowlan: Normalize Controller/View-listener behavior with a Page object. This sets the stage for the last push to finish up WSCCI (the Web Services initiative) and for SCOTCH (the Blocks and Layouts initiative) to be completed in contrib.

You can also always check the Change records for Drupal core for the full list of Drupal 8 API changes from Drupal 7.

Drupal 8 Around the Interwebs

Blog posts about Drupal 8 and how much it's going to rock your face.

Drupal 8 in "Real Life"

  • Jan. 25-26: Mark your calendars for the next Global Sprint Weekend. Join local user groups around the planet for a weekend of Drupal 8 contribution. The sprint weekend will be a great opportunity to engage your local community, and there are lots of resources on the sprint page to help get new people involved. Events are already planned everywhere from Illinois to Budapest to Montréal to Spain. Add yours today!
  • Feb 14-17 DrupalSouth. Join Australian, New-Zealand and international guests for a weekend of Drupal. Signup for the sprint day is now open - come along and sprint on Drupal 8.

Whew! That's a wrap!

Do you follow Drupal Planet with devotion, or keep a close eye on the Drupal event calendar, or git pull origin 8.x every morning without fail before your coffee? We're looking for more contributors to help compile these posts. You could either take a few hours once every six weeks or so to put together a whole post, or help with one section more regularly. Contact xjm if you'd like to help communicate all the interesting happenings in Drupal 8!

Oct 16 2013
Oct 16

What's new in Drupal 8?

This week saw a large number of bug fixes and cleanup task as we work hard cleaning up our APIs and codebase for a beta release. The next alpha (alpha4) is slated for October 18th since it has been a while since alpha3 was released and also in order to have something stable to port modules from during the various sprints/sessions at BADCamp. There are still some remaining issues we’d like to see in the alpha4 so help is always appreciated.

Notable Commits

The best of git log --since "1 week ago" --pretty=oneline (123 commits in total):

  • Issue #2109601 by tim.plunkett, benjy: Update MAINTAINERS.txt for block.module - great to have more maintainers come on board. Trivia Question - benjy brings the total number of Australian based core maintainers to four. Can you name them?
  • Issue #1757452 by amateescu, Xano, chx: Support config entities in entity reference fields. This opens up more data-modelling opportunities for Drupal.
  • Issue #2004626 by plach, kfritsche, vijaycs85, Pancho, penyaskito: Make non-configurable field translation settings available in the content language settings. Where built-in properties got support for translatability configuration as well. This is now only implemented for node titles, but will expand to other node properties very easily soon. Still more to go in entity translation, but this was an important step.

Important API and data model changes

Commits that stabilize our API and data model, moving us closer to a Drupal 8 beta.

  • Issue #1589176 by nod_: Fixed Use data-* to store #states api informations
  • Issue #2101709 by tstoeckler: Remove the bundle_prefix concept from the entity
  • Issue #2095399 by Berdir: Merge DatabaseStorageController and DatabaseStorageControllerNGsystem.

Hardening our APIs

One of the primary use cases for the CMI initiative is to allow export and import of configuration. Until recently we were testing these in isolation or for a single site but now thanks to some lateral thinking by chx we have test coverage for exporting and importing between two distinct installs. Also we now have UUIDs on all default configuration which means the migrate initiative (IMP) is now unblocked.

  • Issue #2106171 by chx, webchick: Write tests for configuration deployment scenarios.
  • Issue #1969800 by swentel, mtift, tayzlor, chx, xjm, herom: Add UUIDs to default configuration.
  • Issue #2023563 by Berdir, smiletrl, fago: Convert entity field types to the field_type plugin.

User experience (UX) improvements

Drupal 8 makes great strides in improving the user experience for administrators, site builders, content authors, and end users. Here's a sampling of some of the small fixes that are adding up to a big UX boost:

  • Issue #2072533 by swentel: Group together 'Decimal', 'Float' and 'Integer' in the field type drop-down menu.
  • Issue #1851414 by nod_, quicksketch, frega, dawehner, damiankloip: Convert Views to use the abstracted dialog modal. A big win, now all of our dialogs use the same API so any a11y or otherwise issues only need to be fixed in one place.

Drupal 8 Around the Interwebs

Blog posts about Drupal 8 and other related lint.

Drupal 8 in "Real Life"

  • October 24-27: BADCamp 2013 is next week! The four-day event includes lots of great Drupal 8 content for site builders, module authors, and core contributors, plus a full-day core summit on October 27. Also don't miss the extended sprints which run from October 21 all the way through to October 29. The teams for Views, Configuration, Multilingual, Web Services, Documentation, Usability, Media, and Twig all need your help to make Drupal 8 awesome! Sign up now.

Other important dates in the next couple of weeks

  • November 1 Session submissions close for DrupalSouth, Asia Pacific's biggest Drupal conference, get your sessions in.
  • November 7th 6-9pm (GMT+11) Drupal 8 Tour Writing sprint in Sydney Australia. See the event page for more details.

Keep watching the queues!

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web