Oct 19 2017
Oct 19

Salesforce Suite is a group of modules for Drupal that allows for pulling data from Salesforce into Drupal, as well as pushing data from Drupal to Salesforce. The module api provides some very useful hooks, including the _salesforce_pull_entity_presave hook implemented by the Salesforce Pull module. In this blog post, we’ll look at using that hook to pull three Salesforce custom fields (select lists) into Drupal as taxonomy terms in three vocabularies.

Create a custom module to house the hook called <sitename>_salesforce and create a <sitename>_salesforce.module file. In that file, drop in the presave function, as copied from salesforce.api.php in the Salesforce Suite module:

/**
 * Act on an entity just before it is saved by a salesforce pull operation.
 * Implementations should throw a SalesforcePullException to prevent the pull.
 *
 * @param $entity
 *   The Drupal entity object.
 * @param array $sf_object
 *   The Salesforce query result array.
 * @param SalesforceMapping $sf_mapping
 *   The Salesforce Mapping being used to pull this record
 *
 * @throws SalesforcePullException
 */
function hook_salesforce_pull_entity_presave($entity, $sf_object, $sf_mapping) {
  if (!some_entity_validation_mechanism($entity)) {
    throw new SalesforcePullException('Refused to pull invalid entity.');
  }
  // Set a fictional property using a fictional Salesforce result object.
  $entity->example_property = $sf_object['Lookup__r']['Data__c'];
}

Take a look at the example code in the function body but remove it.

The hook gets called during the salesforce_pull_process_records function with this line:

// Allow modules to react just prior to entity save.
module_invoke_all('salesforce_pull_entity_presave', $wrapper->value(), $sf_object, $sf_mapping);

So, that’s where we will intervene with our custom code. With this hook, we have access to the data queried from Salesforce, and the entity that is about to be saved into Drupal, so it's a perfect time to do any translations between the two data sets.

The first problem we have to address is that, by default, the Salesforce Pull module will create a new node as it processes each Salesforce record instead of modifying the existing nodes on your Drupal site. If you don’t want this behavior, add this code:

// first of all, don't create new nodes
if (isset($entity->is_new) && $entity->is_new == TRUE) {
  throw new SalesforcePullException('Tried to create a new node.');
}

You may also want to look at the _salesforce_pull_mapping_object_alter hook to aid in prematching nodes.

Then, we need to define our taxonomy vocabularies:

// lookup table
$names_vids = array(
  'exampleVocabularyA' => array('vid' => 1, 'field' => 'field_example_vocabulary_a'),
  'exampleVocabularyB' => array('vid' => 2, 'field' => 'field_example_vocabulary_b'),
  'exampleVocabularyC' => array('vid' => 3, 'field' => 'field_example_vocabulary_c'),
);

Gather the terms from $sf_object like this:

// gather terms
$incoming = array(
  'exampleVocabularyA' => explode(';', $sf_object['Terms_A__c'] ? $sf_object['Terms_A__c'] : ''),
  'exampleVocabularyB' => explode(';', $sf_object['Terms_B__c'] ? $sf_object['Terms_B__c'] : ''),
  'exampleVocabularyC' => explode(';', $sf_object['Terms_C__c'] ? $sf_object['Terms_C__c'] : ''),
 );

You’ll want to clean up the incoming data:

array_walk_recursive($incoming, 'trim');
$incoming = array_map('array_filter', $incoming);

Then, we need to iterate over the incoming terms and create a new term if it doesn’t already exist in Drupal. Finally, we set the tids on the desired nodes:

foreach($incoming as $vname => $term_names) {
  $tids = array();
  foreach($term_names as $term_name) {
    $tid = taxonomy_get_term_by_name($term_name, $vname);
    if (empty($tid)) {
      // add the term if we don't already have it
      $newterm = new stdClass();
      $newterm->name = $term_name;
      $newterm->vid = $names_vids[$vname]['vid'];
      taxonomy_term_save($newterm);
      $tid = $newterm->tid;
    }
    array_push($tids, $tid);
  }
  // set tids on target nodes
  // first unset all existing tids
  $entity->{$names_vids[$vname]['field']} = array();
  // using $length here because we modify $tids in loop
  $length = count($tids);
  for ($i = 0; $i < $length; $i++) {
    $tid = array_shift($tids);
    $tid = array_keys($tid)[0];
    $entity->{$names_vids[$vname]['field']}[LANGUAGE_NONE][$i]['tid'] = $tid;
  }
}

This will keep your Drupal nodes in sync (on each cron run) with any terms added or deleted on the Salesforce objects.

(If you are having trouble getting Salesforce to complete its whole queue on a single cron run, I recommend this blog post for troubleshooting tips: Drupal Salesforce Not Updating Records. In particular, we recommend the Queue UI module.)

Oct 19 2017
Oct 19

Mediacurrent’s commitment to innovation has been clear since our founding ten years ago. From helping spearhead the Atlanta Drupal community in our early days, to architecting the world’s highest-trafficked Drupal site, to building and open sourcing a Drupal 8 theme generator, Mediacurrent has lived up to its pledge to give team members internal time to work on their own initiatives, and to give back to the wider open source community in diverse and innovative ways. The choice to create a new Labs department and a Director of Emerging Technology role flows from that commitment, but also aims to go further. So what will these changes entail, and how will they help ensure the success of our clients and of Drupal?

More and more, we are finding that staying on the cutting edge of building with Drupal means also keeping up with where non-Drupal technologies are heading. Our clients are increasingly moving from traditional websites, through mobile-first thinking, and towards truly multi-channel technology experiences. Our ability to serve their needs as an agency, and Drupal’s ability to serve their needs as a piece of software, requires new ways of thinking and growing. Mediacurrent Labs is a formalization of our commitment to doing just that.

What does all this mean in practice? First and foremost, it means encouraging experimentation with new technologies within Mediacurrent, and contributing the lessons learned (and hopefully some useful tools as well) back to the open source community. The work I’ve been doing on Contenta Angular falls nicely into this category, as does our recently featured work on blockchains, Alexa skills and TypeScript integrations with Drupal behaviors. When these things are driven by a client’s needs, we also explore how we can generalize what we’ve learned and apply it elsewhere; perhaps by building tools to make the process simpler next time, or by documenting publicly our best practices. And when we’re excited about a new technology’s possibilities but don’t have a specific client asking for integration (yet), we can use internal proof-of-concepts to educate ourselves, then share what we learn with the world.

Speaking of sharing, our commitment isn’t just to fostering Mediacurrent’s thought leadership within the Drupal world; we are also committed to elevating outside voices of innovation. Being co-lead organizer of the new conference that debuted in New York City this August, Decoupled Dev Days, is a great example of how we help foster the momentum around rethinking Drupal’s future over the next decade. The conference, which was dreamt up less than four months ago at DrupalCon Baltimore, brought together speakers from seven Drupal agencies, as well as core members of two major javascript frameworks!

This is just a taste of what we are setting out to achieve. As we consider Labs projects in the coming months we want to hear about the emerging technologies you are interested in. Should we be exploring virtual reality or augmented reality integrations with Drupal? Maybe Drupal-backed IoT beacons are the next big thing? Or is there a good chance that blockchain technology will completely upend the way we think about Drupal today? We don’t have all those answers, but we are committed to exploring the questions collaboratively with our clients and community.
 

Additional Resources
5 Thoughts About Blockchain Technology | Video
Integrating Amazon Alexa with a Drupal 8 Site | Blog
Rinnai | Case Study

Oct 19 2017
Oct 19

It is not uncommon to propose to filter contents according to dates, and in particular depending on the year. How to filter content from a view based on years from a date field? We have an immediate solution using the Search API module coupled with Facets. This last module allows us very easily to add a facet, to a view, based on a date field of our content type, and to choose the granularity (year, month, day) that we wish to expose to the visitors. But if you do not have these two modules for other reasons, it may be a shame to install them just for that. We can get to our ends pretty quickly with a native Views option, the contextual filter. Let's discover in a few images how to get there.

Creating the content view to be filtered by year

Let's say we have a content type named bulletin, which has a Date field, and we want to filter by year. To do this, we designed a view that lists all bulletin contents. Here is the general, very classic, configuration of the view.

View bulletin general configuration

We distinguish in the view's settings a section Contextual filters. These contextual filters can be added and configured to a view in order to be able to filter its results from these filters. These filters can be configured to be provided from the view URL, or from a specific context, from a particular field of a current content (if we display the view beside a content), from a parameter in the request, etc. In fact the possibilities are endless and we can also provide our own logic to contextual filters very easily by implementing a Views Plugin type @ViewsArgumentDefault. Finally, these contextual filters are created in relation to a particular field of the contents that you visualize with Views.

Adding and configuring the contextual argument

We will add a contextual filter to our view, using somewhat specific field types that allow us to get aggregate values ​​of Date fields. We click on the button to add a contextual argument.

Add contextual filter

And we will look for our Date field (field_date) of our bulletin content type in an aggregated form per year (Date in the form YYYY).

After validating our choice, we get the control panel of our contextual filter.

settings contextual filter

Since we want to filter our content from a query parameter, we will configure the panel When the filter value is not in the URL. Note that if we had wanted to filter these contents from the URL directly (for example with a URL type /bulletins/2017) the configuration is almost nonexistent.

We will choose the Provide default value option, and select a query parameter as the default value type. We give a name to our parameter (year), supply it a fallback value if the query parameter is not present (all). And we set an exception value (all), which, if it is received by our query parameter, will let us ignore our argument, and therefore show all results.

All we have to do is save and test our view.

view filtered by year

And we simply add to the URL the parameter ?year=2017 to filter the contents of the year 2017.

Adding a Custom Exposed Filter

All we have to do now is add a select list in the exposed filters of the view to offer the visitor an interface to select the desired year.

You noted in the general view configuration that the Tags field was added as a filter criteria and was also exposed. This is not innocent, because it allows us to get the exposed filter form already operational, form in which we will only have to add our custom option for the years.

To do this, we create a small module, which we call my_module, and we will alter this form.

use Drupal\Core\Form\FormStateInterface;
use Drupal\Core\StringTranslation\TranslatableMarkup;

/**
 * Implements hook_form_FORM_ID_alter().
 */
function my_module_form_views_exposed_form_alter(&$form, FormStateInterface $form_state, $form_id) {
  if (isset($form['#id']) && $form['#id'] == 'views-exposed-form-bulletins-page') {
    $options = [
      'all' => t('- All -'),
      '2015' => '2015',
      '2016' => '2016',
      '2017' => '2017',
    ];

    $form['year'] = [
      '#title' => new TranslatableMarkup('By year'),
      '#type' => 'select',
      '#options' => $options,
      '#size' => NULL,
      '#default_value' => 'all',
    ];
  }
}

We add to the exposed form corresponding to the view whose identifier is bulletins, and to the display whose identifier is page, a simple select element whose year name corresponds to the query parameter set in the contextual filter of the view. And we provide him with some options, hardcoded here, over available years and of course with our exception value all, to be able to return all the results without any filters.

Filter year added

There you go. We have a simple view, with a minimum of code, which allows us to filter content by date according to an annual granularity, granularity that we can modify at will by modifying the contextual filter of the view if needed.

Make filter options dynamic

The options we have provided are unlikely to be sustainable over time. What is being done in 2018? Are we changing the code? Let's improve a bit our view to make the years available in the select list dynamic.

We will simply, since the alteration of the exposed form, make a query on all contents bulletin to retrieve all the dates and propose the different available years. But because this type of query can be costly, for a simple select list field, we will use the Drupal Cache API to cache these results and not have to recalculate them each time the page loads.

Let's improve the snippet seen above.

/**
 * Implements hook_form_FORM_ID_alter().
 */
function my_module_form_views_exposed_form_alter(&$form, FormStateInterface $form_state, $form_id) {
  if (isset($form['#id']) && $form['#id'] == 'views-exposed-form-bulletins-page') {

    $options = &drupal_static(__FUNCTION__);
    if (is_null($options)) {
      $cid = 'my_module:bulletin:year';
      $data = \Drupal::cache()->get($cid);
      if (!$data) {
        $options = [];
        $options['all'] = new TranslatableMarkup('- All -');
        $query = \Drupal::entityQuery('node');
        $query->condition('type', 'bulletin')
          ->condition('status', 1)
          ->sort('field_date', 'ASC');
        $result = $query->execute();
        if ($result) {
          $nodes = Node::loadMultiple($result);
          foreach ($nodes as $node) {
            $date = $node->field_date->value;
            if ($date) {
              $date = new DrupalDateTime($date, new DateTimeZone('UTC'));
              $year = $date->format('Y');
              if (!isset($options[$year])) {
                $options[$year] = $year;
              }
            }
          }
        }

        $cache_tags = ['node:bulletin:year'];
        \Drupal::cache()->set($cid, $options, CacheBackendInterface::CACHE_PERMANENT, $cache_tags);
      }
      else {
        $options = $data->data;
      }

    }
    
    $form['year'] = [
      '#title' => new TranslatableMarkup('By year'),
      '#type' => 'select',
      '#options' => $options,
      '#size' => NULL,
      '#default_value' => 'All',
    ];
    
  }
}

Thus the options of the available years will be computed a first time, then recovered directly from the cache of Drupal. We have taken care to add a custom cache tag node:bulletin:year specific to the data stored in cache so as to be able to invalidate them just when it will be necessary. Indeed, we permanently cache the result of our options, and we just have to invalidate these data if, and only if, a new bulletin is created or modified, and if it contains a date whose year is not present in the cached options.

Cache Invalidation

With the magic cache tags, nothing is simpler for a Drupal 8 developer. Let's look at the snippet that will take care of invalidating the cached options.

use \Drupal\Core\Entity\EntityInterface;
use Drupal\Core\Datetime\DrupalDateTime;
use Drupal\Core\Cache\Cache;

/**
 * Implements hook_ENTITY_TYPE_presave().
 */
function my_module_node_presave(EntityInterface $entity) {
  $bundle = $entity->bundle();
  if ($bundle == 'bulletin') {
     // Check if a bulletin updated has a new year, and invalidate the
    // options cached used in the custom views filter for filtering by year.
    $cid = 'my_module:bulletin:year';
    $data = \Drupal::cache()->get($cid);
    if ($data) {
      $options = $data->data;
      $date = $entity->field_date->value;
      if ($date) {
        $date = new DrupalDateTime($date, new DateTimeZone('UTC'));
        $year = $date->format('Y');
        if (!isset($options[$year])) {
          Cache::invalidateTags(['node:bulletin:year']);
        }
      }
    }
  }
}

For each update of a bulletin content, we retrieve the cached options ($data), and then compare the year of the date of the content being saved with the cached values. In case of absence of the year, thanks to the magic cache tags, then we simply need to invalidate the custom cache tag (node:bulletin:year) that we have associated with the cached options. And the next time you load the bulletins view, the options will be recalculated, updated with the new year, and re-cached for an indefinite period, and no doubt annually.

As a conclusion

The contextual filters of the Views module, available with Drupal 8 Core, offer a considerable range of possibilities to cover a wide range of needs. And when these are not enough, a simple implementation of a Plugin, in a few lines, allows us to insert our own business logic into the Views mechanism in a robust and maintainable way, thus conciliating the robustness of Views and the specificities of a project. And finally, for an apparently simple need (filter by year), but far from being as obvious (working with dates is always a source of surprise), these allow to treat the subject in a simple way, without resorting to heavier means. And, often, simples solutions are the more effective solutions. Why make it complicated when it can be simple ?

Oct 18 2017
Oct 18

At the 2017 Acquia Engage Conference, our enterprise web project for Pinnacol Assurance won the Financial Services category. As an Acquia Preferred Partner and sponsor of the Acquia Engage conference, we are thrilled to have been ranked beside the world's top Drupal websites. 

Pinnacol.com launched in December of 2016. As Colorado’s leading workers compensation insurer, Pinnacol Assurance needed a Drupal website design that reflected the company’s commitment to first-class service.

Built on Drupal 8, we launched a brand new enterprise content management system, created a one-of-a-kind knowledge hub, and revamped the site’s user experience interface.

“The Pinnacol project was lengthy and complex, we had very specific problems that required creative solutions. Elevated Third developed a high performing, enterprise-level website that continues to exceed our expectations.”

- Hilary Miller, Brand and Marketing Director, Pinnacol Assurance

This year, five of our Drupal websites were finalists in the annual Acquia Engage competition. Partners and customers submitted more than 200 nominations across 17 categories to the program. Our Drupal work ranked in the following categories. 

Powdr Corporation, Digital Experience Finalist

Denver Botanic Gardens, Nonprofit Finalist

Comcast Technology Solutions, Brand Experience Finalist

Firewise USA, Community Finalist

Pinnacol Assurance, Financial Services Winner

“Winning sites set themselves apart in how they grabbed our attention and made us want to learn more,” said CMSWire’s Dom Nicastro, one of the award program jurors. “The first thing I looked for were search and commerce capabilities. It's a Google and Amazon world that we live in. No one comes to a website just for a pretty design, and no one remembers a red call-to-action button versus a blue one. Sites that deliver excellent search and easy transactional experiences won for me.”

Congratulations to all the 2017 Acquia Engage winners!

Oct 18 2017
Oct 18

As part of my new role as Agile Consultant with Amazee Labs Zurich, I'm running a global survey to assess agile practices in our industry. Anyone working in an agency environment is welcome to fill out the survey!

Do you / does your agency work using defined agile methodologies such as Scrum and or Kanban? How do you fit theory into practice when it comes to working for different clients with different levels of understanding with regards to Agile practices at the same time?

Thank you for taking the survey before October 31 - I’m looking forward to report the findings in an upcoming blog post.

Oct 18 2017
Oct 18

We've stepped into the last quarter of the year, but in Drupal community is still much going on. We've made a list of DrupalCamps and summits that are still available to attend. Drupal events are bringing together Drupal developers, themers, end users and those interested in learning more about Drupal for talks, sessions and collaborative discussions. 

Drupal Summit Tokyo 

Fukurasia Shinagawa Crystal Square, Tokyo, Japan
19. October 2017 9:00-19:00
Largest Drupal event in Japan will host more than 15 strategies and technical sessions, starting with a session of a formal digital director of the White House.
http://drupal-summit.tokyo/

drupal blueCornell DrupalCamp 2017

ILR Conference Centre, Ithaca, NY, United States
20. October 2017
The day begins with the keynote speech "Accessibility in Drupal: Using Your Powers for Good," with Kelsey Hall and Joshua Pearson from UMass Amherst. 
After the keynote, you can choose from over thirty sessions in tracks for beginners and advance users.  
https://camp.drupal.cornell.edu/

Drupal Camp Dublin 2017

Guinness Enterprise Center, Dublin, Ireland
20-21. October 2017
There will be around 20 sessions about Drupal commerce, headless Drupal, performance, project management and others. Dublin is known for great social life so Drupal camp organizers are preparing something for evenings too!
http://2017.drupal.ie/


DrupalCamp Schwein 2017 

Das Haus am See, Schwein, Germany
21-22. October 2017
Six years after the Drupal-Day in Rostock, the Drupal Usergroup MV is organizing Drupal camp in Schwein, tranquil and beautiful state capital of Germany. 
The main focus of the program is Drupal in government and government-related organizations. Other hot topics will be website performance and user experience.
https://dcsn17.drupalcamp.de/

DrupalCamp Baltics 2017 Vilnius

Crowne Plaza, Vilnius, Lithuania
27. October 2017 9:00-23:55
More than 20 sessions and workshops will engage attendees from Baltic and beyond. The conference will provide three parallel session tracks where one will be dedicated to Drupal technical novelties and the other for practical knowledge, the last track will be dedicated to case studies and business development tendencies. 
http://drupalcampbaltics.com/

Drupalcamp Lannion 2017

Espace Saint-Anne, Lannion, France
27-29. October 2017
Last year in Nantes, this year in Lannion.
Friday and Saturday are dedicated to sessions (you can expect around 20 of them) and discussion (also known as BOFs). Sunday is devoted to workshops and sprints.
https://lannion2017.drupalcamp.bzh/

Drupal Picchu 2017

Epis-Fia - Andean University of Cusco, Cusco, Peru
30. October - 3. November 2017
DrupalPicchu 2017 will be an International Drupal Community event to take place in Latin America. There is 16 selected session, some will be in English.
http://picchu2017.drupal.lat/

Drupal Camp Atlanta 2017

Doubletree - Buckhead, Atlanta, GA, United States
3-4. November 2017
This conference is comprised of multiple concurrent sessions with tracks ranging from beginner, design/theming and usability, development and performance, site building, business leadership, and education.  In addition, there are impromptu birds of feather (BOF) sessions, code sprints all concluding with a networking reception.
https://www.drupalcampatlanta.com/

NWDUG Uncon 2017

MadLab, Manchester, United Kingdom
4. November 2017
The focus of this unconference is the Drupal Content Management Framework, and supporting subjects including databases, frameworks, security, UX, front-end development and design are also encouraged.
https://uncon.nwdrupal.org.uk/

Drupal Day 2017

El Garaje 2.0, Caceres, Spain
11. November 2017
The headquarters of Drupal Day will be in Garage 2.0. The event will take place on  Friday and Sunday. With more than 20 sessions attendees will be able to find what they would like to learn more about. 
https://2017.drupalday.es

Drupalcamp Oslo 2017

Oslo, Norway
11-12. November 2017
Drupal community in Norway organizes Drupal Camp annually and for 2017 they are preparing the event on 11. and 12. of November. More information to follow, but do reserve the date!
https://oslo2017.drupalcamp.no/

Drupamcamptunis 2017

AUTechnopark Elgazala, Tunis, Tunisia
11-12. November 2017
Drupalcamp Tunis 2017 is an event organized by the company Emerya in collaboration with the association Drupal Tunisia.
http://drupalcamptunis2017.drupal.tn/

DrupalSouth Auckland

Q Theatre, Auckland, New Zealand, AUK
16-17. November 2017
The biggest Drupal conference for Australia and New Zealand comes to Auckland for the first time. This event will deliver 2 days of classic DrupalSouth action, spiced up with a dash of Auckland-style fun and social collaboration. The conference will take place at Q Theatre, located near Auckland's Aotea Square and in the heart of Auckland city. 
https://drupalsouth2017.drupal.org.nz/

NEDCAMP - New England Drupal Camp 2017

College of the Holy Cross, Worcester, MA, United States
17-18. November 2017
Friday trainings are Drupal 8 multilingual websites,  Drupal 8 theming quickstart, Drupal basics, Writing your first Drupal 8 module. On Saturday there will be 20 sessions. 
https://nedcamp.org/

Drupalcamp Sao Paulo 2017

Mackenzie, Sao Paulo, Brazil
24-25. November 2017
It will be the third edition of the largest Drupal event in Brazil, where the main mission in this event is to promote and disseminate the use of technology in the country and stimulate the exchange of knowledge and experience among the participants. The Drupal community in Brazil is made up of developers and companies working with technology and supporters and sympathizers of free software. For this year's event, we have an estimated 300 attendees, about 30 sessions on themes and various levels and 2 keynotes with speakers.
https://groups.drupal.org/node/517451

Drupalcamp Cebu 2017

CIT University, Cebu City, Philippines
25. November 2017 09:00 - 18:00
This is a one day conference about Drupal and web development. This year we are hosted by CIT University. Sessions will cover Drupal, general web development, design, DevOps, and business.
https://groups.drupal.org/node/517132

Oct 18 2017
Oct 18
Import data from a CSV file into Drupal 8 with the Content Import module

Sometimes you would like to import a huge volume of data from a CSV file into Drupal. Maybe from another CMS. Maybe from a spreadsheet. But there is no such functionality in Drupal 8 core. 

To import your data from a CSV file, you need to install and enable the contributed module "Content Import". In this tutorial, you are going to import five content items of the content type Customer.

The Customer content type will have the following five fields:

Field Name Field Description Field Type Title

The name of the customer / this is the default Drupal title field for each content type

Plain text Body Customer information Long text Contract Date The date when the contract with the customer was signed In the mm/dd/yy format Customer Picture The picture of the customer Image Discount This field indicates if the customer qualifies for an extra discount at the end of the year Boolean (Yes/No)


Step #1. Create the Customer content type

After creating the Customer content type and adding fields, you will have the following starting point:

Starting point

There are some details that you have to take into account to run this process without complications:

  • The date field has to be set as Date and time (this is related to the Unix timestamp. You can read more about this here).

Setting the Date Type

  • When creating the Customer Picture image field, configure the file directory for the images as [MACHINENAMEOFYOURCONTENTTYPE/images] You’ll upload your images to this directory with the help of the IMCE module or some kind of FTP software

Configure the file directory for the images

  • Set the On and Off labels in your Discount boolean field to Yes and No respectively

Set the On and Off labels

Step #2 Prepare your spreadsheet for import

You can use the spreadsheet application of your liking for this. I’m using Google Spreadsheets.

Google Spreadsheets

  • Save your spreadsheet as a Comma Separated Values file. Once you do, your file will have a .csv file extension.

Save file in CSV format

Step #3 Upload the profile pictures to the specified directory

With the IMCE module:

  • Go to yoursite/imce in order to open the IMCE browser
  • Create the /customer folder inside the public directory
  • Create the /images sub-folder inside the /customer folder
  • Upload the profile pictures to this folder by clicking the Upload button on the top (/customer/images folder in our case)

New folder

Upload button

Uploaded profile pictures

Step #4 Import your content with CSV file

  • Click Configuration > Content Authoring > Content Import

You will see a screen with two options.

  • Select the Customer option for the Select Content Type
  • Click on the Upload File button
  • Select the CSV file on your hard drive
  • Click Import

Click Import

Congratulations! If you followed along with my instructions, you should now see the Content screen of your Drupal installation with your newly imported content. 

Troubleshooting

While importing content, you may run into the following error on the white page:

"The website encountered an unexpected error. Please try again later."

To deal with this error, please do the following:

  • In your Drupal site root go to modules > contentimport > src > form
  • Open the contentImport.php file in your text or code editor
  • Find the following two lines (around line 275):

$dateTime = \DateTime::createFromFormat('Y-m-d h:i:s', $data[$keyIndex[$fieldNames[$f]]]);
$newDateString = $dateTime->format('Y-m-d\Th:i:s');

Replace them with the following two lines:

$dateTimeStamp = strtotime($data[$keyIndex[$fieldNames[$f]]]);
$newDateString = date('Y-m-d\TH:i:s', $dateTimeStamp);

You’ll find more information about this error here.

I hope you enjoyed this tutorial.


About the author

Jorge lived in Ecuador and Germany. Now he is back to his homeland Colombia. He spends his time translating from English and German to Spanish. He enjoys playing with Drupal and other Open Source Content Management Systems and technologies.
Oct 18 2017
Oct 18

A checkout is a pretty fundamental part of a commerce system. So the fact that Commerce 2.x has a checkout is not really news. But it’s what you can do with the checkout that makes 2.x special.

You can now configure the checkout workflow. You can opt to ask for billing information, shipping information, certificates, registration details, etc. There’s lots of different data that can change depending on the type of product you sell. If you sell digital products, for instance, you don’t need shipping information. If you sell course registrations, you might require pre-existing certificates. Maybe you do both, so you need to configure multiple types of checkouts.

And that’s easy to do. For the most part, it’s a matter of dragging and dropping options. You can add or remove pieces pretty easily. If you need something really custom, like if you need to validate a safety certificate against a third party, you might need a developer to build that functionality. But otherwise it’s a fairly simple process.

You can also integrate into any part of the checkout. Maybe you do something when you add to cart, or when you complete the order. Maybe you even go off-site to pay through PayPal or register through Eventbrite and then come back. You can hook into any step you need in order to get those things done.

Commerce 2.x also has a more modern checkout out of the box than Commerce 1 had, with billing information on the side, and a floating cart rather than a series of pages you go through, and all those sorts of best practices. It’s a nice update from Commerce 1.

In the end, it’s all the customizations that make the checkout in Commerce 2.x new and cool.

Oct 17 2017
Oct 17

In about a month, it'll be 2 years since Drupal 8.0.0 was released. Drupal 8 has come a long way since then, especially with Drupal 8.4.0 released two weeks ago, which is the most feature-packed release yet.

Drupal 8 is the future of Drupal. It's awesome.

However, looking at all the blogs and articles and podcasts in the Drupalsphere, we're sending a message that you should only build new sites on Drupal 8.

The common wisdom is that starting a new project on Drupal 7 is dumb idea.

While I'm sure there's lots of people who are OK with that or even think that's the right message...

I strongly believe that we are hurting the Drupal project by sending that message.

Read more to find out why!

Drupal 8 isn't ready for everyone... and might not be for a while

The root of the problem is that, while Drupal 8 can do so much, it isn't a complete replacement for Drupal 7 yet.

I'm not just talking about features, and what modules are available, and community knowledge/ability, and tooling... Well, actually, I'm talking about all those things together. :-)

What made Drupal 7 (and earlier versions) great, wasn't that core was awesome. In fact, while core was always made of well-written code, it was decidedly UNawesame all on its own.

The magic of Drupal was being able to quickly build an application with loads of features, without needing to write much code.

In fact, with Drupal 7, a site bulider — even one who is incapable of writing code — could create a pretty advanced website (say, a Facebook clone?) just by combining modules (with an FTP client) and clicking their way through the admin screens in Drupal.

In Drupal 8, the bar is currently much higher. Working on a Drupal 8 site today, you'll need to write custom code or help port modules, because the contrib ecosystem is less mature.

But even if all the contrib modules were ported and had stable versions, you're now forced to do things that are good ideas, but previously were totally optional, for example: use composer, maintain a staging site, probably use Git and specialized hosting.

Don't force out the "non-techies"

Should the Drupal community "level up" and learn how to use composer, git, etc? Yeah, that'd be good. :-) But does it really make sense to require that?

Many very successful Drupal users are not developers or even technologists. They're librarians, or volunteers, or scientists, or government employees, etc, who do need the power of Drupal (ie. Wix wouldn't work for them), but don't have the time or incentive to become "real techies."

Traditionally, Drupal democratized the ability to create advanced websites by empowering non-techies.

And they are valuable members of our community! They make many non-code contributions, not the least of which is making sure we create software usable by non-developers. And some do eventually make the leap to being developers, but they probably wouldn't have if the initial bar was too high and they couldn't make the transition gradually.

Let's not force these people out of our community.

There are some issues on Drupal.org looking at ways to eliminate or reduce these barriers. For example, looking at ways for site builders to use composer without having to learn composer. But I don't see those problems being fixed particularly soon.

The software adoption curve

Software goes through a predictable cycle where different groups of people adopt it at different times. At first, it's only picked up by innovators and early adoptors, but eventually it's ready for mainstream usage. A lot happens in the process, and not all of it is related to code or the core product.

Many of the tools and processes that we use day-to-day as Drupalists didn't come from Drupal core, but the ecosystem around it (drush, Features, etc) and shared knowledge and best practices (tens of thousands of hours of experience gained by many different people and groups, and remixed via meetups, Drupalcamps, Drupalcons, blog posts).

Drupal went through this process as it was figuring out what it was, and as the community learned how to build sites with it. I'd say the transition to mainstream happened somewhere between Drupal 4.7 to Drupal 6, and continued to mature with Drupal 7.

While all major Drupal releases up until now made big changes, they were more iterative and much about the way that Drupal worked and how our community used it to build sites remained the same.

Since Drupal 8 is nearly a complete rewrite which shook up much of what we know about developing Drupal sites, I'd argue that Drupal 8 is starting the software adoption curve all over again.

While there are some things that have remained the same, there is a huge amount of new stuff, much of it untested over the long-term. Drupal 8 is ready for innovators and early adoptors, and maybe some people in the middle, but it's not ready for all the same groups of people that can successfully adopt Drupal 7.

We're creating a false choice

By pushing Drupal 8 so hard, we're creating a false choice:

  1. Use Drupal 8 for your new site, or
  2. Wait for when Drupal 8 is capable of supporting you (... and while you're waiting, consider moving to other platforms!)

We do Drupal 6 Long-Term Support, and we've seen many people who loved their Drupal 6 sites move to non-Drupal platforms because they didn't see themselves capable of building a new Drupal 8 site, or paying for someone else to do it, or both.

But Drupal 7 is still awesome and isn't going anywhere any time soon!

Since Drupal 7 is still used by nearly a million or so sites (the majority of Drupal sites) we're going to need to continue to support Drupal 7 for a long time, whether that's in the form of official support from the Drupal project or as a Drupal 7 Long-Term Support effort.

Drupal 6 continues to be with us (via Drupal 6 Long-Term Support) almost 10 years after its initial release!

Honestly, I think Drupal 7 is going to be with us in a real way for another 8-10 years.

So, why not say it's OK to build new sites on Drupal 7?

By not saying it, we're pushing people to other platforms

I mentioned this in the last section, but it's worth reiterating:

When people see Drupal 8 as the only way they should be build new sites, and it doesn't seem to work for them, they begin to consider other platforms.

I really, honestly believe that Drupal 8 will get to a point where it can support the same groups of people (including the non-techies) that Drupal 7 supported.

But to keep those people in our community until we get there, we need to be saying, as a community: "It's OK to build new sites on Drupal 7!"

Ok, I've said my piece. :-)

I'm sure this will be a controversial opinion and that many people will disagree. I look forward to discussing further in the comments below!

Do you agree with my argument? Or think I'm totally wrong? Please leave a comment below!

Oct 17 2017
Oct 17

The TWG coding standards committee is announcing two issues for final discussion. Feedback will be reviewed on 10/31/2017.

New issues for discussion:

Pending ratification

Provisionally approved issues

Interested in helping out?

You can get started quickly by helping us to update an issue summary or two or dive in and check out the full list of open proposals and see if there's anything you'd like to champion!

Oct 17 2017
Oct 17

The TWG coding standards committee is announcing two issues for final discussion. Feedback will be reviewed on 10/31/2017.

New issues for discussion:

Pending ratification

Provisionally approved issues

Interested in helping out?

You can get started quickly by helping us to update an issue summary or two or dive in and check out the full list of open proposals and see if there's anything you'd like to champion!

Oct 17 2017
Oct 17

In this edition of 3 Takeaways, our Business Development Strategist, Nelson Harris, reviews Drupal 8 and how the latest improvements help get more out of the box, leverage mobile, and upgrade smoothly.

 

[embedded content]

 

Hi, I’m Nelson Harris, Business Development Strategist at Elevated Third. A question I get a lot from people is “what’s new and interesting about Drupal 8, and why might I upgrade.” There are a lot of reasons why you might want to upgrade to Drupal 8 but I’m just going to list three of them.

Takeaways #1: First, you get more out of the box.

There are a lot of useful modules in Drupal 8 core that have been built in. Things like views, multilingual, a WYSIWYG editor, and more types of fields. This means you can spend less time configuring and installing modules, and more time working on your site.

Takeaway #2: Second of all, mobile is in it’s DNA.

Built-in themes are all responsive and adapt well to different screen sizes. Tables will scale, and the new admin toolbar is really good on mobile devices. Chances are, you’re probably watching this video on the screen of your mobile device right now, so you can imagine why mobile might be important.

Takeaway #3: Finally, it’s built to be more future proof.

Where an upgrade from 7 to 8 or 6 to 7 requires scraping your codebase and starting all over from scratch, Drupal 8 is designed to go from 8 to 9 and 9 to 10 more seamlessly and more like an update patch as opposed to starting over. An investment in Drupal 8 really means that you're investing in your website because it's going to be easier to upgrade in the future.

Oct 17 2017
Oct 17

Drupal Modules: The One Percent —Timelogin (video tutorial)

[embedded content]

Episode 40

Here is where we seek to bring awareness to Drupal modules running on less than 1% of reporting sites. Today we'll look at Timelogin, a module which restricts users, based on role, from logging in during certain times of the day.

Oct 17 2017
Oct 17

When going live with a big project, it is all about reassuring the client that the project will be able to handle all those excited visitors. To achieve that state of zen, it is paramount that you do a load test. The benefits of load tests go beyond peace of mind, however. For example, it enables you to spot issues that only happen during high load or let’s you spot bottlenecks in the infrastructure setup. The added bonus is that you can bask in the glory of your high-performance code - on the condition the test doesn’t fail, of course.

Need help with your load and performance testing?
Contact us 

When doing a load test it is important to do the following steps:

  • Analyse existing data
  • Prepare tests
  • Set up tools
  • Run the tests
  • Analyse the results

Analyse existing data

If you are in luck, you will already  have historic data available to use from Google Analytics. If this isn’t the case, you’ll have to get in touch with your client and ask a few to-the-point questions to help you estimate all the important metrics that I’ll be covering in this post.

A couple of tips I can give if you lack the historic data:

  • Ask if the client has a mailing (digital or old-school) list and how many people are on it
  • If you have made comparable sites in the past, look at their Google Analytics data
  • Ask the client how they are going to announce their new website
  • When you are working on an estimate, it is always better to add an extra 15% to it. Better safe than sorry!

The first thing you need to do, is set a reference frame. Pick a date range that has low activity as well as the highest activity you can find. Then start putting that data into a spreadsheet, as pictured below:

An example spreadsheet for load testingYou can download an example copy of the file here

The most important metrics we are going to calculate are:

  • Peak concurrent user (Hourly sessions x Average sessions Duration / 3600)
  • Peak page views per second

The values you need to find or estimate are:

  • Peak daily page views
  • Peak hourly page views
  • Total page view for period
  • Peak hourly sessions
  • Total amount of sessions
  • Average session duration in seconds

As you can see, we mainly focus on the peak activity, because you test with the worst-case scenario in mind - which is, funnily enough, usually the best-case scenario for your client.

Before we start preparing our test, it is also handy to check which pages receive the most traffic. This benefits the validity of your test scenario.

Prepare the tests

For our tests we are going to start out with Apache JMeter, which you can grab here.

With JMeter you can test many different applications/server/protocol types, but we’re going to use it to make a whole lot of HTTP requests.

Make sure you have the required Java library and go boot up the ApacheJMeter.jar file.

Adding and configuring a Thread Group

Start by adding a Thread Group to your test plan by right clicking your Test plan and selecting Add > Threads (Users) > Thread Group

Thread group

Eventually you will need to fill in the number of (concurrent) users and ramp-up period based on your analysis, but for now keep it low for debugging your test.

Adding and configuring User-Defined Variables

Then right click the thread group to add User Defined Variables (Add > Config Element > User Defined Variables).

Add two variables named url and protocol and assign them a value.

User-defined variables - examples

Using these user-defined variables makes it easy to choose another environment to test on. It avoids the painstaking and error-prone work of finding all references and changing them manually.

You can use these variables in input fields in your test by doing this: ${url} or ${protocol}

Adding and configuring HTTP config elements

 Next up, you need to add the following HTTP config elements to your thread group:

  • HTTP Request Defaults
  • HTTP Header Manager
  • HTTP Cookie Manager

On the first one, you use your variables to fill in the protocol and the server name.

On the second one, you can set default headers for each one of your requests. See the screenshot below for what I’ve put in default.

HTTP config elements

For the third one, you only select cookie policy: standard.

A simple page request sampler

Right-click your test again and add the HTTP request sampler (Add > Sampler > HTTP Request).

Here we are going to call the home page. The only things you need to set here are:

  • Method: GET
  • Path: /

We don’t fill in the protocol or server name because this is already covered by our HTTP Request Defaults.

Posting the contact form

In this one we are going to submit the contact form (which is located at www.example.com/contact), so add another HTTP Request like we did before. Now only fill in the following values:

  • Method: POST
  • Path: /contact
  • Follow redirects: True
  • Use KeepAlive: True

In order for Drupal to accept the submit, we need to add some parameters to our post, like this:

Contact form parameters

The important ones here are form_build_id and form_id. You can manually get the form id because it always stays the same. The form build ID can vary, so we need to extract this from the page. We’ll do this using the CSS/JQuery Extractor (right-click your HTTP Request sampler: Add > Post Processors > CSS/JQuery Extractor)

Configure it like the screenshot below:

JQuery extractor example

It will now get that form_build_id from the page and put into a variable the sampler can use.$

Posting some Ajax on the form

Imagine our contact form has some Ajax functionality and we also want to test this. The way we go about it is identical to posting the regular form like we did before. The only difference is the post parameters, the path and an extra HTTP Header Manager.

You should set the path in your sampler to: /system/ajax

Then right click your sampler to add your new HTTP Header Manager (Add > Config Element > HTTP Header Manager). Configure it like shown in the screenshot:

adding Ajax - example
 
Saving the results of your test

Now that we’ve configured samplers, we need to add some listeners. You can add these listeners everywhere, but in our example we’ve added it to the test in a whole.

We’ll add three listeners:

  • View Results in Table:
    • Show every request in a table format
    • Handy for getting some metrics like latency and connect time
  • Simple Data Writer:
    • Writes test data to a file
    • Handy for debugging when using Blazemeter (check out this link)
    • Just load the file into the View Results Tree
  • View Results Tree:
    • It shows you the actual response and request.
    • Uses a lot of resources (so only good for debugging)

There is a lot more you can do with JMeter. You can read all about it here.


Test-run the test

Now that we’ve configured our test it is time to try it out. So make sure not to put too much concurrent users in there. Just run the test by pressing the green ‘Play’ icon.

Test run

If you get errors, debug them using the feedback you got from your listeners.

As this wise man once said: "Improvise. Adapt. Overcome."

After you’ve validated your test, it’s always handy to turn up the concurrent users until your local site breaks. It’ll give you a quick idea of where a possible bottleneck could be.

Just a small warning: doing that load test on your local machine (running the test and the webserver) will take up a lot of resources and can give you skewed results.

You can download an example here.

Set up tools

Load testing with Blazemeter

When you have a project that will have a lot of concurrent users, your computer is most likely not able to handle doing all those calls and that is why it is good to test from a distributed setup like Blazemeter does.

You can have multiple computers running the same test with only a part of the concurrent users or you can pay for a service like Blazemeter.

The downside of using multiple computers is that they still use the same corporate WiFi or ethernet, blocking yourself possibly to the lowest common denominator, which is most likely unknown and could cause trouble that might skew your test. On top of that you will also have to aggregate all those results yourself, costing you precious time.

For us the mayor benefits of Blazemeter are the following:

  • Simulate a massive amount of concurrent users with little hassle
  • Persistence of test results and comparison between tests
  • Executive report to deliver to a technical savvy client
  • Sandbox mode tests that don’t count against your monthly testing quota

Adding your JMeter test in Blazemeter is very easy and straightforward. Just click ‘Create Test’ in the menu and select JMeter Test.

Blazemeter screenshot

Upload the file and you can start to configure your test to reflect your test scenario from the analysis chapter. We suggest to choose to ‘Originate a load’ from a service that is closest to your target population.

Blazemeter - load test set-up screenshot

Before you run your test, it is important to have set up your monitoring of the environment you want to test.

Monitoring performance

At Dropsolid, we like to use New Relic to monitor performance of our environments but you could also use open source tools like munin (http://munin-monitoring.org/).

The most important factors in your decision of monitoring tool should be:

  • Persistence of monitoring data
  • Detail of monitoring data
  • Ease of use

If you are using New Relic, we recommend to install both APM and Server. The added value of having APM is that you can quickly get an overview of possible bottlenecks in PHP and MySQL.

Run the test

Now that everything is set up, it is important to have an environment that is a perfect copy of your production environment. That way you can easily optimize your environment without having to wait for a good moment to restart your server.

Run your test, sit back and relax.

Analyse the results

If everything has gone according to plan, you should now have reports from both Blazemeter and New Relic.

Blazemeter test reportBlazemeter report of a test of 854 concurrent usersNew relic monitoring during the same testNew Relic monitoring during the same test

If your server was able to handle the peak amount of users, then your job is done and you can inform the client that they can rest assured that it won’t go down.

If your server couldn’t handle it, it is time to compare the results from Blazemeter and New Relic to find out where your bottleneck is.

Common issues are the following:

  • Not the right memory allocation between parts of the stack.
  • Misconfiguration of your stack. For example, MySQL has multiple example configuration files for different scenarios
  • Not using extra performance enhancing services like varnish, memcache, redis,...
  • Horrible code

If the issue is horrible code, then use tools like xhprof or blackfire.io to profile your code.

Need expert help with your performance tests? Just get in touch!

Contact us for performance testing 


Final note

As Colin Powell once said: "There are no secrets to success. It is the result of preparation, hard work and learning from failure." That is exactly what we did here: we prepared our test thoroughly, we tested our script multiple times and adapted when it failed.

Oct 16 2017
Oct 16

Last week at DrupalCamp Quito, I presented an updated, Spanish-language version of my DrupalCon session. If you would like to view the presentation in English, you can find it on my DrupalCon blog post.

Las estructuras orientadas a objetos han reemplazado a nuestros queridos "hooks" que nos permitían extender Drupal con nueva funcionalidad sin necesidad de hackear core (u otros módulos de contrib). Pero, ¿cómo funciona esto? En esta charla revisamos cómo extender un módulo para implementar single sign-on (SSO), y al hacerlo nos adentramos a cómo la programación orientada a objetos hace magia en nuestros módulos, haciéndolos más fáciles de escribir, entender y depurar. Adicionalmente, se describen algunos de los patrones de diseño de Drupal, cómo utilizar event listeners, sobreescribir rutas y otras herramientas.

[embedded content]

Slides: https://goo.gl/QgskXw
Código de ejemplo: https://github.com/arlina-espinoza/openid_fb

Oct 16 2017
Oct 16

This is part 3 of our series on developing a Decoupled Drupal Client Application with Ember. If you haven't yet read the previous articles, it would be best to review Part1 first. In this article, we are going to clean up the code to remove the hard coded URL for the host, move the login form to a separate page and add a basic header and styling.

We currently have defined the host URL in both the adapter (app/adapters/application.js) for the Ember Data REST calls as well as the AJAX Service that we use for the authentication (app/services/ajax.js). This is clearly not a good idea but helped us focus on the initial goal and our simple working app.

Oct 16 2017
Oct 16

Secure sites. HTTPS and SSL. A topic more and more site owners and maintainers are having to work with. For some, this is a great thing and others it is either nerve-wracking or confusing. Luckily, for us all, getting an SSL and implementing full site HTTPS is becoming easier.

Why does an SSL matter?

First. Let's talk about why having an SSL and wrapping your site in HTTPS matters. For instance, there are people who think it is fine to have their e-commerce site behind regular HTTP because their payment gateway is PayPal.

Beyond security, consider the fact that in Google announced in 2014 that HTTPs would be used as a ranking signal. Three years ago Google made the push for a more secure web by making this choice. According to the Internet, Bing has stayed away from this sort of decision. But, if you care (or your customer) cares about SEO, I hope this helps make a case.

Google's search rankings are not your only worry. Chrome and FireFox are starting to alert users that the site is not secure if they fill in sensitive form data: passwords, credit card fields. The Google Security Blog announced the move last year, and Firefox did the same in early 2017.

Isn't HTTPS slow?

Years ago it was thought that SSL was slow due to the handshakes involved. The fact is that it is actually faster. If you do not believe me, go to http://www.httpvshttps.com/.

Getting an SSL is easier, now.

I remember when having to go purchase and then install and SSL was a drag. It cost extra money, even if a paltry amount is broken down to monthly costs (~$5 a month), and required time to install. Thanks to the service Let's Encrypt it has become easier to get an SSL certificate. Let's Encrypt is a free and open certificate authority (CA), which means they can provide signed and authorized SSL certificates. You won't get Organization Validation (OV) or Extended Validation (EV) certificates; but, generally, you do not need those.

Let's Encrypt logo

Let's Encrypt is great, but it requires you to run some tools to help automate certificate renewal and installation. Luckily, there are more hosting platforms and CDNs providing free or bundled SSL certificates.

Let's roll through some options. Please comment and share corrections or services that I have missed! The following items are services I use or found to be great on cost and ease of use.

Content Delivery Networks (CDN)

Putting a CDN in front of your website is one of the simplest ways to get your site wrapped around HTTPS without having to change your server or hosting information. This is your best option if your own servers and don't want to mess with certificates directly or your current host does not provide free/cheap SSL certificate support. It also improves visitor performance of your website.

CloudFlare is the CDN solution I use for this site in order to provide fully wrapped HTTPS support. CloudFlare has a great free plan that provides DDoS mitigation, CDN, SSL certificate and some other goodies. This is my go-to solution.

Hosting providers

More and more hosting providers are providing free SSL certificates. I've done some basic research, but these are based on services I have used or are familiar with.

Pantheon is a managed hosting service for Drupal and WordPress. Starting at $25 a month you get a managed hosting service, three environments (development, test, production), and a CDN with free SSL. If you want to install a custom SSL certificate, though, you will need to jump up to the Professional offering at $100 a month. Before Pantheon announced their global CDN and free SSL I had never considered them due to the price of the monthly service when you have an SSL. Next to using CloudFlare, it's your best bet for the "hands off" and ease of mind approach.

Platform.sh is my favorite and general go-to for price and value. You can host your PHP, Node.js, Python, and Ruby projects.  Plans start at $50 for a production site, which seems a bit expensive. But that gets you an automatic free SSL and you can still install custom SSL certificates without additional charge. You also get other goodies such as the ability to use Solr, Redis caching and more.

Gandi.net is a hosting provider that was brought to my attention when finding homes for ContribKanban. For $7.50 a month you can get their Simple Hosting with free SSL. You can run your PHP, Node.js, Ruby or Python apps on their hosting powered by a web administrative interface.

Using Let's Encrypt itself

You can of course use Let's Encrypt yourself on your own hosting - using certbot on your DigitalOcean droplet. Or just generating certificates and adding them to your existing host.

Oct 16 2017
Oct 16

Drupal 8.4 is stable! With 8.3 coming to end of life, it's important to update your projects to the latest and greatest. This blog will guide you through upgrading from Drupal core 8.3 to 8.4 while avoiding those nasty and confusing composer dependency errors.

The main issues with the upgrade to Drupal core 8.3 are dependency conflicts between Drush and Drupal core. The main conflict being that both Drush 8.1.x and Drupal 8.3 use the 2.x version of Symfony libraries, while Drupal 8.4 has been updated to use Symfony 3.x. This means that when using composer to update Drupal core alone, composer will complain about conflicts in dependencies, since Drush depends on Symfony 2.x

Updating your libraries

Note: If you are using Drush 8.1.15 you will not have these issues as it is now compatible with both Symfony 2.x and 3.x

However, if you are using Drush < 8.1.15 (which a lot of people will be on), running the following command will give you a dependency conflict:

composer update drupal/core --with-dependencies

Resulting in an error message, followed by a composer trace:

Your requirements could not be resolved to an installable set of packages.

The best way to fix this is to update both Drupal core and Drush at the same time. Drush 8.x is not compatible with Drupal 8.4 so you will need to update to Drush 9.x.

composer update drupal/core drush/drush --with-dependencies
composer require "drush/drush:~9.0"

Some people have reported success with simply running a require on both updated versions of Drupal and Drush at the same time, but this did not work for me

composer require "drupal/core:~8.4" "drush/drush:~9.0"

What next?

Great, you're on the latest versions of both core and drush, but what's next? Well, that depends on a lot of things like what contributed and custom modules your project is running, how you're deploying your site, and what automated tests you are running. As I can't possibly cover all bases, I'll go through the main issues we encountered.

First things first, you'll need to get your site's database and configuration updated. I highly recommend running your database update hooks and exporting your site's configuration before proceeding any further.

Next, you'll want to ensure that all of your deployment tools are still working. Here at PreviousNext our CI/CD tools call Make commands which are essentially just wrappers around one or more Drush commands.

For the most part, the core Drush commands (that is, the commands that ship with drush) continued working as expected, with a couple of small caveats:

1. You can no longer pipe a SQL dump into the drush sql-cli (sqlc) command.

Previously, we had:
drush sqlc < /path/to/db.sql
Now we have:
`eval drush sql-connect` < /path/to/db.sql

Note: As of Drush 9.0-beta7 this has now been fixed, meaning the old version will work again!

2. The drush --root option no longer works with relative paths

Previously, our make commands all ran Drush with the --root (or -r) option relative to the repository root:
./bin/drush -r ./app some-command
Now it must be an absolute path, or Drush will complain about not being able to find the Drupal settings:
./bin/drush -r /path/to/app some-command

3. Custom Drush commands

For custom Drush commands, you will need to port them to use the new object oriented style approach and put the command into a dedicated module. Since version 9.0-beta5, Drush has dropped support for the old drush.inc style approach that could be used to add commands to a site without adding a new module.

For an example on this, take a look at our drush_cmi_tools library which provides some great extensions for importing and exporting config. This PR shows how we ported these commands to the new Drush 9 format.

For more information on porting commands to Drush 9, check out Moshe Weitzman's blog on it.

Other gotchas

Following the Drush upgrades, your project will need various other updates based on the modules and libraries it uses. I'll detail some issues I faced when updating the Transport for NSW site below.

1. Stale bundles in the bundle field map key value collection

Added as part of this issue, views now throws warnings similar to "A non-existent config entity name returned by FieldStorageConfigInterface::getBundles(): field name: field_dates, bundle: page" for fields that are in the entity bundle field field map that no longer exist on the site. We had a handful of these fields which threw warnings on every cache clear. To fix this, simply add an update hook which clears out these stale fields from the entity.definitions.bundle_field_map keyvalue collection:

/**
 * Fix entity.definitions.bundle_field_map key store with old bundles.
 */
function my_module_update_8001() {
  /** @var \Drupal\Core\KeyValueStore\KeyValueFactoryInterface $key_value_factory */
  $key_value_factory = \Drupal::service('keyvalue');
  $field_map_kv_store = $key_value_factory->get('entity.definitions.bundle_field_map');
  $node_map = $field_map_kv_store->get('node');
  // Remove the field_dates field from the bundle field map for the page bundle.
  unset($node_map['field_dates']['bundles']['page']);
  $field_map_kv_store->set('node', $node_map);
}

2. Custom entities with external uri relationships throw Fatal errors when delete while menu_link_content is installed

The menu_link_content module now has an entity_predelete hook that looks through an entities uri relationships and tries to find any menu links that link to that specific route, and if so deletes them. When the uri is external, an error is thrown when it tries to get the route name "External URLs do not have an internal route name.". See this issue for more information.

3. Tests that submit a modal dialog window will need to be altered

This is a very edge case issue, but will hopefully help someone! In older versions of jQuery UI, the buttons that were added to the bottom of the modal form for submission had an inner span tag which could be clicked as part of a test. For example, in Linkit's LinkitDialogTest. This span no longer exists, and attempting to "click" any other part of that button in a similar way will throw an error in PhantomJS. To get around that simply change your test to do something similar to the following:

$this->click('.ui-dialog button:contains("Save")');

Kudos to jhedstrom for finding this one. See this issue for more information.

Conclusion

Personally, I found the upgrade to be quite tedious for a minor version upgrade. Thankfully, our project has a large suite of functional/end-to-end tests which really helped tease out the issues and gave us greater confidence that the site was still functioning well post-upgrade. Let me know in the comments what issues you're facing!

Finally, take a look at Lee's blog on some of the major changes in 8.4 for some more insight into what you might need to fix.

Photo of Adam Bramley

Posted by Adam Bramley
Senior Drupal Developer

Dated 16 October 2017

Add new comment

Oct 16 2017
Oct 16

Chris Teitzel of Cellar Door Media gives us a preview of Security Saturday at BadCamp 2017 and provides some great tips for securing your website.  He tells us why we should always say yes to the community; you never know where it's going to lead.

Chris also shares some amazing stories about bringing a Drupal-based communications tool developed from the DrupalCon Denver Tropo Hackathon, to Haiti in 2012 to help with relief efforts after their devastating 2010 earthquake.

Oct 16 2017
Oct 16
BADCamp 2017 starts this Wednesday | BADCamp 2017

Main navigation

BADCamp kicks off this Wednesday! We are looking forward to seeing you and are excited to share some logistical details and tips for making the most of your time at BADCamp.

Where do I register and pick up my badge?

Central BADCamp registration opens at 8:15 am each morning. It’s located in the Martin Luther King (MLK) Student Union, on the 3rd Floor in the Kerr Lobby.

Map to Martin Luther King Student Union

2495 Bancroft Way, at Telegraph Avenue

University of California

Berkeley CA 94720

 

If you are attending a summit at the Marsh Art Center, badges will be available for pick up when you arrive.

Map to Marsh Art Center

2120 Allston Way

Berkeley, CA 94704

 

Be sure to come back to BADCamp Expo Hall at MLK Pauley West during breaks. We’ll have coffee, pinball, 15-min relaxation massages and a chance to thank our generous sponsors ... many are hiring!


Here is an overview of what is happening at each venue.

 

Where is everything? Where do I go?

  • Take a look at our Event Timeline to find out what is happening when.

  • Check out the Venues to see what is happening where.

  • Be sure to log in and make your session schedule in advance and then follow along on your mobile device.

 

What’s the 411 on food and beverage?

As always, BADCamp will provide an endless supply of coffee, tea, and water.

 

Wednesday & Thursday

  • All Training & Summits will have light snacks in the morning.

  • For lunch, head outside to discover some of Berkeley’s best food!

  • Stop by the Sponsor Expo on Thursday for specialty coffees.

 

Friday & Saturday

 

Parking

Parking at Berkeley can be extremely challenging. Consider taking public transportation whenever possible.  

 

Anything else to know?

  • Wear good shoes! You will do a lot of walking.

  • Bring layers, or donate at the $100 level and get not only an awesome 2017 t-shirt, a solar charger, and a cozy BADCamp hoodie!

  • The Fires. We are keeping an eye on things and will provide any updates if the air quality or anything else impact the event. Stay in touch with BADamp on Twitter.

  • The BADCamp Contribution Lounge is open 24 hours, beginning at 9 am on Wednesday and going until 10 pm on Saturday. We welcome and encourage you to participate!

 

Sponsors

Our sponsors make the magic of BADCamp possible! Stop by to thank them at the event. As an added bonus, many of them are hiring! We’re also sending an extra big virtual hug to Platform.sh, Pantheon & Acquia for sponsoring at the Core level and helping to keep BADCamp AWESOME!

Stay Connected

Oct 15 2017
Oct 15

Your hosting account was found to be causing an overload of MySQL resources. What can you do? Upgrade your Drupal 8 website to Drupal 8.4 or higher.

One of my goals in rebranding my website from CMS Report to socPub was to write diverse articles beyond the topic of content management systems. Yet, here we go again with another CMS related article. The Drupal open source project recently made available Drupal 8.4 and for me this version has been a long time coming as it addresses some long standing frustrations I've had with Drupal 8 from the perspective of a site administrator. While Drupal 8.4 adds some nice new features, I'm just as excited about the bug fixes and performance improvements delivered in this new version of Drupal.

When Drupal 8 was introduced it made significant improvements in how it caches and renders pages. That's great news for websites that use Drupal's built-in caching to speed up delivery of pages or page elements. But there was one unwanted side effect to the cache enhancements, excessive growth of cache tables with tens or hundreds of thousands of entries, and gigabytes in size. For my own website it is not too uncommon to see my database reach 4 GB in size. Let's put it this way, it was no fun to receive a letter from my hosting provider that they weren't too happy of my resource usage. Worse they threatened shutting down my website if I didn't manage the database size better. Just in the nick of time for you and me, Drupal 8.4 delivers a fix to the cache growth by introducing a new default limit of 5000 rows per cache bin.

I'm still playing with this change and I haven't found a lot of documentation, but you can override the default row limit in Drupal's settings.php via the setting "database_cache_max_rows". For my site, the following settings has helped me keep my MySQL database under half a Gigabyte:

$settings['database_cache_max_rows']['default'] = 5000;
$settings['database_cache_max_rows']['bins']['page'] = 500;
$settings['database_cache_max_rows']['bins']['dynamic_page_cache'] = 500;
$settings['database_cache_max_rows']['bins']['render'] = 1000;

For those of you that may not be ready to upgrade to Drupal 8.4 but still need to handle the oversized caching tables today, I had some luck with the Slushi cache module. An additional good summary of similar solutions for Drupal 8 versions prior to 8.4 can be found on Jeff Geerling's blog.

Notable New Features in Drupal 8.4

Of course the purpose of Drupal 8.4 isn't just to address my pet peeve about Drupal caching but also to bring Drupal users a number of new features and improvements. Some of the more significant additions and changes in Drupal that affect me and possibly you include:

Datetime Range

For non-Drupal user I know this is going to sound odd, but despite a number of community approaches there never really been a standard format for expressing a range for date or time commonly used in event and planning calendars. Drupal 8.4 addresses this missing field type with the new core Datetime Range module to support contributed modules like Calendar and shares a consistent API with other Datetime fields. Future releases may improve Views support, usability, Datetime Range field validation, and REST support.

Datetime Range User Interface

Content Moderation and Workflow

Although I've been a longtime user of Drupal, for a two year period I managed my website on the Agility CMS. One of the benefits of Agility over Drupal were the workflow and moderation tools delivered "out of the box". The ability to moderate content becomes especially important in websites that have multiple authors and editors collaborating together and in need to mark whether the content is a draft, ready for review, in need of revision, ready to publish, etc. With Drupal 8.4 the Workflow modules is now stable and provides the framework to build additional modules such as the much anticipated Content Moderation module. Currently, the new core Content Moderation is considered experimental and beta stable so additional future changes should be expected. Content moderation workflows can now apply to any entity types that support revisions, and numerous usability issues and critical bugs are resolved in this release.

Media Handling

Another long standing issue for me has been how Drupal handles, displays, and allows you to reuses (it doesn't without outside help) those images. Over the years, there has been a host of solutions found via contributed modules but I've often found myself frustrated that support for these modules vary and often compatible versions are not made available until weeks or months after a new major version of Drupal has been released. The new core Media module wants to change this hurdle by providing an API for reusable media entities and references. It is based on the contributed Media Entity module which has become popular in recent years within Drupal's users.

Unfortunately, the core Media module still needs work and is currently marked hidden. In other words Media by default will not appear in Drupal 8.4's module administration page. The module will be displayed to site builders normally once once related user experience issues are resolved in a future release. Although, if you elect to use a contributed module under development that depends on the core Media module it will enable Media automatically for you. Similarly, the REST API and normalizations for Media are not final and support for decoupled applications will be improved in a future release. So while the Media API in available in this version of Drupal, most of us non-developers will need to wait for additional development to see the benefits of this module. 

Additional Information on Drupal 8.4

An overview of Drupal 8.4 can be found at Drupal.org but for a better list of the changes and fixes you'll want to check out the release notes. As always, links to the latest version of Drupal can be found on the project page. I've seen a few strange errors in the logs since updating my site from Drupal 8.3 to 8.4 but nothing significant for me to recommend waiting to install Drupal 8.4. For those that are more cautious, the next bugfix release (8.4.1) is scheduled for November 1, 2017.

Article originally published at socPub.

Oct 14 2017
Oct 14

Metatag cannot directly extract an image url from a media field referenced by another entity.

I upgraded my site from Drupal 7 to Drupal 8 this week (yes, that's why it's running on Bartik - a PatternLab developed theme will be installed in time).

This morning I enabled the Metagtag module and set some defaults for page title, description, image, etc. The help notes on the image metatag field says "An image associated with this page, for use as a thumbnail in social networks and other services. This will be able to extract the URL from an image field." This is true, except in my case, all the image fields on the site use the Media Entity module, so they are entity reference fields rather than image fields.

When I put in a token of [node:field_main_image], the result in the outputted metatags was:

<meta property="og:image:url" content="Mark Conroy | DrupalCon Dublin 2017" />

In that case, "Mark Conroy | DrupalCon Dublin 2017" is the name of the referenced media. I needed to output the image field of the referenced media.

After a little trial and error, I came up with this:

[node:field_main_image:entity:field_m_image_image]

which outputs:

<meta property="og:image:secure_url" content="https://mark.ie/sites/default/files/media/images/2017-10/mark_presenting_1.jpg" />

In this case, "field_main_image" is the name of the image field on my content type, and "field_m_image_image" is the name of the image field on my image media bundle.

I hope that helps!

Oct 13 2017
Oct 13

Upgrading to Drush 9

Drush should be installed and updated through composer. There is no stable Drush 9 version yet, so the development version must be used. Updating to the development version of Drush 9 is a simple as typing:

composer require drush/drush:dev-master

Porting your Drush commands to Drush 9

Porting the commands is a semi-automatic process: There is a command that will generate the required files and class structure for you. To start the wizard, just type:

drush generate drush-command-file -l dev

Drush will ask you for the module's machine name and for the optional path to the legacy Drush command file (the one that has your commands, ending with .drush.inc). You will have to provide the absolute path.

drush.services.yml

This is the file your Drush command definition goes into. Do not use your module's regular services.yml as you might have done in Drush 8 or else you will confuse the legacy Drush which will lead to a PHP error like this:

Fatal error: Class 'Drush\Commands\DrushCommands' not found in MyModuleCommands.

Use the dedicated drush.services.yml file in your module's root directory instead.

The file should look like this:


  1. services:

  2. mymodule.commands:

  3. class: \Drupal\mymodule\Commands\MyModuleCommands

  4. tags:

  5. - { name: drush.command }

As in other symfony service definitions, you can (and should) provide other services as arguments DI style and do all the other crazy stuff.

MyModuleCommands.php


  1. namespace Drupal\mymodule\Commands;

  2. use Drush\Commands\DrushCommands;

  3. /**

  4.  *

  5.  * In addition to a commandfile like this one, you need a drush.services.yml

  6.  * in root of your module.

  7.  *

  8.  * See these files for an example of injecting Drupal services:

  9.  * - http://cgit.drupalcode.org/devel/tree/src/Commands/DevelCommands.php

  10.  * - http://cgit.drupalcode.org/devel/tree/drush.services.yml

  11.  */

  12. class MyModuleCommands extends DrushCommands {

  13. /**

  14.   * @command mymodule:do-something

  15.   * @param array $options An associative array of options whose values come from cli, aliases, config, etc.

  16.   * @validate-module-enabled mymodule

  17.   * @aliases mm:do-something, mm:ds, mymodule-do-something

  18.   */

  19. public function generate()

  20. {

  21. // See bottom of https://weitzman.github.io/blog/port-to-drush9 for details on what to change when porting a

  22. // legacy command.

  23. }

  24. }

As seen above, the generate() method needs to be implemented manually. Other manual changes may include creating a constructor in case other services are injected.

Drush 9 mimics symfony's style module:command naming structure and this should be respected. I don't see any reson not to include the legacy command as an alias however: If your command used to be my_module:do-something, use my-module:do-something in @command, but also the old my_module-do-something as @alias as presented in the example above. This way scripts calling the old Drush will continue working.

Maintaining Drush 8, Drush 9 and Drupal Console commands side by side

The new three standards of managing Drupal through a shell should not be an excuse for bad practice. To avoid code duplication, make sure your module defines a service which holds all the business logic that can be run by any of the above tools.

Simple XML Sitemap (project page) now supports Drush 9 and is a good example of this principle:

simple_sitemap.drush.inc (Drush 8)


  1. /**

  2.  * @file

  3.  * Drush (< 9) integration.

  4.  */

  5. /**

  6.  * Implements hook_drush_command().

  7.  */

  8. function simple_sitemap_drush_command() {

  9. $items['simple_sitemap-generate'] = [

  10. 'description' => 'Regenerate the XML sitemap according to the module settings.',

  11. 'callback' => 'drush_simple_sitemap_generate',

  12. 'drupal dependencies' => ['simple_sitemap'],

  13. ];

  14. return $items;

  15. }

  16. /**

  17.  * Callback function for hook_drush_command().

  18.  *

  19.  * Regenerate the XML sitemap.

  20.  */

  21. function drush_simple_sitemap_generate() {

  22. \Drupal::service('simple_sitemap.generator')->generateSitemap('drush');

  23. }

SimplesitemapCommands (Drush 9)


  1. namespace Drupal\simple_sitemap\Commands;

  2. use Drupal\simple_sitemap\Simplesitemap;

  3. use Drush\Commands\DrushCommands;

  4. /**

  5.  * Class SimplesitemapCommands

  6.  * @package Drupal\simple_sitemap\Commands

  7.  */

  8. class SimplesitemapCommands extends DrushCommands {

  9. /**

  10.   * @var \Drupal\simple_sitemap\Simplesitemap

  11.   */

  12. protected $generator;

  13. /**

  14.   * SimplesitemapCommands constructor.

  15.   * @param \Drupal\simple_sitemap\Simplesitemap $generator

  16.   */

  17. public function __construct(Simplesitemap $generator) {

  18. $this->generator = $generator;

  19. }

  20. /**

  21.   * Regenerate the XML sitemap according to the module settings.

  22.   *

  23.   * @command simple-sitemap:generate

  24.   * @validate-module-enabled simple_sitemap

  25.   * @aliases ss:generate, ssg, simple_sitemap:generate, simple_sitemap-generate

  26.   */

  27. public function generate() {

  28. $this->generator->generateSitemap('drush');

  29. }

  30. }

All of the business logic of this command is inside of the method generateSitemap() of the simple_sitemap service.

Downgrading back to Drush 8

Not a fan of changing APIs? Downgrading is a composer command away:

composer require drush/drush:^8.0

Conclusion

It is good to see the Drush project keeping up with time and pubishing Drush 9 parallely to the appearance of Drupal 8.4.0. The API changes are the necessary price we pay for a modern and continuously evolving framework like Drupal.

Feel free to leave a comment below in case of questions or new Drupal 8.4 / Drush 9 insigts.

Oct 13 2017
Rob
Oct 13

A week ago, we released Mannequin, which we’re calling a “Component theming tool for the web.” I wanted to explain what we mean by “Component theming,” and to explain why we’re (belatedly) switching to this approach for our Drupal development.

Our Story

We used to be terrible at theming. Five or six years ago, the sites we built were consistent in one way - they all had inconsistent page margins, big cross-browser issues, and enough CSS to theme a dozen sites. As a developer, I used to hate jumping betweeen our projects, because each one had it’s own rats-nest of CSS, and every time I made a change, it broke parts of the site that nobody even knew existed.

That changed when we discovered Foundation. Foundation solved the vast majority of our consistency and cross-browser problems right away. It was a night-and-day shift for us. We’d just apply a few simple classes to our markup, and our theming was “roughed in”. Some custom styles would be written to cover the rest, but in general we were writing much less code, which made the bugs easier to find and fix. There was still a pretty major problem though - small changes still had a tendency to break things in unexpected ways.

These days, we’re starting a new chapter in our journey toward front-end excellence… the shift to Component theming. Component theming is theming based on chunks of markup (“Components”) rather than pages. If you haven’t read Brad Frost’s excellent Atomic Design, you should. It’s a great intro to the topic, although the terminology is a little different from what we’ll use here… Atomic Design is as much a specification for design as it is for development, and what we’re primarily interested in here is the development portion (theming).

What we’re changing

Long story short, for many of our newer projects, we’ve been shifting away from using globally available utility classes (such as Foundation’s .row and .column), and toward theming specific classes that are only used by the templates under our control. To use a very simple example, let’s consider how we might theme something like a grid of images:

Foundation Markup:

See the Pen Foundation - Card by Rob Bayliss (@rbayliss) on CodePen.

Component Markup:

See the Pen Foundation - Card by Rob Bayliss (@rbayliss) on CodePen.

The thing that’s immediately obvious is that we’ve gotten rid of the Foundation layout classes. This forces us to handle the layout ourselves, and to do that, we’re targeting the markup we know this component uses directly. What’s more, all of our CSS we’re using for this component is scoped to the ImageGrid class, so there’s no chance it will leak out into the global styling. We might say that this component is very isolated from the rest of the system as compared to the Foundation version, which is going to depend on the unscoped .row and .column selectors. As a result, when the client adds feedback at the end of the sprint that the first image in the grid is 1px off, we can make that fix without touching anything but the ImageGrid CSS. That is to say - refactoring just got a whole lot easier.  For example, imagine that we’re shifting toward a CSS Grid layout, and we want to shift this component over without breaking the rest of the site.

Component Markup with CSS Grid

See the Pen Foundation - Card by Rob Bayliss (@rbayliss) on CodePen.

This is a pretty significant change on the component scale (swapping layout systems), but as long as we’ve properly scoped our CSS, there’s no danger of this change leaking out to the rest of the system.

But what about shared styles?

There will always be shared styles. Even if we added no unscoped CSS or utility classes, browsers have their own stylesheet that would change the look of your components. The key is to keep these as minimal as possible. We’ve still be applying the following two Foundation mixins globally on the projects where we’ve been working with components:

This gives us a good global baseline that gets shared by all components, but you need to be extremely judicious about what gets included in unscoped CSS, since changing it later on will affect every single component (exactly what we want to avoid).

How we’re changing

Last week, we released Mannequin, a tool for isolating your templates as components and rendering them with sample data. Going forward, we plan to use Mannequin on all of our new projects from the get-go. Rather than writing extremely Drupal-specific templates, our front end developers will be writing a much cleaner version in Mannequin, then we’ll be wiring it into Drupal using what we’re calling “glue templates.”

Mannequin does not dictate that we use a glue template here - we could be writing a single node.html.twig and use it with Mannequin just fine. But glue templates give us two important benefits. First, we’re free to reuse the component template just by writing another glue template that includes it, keeping our project DRY while making things nice and discoverable by leaving the Drupal defined templates in place. Second, writing our templates without considering Drupal’s funky data structures means we can work with developers who don’t know their nodes from their assets (non-Drupal developers). As much as I poke fun, we’re excited to be leaving a lot of the Drupalisms behind.

That’s all for now!

Next week looks like a a busy one! If you’re going to BADCamp and are interested in Component based theming, please find us to talk, or come to our session on Mannequin!  Also, look for a blog post next week with a step-by-step guide to setting up Mannequin for Drupal.  

Oct 13 2017
Oct 13

BADCamp is almost here! Just five more sleeps to go. We’d like to share some details about event logistics and making the most of your time at BADCamp.

Make Sure You Are Registered!

While BADCamp is both awesome and free, signing up for BADCamp helps us plan and ensures you receive event specific information.

 

Want to be Trained? You Need to Sign Up for Free Training

A few last minute cancellations means we have a few seats still available. Sign up soon to reserve your spot!

 

Want to Summit? You Need to Sign up for Summits

Wednesday and Thursday, we'll be hosting great summits that facilitate conversations and connections with people in specific industries or with specific skills. Come dive deep into the issues that matter and collaborate freely. Registration is open and while attendance is free, signing up will ensure you receive summit specific information for the event.

 

Want to Make A Session Schedule?

Are you a super planner? Make your session schedule in advance and then follow along on your mobile device! Take a look at the final session schedule. There are BADCamp sessions spanning the worlds of development, design, strategy, project management, technology communities and everything in between.

 

Join us at the Contribution Lounge for coffee, community, and code!

This is a great chance to help make Drupal 8 bigger and better. The BADCamp Contribution Lounge is at the Hotel Shattuck. The Lounge has Internet access and an ample supply of coffee and water. We're open around the clock from Wednesday, October 18 at 9 am to Saturday, October 21 at 10 pm. Come participate!

 

Parties

BADCamp Party on Friday Night

Come to the official BADCamp party at The Marsh Theatre on Friday night.  Doors open earlier than planned at 6:30 to maximize our time on the ROOF TOP!

Please find our generous sponsors, Platform.sh, Hook42 and Lullabot. Give them a BIG thanks as this party would not have been possible without their generous support.

We will have drink tickets burning a hole in our pocket, so come early and be prepared for a good time. There will be great music, and ample space on the Dance Floor. There will also be tables and quiet areas to chat. For more info...

 

We need your help: Volunteer for BADCamp!

BADCamp is 100% volunteer driven and we need your hands! We need stout hearts to volunteer and help set up, tear down, give directions and so much more!  If you are local and can help us, please contact Manish at [email protected] or sign up on our Volunteer Form.

 

Would you have been willing to pay for your ticket?  

If so, then you can give back to the camp by purchasing an individual sponsorship at the level most comfortable for you. As our thanks, we will be handing out some awesome BADCamp swag including a 2017 edition t-shirt, hoodie and stellar solar charger.

 

Sponsors

A BIG thanks to our sponsors! Without them this magical event wouldn’t be possible. An extra big thanks to Platform.sh, Pantheon & Acquia for sponsoring at the Core level to help keep BADCamp free and awesome.

Oct 13 2017
Oct 13

Recently I've been thinking a lot about what is missing that could help the Drupal project achieve greater success. This was partly in preparation for the Drupal Strategy Summit but also a continuation of research I was already working on.

Many Drupal friendships a created in issue queues, over IRC, Twitter or in Google Hangouts and often across continents. I can think of many personal examples. Maybe you can too?. So it seems incongruous to me that Drupal.org has no community search feature. We are one of the world's biggest communities but no way to find one another.

There is a helpful Where is the Drupal Community?page but it lacks the ability to search of people like me, people having shared interests, shared motivations to contribute, people I can collaborate with.I feel like this is a massive missed opportunity to connect like minds, if such a tool existed new comers are far more likely to have a positive experience and find an outlet for their passion.

I have written a proposal in the Issue Queue for Drupal.org content. If you have thoughts around this feature request, I'd appreciate you joining the conversation.

Oct 13 2017
Oct 13
October 13th, 2017

Welcome to the second episode in our new video series for Emulsify. Emulsify 2.x is a new release that embodies our commitment to component-driven design within Drupal. We’ve added Composer and Drush support, as well as open-source Twig functions and many other changes to increase ease-of-use.

In this video, we’re going to teach you how to create an Emulsify 2.0 starter kit with Drush. This blog post follows the video closely, so you can skip ahead or repeat sections in the video by referring to the timestamps for each section.

PURPOSE [00:15]

This screencast will specifically cover the Emulsify Drush command. The command’s purpose is to setup a new copy of the Emulsify theme.

Note: I used the word “copy” here and not “subtheme” intentionally. This is because the subtheme of your new copy is Drupal Core’s Stable theme, NOT Emulsify.

This new copy of Emulsify will use the human-readable name that your provide, and will build the necessary structure to get you on your way to developing a custom theme.

REQUIREMENTS [00:45]

Before we dig in too deep I recommend that you have the following installed first:

  • a Drupal 8 Core installation
  • the Drush CLI command at least major version 8
  • Node.js preferably the latest stable version
  • a working copy of the Emulsify demo theme 2.X or greater

If you haven’t already watched the Emulsify 2.0 composer install presentation, please stop this video and go watch that one.

Note: If you aren’t already using Drush 9 you should consider upgrading as soon as possible because the next minor version release of Drupal Core 8.4.0 is only going to work with Drush 9 or greater.

RECOMMENDATIONS [01:33]

We recommend that you use PHP7 or greater as you get some massive performance improvements for a very little amount of work.

We also recommend that you use composer to install Drupal and Emulsify. In fact, if you didn’t use Composer to install Emulsify—or at least run Composer install inside of Emulsify—you will get errors. You will also notice errors if npm install failed on the Emulsify demo theme installation.

AGENDA [02:06]

Now that we have everything setup and ready to go, this presentation will first discuss the theory behind the Drush script. Then we will show what you should expect if the installation was successful. After that I will give you some links to additional resources.

BACKGROUND [02:25]

The general idea of the command is that it creates a new theme from Emulsify’s files but is actually based on Drupal Core’s Stable theme. Once you have run the command, the demo Emulsify theme is no longer required and you can uninstall it from your Drupal codebase.

WHEN, WHERE, and WHY? [02:44]

WHEN: You should run this command before writing any custom code but after your Drupal 8 site is working and Emulsify has been installed (via Composer).

WHERE: You should run the command from the Drupal root or use a Drush alias.

WHY: Why you should NOT edit the Emulsify theme’s files. If you installed Emulsify the recommended way (via Composer), next time you run composer update ALL of your custom code changes will be wiped out. If this happens I really hope you are using version control.

HOW TO USE THE COMMAND? [03:24]

Arguments:

Well first it requires a single argument, the human-readable name. This name can contain spaces and capital letters.

Options:

The command has defaults set for options that you can override.

This first is the theme description which will appear within Drupal and in your .info file.

The second is the machine-name; this is the option that allows you to pick the directory name and the machine name as it appears within Drupal.

The third option is the path; this is the path that your theme will be installed to, it defaults to “themes/custom” but if you don’t like that you can change it to any directory relative to your web root.

Fourth and final option is the slim option. This allows advanced users who don’t need demo content or don’t want anything but the bare minimum required to create a new theme.

Note:

Only the human_readable_name is required, options don’t have to appear in any order, don’t have to appear at all, or you can only pass one if you just want to change one of the defaults.

SUCCESS [04:52]

If your new theme was successfully created you should see the successful output message. In the example below I used the slim option because it is a bit faster to run but again this is an option and is NOT required.

The success message contains information you may find helpful, including the name of the theme that was created, the path where it was installed, and the next required step for setup.

THEME SETUP [05:25]

Setting up your custom theme. Navigate to your custom theme on the command line. Type the yarn and watch as pattern lab is downloaded and installed. If the installation was successful you should see a pattern lab successful message and your theme should now be visible within Drupal.

COMPILING YOUR STYLE GUIDE [05:51]

Now that we have pattern lab successfully installed and you committed it to you version control system, you are probably eager to use it. Emulsify uses npm scripts to setup a local pattern lab instance for display of your style guide.

The script you are interested in is yarn start. Run this command for all of your local development. You do NOT have to have a working Drupal installation at this point to do development on your components.

If you need a designer who isn’t familiar with Drupal to make some tweaks, you will only have to give them your code base, have them use yarn to install, and yarn start to see your style guide.

It is however recommended the initial setup of your components is done by someone with background knowledge of Drupal templates and themes as the variables passed to each component will be different for each Drupal template.

For more information on components and templates keep an eye out for our soon to come demo components and screencasts on building components.

VIEWING YOUR STYLE GUIDE [07:05]

Now that you have run yarn start you can open your browser and navigate to the localhost URL that appears in your console. If you get an error here you might already have something running on port 3000. If you need to cancel this script hit control + c.

ADDITIONAL RESOURCES [07:24]

Thank you for watching today’s screencast, we hope you found this presentation informative and enjoy working with Emulsify 2.0. If you would like to search for some additional resources you can go to emulsify.info or github.com/fourkitchens/emulsify.

[embedded content]

Web Chef Chris Martin
Chris Martin

Chris Martin is a support engineer at Four Kitchens. When not maintaining websites he can be found building drones, computers, robots, and occasionally traveling to China.

Oct 13 2017
Oct 13

Have you ever wondered how the text or email or entity reference field is extended in Drupal 8? Or how to create a custom field/widget/formatter so that it can match with the rest of fields in your Drupal application? This blog will cover everything required to extend existing field widgets in Drupal 8 using annotation plugin. 

Many developers, who recently started working on Drupal 8, may not be aware of an entire process so let’s take a closer look to everything step-by-step. Key comparisons between Drupal 7 and Drupal 8, what is an annotation, why annotation and sample use case from an Inline Entity Form step-by-step. After completing this post, you will be able to extend the field with your own methods/functions without disturbing contributed module code.

Note: The below-mentioned functions are used to access properties of available widgets, which are moved to methods on the Widget Plugin Manager.
Comparisons

In Drupal 7

  • hook_field_widget_info() is replaced by annotation-based plugin 
  • hook_field_widget_settings_form(), hook_field_widget_form(), hook_field_widget_error() : replaced by Widget Interface methods

In Drupal 8

  • hook_Info is replaced by annotation-based-plugin
  • public function settings Form(array $form, array &$form_state)
  • public function formElement(FieldItemListInterface $items, $delta, array $element, array &$form, array &$form_state)
  • public function errorElement(array $element, ConstraintViolationInterface $violation, array $form, array &$form_state)

Let’s take a brief look at Annotations as it plays a significant role in extending FieldWidgets.

Annotations are one of the new concepts in Drupal 8, which are written as  PHP code comments above a class or function. Annotations contain metadata about the function or class.

Note: In Drupal, we have an option to add metadata about the functions inside the comment block so that it can automatically take annotations when Field Widgets rendered on the page. It will help in avoiding multiple files to declare the metadata information.

Why Annotations?

The annotation meta-data lives in the same file and is an integral part of the class that implements the plugin. 

This makes it easier to find and to create a new custom plugin by simply copying an existing one.

In addition, Implementation used by Drupal to parse the annotation simply tokenizes the text of the file without including it as a PHP file, so memory use is minimized.

The annotations syntax comes from the Doctrine project, which is nothing but the docblock annotations.

Sample use case from an Inline Entity Form in Drupal 8

Real Entity (node) has been removed from backend when removing contacts field values. Contacts field is an Entity reference field with unlimited cardinality.

By showing this contact with Inline Entity Form - Complex Field Widgets on node edit page.

To avoid this, we need to extend the FieldWidget from an inline entity form - complex. Check out how to it programmatically.

Note: Based on the settings, we can control the original entity deletion from the backend. For this, we can use public function settingsForm(array $form, array &$form_state) to add checkbox “Delete referenced node on remove”.

Create a module folder extend_inline_entity_form

Folder Structure: src/Plugin/Field/FieldWidget/ExtendedInlineEntityForm.php

After enabling the module, you can find a similar interface. Here you need to disable “Delete reference node on remove” checkbox to prevent the original node deletion on the backend. Check out the screenshot below for an example.

Field Widgets

Extending an inline entity form - complex - is nothing, but preventing the deletion of an original node from backend when we remove the entity. With the above-mentioned code, we can unset the widget_state[‘delete’] to stop the deletion of an original node from the backend.

For the above code to work, we need to select the “Delete reference node on remove” checkbox to extend an inline entity form - complex. It can be configurable now! We can add this as a patch to drupal.org or we can contribute this as a sub-module to an inline entity form in Drupal. 

That’s it. Hope you can now extend FieldWidgets using an Annotation plugin. If you have any suggestions or queries, please comment down.

Below given is a presentation on extending Field Widgets in Drupal 8.
 

Oct 13 2017
Oct 13

Config Split treats all the cli the same.

Drupal 8.4 and its upgrade to Symfony 3 has made the compatibility of the global Drush 8 a bit more challenging. Drush 9 works with Drupal 8.4 but it is not stable yet and the format of how third party Drush commands are made has changed significantly.

While Drush 9 comes with a command that helps porting Drush 8 commands you will still end up maintaining very similar code in two places one with calls to drush_confirm('...') and one with $this->io()->confirm('...'). If you decide to also provide your commands for Drupal console you now have three times the burden.

Because we tried to provide the commands for Config Split for both Drush and Drupal console early on we faced this problem already more than a year ago. And now it has paid off because porting the commands to Drush 9 was very quick.

The solution is actually really simple and brings the added benefit of being able to test the business logic of the commands in the absence of Drush or Drupal console. It is all about separating the command discovery from the command logic. Drush 8, 9 and Drupal console all have a bit different ways to discover and invoke commands, but the business logic you want to implement is the same so all we have to do is to extract a common "interface" our custom service can implement and then make the command definitions wrap that and keep things DRY.

The CliService

Config Split defines a config_split.cli service with the class ConfigSplitCliService with all its dependencies injected. It has the methods \Drupal\config_split\ConfigSplitCliService::ioExport and \Drupal\config_split\ConfigSplitCliService::ioImport that implement all the commands logic and delegate the actual importing and exporting to specific methods.

The method signature for both the export and import method are more or less the same: CliService::ioMethod($arguments, $io, callable $t).

  • $arguments: The arguments passed to the command.
  • $io: This is an object that interacts with the command line, in Drush 9 and Drupal console this comes from the Symfony console component, for Drush 8 we created a custom wrapper around drush_confirm and drush_log called ConfigSplitDrush8Io.
  • $t: This is essentially a t function akin to how Drupal translates strings. Because Drupal console translates things differently we had to be a bit creative with that by adding a t method to the command.

Commands wrap the service

The Drush 8 command is essentially:

<?php
function drush_config_split_export($split = NULL) {
 
// Make the magic happen.
 
\Drupal::service('config_split.cli')->ioExport($split, new ConfigSplitDrush8Io(), 'dt');
}
?>

For Drush 9 we can use dependency injection and the Drush 9 command becomes essentially:

<?php
class ConfigSplitCommands extends DrushCommands {
  public function
splitExport($split = NULL) {
   
$this->cliService->ioExport($split, $this->io(), 'dt');
  }
}
?>

And very similar the Drupal console command:

<?php
class ExportCommand extends SplitCommandBase {
  protected function
execute(InputInterface $input, OutputInterface $output) {
   
$this->setupIo($input, $output);
   
// Make the magic happen.
   
$this->cliService->ioExport($input->getOption('split'), $this->getIo(), [$this, 't']);
  }
}
?>

Testing

The ConfigSplitCliServiceTest is a KernelTest which asserts that the export works as expected by exporting to a virtual file system. The test coverage is not 100% (patches welcome) but the most important aspects for the complete and conditional splitting (blacklist/graylist) is thoroughly tested. There are no limitations on what or how you can test your CliService since it is self contained in your module and does not depend on Drush or the Drupal console. For example one could write a unit test with a mocked $io object that asserts that the messages printed to the cli are correct.

Oct 13 2017
Oct 13

Here is another recipe for success. You can have a whole team of websites playing for you, and they don’t have to be created from scratch or managed separately. The secret lies in Drupal’s well-developed multisite functionality. Thanks to this, Drupal will not only let you leave your competitors behind, but also multiply this effect by many times. Let’s see how the multisite feature works, when to choose it, and what benefits it brings. In addition, pick up a couple of useful multisite modules and tips.

The essence of Drupal multisite functionality

According to Drupal multisite architecture, you can have several sites running on the same codebase but having separate databases and configuration settings. They will share the same Drupal core, modules, and themes, but at the same time will be customizable to your liking.

For example, you can enable and disable the shared modules as you please. You are also free to decide whether you want to add some site-specific modules and themes. Welcoming the same users with single sign-on or not, sharing content across websites or not, creating a similar or different website’s look and feel — these are all options you have.

Like siblings in real families, multisites can look and behave alike or each have their own original styles. However, they definitely share their family DNA.

Hence the next main principles to stick to when deciding on multisite architecture: it’s best to choose it if your website family is going to have similar functionality, share important Drupal modules or use the same Drupal distribution. In these cases, it will be especially beneficial and easy to implement.

At least some of Drupal multisite benefits

No extra development costs

By setting up multisite, you save on development costs by not creating related sites from scratch.

Quick launch

Just like you don’t have to pay for creating websites from scratch, you don’t have to wait for it.

Brand consistency

With multisite architecture, it’s much easier to stick to brand guidelines on the design across websites.

Less efforts for website management

You will need less time, efforts and costs for your website administration, because you can do certain actions only on one without repeating them on all the others.

This includes:

  • website upgrades 
  • sharing content
  • making changes

and much more.

A couple of useful multisite tools and tips

Drush

Drush command-line interface is very useful for managing multisite deployments and upgrades. According to drupal.org, it lets you cope with 100 to 1000 websites, and this opportunity is widely used by university websites on Drupal. Multisite is a real salvation for them!

Sites.php

Using the sites/sites.php file in Drupal 7 and the sites/sites.example.php file in Drupal 8 to reference sites will help you more easily migrate between development, testing and production environments.

Contributed modules

Consider some Drupal modules that extra helpful. Thanks to the Domain Access module, you will be able to easily share configuration, users, and content between your affiliated sites. Menu Domain Access lets administrators decide which menu items to hide and which to show to users on selected domains. Apache Solr Multisite Search provides Apache Solr Search function across multiple sites. RobotsTxt will dynamically generate a different robots.txt for each member of your website family — and there also lots of others.

Wrap-up

Enjoy the multiple benefits of Drupal multisite! Our team is always ready to consult you and, of course, perform the smooth multisite setup.

Always stay up-to-date by subscribing to our newsletter!

Oct 12 2017
Oct 12

Platform.sh aims to be a complete solution for web development and hosting, while at the same time offering the flexibility to slot into your own development tools and methodologies. That's a tricky balance to strike at times: Providing a complete solution while offering maximum flexibility at the same time.

One area where we generally favor flexibility is in your local development environment. You can use whatever local development tools you're most comfortable with: MAMP, WAMP, VirtualBox, Docker, or just a native local install of your needed tools.

For those who fear analysis-paralysis from so many choices, though, we've decided to start reviewing and green-lighting recommended tools that we've found work well. And the first local development tool we can recommend is Lando.

Lando is a Docker-based local development environment that grew out of Kalabox, a VirtualBox-based local dev tool for Drupal. Lando is much more flexible and lighter-weight than a virtual machine-based solution, and has direct support for a variety of systems including Drupal, Laravel, Backdrop, and WordPress. It even goes beyond PHP with support for Node.js, Python, and Ruby as well, just as we do.

Like Platform.sh, Lando is controlled by a YAML configuration file. Although being Docker-based it cannot directly mimic how a Platform.sh project works, it can approximate it reasonably well.

We've included a recommended Lando configuration file in our documentation. It's fairly straightforward and easy to adapt for your particular application. It's also possible to synchronize data from a Platform.sh environment to your local Lando instance in just a few short commands. Lando's own documentation provides more details on how to trick out your local system with whatever you may need.

We still believe in allowing you to pick your own development workflow, so you don't have to change anything if you already have a workflow that works for you; if you want our advice, though, Lando is a solid option that should get you up and running locally in minutes, while Platform.sh handles all of your staging and production needs.

Oct 12 2017
Oct 12

An Account Manager writing a blog on something web related… it sounds scary, I know. But technically, I speak client, and I also really enjoy the web/digital side of my job. So, what I want to translate into non-technical speak for you is why your Content Management System (CMS in agency lingo) choice matters.

There are so many CMS options to choose from today, WordPress, Joomla, Typo3, ExpressionEngine, Drupal…. I can keep going… but not all CMS are created equal. In the process of building or rebuilding your website, the design is important, but it’s the CMS that’s going to either make your life easy or cause you to scream and pull your hair out.

As you may know from some of our previous blogs, Drupal is our CMS of choice here at Texas Creative. The question I get asked the most when meeting with clients about a new website is, “What is Drupal going to do for me that something like *insert CMS of choice here* can’t?” Good question. Here are 4 things I have found to be true when it comes to using Drupal as your Content Management System.

1. It’s going to be easy for you and your team to update.

Working at an agency, sometimes we forget that not everyone is as comfortable as we are with technology. A lot of clients think that they need coding experience to make content updates or swap out an image on their website, but Drupal makes it easy for even the most tech-challenged of people to make updates without any coding knowledge.

The backend of Drupal is organized by content types and contains text fields/image upload fields for each content type, which makes it extremely simple to update. So while you may have a page with over 100 different pieces of information listed, editing one of those pieces is one click away and as simple as filling out a form.

2. It’s going to be easy for me to update.

I know what you’re saying at this point, “I don’t have the time nor do I want to make updates myself.” That’s OK, not all of our clients want to do it either. The great thing about Drupal is that it is easy for your Account Manager to make content, image and other minor updates for you. That means if you call with an edit, I can easily complete it without having to send it back to the web team, which makes for a shorter turnaround time. But if I can’t solve the problem for you, I can easily walk to the other side of the office and ask someone who can.

3. It can literally do anything.

You name it; we’ve probably built a site for it in Drupal. From selling high-end luggage products to cattle, Drupal has the flexibility to accommodate all of your website needs. But no matter how complex your site is, our team builds the CMS with the end user in mind, which allows our clients to make updates without the fear of breaking their site.

Here are some links to a few of the projects Drupal has helped us build:

Take Care of Texas

The University of Texas at Austin

The San Antonio Medical Foundation

4.     It’s open source.

I’m slipping into technical speak here (don’t leave, come back), but basically all this means is that Drupal is one big group project. Because Drupal has no license fees, there are thousands of Drupal developers working to build new and better technology all the time. This matters to you as a client because your site is not only portable (aka you’re not handcuffed to one agency) but it is constantly being worked on, which leads to improvements and helps increase your site’s security.  

So, to sum all of this up for you (I told you I’d make this simple), it may seem like an intimidating subject but YOUR CMS IS REALLY IMPORTANT. Whether you go with Drupal or not, you should feel comfortable working in the backend of your site. Your website should be working hard for your brand, you should not be working hard to maintain your website.

Oct 12 2017
Oct 12

Search Engine Optimization (SEO) might not be the first thing you think of when designing a new website, but building an optimized framework from the start will help you drive traffic to your site and keep it there. With our Drupal SEO-checklist in hand you can build an excellent website that draws customers from launch day. Briefly speaking, here is a bullet list of what to check before the launch day. Below we’ll speak about each point in more detail.

  • Check that all web pages have unique titles using the Page Title module

  • Check if XML Sitemap and Google News Sitemap are configured properly

  • Check if Redirect module is enabled and configured

  • Check if Global Redirect module is enabled and configured

  • Check that .htaccess redirect to site with/without www

  • Check that the homepage title includes a slogan, and is descriptive for the function of the site

  • Check if Meta Tags is filled with descriptive information

  • Check that OG tags are filled correctly and with descriptive information.

  • Check if site's information appears well when shared on Facebook

  • Check if Path aliases patterns are meaningful

  • Check if Google Analytics is enabled and configured

  • Check if Page Title module is enabled and configured

  • Check if Google News Sitemap is enabled and configured

  • Check if Site verification is enabled and configured

  • Check if Search 404 module is enabled and configured

Drupal SEO: 12 Things that Will Improve Your Site's Ranking

Check that all web pages have unique titles...

...and make sure to write them correctly. All of your pages should be easily identifiable to the end user. Not only should they have unique titles, they should have meaningful titles. Having multiple pages with the same titles (like “Get in touch”, “Contact us” and “Make a booking”) will simply confuse your end users and search engine crawlers.

 

Not only do good page titles help customers who are already on your site, but they help with social sharing, and picking your site out of search engine results. Titles are the first element that any user will see, whether they come directly to your site, find it in a search engine, or see it shared on social media.

Writing good titles is extremely important, and having keywords in your title that match a user's search greatly improves the chances of them clicking on your page.

Ensuring all your pages have a unique name will help users navigate, boost your SEO ratings, and increase the chances that someone will type the right keywords into a search engine to bring them to your site.

You can set up unique page titles much easier if you install the Drupal Page Title module.

10 Drupal Modules that Will Boost Your Website’s SEO

 

Check if XML Sitemap and Google News Sitemap are configured properly

The XML Sitemap module creates a robot friendly map of your site that Google and other search engines can crawl to categorise your website. There are a few settings you can alter for your site at admin/config/search/xmlsitemap and you can view the sitemap from http://yoursite.com/sitemap.xml.

You should configure XML Sitemap early in your site build for the best effect, but you can also alter the settings later on if needed.

Google News Sitemap offers a similar but different service that creates a Google specific map - as suggested in the name. These two modules work nicely side by side to make your site easy for search engines to crawl and index.

Please note that if your site contains AMPs, there is no need to create sitemaps for them. The rel=amphtml link is enough for Google to pick up on the accelerated mobile page version, which means you can easily gain traffic from Top Stories carousels and mobile search. Creating AMP on your Drupal site became easy with our step-by-step guide.

 

 

Check if Redirect module is enabled and configured

Redirect is a handy module for making sure users always make it to your site. It uses case-insensitive matching to help catch broken links with redirects and tracks how often users are hitting those redirects. You can use redirects to capture any broken links, set up promotional links, or simply capture typos users are entering when trying to access your site.

 

Check if Global Redirect module is enabled and configured

If you’re using Drupal 8 you can skip this one because the functionality has been rolled into the redirect module. Otherwise install Global Redirect to work in tandem with Redirect to catch any broken links. Global Redirect will test all links with and without a trailing slash, ensure links are case-insensitive and if a link is truly broken it will return a user to your home page, rather than an ugly 404 page that decrease the position of your site in SERPs.

Check that .htaccess redirects to site with/without www

Some users attempting to visit your site will navigate to www.yoursite.com, while others will simply type yoursite.com. By setting up your site to handle either request you can be sure you won’t miss any visitors.

 

 

Check that the homepage title includes a headline, logo and primary image and is descriptive for the function of the site

The headline as well as the slogan represent who you are as a business. Make your first impression a good one as this will also be visible on search engines. This is a good opportunity to stack your website with SEO friendly keywords, but don’t go overboard and sacrifice your image for it - keyword stuffing may not only decrease the trust index of your site, but also its conversion rates.

Ensure Metatags are filled with descriptive information

Writing SEO-optimized metatags is highly important, because they remain one of the top on-page ranking factors. Make sure to install the Metatag module on your site to have an easy, user friendly interface for updating metadata. With the module installed you can easily populate metadata with keywords, page descriptions, and more.

SEO tips for your Drupal site

The Metatag module will also give you extra control over how your site appears when shared on Twitter or Facebook.

Check that OG tags are filled correctly and with descriptive information.

OG tags are metatags specifically designed to ensure your site communicates nicely with Facebook. By setting these tags correctly you will be able to control exactly how your site appears on Facebook, including what images and what taglines are used.

Check if site's information appears well when shared on Facebook and Twitter

After configuring the metatag module and OG tags, pop over to Facebook and make sure that your site shares the way you would like it too. It’s important to test this out now before users start sharing your site around.

Similarly try tweeting a couple of your pages to see how well your Twitter Cards come through. If you don’t want to show your site to your audience until you are sure it is set up properly, you can check Twitter Cards using the Card Validator.

For more information on configuring Twitter cards, check out the Twitter user guides.

Check if Path aliases patterns are meaningful

By default Drupal will set your URLs to node/123 - while this works great for the database back end, it doesn’t work well for your end users, or for search engines.

You can use the Pathauto module to create rules and patterns for your URLs that will significantly cut down on your maintenance times and simplify your site navigation.

Check if Google Analytics is enabled and configured

While having Google Analytics configured won’t improve your SEO, it will give you all the data you need to understand where your users are coming from and how they behave once they hit your site.

Installing the Google Analytics module makes setting up and configuring Google Analytics a breeze.

Check if Site verification is enabled and configured

The Site verification module makes it easy to check the boxes that tell search engines that your site is truly yours. Having your site verified will improved how search engines crawl your site, and for Google will allow you to access private search data. With site verification you will receive better data and better search engine rankings for just a few minutes work.

Check if Search 404 module is enabled and configured

The Search 404 module is a saving grace for reducing your bounce rate, your SEO and improving your customer experience. Instead of your users finding an ‘Error: Page not Found” in place of the content they were hoping for, they will be offered a search of your site based on the URL string. For example if “www.yoursite.com/great-seo-tips” doesn’t exist, users this module will automatically search your site for ‘Great SEO tips” and show the users the results.

 

 

Bottom line

While SEO may seem like a tricky subject to wrap your head around, the basics are easy with the right modules and the right guidance. Drupal is a great content management system for building search engine optimized websites.

With our SEO checklist you can get off on the right foot, and here at Vardot we love educating our customers to build top quality websites. If you’re looking for even more ways to improve your sites SEO, have a look at SEO articles in our blog or get in touch with us.

Oct 12 2017
Oct 12

In my previous post, Modern Decoupling is More Performant, we discussed how saving HTTP round-trips has a very positive impact on performance. In particular, we demonstrated how the JSON API module could help your application by returning multiple entities in a single request. Doing so eliminates the need for making an individual request per entity. However, this is only possible when fetching entities, not when writing data and only if those entities are related to the entry point (a particular entity or collection).

Sometimes you can solve this problem by writing a custom resource in the back-end every time, but that can lead to many custom resources, which impacts maintainability and is tiresome. If your API is public and you don’t have prior knowledge of what the consumers are going to do with it, it’s not even possible to write these custom endpoints.

The Subrequests module completes that idea by allowing ANY set of requests to be aggregated together. It can aggregate them even when one of them depends on a previous response. The module works with any request, it's not limited to REST or any other constraint. For simplicity, all the examples here will make requests to JSON API.

Why Do We Need It?

The main concept of the Subrequests module is that instead of sending multiple requests to your Drupal instance we will only send a single request. In this master request, we will provide the information about the requests we need to make in a JSON document. We call this document blueprint.

A blueprint is a JSON document containing the instructions for Drupal to make all those requests in our name. The blueprint document contains a list of subrequest objects. Each subrequest object contains the information about a single request being aggregated in the blueprint.

Imagine that our consumer application has a decoupled editorial interface. This editorial interface contains a form to create an article. As part of the editorial experience, we want the form to create the article and a set of tags in the Drupal back-end.

Without using Subrequests, the consumer application should execute the following requests when the form is submitted:

  • Query Drupal to find the UUID for the tags vocabulary.
  • Query Drupal to find the UUID of the user, based on the username present in the editorial app.
  • Create the first tag in the form using the vocabulary UUID.
  • Create the second tag in the form using the vocabulary UUID.
  • Create the article in the form using the user UUID and the newly created tags.

We can query for the user and the vocabulary in parallel. Once that is done, and using the information in the vocabulary response, we can create the tag entities. Once those are created, we can finally create the article. In total, we would be making five requests at three sequential levels. And, this is not even a complex example!

Sequential requestsSequential requests introduce a performance penalty.

A JavaScript pseudo-code for the form submission handler could look like:

console.log('Article creation started…');
Promise.all([
  httpRequest('GET', 'https://cms.contentacms.io/api/vocabularies?filter[vid-filter][condition][path]=vid&filter[vid-filter][condition][value]=tags'),
  httpRequest('GET', 'https://cms.contentacms.io/api/users?filter[admin][condition][path]=name&filter[admin][condition][value]=admin'),
])
  .then(res => {
    const [vocab, user] = res;
    return Promise.all([
      Promise.resolve(user),
      httpRequest('POST', 'https://cms.contentacms.io/api/tags', bodyForTag1, headers),
      httpRequest('POST', 'https://cms.contentacms.io/api/tags', bodyForTag2, headers),
    ])
  })
  .then(res => {
    const [user, tag1, tag2] = res;
    const body = buildBodyForArticle(formData, user, tag1, tag2);
    return httpRequest('POST', 'https://cms.contentacms.io/api/articles', body, headers);
  })
  .then(() => {
    console.log('Article creation finished!');
  });

Using Subrequests

Our goal is to have JavaScript pseudo-code that looks like:

console.log('Article creation started…');
const blueprint = buildBlueprint(formData);
httpRequest('POST', 'https://cms.contentacms.io/api/subrequests?_format=json', blueprint, headers)
  .then(() => {
    console.log('Article creation finished!');
  });

We've reduced our application code to a single POST request that contains a blueprint in the request body. We have reduced the problem to the blueprint creation. That is a big improvement in the developer experience of consumer applications.

Sequential requests as processed by SubrequestsSequential requests as processed by Subrequests avoid unnecessary round trips.

Parallel Requests

In our current task we need to perform two initial HTTP requests that can be run in parallel:

  • Query Drupal to find the UUID for the tags vocabulary.
  • Query Drupal to find the UUID of the user based on the username in the editorial app.

That translates to the following blueprint:

[
  {
    "requestId": "vocabulary",
    "action": "view",
    "uri": "/api/vocabularies?filter[vid-filter][condition][path]=vid&filter[vid-filter][condition][value]=tags",
    "headers": ["Accept": "application/vnd.application+json"]
  },
  {
    "requestId": "user",
    "action": "view",
    "uri": "/api/users?filter[admin][condition][path]=name&filter[admin][condition][value]=admin",
    "headers": ["Accept": "application/vnd.application+json"]
  }
]

For each subrequest, we can observe that we are providing four keys:

  • requestId A string used to identify the subrequest. This is an arbitrary value generated by the consumer application.
  • action Identifies the action being performed. A "view" action will generate a GET request. A "create" action will generate a POST request, etc.
  • uri The URL where the subrequest will be sent .
  • headers An object containing the headers specific for this subrequest.

The response to this blueprint (after adjusting the permissions in Drupal to view users and vocabularies) will return the response to both subrequests:

{
    "vocabulary": {
        "headers": {
            "content-id": ["<vocabulary>"],
            "status": [200]
        },
        "body": "{\"data\":[{\"type\":\"vocabularies\",\"id\":\"47ce8895-0df6-44a4-af43-9ef3b2a924dd\",\"attributes\":{\"status\":true,\"dependencies\":{\"module\":[\"recipes_magazin\"]},\"_core\":\"HJlsFfKP4PFHK1ub6QCSNFmzAnGiBG7tnx53eLK1lnE\",\"name\":\"Tags\",\"vid\":\"tags\",\"description\":\"Use tags to group articles on similar topics into categories.\",\"hierarchy\":0,\"weight\":0},\"links\":{\"self\":\"http:\\/\\/localhost\\/api\\/vocabularies\\/47ce8895-0df6-44a4-af43-9ef3b2a924dd\"}}],\"links\":{\"self\":\"http:\\/\\/localhost\\/api\\/vocabularies?filter%5Bvid-filter%5D%5Bcondition%5D%5Bpath%5D=vid\\u0026filter%5Bvid-filter%5D%5Bcondition%5D%5Bvalue%5D=tags\"}}"
    },
    "user": {
        "headers": {
            "content-id": ["<user>"],
            "status": [200]
        },
        "body": "{\"data\":[{\"type\":\"users\",\"id\":\"a0b7af80-e319-4271-899f-f151d3fbfc8e\",\"attributes\":{\"internalId\":1,\"name\":\"admin\",\"mail\":\"[email protected]\",\"timezone\":\"Europe\\/Madrid\",\"isActive\":true,\"createdAt\":\"2017-09-15T15:47:26+0200\",\"updatedAt\":\"2017-09-15T20:06:15+0200\",\"access\":1505565434,\"lastLogin\":\"2017-09-15T20:06:07+0200\"},\"relationships\":{\"roles\":{\"data\":[]}},\"links\":{\"self\":\"http:\\/\\/localhost\\/api\\/users\\/a0b7af80-e319-4271-899f-f151d3fbfc8e\"}}],\"links\":{\"self\":\"http:\\/\\/localhost\\/api\\/users?filter%5Badmin%5D%5Bcondition%5D%5Bpath%5D=name\\u0026filter%5Badmin%5D%5Bcondition%5D%5Bvalue%5D=admin\"}}"
    }
}

In the (simplified) response above we can see that for each subrequest, we have one key in the response object. That key is the same as our requestId in the blueprint. Each one of the subresponses contains the information about the response headers and the response body. Note how the response body is an escaped JSON object.

This blueprint is not sufficient to create an article with two tags, but it's a great start. Let's build on top of that to create the tags and the article.

Dependent Requests

The next task we need to execute is the creation of the two tag entities:

  • Create the first tag in the form using the vocabulary UUID.
  • Create the second tag in the form using the vocabulary UUID.

To do this, we will need to expand the blueprint. However, we don't know the vocabulary UUID at the time we are writing the blueprint. What we do know is that the vocabulary UUID will be in the subresponse to the vocabulary subrequest. In particular, we can find the UUID in data[0].id.

We will use that information to create a blueprint that can create tags. Since we don't know the actual value of the vocabulary UUID, we will use a replacement token. At some point, during the blueprint processing by Drupal, the token will be resolved to the actual UUID value.

Replacement Tokens

We can use replacement tokens anywhere in the body or the URI of our subrequests. For those to be resolved, a token needs to be formatted in the following way:

{{<requestId>.<"body"|"headers">@<json-path-expression>}}

In particular, the replacement token for our vocabulary UUID will be:

{{[email protected]$.data[0].id}}

What this replacement says is:

  1. Use the subresponse for the vocabulary subrequest.
  2. Take the body from that subresponse.
  3. Extract the string under data[0].id, by executing the JSON Path expression $.data[0].id. You can execute any JSON Path expression as long as it returns a string. JSON Path is a very powerful way to extract data from an arbitrary JSON object, in our case the body in subresponse to the vocabulary subrequest.

This is what our blueprint looks like after adding the subrequests to create the tag entities. Note the presence of the replacement tokens:

[
  {
    "requestId": "vocabulary",
    "action": "view",
    "uri": "/api/vocabularies?filter[vid-filter][condition][path]=vid&filter[vid-filter][condition][value]=tags",
    "headers": {"Accept": "application/vnd.api+json"}
  },
  {
    "requestId": "user",
    "action": "view",
    "uri": "/api/users?filter[admin][condition][path]=name&filter[admin][condition][value]=admin",
    "headers": {"Accept": "application/vnd.api+json"}
  },
  {
    "action": "create",
    "requestId": "tags-1",
    "body": "{\"data\":{\"type\":\"tags\",\"attributes\":{\"name\":\"My First Tag\"},\"relationships\":{\"vocabulary\":{\"data\":{\"type\":\"vocabularies\",\"id\":\"{{vo[email protected]$.data[0].id}}\"}}}}}",
    "uri": "/api/tags",
    "headers": {"Content-Type": "application/vnd.api+json"},
    "waitFor": ["vocabulary"]
  },
  {
    "action": "create",
    "requestId": "tags-2",
    "body": "{\"data\":{\"type\":\"tags\",\"attributes\":{\"name\":\"My Second Tag\",\"description\":null},\"relationships\":{\"vocabulary\":{\"data\":{\"type\":\"vocabularies\",\"id\":\"{{[email protected]$.data[0].id}}\"}}}}}",
    "uri": "/api/tags",
    "headers": {"Content-Type": "application/vnd.api+json"},
    "waitFor": ["vocabulary"]
  }
]

Note that to use a replacement token in a subrequest, we need to add a dependency on the subresponse that contains the information. That's why we added the waitFor key in our tag subrequests.

Finishing the Blueprint

Subrequests process

Using the same principles that we used for the tags we can add the subrequest for:

  • Create the article in the form using the user UUID and the newly created tags.

That will leave our completed blueprint as:

[
  {
    "requestId": "vocabulary",
    "action": "view",
    "uri": "/api/vocabularies?filter[vid-filter][condition][path]=vid&filter[vid-filter][condition][value]=tags",
    "headers": {"Accept": "application/vnd.api+json"}
  },
  {
    "requestId": "user",
    "action": "view",
    "uri": "/api/users?filter[admin][condition][path]=name&filter[admin][condition][value]=admin",
    "headers": {"Accept": "application/vnd.api+json"}
  },
  {
    "action": "create",
    "requestId": "tags-1",
    "body": "{\"data\":{\"type\":\"tags\",\"attributes\":{\"name\":\"My First Tag\"},\"relationships\":{\"vocabulary\":{\"data\":{\"type\":\"vocabularies\",\"id\":\"{{[email protected]$.data[0].id}}\"}}}}}",
    "uri": "/api/tags",
    "headers": {"Content-Type": "application/vnd.api+json"},
    "waitFor": ["vocabulary"]
  },
  {
    "action": "create",
    "requestId": "tags-2",
    "body": "{\"data\":{\"type\":\"tags\",\"attributes\":{\"name\":\"My Second Tag\",\"description\":null},\"relationships\":{\"vocabulary\":{\"data\":{\"type\":\"vocabularies\",\"id\":\"{{[email protected]$.data[0].id}}\"}}}}}",
    "uri": "/api/tags",
    "headers": {"Content-Type": "application/vnd.api+json"},
    "waitFor": ["vocabulary"]
  },
  {
    "action": "create",
    "requestId": "article",
    "headers": {"Content-Type": "application/vnd.api+json"},
    "body": "{\"data\":{\"type\":\"articles\",\"attributes\":{\"body\":\"Custom value\",\"default_langcode\":\"1\",\"langcode\":\"en\",\"promote\":\"1\",\"status\":\"1\",\"sticky\":\"0\",\"title\":\"Article Created via Subrequests!\"},\"relationships\":{\"tags\":{\"data\":[{\"id\":\"{{[email protected]$.data.id}}\",\"type\":\"tags\"},{\"id\":\"{{[email protected]$.data.id}}\",\"type\":\"tags\"}]},\"type\":{\"data\":{\"id\":\"article\",\"type\":\"contentTypes\"}},\"owner\":{\"data\":{\"id\":\"{{[email protected]$.data[0].id}}\",\"type\":\"users\"}}}}}",
    "uri": "/api/articles",
    "waitFor": ["user", "tags-1", "tags-2"]
  }
]

More Powerful Replacements

Imagine that instead of creating an article for a single user, we wanted to create an article for each one of the users on the site. We cannot write a simple blueprint, like the one above, since we don't know how many users there are in the Drupal site. Hence, we cannot write an article creation subrequest for each user.

To solve this problem we can tweak the user subrequest, so instead of returning a single user it returns all the users in the site:

[
  …
  {
    "requestId": "user",
    "action": "view",
    "uri": "/api/users",
    "headers": {"Accept": "application/vnd.api+json"}
  },
  …
]

Then in our replacement tokens, we can write a JSON Path expression that will return a list of user UUIDs, instead of a single string. Subrequests will accept JSON Path expressions that return either strings or an array of strings for the replacement tokens.

In our article creation subrequest we will need to change {{[email protected]$.data[0].id}} by {{[email protected]$.data[*].id}}. The Subrequests module will create a duplicate of the article subrequest for each replacement item. In our case this will have the effect of having a copy of the article creation subrequest per each available user in the user subresponse.

The Final Response

The modified blueprint that generates one article per user will have a response like:

Six articles are returned from a single subrequestSix articles are returned from a single subrequest.

We can see how a single subrequest can generate n subresponses, and we can use each one of those to generate n other subresponses, etc. This highlights how powerful this technique is. In addition, we have seen that we can combine different type of operations. In our example, we mixed GET and POST in a single blueprint (to get the vocabulary and create the new tags).

Conclusion

Sub requests is a great way to fetch or write many resources in a single HTTP request. This allows us to improve performance significantly while maintaining almost the same flexibility that custom code provides.

Further Your Understanding

If you want to know more about the blueprint format you can read the specification. The Subrequests module comes with a JSON schema that you can use to validate your blueprint. You can find the schema here.

The hero image was downloaded from Frankenphotos and use without modifications with a CC BY 3.0 license.

Oct 12 2017
Oct 12

With cybercrime on the rise, securing data in Drupal has become a hot topic for developers and project stakeholders alike.

In our latest webinar, we were joined by three Drupal security experts from Townsend Security, Lockr and Mediacurrent who shared their approach for building a secure groundwork to protect site data in Drupal.

Top 4 Takeaways 

1. An introduction to  "security by design" and how businesses should be thinking about security.  

2. The right tools to conduct a site security audit. 

3. A deep dive into data encryption and key management.

4. Resources to improve security -  and how modules like Guardr can help protect private data in Drupal.
 

View the Webinar 

 To learn more about this topic, check out their presentation slides below or watch the webinar recording.

About the Speakers

Mark Shropshire, Open Source Security Lead at Mediacurrent
Mark brings 20 years of experience leading technical teams to his role. He is also the maintainer of the Guardr Drupal security module suite.

Chris Teitzel, Founder/CEO at Lockr 
Chris has been working in Drupal for almost 8 years. During that time he has worked on projects spanning the globe in front-end design, e-commerce and security. His passion is for making technology accessible to all skill sets. 

Luke Probasco, Drupal GM at Townsend Security 
Luke manages Drupal business for Townsend Security. He's a DrupalCon, Camp, and Summit speaker, security professional and music enthusiast.

Additional Resources
Mediacurrent Dropcast Episode 34: Security! 
Security Dropcast Transcript | Townsend Security Blog 
Raising the Security Bar with Drupal | Drupalcon Presentation Recording 
Guardr for Drupal 8: Meeting Enterprise Security Requirements | Mediacurrent Blog

Oct 11 2017
Oct 11

Four months ago, I shared that Acquia was on the verge of a shift equivalent to the decision to launch Acquia Fields and Drupal Gardens in 2008. As we entered Acquia's second decade, we outlined a goal to move from content management to data-driven customer journeys. Today, Acquia announced two new products that support this mission: Acquia Journey and Acquia Digital Asset Manager (DAM).

Last year on my blog, I shared a video that demonstrated what is possible with cross-channel user experiences and Drupal. We showed a sample supermarket chain called Gourmet Market. Gourmet Market wants its customers to not only shop online using its website, but to also use Amazon Echo or push notifications to do business with them. The Gourmet Market prototype showed an omnichannel customer experience that is both online and offline, in store and at home, and across multiple digital touchpoints. The Gourmet Market demo video was real, but required manual development and lacked easy customization. Today, the launch of Acquia Journey and Acquia DAM makes building these kind of customer experiences a lot easier. It marks an important milestone in Acquia's history, as it will accelerate our transition from content management to data-driven customer journeys.

A continuous journey across multiple digital touch points and devices

Introducing Acquia Journey

I've written a great deal about the Big Reverse of the Web, which describes the transition from "pull-based" delivery of the web, meaning we visit websites, to a "push-based" delivery, meaning the web comes to us. The Big Reverse forces a major re-architecture of the web to bring the right information, to the right person, at the right time, in the right context.

The Big Reserve also ushers in the shift from B2C to B2One, where organizations develop a one-to-one relationship with their customers, and contextual and personalized interactions are the norm. In the future, every organization will have to rethink how it interacts with customers.

Successfully delivering a B2One experience requires an understanding of your user's journey and matching the right information or service to the user's context. This alone is no easy feat, and many marketers and other digital experience builders often get frustrated with the challenge of rebuilding customer experiences. For example, although organizations can create brilliant campaigns and high-value content, it's difficult to effectively disseminate marketing efforts across multiple channels. When channels, data and marketing software act in different silos, it's nearly impossible to build a seamless customer experience. The inability to connect customer profiles and journey maps with various marketing tools can result in unsatisfied customers, failed conversion rates, and unrealized growth.

An animation showing Acquia's journey building solution

Acquia Journey delivers on this challenge by enabling marketers to build data-driven customer journeys. It allows marketers to easily map, assemble, orchestrate and manage customer experiences like the one we showed in our Gourmet Market prototype.

It's somewhat difficult to explain Acquia Journey in words — probably similar to trying to explain what a content management system does to someone who has never used one before. Acquia Journey provides a single interface to define and evaluate customer journeys across multiple interaction points. It combines a flowchart-style journey mapping tool with unified customer profiles and an automated decision engine. Rules-based triggers and logic select and deliver the best-next action for engaging customers.

One of the strengths of Acquia Journey is that it integrates many different technologies, from marketing and advertising technologies to CRM tools and commerce platforms. This makes it possible to quickly assemble powerful and complex customer journeys.

Implementing getBestNextExperience() creates both customer and business value

Acquia Journey will simplify how organizations deliver the "best next experience" for the customer. Providing users with the experience they not only want, but expect will increase conversion rates, grow brand awareness, and accelerate revenue. The ability for organizations to build more relevant user experiences not only aligns with our customers' needs but will enable them to make the biggest impact possible for their customers.

Acquia's evolving product offering also puts control of user data and experience back in the hands of the organization, instead of walled gardens. This is a step toward uniting the Open Web.

Introducing Acquia Digital Asset Manager (DAM)

Digital asset management systems have been around for a long time, and were originally hosted through on-premise servers. Today, most organizations have abandoned on-premise or do-it-yourself DAM solutions. After listening to our customers, it became clear that large organizations are seeking a digital asset management solution that centralizes control of creative assets for the entire company.

Many organizations lack a single-source of truth when it comes to managing digital assets. This challenge has been amplified as the number of assets has rapidly increased in a world with more devices, more channels, more campaigns, and more personalized and contextualized experiences. Acquia DAM provides a centralized repository for managing all rich media assets, including photos, videos, PDFs, and other corporate documents. Creative and marketing teams can upload and manage files in Acquia DAM, which can then be shared across the organization. Graphic designers, marketers and web managers all have a hand in translating creative concepts into experiences for their customers. With Acquia DAM, every team can rely on one dedicated application to gather requirements, share drafts, consolidate feedback and collect approvals for high-value marketing assets.

On top of Drupal's asset and media management capabilities, Acquia DAM provides various specialized functionality, such as automatic transcoding of assets upon download, image and video mark-up during approval workflows, and automated tagging for images using machine learning and image recognition.

A screenshot of Acquia's Digital Asset Management solutionBy using a drag-and-drop interface on Acquia DAM, employees can easily publish approved assets in addition to searching the repository for what they need.

Acquia DAM seamlessly integrates with both Drupal 7 and Drupal 8 (using Drupal's "media entities"). In addition to Drupal, Acquia DAM is built to integrate with the entirety of the Acquia Platform. This includes Acquia Lift and Acquia Journey, which means that any asset managed in the Acquia DAM repository can be utilized to create personalized experiences across multiple Drupal sites. Additionally, through a REST API, Acquia DAM can also be integrated with other marketing technologies. For example, Acquia DAM supports designers with a plug in to Adobe Creative Cloud, which integrates with Photoshop, InDesign and Illustrator.

Acquia's roadmap to data-driven customer journeys

Some of the most important market trends in digital for 2017

Throughout Acquia's first decade, we've been primarily focused on providing our customers with the tools and services necessary to scale and succeed with content management. We've been very successful with helping our customers scale and manage Drupal and cloud solutions. Drupal will remain a critical component to our customer's success, and we will continue to honor our history as committed supporters of open source, in addition to investing in Drupal's future.

However, many of our customers need more than content management to be digital winners. The ability to orchestrate customer experiences using content, user data, decisioning systems, analytics and more will be essential to an organization's success in the future. Acquia Journey and Acquia DAM will remove the complexity from how organizations build modern digital experiences and customer journeys. We believe that expanding our platform will be good not only for Acquia, but for our partners, the Drupal community, and our customers.

Acquia's product strategy for 2017 and beyond
Oct 11 2017
Oct 11

Seems like just yesterday since we held DrupalCon in Dublin, now we're back with our annual Drupal Camp Dublin.

This year's Drupal Camp Dublin has a great line up of speakers from Ireland and abroad, covering such topics as:

  • Building multi-lingual, multi-region websites (Stella Power)
  • Working as a developer with attention-deficit disorder - add (Levi Govaerts)
  • Planning for disruptions (Jochen Lillich)
  • Migrating from Drupal 4 to 5 to 6 to 7 to 8 (Alan Burke)
  • Automating deployments (Luis Rodriguez)
  • Working webform and commerce and paragraphs and display suites and more (Chandeep Khosa)
  • Live debugging a site that's giving issues (Anthony Lindsay)
  • Deploy with Fabric, and test driven development (Oliver Davies)
  • Design in the Browser (yours truly, me, Mark Conroy)
  • Teaching web development at third level (Ruairi O'Reilly)
  • The QA process (Daniel Shaw)
  • Getting started with Docker (Ed Crompton)
  • The new theme coming to Drupal core (Mark Conroy)

And then there's some socials, and our Drupal Ireland AGM, and at least one other talk not announced yet, and ... you get the idea.

The full schedule is available on our website. There are some tickets left (only €20), get them before they are all gone.

Oct 11 2017
Oct 11

by David Snopek on October 11, 2017 - 1:37pm

Today, there was a Moderately Critical security advisory for an Access Bypass vulnerability in the netFORUM Authentication module for Drupal 7:

netFORUM Authentication - Moderately critical - Access Bypass - SA-CONTRIB-2017-077

The module was bypassing protections on the Drupal 7 user login form, to deter brute force attempts to login to the site, and so was an Access Bypass vulnerability by making login less secure when using this module.

However, Drupal 6 (including Pressflow 6) don't have these same protections for the user login form, and so, using this module is no less secure than using vanilla Drupal 6. Of course, these protections could be added to this module, and while this would be great security hardening, this doesn't represent a vulnerability - only a weakness which is also present (and widely known) in Drupal 6 core.

If you'd like all your Drupal 6 modules to receive security updates and have the fixes deployed the same day they're released, please check out our D6LTS plans.

Note: if you use the myDropWizard module (totally free!), you'll be alerted to these and any future security updates, and will be able to use drush to install them (even though they won't necessarily have a release on Drupal.org).

Oct 11 2017
Oct 11

Mediacurrent has been selected as finalists for the 2017 Acquia Engage Awards in the categories of Financial Services, Travel and Tourism, and Digital Experience. These awards recognize the amazing sites and digital experiences that leading digital agencies are building with the Acquia Platform.

Check out our three finalists!
 

Category: Travel and Tourism

Finalist: Brand USA
URL: visittheusa.com

Leveraging the powerful personalization features of Acquia Lift, Mediacurrent partnered the organization behind the official travel site of the USA to launch a new multilingual site on Drupal 8. Read more about the project in our case study.

 

Category: Financial Services

Finalist: MagMutual Insurance
URL: magmutual.com

After a complete transformation of their digital experience, MagMutual saw an increase in acquisition, engagement, and conversions. "I really like that Mediacurrent has a broad skill set - including SEO, UX/UI and Strategy,” said MagMutual CIO Sallie Graves. Read more about the project in our case study.

 

Category: Best Digital Experience

Finalist: The Commonwealth of Massachusetts
URL: mass.gov

Mediacurrent is among the four leading digital agencies that are working to transform Mass.gov, improving the user experience for residents of the Commonwealth of Massachusetts to interact with state government online.

“Building ambitious digital experiences has been a central focus in the Drupal community this year — a vision that we at Mediacurrent have fully embraced.These industry awards are a great validation of our team’s creativity, strategy, and technical strengths that drive some of the world’s most notable Drupal sites.”

- Dave Terry,  Mediacurrent co-founder
 

Acquia Awards History

In 2015, Weather.com won for best partner site of the year and in 2016, our work on Curtiss-Wright earned a finalist nod in the High Technology category.

For more on the 2017 finalists, see our press release.
 

Meet us at Engage

Winners will be announced next at Acquia Engage in Boston (October 16-18, 2017). As a sponsor of the Engage event, several Mediacurrent team members - Dave Terry, Josh Linard, Megan Hofmeyer and Kevin Basarab - will be in attendance. We’re looking forward to learning about the challenges and opportunities that Drupal users will face in the coming year, and anticipate some lively discussion around decoupling, personalization, user experience, and more.

Will you be at Engage? Contact us to set up a time to network. See you in Boston!

Oct 11 2017
Oct 11

This blog has been re-posted with permission from Dries Buytaert's blog. Please leave your comments on the original post.

Drupal looking to adopt React

Last week at DrupalCon Vienna, I proposed adding a modern JavaScript framework to Drupal core. After the keynote, I met with core committers, framework managers, JavaScript subsystem maintainers, and JavaScript experts in the Drupal community to discuss next steps. In this blog post, I look back on how things have evolved, since the last time we explored adding a new JavaScript framework to Drupal core two years ago, and what we believe are the next steps after DrupalCon Vienna.

As a group, we agreed that we had learned a lot from watching the JavaScript community grow and change since our initial exploration. We agreed that today, React would be the most promising option given its expansive adoption by developers, its unopinionated and component-based nature, and its well-suitedness to building new Drupal interfaces in an incremental way. Today, I'm formally proposing that the Drupal community adopt React, after discussion and experimentation has taken place.

Two years ago, it was premature to pick a JavaScript framework

Three years ago, I developed several convictions related to "headless Drupal" or "decoupled Drupal". I believed that:

  1. More and more organizations wanted a headless Drupal so they can use a modern JavaScript framework to build application-like experiences.
  2. Drupal's authoring and site building experience could be improved by using a more modern JavaScript framework.
  3. JavaScript and Node.js were going to take the world by storm and that we would be smart to increase the amount of JavaScript expertise in our community.

(For the purposes of this blog post, I use the term "framework" to include both full MV* frameworks such as Angular, and also view-only libraries such as React combined piecemeal with additional libraries for managing routing, states, etc.)

By September 2015, I had built up enough conviction to write several long blog posts about these views (post 1, post 2, post 3). I felt we could accomplish all three things by adding a JavaScript framework to Drupal core. After careful analysis, I recommended that we consider React, Ember and Angular. My first choice was Ember, because I had concerns about a patent clause in Facebook's open-source license (since removed) and because Angular 2 was not yet in a stable release.

At the time, the Drupal community didn't like the idea of picking a JavaScript framework. The overwhelming reactions were these: it's too early to tell which JavaScript framework is going to win, the risk of picking the wrong JavaScript framework is too big, picking a single framework would cause us to lose users that favor other frameworks, etc. In addition, there were a lot of different preferences for a wide variety of JavaScript frameworks. While I'd have preferred to make a bold move, the community's concerns were valid.

Focusing on Drupal's web services instead

By May of 2016, after listening to the community, I changed my approach; instead of adding a specific JavaScript framework to Drupal, I decided we should double down on improving Drupal's web service APIs. Instead of being opinionated about what JavaScript framework to use, we would allow people to use their JavaScript framework of choice.

I did a deep dive on the state of Drupal's web services in early 2016 and helped define various next steps (post 1, post 2, post 3). I asked a few of the OCTO team members to focus on improving Drupal 8's web services APIs; funded improvements to Drupal core's REST API, as well as JSON API, GraphQL and OpenAPI; supported the creation of Waterwheel projects to help bootstrap an ecosystem of JavaScript front-end integrations; and most recently supported the development of Reservoir, a Drupal distribution for headless Drupal. There is also a lot of innovation coming from the community with lots of work on the Contenta distribution, JSON API, GraphQL, and more.

The end result? Drupal's web service APIs have progressed significantly the past year. Ed Faulkner of Ember told us: "I'm impressed by how fast Drupal made lots of progress with its REST API and the JSON API contrib module!". It's a good sign when a core maintainer of one of the leading JavaScript frameworks acknowledges Drupal's progress.

The current state of JavaScript in Drupal

Looking back, I'm glad we decided to focus first on improving Drupal's web services APIs; we discovered that there was a lot of work left to stabilize them. Cleanly integrating a JavaScript framework with Drupal would have been challenging 18 months ago. While there is still more work to be done, Drupal 8's available web service APIs have matured significantly.

Furthermore, by not committing to a specific framework, we are seeing Drupal developers explore a range of JavaScript frameworks and members of multiple JavaScript framework communities consuming Drupal's web services. I've seen Drupal 8 used as a content repository behind Angular, Ember, React, Vue, and other JavaScript frameworks. Very cool!

There is a lot to like about how Drupal's web service APIs matured and how we've seen Drupal integrated with a variety of different frameworks. But there is also no denying that not having a JavaScript framework in core came with certain tradeoffs:

  1. It created a barrier for significantly leveling up the Drupal community's JavaScript skills. In my opinion, we still lack sufficient JavaScript expertise among Drupal core contributors. While we do have JavaScript experts working hard to maintain and improve our existing JavaScript code, I would love to see more experts join that team.
  2. It made it harder to accelerate certain improvements to Drupal's authoring and site building experience.
  3. It made it harder to demonstrate how new best practices and certain JavaScript approaches could be leveraged and extended by core and contributed modules to create new Drupal features.

One trend we are now seeing is that traditional MV* frameworks are giving way to component libraries; most people seem to want a way to compose interfaces and interactions with reusable components (e.g. libraries like React, Vue, Polymer, and Glimmer) rather than use a framework with a heavy focus on MV* workflows (e.g. frameworks like Angular and Ember). This means that my original recommendation of Ember needs to be revisited.

Several years later, we still don't know what JavaScript framework will win, if any, and I'm willing to bet that waiting two more years won't give us any more clarity. JavaScript frameworks will continue to evolve and take new shapes. Picking a single one will always be difficult and to some degree "premature". That said, I see React having the most momentum today.

My recommendations at DrupalCon Vienna

Given that it's been almost two years since I last suggested adding a JavaScript framework to core, I decided to talk bring the topic back in my DrupalCon Vienna keynote presentation. Prior to my keynote, there had been some renewed excitement and momentum behind the idea. Two years later, here is what I recommended we should do next:

  • Invest more in Drupal's API-first initiative. In 2017, there is no denying that decoupled architectures and headless Drupal will be a big part of our future. We need to keep investing in Drupal's web service APIs. At a minimum, we should expand Drupal's web service APIs and standardize on JSON API. Separately, we need to examine how to give API consumers more access to and control over Drupal's capabilities.
  • Embrace all JavaScript frameworks for building Drupal-powered applications. We should give developers the flexibility to use their JavaScript framework of choice when building front-end applications on top of Drupal — so they can use the right tool for the job. The fact that you can front Drupal with Ember, Angular, Vue, React, and others is a great feature. We should also invest in expanding the Waterwheel ecosystem so we have SDKs and references for all these frameworks.
  • Pick a framework for Drupal's own administrative user interfaces. Drupal should pick a JavaScript framework for its own administrative interface. I'm not suggesting we abandon our stable base of PHP code; I'm just suggesting that we leverage JavaScript for the things that JavaScript is great at by moving relevant parts of our code from PHP to JavaScript. Specifically, Drupal's authoring and site building experience could benefit from user experience improvements. A JavaScript framework could make our content modeling, content listing, and configuration tools faster and more application-like by using instantaneous feedback rather than submitting form after form. Furthermore, using a decoupled administrative interface would allow us to dogfood our own web service APIs.
  • Let's start small by redesigning and rebuilding one or two features. Instead of rewriting the entirety of Drupal's administrative user interfaces, let's pick one or two features, and rewrite their UIs using a preselected JavaScript framework. This allows us to learn more about the pros and cons, allows us to dogfood some of our own APIs, and if we ultimately need to switch to another JavaScript framework or approach, it won't be very painful to rewrite or roll the changes back.

Selecting a JavaScript framework for Drupal's administrative UIs

In my keynote, I proposed a new strategic initiative to test and research how Drupal's administrative UX could be improved by using a JavaScript framework. The feedback was very positive.

As a first step, we have to choose which JavaScript framework will be used as part of the research. Following the keynote, we had several meetings at DrupalCon Vienna to discuss the proposed initiative with core committers, all of the JavaScript subsystem maintainers, as well as developers with real-world experience building decoupled applications using Drupal's APIs.

There was unanimous agreement that:

  1. Adding a JavaScript framework to Drupal core is a good idea.
  2. We want to have sufficient real-use experience to make a final decision prior to 8.6.0's development period (Q1 2018). To start, the Watchdog page would be the least intrusive interface to rebuild and would give us important insights before kicking off work on more complex interfaces.
  3. While a few people named alternative options, React was our preferred option, by far, due to its high degree of adoption, component-based and unopinionated nature, and its potential to make Drupal developers' skills more future-proof.
  4. This adoption should be carried out in a limited and incremental way so that the decision is easily reversible if better approaches come later on.

We created an issue on the Drupal core queue to discuss this more.

Conclusion

Drupal supporting different javascript front ends

Drupal should support a variety of JavaScript libraries on the user-facing front end while relying on a single shared framework as a standard across Drupal administrative interfaces.

In short, I continue to believe that adopting more JavaScript is important for the future of Drupal. My original recommendation to include a modern JavaScript framework (or JavaScript libraries) for Drupal's administrative user interfaces still stands. I believe we should allow developers to use their JavaScript framework of choice to build front-end applications on top of Drupal and that we can start small with one or two administrative user interfaces.

After meeting with core maintainers, JavaScript subsystem maintainers, and framework managers at DrupalCon Vienna, I believe that React is the right direction to move for Drupal's administrative interfaces, but we encourage everyone in the community to discuss our recommendation. Doing so would allow us to make Drupal easier to use for site builders and content creators in an incremental and reversible way, keep Drupal developers' skills relevant in an increasingly JavaScript-driven world, move us ahead with modern tools for building user interfaces.

Special thanks to Preston So for contributions to this blog post and to Matt Grill, Wim Leers, Jason Enter, Gábor Hojtsy, and Alex Bronstein for their feedback during the writing process.

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web