Oct 25 2019
Oct 25

DrupalCon Amsterdam 2019. 28 OCT - 31 OCT. Amsterdam, Netherlands

We're sad to miss DrupalCon Europe in Amsterdam next week (October 28-31, 2019). But which talks would we attend if we were going? Amber and I combed through the Interactive Program and created a list of what looks intriguing at the next DrupalCon. Will you be there? You might want to check out our picks.

Joe's picks

I'm not going to be at DrupalCon Amsterdam. First time I've missed a DrupalCon since 2011! And I'm bummed about missing the chance to catch up with friends, meet new people, and keep up with everything the Drupal community is doing. If, however, I was in Amsterdam, these are some of the sessions that would be on my calendar.

Amber's picks

Amber: A big plus one from me on Joe's picks, and here are a few more I would check out if I was there.

  • Autosave and Concurrent editing (conflict resolution) in Drupal 8

    Training around content editing can be tricky because each site has a different configuration and internal process for creating, editing, publishing, and archiving content. But there are definitely some universal known problems in editing content in Drupal and "losing changes before saving content" and "concurrent editing conflicts" are two of them. If you are in the frustration stage of this problem and are looking for potential solutions, check out this session which introduces two modules that address these problems.

  • Configuration Management Initiative 2.0 updates

    Now that Configuration Management in Drupal 8 has been used in sites for a while now, some limitations and challenges have emerged. In this session, you'll get an overview of these issues and how the Configuration Management Initiative 2.0 will seek to address them and how you can structure your sites today for minimal disruption in the future. I'll definitely be checking out the recording on this one to make sure we're making the best recommendations possible in our tutorials on Configuration Management.

  • Initiative Leads Keynote

    Attend this keynote to get updates from initiative leads and learn how you can get involved with core contribution for these coordinated efforts. I'll be cheering from the internet sidelines for my fellow core contributors!

  • (Paid) Training: Drupal + Gatsby

    Our training friend Suzanne Dergacheva is offering a training on Drupal + Gatsby. If I could, I would totally register for this training workshop. Suzanne is a great instructor and the topic is very hot right now, and I think will continue to be into the future.

Oct 15 2019
Oct 15

I began my DrupalEasy journey with the greatest of intentions. Jumping in head first, I upgraded to Windows 10 Pro, set up a new local development environment — I highly recommend DDEV for its power and flexibility, and because it allows development teams to use Docker in their workflow — and reacquainted myself to Composer and the command line. If there was a roll, I was on it.

Then week 2 happened. What I learned then is that unfortunately, having a teacher doesn’t automatically make the path to Drupal proficiency a smooth, easy ascent to greatness — at least not for me. The greatest challenge that I encountered, and totally underestimated, was the whole concept of time.

Now, if you’re anything like me, you’re learning Drupal while also working a full-time job. This was fine when I was teaching myself on my own time. But with an actual course like DrupalEasy, I totally underestimated the time commitment of scheduled class times and assignments. While the homework is optional, I have to at least attempt it to get the most out of the course.

In week 2, I had a vacation, a wedding, and a team retreat on my calendar. To say I fell behind in the class is an understatement. On top of catching up with email and work tasks, I now had to find time to watch hours of video lecture and complete the homework assignments. The class was learning without me and I felt totally frazzled.

I realized I had to get focused — to get really intentional with my time and plan, plan, plan. It was the only way to balance Drupal, work, and life. Thankfully, both Michael (my instructor) and Addi (my boss) were extremely supportive. I also knew there was a gap week scheduled that would allow me time to catch up. (Hello gap week!) Soon, I’ll be right back in line with all of my classmates as if I had been there all along.

So if your Drupal journey is anything like mine, know there’ll be bumps along the way. Mine was time. Just don’t let a bump on your path become a deterrent. It’s okay to fall behind or get a bit lost. Just don’t stop. There’s hope. Your “gap week” is approaching.

Sep 25 2019
Sep 25

One of our members recently asked this question in support:

Wonder if you have, or can suggest, a resource to learn how to access, authenticate (via OAuth preferably) and process JSON data from an external API?

In trying to answer the question I realized that I first needed to know more about what they are trying to accomplish. Like with most things Drupal, there's more than one right way to accomplish a task. Choosing a solution requires understanding what options are available and the pros and cons of each. This got me thinking about the various different ways one could consume data from an API and display it using Drupal 8.

The problem at a high level

You've got data in an external service, available via a REST API, that you need to display on one or more pages in a Drupal site. Perhaps accessing that data requires authentication via OAuth2 or an API token. There are numerous ways to go about it. Which one should you choose? And how should you get started?

Some questions to ask yourself before you start:

  • How much data are we talking about?
  • How frequently does the data you're consuming change, and how import is it that it's up-to-date? Are real-time updates required? Or is a short lag acceptable?
  • Does that data being consumed from the API need to be incorporated into the Drupal-generated pages' HTML output? How does it impact SEO?
  • How much control does a Drupal site administrator need to have over how the data is displayed?

While I'm certain this list is not exhaustive, here's are some of the approaches I'm aware of:

  • Use the Migrate API
  • Create a Views Query Plugin
  • Write a custom service that uses Guzzle or similar PHP SDK via Composer
  • Use JavaScript

I'll explain each one a little more, and provide some ideas about what you'll need to learn in order to implement them.

Option 1: Use the Migrate API

Use the Migrate API combined with the HTTP Fetchers in the Migrate Plus module to ingest data from an API and turn it into Drupal nodes (or any entity type).

In this scenario you're dealing with a data set that doesn't change frequently (a few times per day, maybe), and/or it's okay for the data displayed on the site to lag a little behind what's in the external data service. This approach is somewhat analogous to using a static site generator like Gatsby, or Sculpin, that requires a build to occur in order for the site to get updated.

In this case that build step is running your migration(s). The result is you'll end up with a Drupal entity for each record imported that would be no different than if a user had created a new node by filling out a form on your Drupal site. In addition, you get the complete extract, transform, load pipeline of the Migrate API to manipulate the ingested data as necessary.

Pros:

  • If you've worked with Migrate API before, this path likely provides the least friction
  • Data is persisted into Drupal entities, which opens up the ability to use Views, Layout Builder, Field Formatters, and all the other powerful features of Drupal's Entity & Field APIs
  • You can use Migrate API process plugins to transform data before it's used by Drupal
  • Migrate Plus can handle common forms of authentication like OAuth 2 and HTTP Basic Auth

Cons:

  • Requires a build step to make new or updated data available
  • Data duplication; you've now got an entity in Drupal that is a clone of some other existing data
  • Probably not the best approach for really large data sets

Learn more about this approach:

Option 2: Create a Views Query Plugin

Write a Views Query Plugin that teaches Views how to access data from a remote API. Then use Views to create various displays of that data on your site.

The biggest advantage of this approach is that you get the power of Views for building displays, without the need to persist the data into Drupal as Entities. This is approach is also well suited for scenarios where there's an existing module that already integrates with the third party API and provides a service you can use to communicate with the API.

Pros:

  • You, or perhaps more importantly your editorial team, can use Views to build a UI for displaying and filtering the data
  • Displays built with Views integrate well with Drupal's Layout Builder and Blocks systems
  • Data is not persisted in Drupal and is queried fresh for each page view
  • Can use Views caching to help improve performance and reduce the need to make API calls for every page load

Cons:

  • Requires a lot of custom code that is very specific to this one use-case
  • Requires in-depth understanding of the underpinnings of the Views API
  • Doesn't allow you to take advantage of other tools that interact with the Entity API

Learn more about this approach:

Option 3: Write a Service using Guzzle (or similar)

Write a Guzzle client, or use an existing PHP SDK to consume API data.

Guzzle is included in Drupal 8 as a dependency, which makes it an attractive and accessible utility for module developers. But you could also use another similar low-level PHP HTTP client library, and add it to your project as a dependency via Composer.

Guzzle is a PHP HTTP client that makes it easy to send HTTP requests and trivial to integrate with web services. --Guzzle Documentation

If you want the most control over how the data is consumed, and how it's displayed, you can use Guzzle to consume data from an API and then write one or more Controllers or Plugins for displaying that data in Drupal. Perhaps a page controller that provides a full page view of the data, and a block plugin that provides a summary view.

This approach could be combined with the Views Query Plugin approach above, especially if there's not an existing module that provides a means to communicate with the API. In this scenario, you could create a service that is a wrapper around Guzzle for accessing the API, then use that service to retrieve the data to expose to views.

If you need to do anything (POST, PUT, etc. ) other than GET from the API in question you'll almost certainly need to use this approach. The above two methods deal only with consuming data from an API.

Pros:

  • Able to leverage any existing PHP SDK available for the external API
  • Some of the custom code you write could be reused outside of Drupal
  • Greatest level of control over what is consumed, and how the consumed data is handled
  • Large ecosystem of Guzzle middleware for handling common tasks like OAuth authentication

Cons:

  • Little to no integration with Drupal's existing tools like Views and others that are tailored to work with Entities

Learn more about this approach:

Option 4: JavaScript

Use client-side JavaScript to query the API and display the returned data.

Another approach would be to write JavaScript that does the work of obtaining and displaying data from the API. Then integrate that JavaScript into Drupal as an asset library. A common example of something like this is a weather widget that displays the current weather for a user, or a Twitter widget that displays a list of most recent Tweets for a specific hash tag.

You could also create a corresponding Drupal module with an admin settings form that would allow a user the ability to configure various aspects of the JavaScript application. Then expose those configuration values using Drupal's JavaScript settings API.

While it's the least Drupal-y way of solving this problem, in many cases this might also be the easiest -- especially if the content you're consuming from the API is for display purposes only and there is no reason that Drupal needs to be aware of it.

Pros:

  • Data is consumed and displayed entirely by the client, making it easier to keep up-to-date in real time.
  • Existing services often provide JavaScript widgets for displaying data from their system in real time that are virtually plug-and-play.
  • Code can be used independent of Drupal.

Cons:

  • No server-side rendering, so any part of the page populated with data from the external API will not be visible to clients that don't support JavaScript. This also has potential SEO ramifications.
  • You can't query the API directly if it requires an API key that you need to keep secret (e.g., because the key has access to POST/PUT/DELETE resources). In that case, you would need server-side code to act as a proxy between the API and the JavaScript frontend
  • Drupal has no knowledge of the data that's being consumed.
  • Drupal has little control over how the data is consumed, or how it's displayed.

Learn more about this approach:

Honorary mention: Feeds module

The Feeds module is another popular method for consuming data from an API that serves as an alternative to the Migrate API approach outlined above. I've not personally used it with Drupal 8 yet, and would likely use the Migrate API based on the fact that I have much more experience with it. Feeds is probably worth at least taking a look at though.

Conclusion

There are a lot of different ways to approach the problem of consuming data from an API with Drupal. Picking the right one requires first understanding your specific use case, your data, and the level of control site administrators are going to need over how it's consumed and displayed. Remember to keep in mind that turning the data into Drupal entities can open up a whole bunch of possibilities for integration with other aspects of the Drupal ecosystem.

What other ways can you think of that someone might go about solving the problem of consuming data from an API with Drupal?

May 04 2018
May 04

Twin Cities DrupalCamp is always my favorite Drupal event of the year, both because it's close to home, and because I get to be involved in planning and organizing. I have the opportunity to attend quite a few Drupal events every year, and there's just something extra special in doing so not only as a participant but as a volunteer. You get to see a different side of things, and engage with people in different ways. It's a great chance to spend some time working on something you care about with friends.

This year the camp is taking place from June 7th-10th in Minneapolis, and it's shaping up to be a good one. As is tradition, the Thursday before the official camp starts is training day. This year, we've got three great options for you to choose from, including a workshop by Amber and me about theming Drupal 8.

We actually did the same workshop at Twin Cities DrupalCamp last year, and it sold out. So thought we would offer it again this year to give a new set of people the chance to learn to make beautiful things with Drupal 8.

Learn Drupal 8 Theming

You can sign up for our Drupal 8 Theming workshop to get the low-down on all of the goodies that come with the powerful theme system. The workshop is free for anyone that's registered for the camp, but seats are limited, so don't wait.

This workshop will familiarize front-end developers with Drupal 8's theme system. Whether your goal is to theme your personal site, pass the Acquia front-end developer certification, or upgrade your skills for a job, we provide students with a solid foundation on which to start, and enough knowledge to continue to practice and learn on their own. Whether you're creating an entirely new theme from scratch or making nips and tucks to an existing design, understanding how Drupal's theme system works -- or having someone on your team who does -- is essential.

Here's what we're planning to cover in this all-day workshop:

  • How the Drupal theme layer relates to the rest of the system
  • Common theming terminology and processes
  • How to override any of Drupal's HTML output
  • The relationship between base themes and sub themes
  • Everything you need to know about Twig when working with Drupal templates
  • How to add both custom and third-party CSS and JavaScript libraries
  • Tools for introspecting and debugging a theme
  • Tips and tricks for using common front-end development tools like CSS preprocessors and task runners, in conjunction with Drupal

We do have a limited number of seats for this workshop, so sign up soon to reserve your spot.

Sign up for Theming Drupal 8!

Additional trainings

If that's not what you're after, there are two additional trainings that you can choose from, put on by our friends at Agaric, and Backdrop CMS.

Drupal 8 Content Migrations, presented by Agaric This training is aimed at site builders who will learn to combine various core and contributed modules and write YAML files to accomplish content migrations. No prior experience with the Migrate module is required.

Intro to Backdrop CMS, presented by Backdrop CMS This introductory training will cover the basics of creating and administering a website with Backdrop CMS.

Come say hi

Even if you're not attending one of the trainings above we would still love to say hi if you're going to be at Twin Cities DrupalCamp this year. Amber and I will be around throughout the event, we'll have stickers, and would love to catch up. So if you see us, say hello.

Feb 27 2018
Feb 27

In November 2017, Symfony released its latest major version: Symfony 4! That's great! Right? But what does this mean for Drupal? And more importantly, what does it mean for your Drupal projects?

The Symfony release cycle

First, a little bit of backstory. Symfony has a predictable release cycle: a new minor version (e.g. 3.3) is released every 6 months and a new major version is released every 2 years. The result is that each major version consists of 5 minor versions: 3.0, 3.1, 3.2, 3.3, 3.4 and then 4.0 is released. This repeats over and over again. The last version of a major (e.g. 3.4 or 4.4 in the future) is a long-term support release. You can even see when any version of Symfony will be released on its roadmap.

But, there's something very interesting about the last minor version (3.4) and the next major version (4.0)... they're identical! What?! Yes, the only difference between Symfony 3.4 and Symfony 4.0 is that any deprecated features have been removed. This means that all the features found in 4.0 also exist in Symfony 3.4. That's important for Drupal.

Drupal and Symfony versions

So, what version of Symfony does Drupal use? Actually, that's not quite the correct question to ask. Instead, ask, what version of Symfony does Drupal require and allow?

The current version of Drupal - 8.4 - requires Symfony 3.2.8 or higher, but less than 4.0. In practice, this probably means that your Drupal 8.4 project is using the latest Symfony 3.2.* version. But, you could technically upgrade your Symfony dependencies to version 3.4.

The upcoming version - Drupal 8.5 - will require 3.4 or higher, but still less than 4.0. This means that a Drupal 8.5 project will have all the features of Symfony 4.0, because those features are the same as Symfony 3.4.

So what does the Symfony 4.0 release mean for Drupal? In the short term, nothing! Drupal 8.5 will use Symfony 3.4, which means that you get all of the new features and optimizations.

Allowing Symfony 4.x in Drupal

But, eventually, Drupal will want to upgrade to Symfony 4. When will this happen? It's still being debated. However, Drupal has adopted Symfony's awesome deprecation policy, which includes a backwards-compatibility promise, meaning only changes that are backwards-compatible with previous minor versions (e.g. 8.4 to 8.5) are allowed. This means that it's likely that the first version of Drupal to require Symfony 4 will be Drupal 9.0. When will that come out? I don't think that's known yet. But thanks to the new deprecation policy, the upgrade will be much easier than the past... practically boring by comparison.

So what happens when Symfony 4.1 comes out in May 2018 with new features you want? Will a Drupal 8 project ever be able to use Symfony 4? Maybe. Nothing is guaranteed, but it's possible that, in Drupal 8.6, Drupal will use Symfony 3.4, but allow version 4. If that happens, you could choose to upgrade one or all of the Symfony libraries to 4.

Oh, and also remember one important thing: Symfony is actually a set of about 40 independent libraries (called components). And Drupal's core only relies on some of these. If you want to use another Symfony component (e.g. symfony/workflow), you're free to install that at any version.

The tl;dr new features, less breakage

The tl;dr is this: Drupal is actively upgrading Symfony in core to bring in new features, optimizations and bug fixes. But, unlike in the past (say, Drupal 7 to Drupal 8), this is being done in a way that prevents backwards compatibility breaks. In other words, it's a lot of goodness.

Oh, and if you want to learn Symfony 4 (it's a lot of fun), you can check out our new, free screencasts here on Drupalize.Me or over at KnpUniversity:

Happy Symfonying! Happy Drupaling!

Feb 27 2018
Feb 27

Before reading this, check out these other posts:

After reading all of these I have lots of thoughts and feelings, and I don't yet know what I think the best path forward is. But I'm happy to see this conversation being highlighted and the continued discussion of how we can make Drupal's documentation even better. To that end, I think we can look to the User Guide project as an example we can learn from.

Learn from past successes

I want to point to the Drupal 8 User Guide project as an example of what can be accomplished through a documentation initiative. It wasn't ever an "official" initiative, but there was a lot of community involvement and excitement around it nonetheless, as well as more coordination and collective effort towards a goal than we typically see with regards to work on Drupal's documentation. While this particular guide only covers one aspect of Drupal, I believe it can serve as a good example for possible future documentation initiatives.

For reference, the readable version of the guide is here: https://www.drupal.org/docs/user_guide/en/index.html, and the project is located here: https://www.drupal.org/project/user_guide.

You can get some history of how this project was started, and how it evolved over time, by watching these two presentations:

We set out to solve a bunch of existing issues:

  • No overall plan
  • Limited scope
  • Lack of peer review
  • Lack of copy editing
  • Tools that don't facilitate the governance model we wanted to impose
  • The desire to have Drupal's documentation translated
  • Guidelines, and conventions, for writers, and a process for enforcing them
  • etc.

Things we learned that could be applied elsewhere:

Start with a plan

Once we knew we wanted to write the guide, and felt like we had some buy-in from the community, the first thing we did was make a plan. Jennifer and I met in person, and drafted an initial outline covering exactly what we felt the guide should contain. Then we shared that for feedback. Having this outline allowed us to know what was in scope, track the progress of the initiative, and get other people involved in a meaningful way (running a documentation sprint is a lot easier if you know what you want people to write).

In addition to the outline, we also defined a process that every page in the guide would go through.

  1. Initial draft
  2. Review: does it match our scope and guidelines?
  3. Technical review: do the instructions work?
  4. Copy editing
  5. Finalize screenshots
  6. Ready for publication

Spreadsheet with rows for user guide topics and colums representing phases in the process of review. Cells indate the progress of each topic as it moves through each phase.

Again, this allowed us to track progress, and helped with recruiting people because we had clearly-defined tasks for people to work on. We could also define that certain steps (copy editing, publication) could only be undertaken by specific people in order to allow for consistent quality control.

Use version control

While maybe not as friendly for someone who just wants to make a drive-by edit to a documentation page, using Git as the canonical source for the content of the guide has proven extremely valuable.

  • Limit who can commit, ensuring that all content is vetted by the same procedure prior to being added
  • Allow for better collaboration between multiple people working on the same content
  • Facilitates a standarized review process
  • Having a patch + review process helped attract contributors who might not otherwise participate. Many people are hesitant to edit a free-for-all page because they're concerned that they're not "doing it right." When they know there is a friendly review process in place, they are emboldened.
  • Allows for opening issues and discussing changes before they're made (this is huge in comparison to the wiki-like nature of drupal.org)
  • Allows for maintaining of translations. Once a page has been translated we have a way to track that the English version has changed, and thus the translated copies require an update.
  • You can have a version of the documentation that matches a version of Drupal.

It does raise the barrier to entry for contribution. However, in my experience I generally feel the trade-offs are worth it.

Give people credit

A nice side effect of the use of version control in this case is that we can give people a commit credit for the work they've done. For better or worse, this is an important aspect of the contribution process. Writing documentation is often a thankless task, and we want to help elevate those that are contributing.

In addition to commit credits we also maintain an ATTRIBUTIONS.txt file for the guide, and individual attributions with each page.

Limit scope and define your audience

The user guide has a defined scope and audience.

From the guide:

This guide was written mainly for people with minimal knowledge of the Drupal content management system. The topics will help them become skilled at installing, administering, site building, and/or maintaining the content of a Drupal-based website.

This allowed us to make critical decisions about what to include, what was too much, and where we maybe needed more information.

Additionally, writing documentation is one thing. Keeping it up-to-date is a whole other beast. Knowing the scope of what your documentation covers makes this easier.

Keeping the scope of what the guide covers allowed us to, well... finish the guide.

Oversight and governance are important

As Dries said in his post:

It's hard to write world-class documentation by committee without good governance...

While the idea of a free-for-all wiki where anyone can come along and help with updates to the documentation certainly has its merits, it is also a promoter of sprawl and can lead to vastly inconsistent quality. In the end Jennifer and I didn't write all that much of the content. Instead, we helped others figure out where and how they could get involved, ensured that processes were followed by acting as gatekeepers, and worked hard to identify issues and adopt our plan in order to address them. I believe this allowed the guide to be completed.

Going forward we can help to identify areas that need improvement, facilitate translation teams who are doing the hard work of translating the content, and throughout all of this ensure consistent quality. But this only works because from the beginning it was made clear that this isn't a free-for-all, and there is a process.

The combination of a clear definition of scope and governance gives us the authority to say, "Thanks, but no thanks." With the existing wiki-like documentation there's no clear authority, which I believe leads to duplication, unnecessary content, and people feeling like they can't update or fix content written by someone else for fear of breaking some unwritten rules. Anyone wanting to improve things by deleting or re-writing is left to contend with the rebuttal from the author and no real recourse other than to decide they want to get into a battle or back off and leave it as-is. Clear scope and governance help solve these issues.

Automate!

Because we have a defined scope, and a set of strict formatting guidelines we are able to automate the process of creating screenshots for the guide. Almost every page in the guide contains multiple screenshots. Keeping those up-to-date with changes to Drupal core would be arduous -- which really means it wouldn't get done. Having an automated way to generate the majority of these ensures that when a new minor version of Drupal 8 is released we can just update all the screenshots. It also means we can generate screenshots for translated versions of the guide that show the Drupal UI with alternative languages installed.

Without being able to do this I'm not sure we would have decided to add screenshots to the whole guide. Its impossible to understate how much harder it is to maintain things like images and videos in comparison to text.

And, as a side-effect, this process serves as a sort of functional test suite for the guide. It often catches changes in Drupal core that require us to upgrade the step-by-step instructions in the guide.

Be prepared to explain again, and again, why we're taking this approach to writing documentation. After many years of wiki-style free-for-all on Drupal.org there are a lot of people who push back against imposing more oversight and control. Taking the time to explain the benefits of an approach like this and helping people to see how they can still contribute can be tedious, but it's worth it.

Videos

Additionally, Drupalize.Me is currently working on creating videos to complement the content of the user guide -- a task that we couldn't possibly take on without all of the above structures being in place. You can read more about the effort in Adding Free Videos to the Drupal 8 User Guide.

Recap

I hope that we can use some of the experience gained writing the Drupal 8 User Guide to help inform future decisions about how we create and maintain documentation for all of Drupal. Including:

  • Defining a clear plan and scope for what should be included
  • Implementing a process that facilitates peer review and oversight
  • Evolving the governance of documentation in order to empower people with the authority to make decisions
  • Create tooling that helps us to better maintain existing content

That, and of course, empower more people to get involved in creating high-quality documentation that matches people's expectations and makes Drupal look good.

I would love to hear your thoughts about adopting lessons learned from the user guide either in the comments here, or on one of the posts linked above.

Feb 19 2018
Feb 19

React for DrupalThe interest in creating decoupled/headless/API-first Drupal sites has been growing for a while. As the interest grows, and more sites are implementing this architecture, there is a growing list of articles and discussions about it. Essentially, with a decoupled site you have a backend that provides an editorial interface for creating and storing content, and an API for retrieving it. This is combined with one or more frontend clients that can display that content where and how you would like. Drupal is a great backend for these kinds of sites, and React is the most popular JavaScript framework being used for frontends. While Drupal and React are both well-established technologies with a wealth of documentation, the interaction between the two, and best practices around that interaction, are still developing.

There has also been a push for Drupal core to adopt a modern JavaScript framework with the end goal of improving Drupal's user experience (UX). At this time React is the current leader in this initiative. So it's not a bad idea for any Drupal developer to at least be familiar with what React is and why it has been taking the web by storm.

Lullabot Education (the company that runs Drupalize.Me) has been exploring the world of JavaScript in general, and React specifically, and we've been pulling together resources. As part of this process, we've put together a new site, React for Drupal, that curates the best of our findings. If decoupled Drupal and/or React are topics you want to learn more about and keep current with, this is a great place to start. We'll also be keeping the site up-to-date as best practices emerge or change and we'll be working to help fill gaps in the current knowledge and documentation.

We hope this is a useful resource to both the Drupal and React communities. If you have any suggestions or comments about what we're covering and what's missing, please let us know! Check out React for Drupal to sign up for regular updates in the future.

Feb 14 2018
Feb 14

Drupal 8 User Guide

We've just released a new free guide on our site, Drupal 8 User Guide in order to help our members--and anyone--with minimal existing knowledge of Drupal get started building something quickly. It's a re-publication of the one already available on Drupal.org, with the addition of embedded videos.

I want to share a little bit about why we choose to republish existing content instead of creating new materials, why we've opted to add video to the user guide, and why we're giving it all away for free.

What is the User Guide project?

The Drupal 8 User Guide project consists of about 100 pages of written content that provides an introduction to Drupal for newcomers.

From the guide's preface:

This guide was written mainly for people with minimal knowledge of the Drupal content management system. The topics will help them become skilled at installing, administering, site building, and/or maintaining the content of a Drupal-based website. The guide is also aimed at people who already have some experience with a current or past version of Drupal, and want to expand the range of their skills and knowledge or update them to the current version.

Its content is written and maintained by the Drupal community. It is published and freely available on Drupal.org and it's licensed Creative Commons CCbySA 2.0.

When it came time for us to start planning for the intro-level content we wanted to include on our site we opted to make use of this existing resource. Drupalize.Me has a long history of involvement with the project. I put forth the initial proposal at DrupalCon LA, helped to subsequently refine it into the current version, and am one of the current maintainers. Amber Matz helped with some of the editorial process, and we created the graphics used in the example site, licensed under Creative Commons for use in the guide.

Why republish the user guide?

  • "A good introduction to Drupal 8" is one of the more common requests we get from our members.
  • The text is already written and licensed Creative Commons. So it's a great head start for us and allows us to complete things faster without also essentially duplicating quality content that is already available elsewhere.
  • It's really high quality. Given our involvement with the project since the beginning, we already know that it's as good or better than anything we might write ourselves.
  • We can do double-duty with our time, and benefit both our site and help improve the official user guide project at the same time.
  • The content of the guide is already organized in a way that is similar to how to we like to break things up. Short concept tutorials that introduce a new idea followed by one or more task tutorials that demonstrate the new concepts in use. So it fits well into our existing architecture.
  • Our site has some unique features that our members appreciate that Drupal.org doesn't currently have. For example, tracking which tutorials you've already read so you can more easily pick up where you left off last time or adding things to your queue for future watching or reading.

We're super excited about this and feel like it's a big win for Drupalize.Me, our members, and the Drupal community as a whole.

Adding Video to the User Guide

One thing that we feel the current iteration of the user guide project is missing is video, and we want to help fix that. So we recorded video for all of the task tutorials in the guide, are making them available under the Creative Commons license, and publishing them all for free on both our site and our YouTube channel.

Why video?

  • Different people learn in different ways, some are more visual learners, and having the ability to watch as someone else navigates the steps required to complete a task is more helpful than either reading instructions, or looking at screenshots.
  • Video can also be beneficial for auditory learners.
  • Video allows the user to see important elements of the UI that may not be covered by screenshots.
  • Some people prefer watching a video over reading a page of text.

The downside of video is that it's harder than text to produce and requires some specialized knowledge that can make it harder for volunteers to create and maintain--something we've gotten really good at over the years. We know first-hand the difficulty of producing and updating high-quality video content. When talking about our own content and the work we do on a daily basis we often state that, "It's easier to patch a text file than a video."

Additionally, we've learned from experience that when it comes to video, people tend to expect a highly polished and consistent format. It can be jarring to switch frequently from one presenter to another, or distract from the learning experience when different screen-casters are using different browser or a different configuration in their terminal. This is by no means impossible for a volunteer team to accomplish, but it's absolutely easier for a team with experience and relevant resources to do.

For these reasons, and because we're firm believers that when Drupal does better we all do better, we're working to contribute all the videos we created back to the original Drupal 8 User Guide project. Our hope is that by contributing them back to the community more people can get the chance to learn from them. And, that by also using them on Drupalize.Me we can continue to help keep them up-to-date and accurate for future versions of Drupal.

If you've got thoughts about how, or if, these videos should be included in the guide see this issue on Drupal.org.

What's next?

Going forward we would like to create and contribute videos to accompany the concept tutorials in the user guide. Although, we don't yet have a timeline for that work. Additionally, with this baseline information in place, we'll begin working on expanding the Drupalize.Me tutorial library to go more in-depth into topics like content types, views, and user management that the user guide introduces.

Get started learning Drupal 8 today with the Drupal 8 User Guide, now with videos!

Dec 14 2017
Dec 14

Here at Drupalize.Me we spend a lot of time looking at our company, poking at our policies and processes, and deliberately working to be a better company in the world. For the last 2 years, since we became our own company, we've talked about wanting to make our employee handbook public but hadn't taken the time get it done. We review our handbook at least annually and as part of the 2017 review and update process, we decided to also make sure it was in shape for sharing it with anyone who wants to take a look. We've finally gotten the review and editing complete and we have made the handbook available on GitHub. (Note that our company name is actually Lullabot Education; Drupalize.Me is just the well-known brand of our main offering.) Anyone can read the handbook, in addition to being able to fork it for your own use, or provide comments and suggestions through issues and pull requests.

Why make it public?

We certainly aren't the first company to share our employee handbook. There are examples from many other companies out there, like Basecamp and GitLab. We decided to share our handbook for 2 main reasons. The first is so that people who interact with us--customers, partners, communities, and potential employees--can have a more complete picture of who we are and what we are working towards. The values and culture that we cultivate are important to every aspect of the company, not just the internal employee processes. We feel our handbook reflects the things we care about and bring to our work every day and it makes sense to allow everyone that is a part of our work to see where we're coming from.

The second reason is simply that we love to share. Our main job is to share as we learn in the web tech world. We're also learning as we grow in the business world, and see no reason to not share our experience there as well. We hope that others may find something useful in our handbook as they work on their own company and we'd love to have conversations with other people who think about these things.

Please check out the handbook, let us know your thoughts, and feel free to ask questions, either in a GitHub issue or here in this blog's comments.

Dec 11 2017
Dec 11

Thank you for casting your votes! The poll is now closed.

One of our favorite things to do at Drupal community events is in-person training. There's just something special—and motivating—about getting to teach people face-to-face rather than from the other side of a computer screen. In 2017 we developed and offered a Theming Drupal 8 workshop at DrupalCon, Twin-Cities DrupalCamp, MidCamp, and BADCamp. It was super popular and filled up every time we did it. So this year we're considering developing another workshop that we can offer at events we're fortunate enough to be able to attend. But we're torn on what to cover! What are you most interested in learning?

Our current top contenders are:

  • Creating Modern Web Services APIs with Drupal: Building on our Web Services in Drupal 8 series, we would teach you how to use Drupal as the backend for your API; popular modules and how to configure them; and best practices for important topics regarding the architecture of your API, like presentation versus content, security, and documentation. In addition, we would look at how consumers can interact with the API that we build. We won’t have time to get into any specific frameworks, but would instead demonstrate how any HTTP client could retrieve information from Drupal, leaving the specific implementation up to you. This would be a great first step for anyone wanting to get started with decoupled Drupal.
  • JavaScript for Drupal Developers: This workshop would build on the JavaScript portions of our Drupal 8 Theming Guide, and integrate additional content from experts in the community regarding current developments in the Drupal+JavaScript ecosystem. The intent would be to teach people who are already familiar with JavaScript basics about the various ways in which JavaScript can be used with Drupal themes and modules. We'd cover Drupal's existing JavaScript API, using ES6 with Drupal, and integrating modern JavaScript toolchains and frameworks like React into an existing Drupal module or theme. This would be a great first step for developers who want to level up their JavaScript game, as well as help prepare for the eventual inclusion of more and more JavaScript in Drupal.

Both of the workshops would follow the same pattern as the Theming Drupal 8 workshop we've been doing for the last year. They would include a combination of short lectures, hands on exercises, and discussion time to help keep attendees engaged and maximize the amount of knowledge we can share in a single class.

As much as we wish we had the time and resources to run all, it's not realistic. So if you were to participate in a Drupal community event in 2018 and could choose between attending these workshops, which would you choose? Cast your vote in the poll below!

Thank you for casting your votes! The poll is now closed.

Sep 07 2017
Sep 07

Today the release candidate (rc) for Drupal 8.4 came out. This is the final review stage for the next version and a good opportunity to see what changes are coming when the new release goes public in fewer than 4 weeks, on October 4th. (See the full release schedule.) There are definitely some important things to note in this release. You can read the full list of goodies in the release notes for the original, alpha release and the newest 8.4.0-rc1 release notes for this rc. Here we highlight some of the more interesting changes. We'll get into more details about the new version in October, but you should start getting excited now.

  • Five(!) modules moved from experimental status to full, stable status in 8.4. That's a bunch of cool new tools in core! (Note that Content Moderation is still experimental, but it is very close to stable. The core dev team is still working very hard to get it moved to stable for the 8.4 release.) Our new stable modules are:
    • Workflows
    • Layout Discovery
    • DateTime Range
    • Inline Form Errors
    • Media (which is stable, but hidden in the UI for now)
  • Symfony has been updated to 3.2 (learn more about Symfony)
  • jQuery updated to version 3 and jQuery UI to 1.12 (learn more about JavaScript in Drupal)
  • Adoption of new, stricter coding standards for JavaScript (learn more about Drupal Coding Standards)
  • Internet Explorer 9 and 10 support dropped
  • More improvements to the Migrate Drupal and Migrate Drupal UI experimental modules, but they are still at alpha stability.

For Drush users, note that versions earlier than 8.1.12 will not work with Drupal 8.4. To avoid fatal errors you will need to update Drush to 8.1.12 before using it to update to Drupal 8.4.

We are in our final tutorial review and update process, so you can expect our Drupal 8 tutorials to be correct once the new release comes out. You can always double-check the status of our tutorial compatibility by looking for the Drupal version check mark.

Aug 23 2017
Aug 23

Today, we're happy to announce a new free series about Drupal Coding Standards that has been made possible by our friends at Chromatic. This series was originally written by Alanna Burke, who is a developer at Chromatic, and they have generously released the work under a Creative Commons Attribution-ShareAlike 4.0 International License so that we can share it here, with some modifications, and keep the information up to date as it changes.

Chromatic logo

Chromatic is a well-recognized agency in the Drupal space, providing design and development services to some of the biggest brands in publishing, media, and e-commerce. Chromatic has 10+ years of experience working on large, enterprise Drupal sites. This series is a testament not only to Chromatic's commitment to building Drupal sites the right way, but sharing that information with the open source community. We're very grateful for their enthusiasm to share their knowledge with the Drupal community.

Learn about coding standards

The Drupal community has defined a set of coding standards and best practices that you should adhere to whenever you're writing code for Drupal. These standards provide a set of rules for how your code should be formatted, and best practice guidelines for naming conventions and the location of files. This ensures consistency in code throughout the project and makes it easy for developers to move around from one subsection to another without having to relearn how to read the code.

The tutorials in this series will walk you through the following topics:

For more resources about Drupal standards, also check out our Drupal Coding Standards topic page.

Jun 29 2017
Jun 29

The call for sessions for DrupalCon Vienna just closed. Now all of us who submitted a session get to play the waiting game, eager to find out whether our session was accepted. Ever wonder what goes on during the session selection process? Here's some insight.

Amber Matz and I (Joe Shindelar) had the awesome opportunity to be part of the programming team for DrupalCon Baltimore. Both of us served as local track chairs. Amber worked on the Horizons track and I worked on the Being Human track. We thought it would be fun to share some of our experience as track chairs helping with session selection.

The official session selection process is well-documented. It's designed to reduce bias and provide as much transparency as possible. Hopefully this post provides some additional insight into how sessions are chosen at DrupalCon.

Who chooses the sessions?

The DrupalCon programming team is responsible for soliciting session proposals, selecting sessions from those submissions, and laying out the schedule for the week of DrupalCon. The programming team is made up of Drupal Association staff and community volunteers. The exact composition of the team changes for each DrupalCon, but always includes a mix of people new to the process as well as those with previous experience.

Sessions are primarily selected on a per track basis, with input from the entire programming team. Each "track team" is composed of one local track chair and two global track chairs. (It was like this for DrupalCon Baltimore when I was involved, but I imagine this can vary.) These are the people who give your session the "thumbs-up" or add it to the "thanks, but no thanks" list.

Local track chair

For DrupalCon Baltimore, Amber was the Horizons local track chair and Joe was the Being Human local track chair.

The local track chair has these responsibilities:

  • Write a description for their track along with specific topic suggestions
  • Respond to inquiries about the track or requests for submission review from the community
  • Actively participate in weekly meetings
  • Determine the method and criteria your track will use to evaluate sessions
  • Read and evaluate all session proposals
  • Determine if session proposals would be a better fit in a different track and communicate accordingly with other track teams
  • Select session proposals for inclusion in the final program, along with alternates
  • Communicate with speakers and offer support to review slides or practice presentations

In many of the above tasks, especially session proposal selection, the local track chair works closely together with their global track chairs.

Global track chairs

There are one or two global track chairs for each track team. These are people who have previously acted as a track chair for DrupalCon. In many cases, the global track chairs were previously the local track chair for the track in question. This allows for things like transfer of knowledge, cohesion between events, and the ability to provide some insight into what did or did not work last time. For Amber and I, as local track chairs, the global track chairs were our first stop for questions.

Global track chairs have these responsibilities:

  • Provide support and guidance to the local track chair and help them complete all their responsibilities.
  • Support the transfer of knowledge and experience from previous events.

Each track team was responsible for figuring out a process and plan that worked for them with the local track chair taking the lead. Local track chairs (and optionally globals) met with Amanda, the Program Manager for the Drupal Association, once a week in the months leading up to DrupalCon. Besides directing the programming-related tasks for DrupalCon, Amanda coordinates all the volunteers on the track team. Our meetings generally involved locals giving an update about the current status of things. One such update might be: "My track has 25 submissions so far, I've been in contact with a few people to get clarification, and I've been working with my globals to finalize the draft of our track blog post that's due next Monday. One of the people who submitted a session asked me about how much time they should plan to leave for Q&A. What should I tell them?"

Meetings ended with locals having a list of action items to complete.

  • Joe: For our team, I was always at the meetings, and the other two attended when they could, but not every time. I would generally immediately follow up with the rest of my team, let them know what was discussed, and what our next tasks were. For actionable tasks, I would usually try and take a first pass at accomplishing it and then work with the rest of the track team to make sure we all agreed. For example, I wrote the track description and then they helped by providing feedback.
  • Amber: My experience was similar to Joe's. I usually time-blocked the time right before and right after our weekly meeting to work on tasks and communicate with my globals. I would always ask my globals to review any copy I had written or responses to inquiries from the community.

Submit early

Here's our number one bit of advice: submit your session well before the deadline. Submitting early is about more than just garnering favor with an over-zealous track chair. It gives them an opportunity to ask clarifying questions, provide feedback, and give your submission the attention it deserves.

Towards the end of the session submission process, this becomes impossible to do. The volume of new submissions is simply too high. Check out the graph below. It shows the total number of sessions submitted over time for the DrupalCon Baltimore CFP (Call For Presentations). The CFP was open for 2 months, but 75% of the submissions came in in the last 5 days. About 38% were submitted in the last 24 hours.

Graph showing exponential rise in number of submissions relative to close of CFP.

  • Joe: As a track chair it is really exciting when people submit a proposal to your track. And I read many of them as they came in. I was just thrilled that people were submitting things. I also started to follow up with people after reading their session descriptions if something wasn't clear, or if I needed more info. Since they submitted early, they also had time to revise. Every track chair on the program team had stories about reading early submissions and taking the time to reach out to get clarification and improve the submission.
  • Joe: I wish I could give the same amount of feedback for every session, but the reality is this can only be done if you submit early. If you wait until just before the deadline I simply can't read all the submissions as quickly as they come in. By the time I do get around to reading it, it'll be too late for you to make any changes.
  • Joe: A couple of people reached out to me specifically after submitting their proposal asking for feedback. I was happy to provide feedback in order to help someone refine their proposal and make sure that it fits with my vision for the track.
  • Amber: As a track chair, I was trying to envision the final program and imagine how each presentation would fit into it. This sort of exercise of imagination becomes difficult with the overflow of submissions at the end.
  • Amber: I really appreciated early submissions that were thorough. What I didn't like were early submissions from veteran speakers who didn't fill out at least 1 link to a recording of a previous talk. Just one is all I needed. Don't make me hunt down your previous talks just because you've spoken many times before at DrupalCon. My advice for veteran speakers is: don't assume the track chairs know who you are and what your reputation is.

Read the track description

Each track has put together both a description of the track and a blog post, explaining the vision for the track and the things that we are hoping to cover. The list reflects both our ideas about what we think will be valuable for the community and specifically DrupalCon's audience, and even topics of personal interest. You can bet we're going to keep that list in mind when we're picking sessions. You don't have to follow the list, but know that we're likely trying to make sure we cover things on the list.

  • Joe: Every time I sat down to start reviewing sessions I would start by first rereading the description I had written for the track. This helped me to get into the right mindset and to remember what it is we're trying to accomplish. That being said, I was also trying to be aware of the fact that other people are going to have some great ideas that I didn't think of when I wrote the track description. This was especially true when I saw that there were multiple sessions submitted about the same topic -- a good indicator that there is community interest. (Amber: I totally agree!)
  • Joe: Sometimes I found it difficult to qualify if something was a good fit for my track or not. As someone who has submitted to DrupalCon in the past, I realize that sometimes you have an idea for a great talk that just doesn't fit perfectly into any of the predetermined tracks. There was a lot of talk about this amongst the overall program team and my suspicion is that in the future this likely to change a bit. Possible ideas include allowing the selection of multiple possible tracks or even moving to more of a tagging system and less of a one-to-one relationship between sessions and tracks.
  • Amber: I depended heavily on my global track chairs for input on the track description. The nature of the Horizons track is such that you want to make sure you're not rehashing "the new hotness" from last year or the year before. I also tried, with a tiny bit of success, to solicit the community for ideas through a survey. I had better luck soliciting folks from our sister company Lullabot for ideas about current trends in tech.

Write a good description

After having read a few hundred submissions you start to get a sense of who put some thought and effort into their descriptions and who didn't. Things like spelling and proper grammar aren't necessarily going to make or break your chances of getting accepted. However, attention to detail in your description indicates you're likely to put the same effort into preparing an awesome session.

  • Joe: When reviewing sessions I pretty much immediately discarded anything that I could tell right away didn't have much effort put into it. There were a few with only one or two sentences in the description. There was even one where someone had left the default "copy goes here" values in the form fields. I suspect they meant to come back and fill in more later but then never did. Either way, if I can tell you didn't put much effort into your submission it's going to be hard to convince me that you're going to put in the time and effort required to prepare a quality session.
  • Amber: In my mind, a thorough track description describes the topic and what problem it addresses, describes what the presentation will cover, and at what level, and describes learning outcomes--what the audience member can expect to learn by attending this session. An absence of any of these elements earned a lower ranking. No matter your level of English proficiency in writing, spelling, or grammar, have at least one person read it over and give you feedback. But please do them the courtesy of running your copy through a spelling and grammar checker tool first.

25-minute sessions

Prior to DrupalCon Baltimore, all sessions were always 60 minutes long. Or rather, 45-50 minutes with time for Q&A at the end. In Baltimore, the program team decided to introduce the option to have 25-minute sessions. And we allowed you to choose when you submitted your proposal whether you intended for it to be 25 minutes, 60 minutes, or that you were willing to do either.

  • Joe: I found it tricky whenever someone would select the "either" option. It forced me to try and imagine what the talk might look like in a 25-minute version vs. a 60-minute version. And, I had to assume the description was written to cover the 60-minute version. What would they cut? Would they cut or just compress? A few people left extra notes indicating their intent. But most did not. I don't think it swayed my opinion of the session much one way or another, but I tended to assume that any session indicated as suitable for both was probably intended to be 60 minutes.
  • Joe: Another thing I thought about a lot related to 25-minute sessions was the fact that when we scheduled them we put 2 sessions in the same room back-to-back. So the audience was likely to be mostly the same group of people for both. And thus it made a lot of sense to try and pair them based on a similar theme.
  • Amber: After sessions were selected and I reached out to the speakers in my track, I did sense a bit of nervousness (maybe even annoyance?) from speakers chosen for 25-minute slots, especially when they had chosen the "either" option. It is quite challenging to reformat a talk to fit in a shorter time-slot, especially when it has been developed as a 50-minute session.

View of spreadsheet used to schedule sessions showing time slot divided in half with two 25-minute sessions in each slot.

Note: We learned our lesson here. When submitting a session for the upcoming DrupalCon Vienna, you can choose 30 minutes or 60 minutes -- not both.

Reviewing sessions

All of the sessions were placed into a spreadsheet with a tab for each track. Each track team (local, and 2 globals) was responsible for picking X hours worth of content for their track plus alternatives in case someone backed out. Beyond that, the requirements were somewhat minimal.

  • Joe: The programming team as a whole had at least 2 meetings where we talked about ways to approach ranking/rating sessions. I found that this was an area where it really helped to have the chance to work alongside other people who had done this before. You could really see how experience and knowledge gained over the years was carried along. For the Being Human track we chose to go through and individually rank each session from -2 to 2, tag them with info about how they mapped to our track description and the big ideas they included. For example, "Imposter syndrome" was something we knew we wanted to cover, and there were multiple sessions about it. Tagging each one allowed us to more easily identify duplication, etc. I also left a LOT of notes for myself. Much of the initial work of selection was assigning a rating and making sure I could figure out later why I assigned something that rating.
  • Joe: I reviewed things in multiple passes. Sometimes I just went through and tagged based on topics covered. Another pass through I was just thinking about the speaker(s) and their experience speaking. In order to keep a clear head and to give each submission the quality review it deserved, I would limit myself to spending no more than an hour or two at a time reviewing sessions -- which basically meant I spent an hour every day for about 2 weeks just reading and ranking. I initially tried reading a bunch all at once but quickly found that they were becoming jumbled in my head and I was having a hard time distinguishing one "Imposter syndrome" session from another. And that didn't seem fair. I was worried that sessions I reviewed most recently would just sort of win by virtue of the fact that I could remember them. This approach -- spending a shorter time each day -- ended up working really well for me.
  • Joe: I really wanted to make sure that the Being Human track was for the community and not just for me. For every session I accepted or declined, I would ask myself the question, "How will I justify this decision to the community?"
  • Amber: The Horizons track got lots of interesting submissions. For us, the main question was, how well does this session fit into the vision we had for our track, as described in our track description and topic suggestions list. I found it helpful to assign subtopics to each session so that we could see where there was more than one submission on a certain topic and so that we could compare submissions that were covering pretty much the same ground. We wanted to have complementary sessions or even two competing approaches to the same problem but avoid duplication. We also wanted to have as many "subtopics" in our track represented as possible. This is where it was nice to have the option to choose 25-minute sessions. We were able to cover a lot more interesting topics in our track that way.
  • Amber: Ranking was difficult. I broke the rating up into multiple categories and then took the average. We also calculated a standard deviation so that we could see which submissions we needed to talk about. Where there was a debate, it really came down to, "How well does this session fit into the vision for our track?" and "Do we think this is a good speaker, based on what we know about them?" The bottom line was always, "Can we justify this decision to the community?" Most of the work of ranking we were able to accomplish asynchronously on the spreadsheet. For matters of debate, we took to Slack or Google Hangouts to discuss. It was hard to find time for all 3 of us to sync up, so we did as much as we could asynchronously.

Inclusivity

When you submit a session to DrupalCon there are a couple of optional questions related to inclusivity. As part of our effort to increase diversity amongst speakers, you're asked to self-identify as to whether or not you've presented at a prior DrupalCon and whether or not you identify with any of the Big 8 social identifiers. This information isn't intended to be used as part of the session rating/ranking process per se, but we do use it to make sure we've got a diverse group of speakers.

Form showing questions about diversity when submitting a session.

  • Joe: In our spreadsheet we just had a yes/no for "first-time speaker" and "identifies with one or more social identifier". It came up a bit when thinking about a speaker's prior experience. There was one case where we had two very similar sounding talks, one by a new speaker and one by someone who had previously talked about the same topic. Both were ranked equally and as a tie-breaker we opted to go for the new perspective, especially since a recorded version of the other talk was still available from a previous conference.
  • Joe: When ranking sessions I intentionally didn't look at this too much. I was trying to initially rate sessions based on the session and the speaker's experience. After we had completed our selection for the Being Human track I took some time to go back through and generate some basic stats, such as percent of new speakers, percent non-male speakers, and percent identifying with one or more social identifiers. Our thinking was if we felt that any of those numbers needed to be improved that we could re-evaluate our choices to make sure we were representing the whole community. But, we were quite happy and didn't change anything. The Being Human track had a really diverse pool of speakers to choose from, which made our job pretty easy in this respect.
  • Amber: While the Horizons track did have around 20-25% session submitters self-identify with one of the Big 8 underrepresented communities, I was the only woman to submit to and speak in my track. (It is standard to recuse oneself from rating one's own talk or others where being objective would be difficult.) I thought this was a bummer. Although I did make an effort to reach out to women in the PHP and VR communities in particular, I was not successful in recruiting any other women to submit to Horizons. So, there's plenty of room for community growth, in that particular respect.
  • Amber: I was pleased with the mix of new-to-DrupalCon and veteran DrupalCon speakers in the Horizons track. I have always been a bit weary of seeing tracks dominated by the same speakers and I was glad to be able to include a number of folks new to DrupalCon, but who were still excellent experienced speakers. It wasn't a criteria for selection, but just a reflection of the quality of submissions from both new and experienced DrupalCon speakers.

One speaker, one session

You can submit as many sessions as you like. Submitting more than one might help increase your chances of being selected. But, with a few exceptions, we were committed to choosing one session per speaker across all tracks. We did allow a speaker to have a solo session and participate in a panel. In two cases we allowed a single speaker to have two solo sessions, but only after contacting them to chat about it.

Because speakers can submit multiple sessions to multiple different tracks this lead to some initial speaker duplication between tracks which we had to sort out.

  • Joe: For the Being Human track we ended up replacing two of our initially selected sessions because of speaker duplication. In one case it was a relatively easy decision; we had another session on the same topic that we liked equally well and had already had a hard time choosing between the two. The other one was kind of a bummer. I was personally really excited about the session so it was hard to give it up. But, the other track team made a stronger case for why the speaker should present the session in their track. It was kind of surprising to me how I could get sort of emotionally attached to a particular session! The program team worked really well together and I think did a good job of creating a schedule that was good for the community and not just for me, the track chair.
  • Amber: This was a bit of a challenge for us in Horizons. We had a number of excellent submissions, all on topics that we were interested in, but by the same speaker. In addition, the Horizons track topics have some overlap with PHP and Front-End, and so even when we thought we were out of the woods as far as overcommitting a speaker, we ended up making some compromises with other track teams in order to solve this problem of not allowing one speaker to speak too many times. I suppose it does increase your chances of being selected if you submit many times and are in general an awesome speaker and ridiculously intelligent, but it was a challenge to get that all sorted out. A good challenge, though, because I think it's good not to have a track dominated by a single person.

In conclusion

In conclusion, being a local track chair took a lot of time and effort. Overall, it was a positive experience. It was really interesting to see this side of DrupalCon planning and to gain insight into the process of session selection.

May 30 2017
May 30

Google Summer of Code

Today marks the start of another Google Summer of Code, which is the 12th year that Drupal has been selected to work with a terrific group of bright and eager students. In this great program, Google pays students to work on various open source projects for 10 weeks. Our 8 students are working on some pretty cool projects with 15 community mentors. They have spent the last few weeks getting oriented in our community, and the actual coding work begins today. The deadline for their projects is August 29th. As in prior years, Drupalize.Me proudly supports these students by providing them with free memberships for the duration of the projects so they can dive right into Drupal 8.

Drupal has been involved with GSoC since its beginning in 2005. A number of long-term contributors started out with Drupal through GSoC projects. The program gives students real, paid, experience, and open source projects benefit not only through their development work, but also by the chance to grow the community. If you'd like to follow along with the students' work over the coming months, check out the Google Summer of Code group.

There is a LOT of work that goes into running GSoC. Aside from the work for the students themselves, the volunteers who manage the project have to solicit, vet, and choose the projects, find and pair up mentors, and then of course, help the students get oriented and digging into the Drupal community. The mentors also devote their time and enthusiasm to help the students get off on the right foot, provide resources, and give them development guidance as they learn. I'd like to say a HUGE thank you to Google and the Drupal community volunteers who make this amazing program possible.

May 04 2017
May 04

Drupal 8Drupal 8 was many years in the making and is gaining in popularity as it matures. One of the aspects of Drupal 8 development that keeps many people from jumping in is that there are still a fair number of contributed projects (modules and themes) that have not been upgraded yet. To recognize the people who do the hard work of creating and maintaining contributed projects, and to encourage more people to upgrade projects to Drupal 8, we will give free Drupalize.Me memberships to Drupal.org project maintainers and the folks listed in Drupal 8's MAINTAINERS.txt file. This will provide a top Drupal 8 training resource for the people on the front lines, making Drupal 8 available to the world.

In order to qualify for a free membership you will need to be listed as a committer on a Drupal.org project page (module, themes, etc.), with at least 1 commit in the last year. Your project should also have a Drupal 8 branch or an issue in the queue where you state that you are committed to upgrading the project to Drupal 8 soon. Just get in touch with us through our contact form or email us at [email protected], and give us a link to your project page on Drupal.org along with your username.

Thank you for all your hard work to make Drupal better!

Apr 06 2017
Apr 06

Last week Blake and I attended MidCamp 2017 in Chicago, and it was awesome. I always enjoy attending regional camps, and especially those that are relatively close to home for me. It's fun to get to geek out with some of my Drupal neighbors. I also like the pace of these smaller events sometimes. I feel that I'm able to actually spend a bit of quality time with the people I meet vs. DrupalCon where I often feel like I'm being pulled in 3 or 4 different directions all at the same time.

I encourage you to take the time to attend your local camps as well if you get the opportunity. Not sure where/when they are happening? Check out http://drupical.com, and/or locate your regional group on https://groups.drupal.org. I've said this before, and I'll probably keep saying it for as long as I'm involved with teaching Drupal. There is no better way to improve your Drupal knowledge than through mentors and interaction with other members in the community. Regional events like this are a great opportunity to make those connections.

Drupal 8 theming workshop

On Thursday Blake and I presented an Introduction to Drupal 8 Theming workshop. It was an 8-hour long whirlwind tour of all the components that make up a Drupal 8 theme. Really, it's the in-person version of the Drupal 8 Theming Guide on our site. There's always something different--and energizing--about getting to teach these things in person. I love being able to get real-time feedback, and to answer people's questions. Creating training videos can be kind of isolating sometimes. I keep talking to my screen, but no one ever responds.

We'll be doing this training again at DrupalCon Baltimore, and we're also interested in presenting it at some other local camps. If you're helping to organize a camp this year and might be interested, let us know and we can see how it works with our schedules.

Sessions

I attended a bunch of sessions during the camp, and as per usual I've got a bunch of pages in my notebook full of notes and ideas I now need to follow up on. All of the sessions were recorded and are now available online. I recommend checking out:

  • Understanding Drupal by Mauricio Dinarte. You might think it's kind of silly that I'm attending an "Understanding Drupal" session, but I always love to hear the different ways people explain it, and Mauricio brings a really unique perspective and a good story.
  • Building Great Teams by Drew Gorton. I came away with a bunch of notes about things I'm now going to try and get the Drupalize.Me team to do with me. This sessions is mostly about finding a common purpose and then going out and tackling it together.
  • Whitewashed - Drupal's Diversity Problem and How to Solve It by Chris Rooney. This session left me thinking about how many people lack a solid safety net, and that when that's missing it can be even hard to do things like decide to learn Drupal and switch your career. I wrote down some notes about how I think Drupalize.Me might be able to improve our offering by being aware of this barrier to entry and am already excited for our next team retreat so we can talk about it more.
  • Drupal 8 Caching: A Developer's Guide by Peter Sawczynec. A 10,000 foot overview of all the pieces that come into play when caching content served by Drupal. There are a lot of them, and Peter does a great job of breaking it down and providing information about each layer of the stack.
  • I also attended Tim Erickson's Drupal as a Political Act? session which was more of a discussion. We talked about the reasons that we all adopt and advocate for free open source software, barriers to entry, and more. I enjoy these conversations about using Drupal to make the world a better place, and find them inspiring. And Tim is a great person to chat with about it as he's got a lot of opinions and ideas.

Sprints

Finally, to cap it all off, on Sunday, Blake and I helped to facilitate a documentation sprint. As you can probably guess, having high-quality documentation is important to us at Drupalize.Me, and sprints like this are a great way for us to contribute back to the community.

So we brought a box of donuts along to help power the sprinters. And also to try and entice people to join the documentation table.

We ended up with a table full of people helping update various aspects of the Drupal.org documentation. Including some work on documentation for the Drupal 8 Migrate API, a new payment gateway for Drupal commerce, and a bunch of work on ensuring content exists for all the modules in Drupal 8 core. There's an open issue to ensure that there is a known URL with good documentation for each of the Drupal 8 core modules, and we made progress on that by adding and cleaning up documentation for 5 different modules during the sprint.

Thanks to everyone that joined us to help out. Some long time contributors like Mike, and Benjamin, and some first time documentation contributors like purplenwu and David. You all rock. Thanks for helping make Drupal better.

I'm already looking forward to getting to attend some more regional camps this summer. Hope to see some of you there.

Mar 06 2017
Mar 06

Drupal Dev Days has been a recurring event since 2010, when it got started in Munich. Since then it has changed location within Europe every year. This year it is being hosted in Seville, Spain from March 21-25. Dev Days is a special event, and I have my own very fond memories from previous years. Instead of being a general DrupalCamp, this is very focused on developers getting together and working on Drupal itself. There are definitely plenty of sessions to attend, where you can learn and grow. This year there is also a really great lineup of keynote speakers.

One thing that tends to set Dev Days apart is a heavy emphasis on sprinting, which makes for a great environment to dig into Drupal with a host of supportive mentors to help guide you, or bounce ideas off of. And you don't need to be a developer to take part in the sprints. There are a ton of fun tasks to help with. The sprints run every day, all the days of the event, so you can use it to fill some free time, or make a day of it. The camaraderie and learning keep hundreds of people coming back every year. If you've never attended a sprint, I highly recommend it as a way to not only learn Drupal, but to connect with people who can turn into lifelong friends.

We're excited to support Dev Days this year, and they will be handing out 5 free months of Drupalize.Me memberships. So go meet great people, enjoy the beautiful city of Seville, and get your Drupal on!

Mar 02 2017
Mar 02

We've added several new tutorials to our free Hands-on Exercises: Movie Project, which dive into module development. The first 16 exercises covered site building and theming. These latest additions continue to develop the same movie review site. They require you to start building your own custom modules to add a custom form for adding movie information to the site and use an external API to find and grab the information you need. This is a special series in that these are exercises to test your skills instead of tutorials that teach you, and they are based on the openly available Drupal projects from Damian Robinson.

We've received good feedback about the exercises we've published so far, and we've made some improvements to make it clearer what the expectations and prerequisites are for each exercise. You should already know how to build a Drupal site before wading into these practical exercises. You are provided with the wireframes for the site you are building, and each exercise details requirements for some aspect of building and theming the site. We also provide resources that you can use to refresh your memory about how to accomplish a particular task.

The new exercises we released are:

If you work through this project, let us know how it went for you!

Jan 24 2017
Jan 24

DrupalCon is a conference and global community gathering that takes place usually twice a year in North America and Europe. Past DrupalCons have spanned the globe and also taken place in India, Colombia, and Australia. DrupalCons are organized by the Drupal Association, a non-profit organization that supports the Drupal community through funding, infrastructure, education, promotion, distribution, and online collaboration on Drupal.org.

DrupalCon North America attracts thousands of users, frontend and backend developers, software engineers, system administrators, designers, UX specialists, content strategists, site owners, site builders, content managers, community leaders, site administrators, business owners, project managers, documentation specialists, trainers, and others that interact with Drupal as a website, as code, as a community, or as an integral part of their organization.

This year in 2017, DrupalCon North America is taking place in Baltimore, Maryland, USA from April 24-28.

DrupalCon Baltimore will feature presentations in 13 session tracks, informal Birds of a Feather gatherings (BoFs), code sprints, social events, pre-conference training workshops, topical summits, and events for first-time attendees and code contributors alike.

Session tracks

The DrupalCon Baltimore session program will include presentations in 13 tracks:

Each track has a track committee consisting of a local chairperson and two global chairpersons. These three people craft the track description and decide which session submissions will be chosen for their track.

Drupalize.Me happens to have two team members serving as local track chairs at DrupalCon Baltimore. Joe Shindelar is the local track chair for the Being Human track and Amber Matz is the local track chair for the Horizons track.

On being a track chair

Here are some thoughts from Joe and Amber on being a track chair:

  • Joe: Local track chairs define the theme for their track with input from their global chairs and the other track chairs. As the chair for the Being Human track, the definition of the track as a category for "content about the humans that create the software" was already set. My job was to take that very broad definition and refine it for this specific DrupalCon. What are issues that are important in our community right now? What about the larger tech industry as a whole? What am I personally passionate about? So, we tried to take that broad definition and provide people with a bit more guidance about what we are looking for in session proposals.
  • Amber: The Horizons track has a personal place in my heart because I got to speak in that track last year. This year I sent out a survey to the general public and also polled our sister company Lullabot for some ideas about what they considered "emerging technology" in the Internet world. So our topic suggestions and track description were heavily influenced by the responses to those surveys.
  • Joe: Our process so far has been mostly weekly meetings in which we've worked to define the tracks, proof each others' descriptions, discuss ideas for increasing the diversity of people submitting sessions, and make decisions regarding the various questions on the session submission form, scholarship funds, and other things related to: 1) getting people to submit sessions so that we can have a diverse pools of speakers and topics to choose from, and 2) putting things in place to make decisions about which of the submitted sessions will be accepted.
  • Amber: We've also written some blog posts: Built by Humans for Humans and Wanted: Explorers of Tech Horizons
  • Joe: Regarding communicating with people interested in submitting a session -- I've had a few people reach out to me asking me to review their submission and provide feedback, and a couple others who wanted to chat about either picking a topic, or narrowing the focus of an idea. Pro tip: The number of session submissions increases exponentially during the last week, so submit your session and/or contact the track chairs early if you want feedback. After a certain point there's too little time and too many new submissions for us to provide feedback to everyone.
  • Joe: Another thing we've done is reach out to people in and out of the community in an attempt to solicit session proposals. I've done this mostly via Twitter, email, and a few from the local community in person.
  • Amber: Same here. While it's easy to Tweet messages out, it's challenging to know what the impact is. Nevertheless, I have made some interesting new connections with folks in my attempt to reach out.
  • Joe: I've been working to read and consider sessions as they are submitted. Knowing that there will be a surge of them coming in at the end, I feel that it's important to stay on top of things so that I'm not overwhelmed trying to read everything in just a few days. I'm a little nervous about reading too many submissions in rapid succession and having them start to blur together. So I would rather tackle a few at a time. That's another good reason to get your session in early -- there's less chance of it getting lost in the torrent of new submissions in the last 24 hours.
  • Amber: I'm excited about the sessions we've received so far in the Horizons track. It's definitely advantageous to get your session in as early as possible because many track chairs, like myself and Joe, are reading them as they come in and inevitably the later ones will get compared to what we've already read. So get yours in early and set the bar high!

Anyone with previous speaking experience (not necessarily at DrupalCon) is invited to submit a session proposal by February 1, 2017. Be sure to read each of the session track descriptions and topic suggestion lists to find the right track for your session proposal.

Tips for submitting a presentation proposal

Here are some suggestions for submitting a proposal:

  • Pick a topic you're passionate about. That passion (or lack thereof) will show in your submission, your slides, and your presentation. If accepted, you're going to put a lot of time into this, so you'd better pick something that's interesting to you.
  • Even if you're not "The Expert™", on a particular topic doesn't mean you don't have good things to say. Everyone is at a different place along the spectrum of Drupal expertise, and I promise you've got good information that others will benefit from.
  • Browse session submissions from previous DrupalCon speakers and compare yours. Is the writing and explanation on the same level?
  • Pick a pain point, something you have personal experience with, and have some ideas about how to resolve it. Explain that this is what you're going to cover.
  • Proofread your submission -- better yet, have someone else proofread it for you.

Resources for people submitting session proposals

  • Here's good article from Gus Childs at Chromatic about his experience getting a session selected at DrupalCon New Orleans: The Road to Speaking at DrupalCon.
  • You can get feedback from veteran speakers before you submit from the group of volunteers at HelpMeAbstract.com
  • Contact the DrupalCon Program Manager who will put you in touch with a Session Submission Mentor.
  • Contact a Session Track team chair. Track teams are listed here and you can contact track team chairs via their Drupal.org contact form.
  • Find out more about resources and support for speakers here.
  • Don't neglect proofreading. Have someone else read it through.
  • Do you identify with an underrepresented group? The people behind the DrupalCon program are dedicated to increasing the diversity of of speakers at DrupalCon. One of the ways they are doing this is through the Inclusion Fund. You can request up to $350 for travel and lodging expense assistance. Find out more on this page.

We hope to see lots of great session proposals in all the tracks. Your contributions as a speaker will help make DrupalCon awesome.

Jan 13 2017
Jan 13

Drupal 8.3 is still a few months away, coming April 5, 2017, but there are already some changes we can look at, most notably in the experimental modules. In December, 2 new experimental modules were added to core, and BigPipe was officially changed from a beta module to stable. The 2 new modules you'll find in 8.3 are Workflows and Layout. Let's examine them.

Workflows

As part of the Workflow initiative, the Workflows module has been added to core to provide a workflow configuration entity type, which other modules can use. You can see it in action immediately with another core experimental module, Content Moderation, which was added in 8.2. As a matter of fact, Content Moderation now requires Workflows because this feature used to be an integral part of moderation.

If you are familiar with the Workbench contributed module, then you will recognize how the Content Moderation and Workflows modules work. The Content Moderation module in core is based on Workbench Moderation. Workflows establishes the concept of having workflow states and transitions between them. For example, you might have a content workflow that needs to have states, or "stops" along the way, of Draft, Review, Published, and a way to restrict how and when a piece of content changes between those states. These transitions would be something like, "a node in a Draft state can only move to Review and not directly to Published, while a Published node can move to either Review or Draft."

Screenshot of Workflows states and transitions with Content Moderation module

The great thing about Workflows being a separate module in core is that the ability to provide workflow states and transitions is not just tied to Content Moderation, but it provides a generic API that any module developer can use to create custom states and defaults for whatever workflow is needed. If you'd like to get a little more information about the Workflow initiative as a whole, you should read Dries' recent blog post about Moving the Drupal 8 workflow initiative along.

One important note about this new addition is that to separate the workflow states out, many changes were made to Content Moderation as well. If you are using Content Moderation in Drupal 8.2, it is a different module now in 8.3. When you upgrade to 8.3 you will need to uninstall Content Moderation before updating the codebase and re-install it after the upgrade. (Experimental alpha modules do not get upgrade paths built for them.)

Layout Discovery

The new Layout Discovery module is an exciting step forward for the Block and Layout initiative, though you won't actually see anything in the user interface for this module, other than the ability to enable it. That is because this is an API module that provides tools for other modules to use. Layout Discovery adapted from the contributed module Layout Plugin, which is used by other modules like Panels and Display Suite to keep track of page layouts. You'll need to be a developer to play with this one. You can get started by reading the Layout Plugin documentation about interacting with the API. The idea is that with this main piece of the puzzle in place, the initiative can move forward with adding a layout system to core with a user interface, along the lines of Panels and Display Suite.

BigPipe

The BigPipe module considerably improves front-end, perceived performance on your website. It was created by Facebook, and was then applied to Drupal through a contributed module before it became an experimental module in core as of Drupal 8.1 in February 2016. When we see it in 8.3, it will now be listed as a regular core module without the caveats of the experimentals. This means it is now ready for prime time and recommended for production sites.

Screenshot showing BigPipe module in the main Core section of modules to enable

This is one of those modules that every site should use by default since it will improve your site performance without you having to configure anything. If you want to understand more of how BigPipe works, you can check out this 10-minute video podcast by Acro Media or listen to a longer audio podcast by Lullabot with Wim Leers, who has been the main architect behind getting BigPipe into Drupal 8.

If you'd like to keep up with the changes coming in the next version of Drupal core, take a look at the change records which are published as things are added or changed. You can even filter the changes by the role they will impact.

Nov 23 2016
Nov 23

Hands-on Exercises: Movie Project

We're happy to announce a new kind of series on Drupalize.Me which provides a project for you to practice the skills you've learned in Drupal site building, theming, and module development. Our free Hands-On Exercises: Movie Project provides you with wireframes and customer requirements for a movie review site. Each exercise in the series has you progressively build the site according to the specifications. To help you along the way, each lesson also lists some tutorials and learning resources that will show you what kind of knowledge you need to have to accomplish the given tasks. Keep in mind that these are not typical tutorials that take you by the hand, but are instead exercises for you to apply the knowledge you have to a real-world project, to help you really understand how to build a site without having all of the step-by-step instructions.

Today we are releasing the first 16 of 36 exercises, which cover most of the site building and theming sections. In the coming weeks we'll be releasing the rest of the exercises, which dive into module development tasks. We're also working on providing Drupal 8 versions for all of these before the end of the year.

movie project homepage

This entirely free series is made possible by Damian Robinson, who developed the movie review site, the wireframes, and all of the exercises. Damian has been providing this kind of training to clients for a while and has come to a point where he wants to share his work with the wider Drupal community. He approached us about being able to provide these free to the community, pairing up his exercises with the tutorials we have on Drupalize.Me. We are happy to partner with Damian; it's a great match. We're publishing these exercises here, while you can also get the original files, and some additional project exercises, on Damian's website. Both Damian's original files and our Hands-on Exercises: Movie Project are published under the Creative Commons Attribution-ShareAlike 4.0 International license (CC BY-SA 4.0), which means you can copy, modify, and redistribute these exercises as long as you share them openly with others and give attribution to Damian.

We're excited to provide our members and the wider Drupal community with some new tools to help you on your Drupal adventure. Let us know what you think!

Nov 16 2016
Nov 16

Drupal 8 birthday cupcakeIt’s hard to believe, but it has been 1 year since Drupal 8.0 was released to the world. We’re celebrating Drupal 8’s first birthday on November 19th by giving FREE access to our full Drupal 8 Migration Guide over the celebration weekend! This Saturday, November 19th through Monday, November 21st you can learn how to use the new core migration system to upgrade your Drupal site or import content from external sources. It’s time to get using Drupal 8!

A lot has changed in Drupal 8 and we’ve been working hard to bring you the most accurate and up-to-date tutorials. We have extensive guides for Drupal 8 Theming and Upgrading Drupal with Migrate (as we noted above). You can also find a lot of really great goodies through our partners at KnpUniversity, who have provided Drupal 8 Module Development Essentials and covered the new PHP and Symfony concepts in Drupal 8. You can see our full guide for Preparing for Drupal 8 to dig into more background concepts relevant to Drupal 8.

We also have a regular review process to make sure we keep current with the 6-month core feature release cycle. Every few months we review our tutorials for accuracy, check what is changing in Drupal core, and make sure you have the latest and greatest information.

You can see our entire to-do list on our Drupal 8 page, and review our topic-publishing roadmap each quarter through our blog.

Join us in celebrating a year of Drupal 8. Check out the Drupal 8 Migration Guide for FREE, only this weekend!

Nov 08 2016
Nov 08

When we were considering switching to the Pantheon hosting platform, one of the features that made us confident in our decision is what they call Multidev. In order to understand why Multidev is so important for us, and Pantheon's other customers, it's important to understand the best practices that go into developing and hosting a website.

In the earliest days of the web, a site consisted of a bunch of HTML files sitting on a server. Even today it's still possible to ssh into a machine, edit files by hand, and have those changes (or typos) immediately served by a web server. It didn't take too long, or too many typos, for people to set up a second site (or web server) to use for testing and proofreading changes before public consumption.

Now (hopefully) every developer knows they should at least have one testing environment alongside their live site where they can work and test new code. Even for small teams there are often more than just two sites. A configuration of a development site, a test site, and the live site is quite common (and just happens to be Pantheon's default configuration). While managing multiple environments can add a bit of overhead they come in quite handy when you need a pristine site to demo new functionality but you don't want to block your development team from continuing to work.

Let's say you're interested in trying out a new version of PHP, or you are thinking about swapping out a new theme, or adding a new major piece of functionality that isn't quite ready for prime time yet. Wouldn't it be nice if you could spin up a new environment for each of these changes in parallel? If that was possible you'd have an actual interactive website to share with stakeholders and test against before affecting the rest of the development team. Thankfully, that's exactly the problem the Pantheon workflow and Multidev helps us solve.

Drupal deployment best practices and pain points

Drupal deployment has traditionally been somewhat difficult. This is especially true of older versions of Drupal (prior to Drupal 8). Since Drupal can store configuration in the database alongside the code powering the site, figuring out how to capture all of that configuration in code (and hence version control) was not always an easy task. Tools like update hooks, CTools exportables, and the Features module have all evolved to help us solve this problem. By establishing a policy that keeps as much configuration in code as possible, we can confidently say that code must be deployed through the Dev -> Test -> Live environments and can only be promoted in one direction. Meanwhile, the data and content that makes up our site flows in the opposite direction. As needed, the database can be copied from the Live -> Test -> Dev environments at any time. Thankfully the Pantheon dashboard makes this process incredibly easy. It can even be scripted using an automation tool like Jenkins and Pantheon's command line tool Terminus.

Pantheon dashboard environment sync

If you're new to administering Drupal sites, or this kind of deployment infrastructure, I'd recommend you check out a few resources before continuing on.

Now that we've got a few deployment and infrastructure best practices established, let's get back to understanding what Multidev offers and why it's a game changer.

Multiwhat?

At its core, Multidev is just a method of spinning up complete environments for code that hasn't yet been merged into the main development branch. The main benefit to this is that it makes it incredibly easy to build a complete website environment that parallels your live site where any team member can functionally and visually test changes before they're fully merged. From a practical standpoint this means that doing functional testing while reviewing a pull request (especially for simple fixes like typos) can be done without disrupting a developer's local environment. Multidev can quickly spin up new environments based on git branches, allowing developers to share their progress with the team and get feedback without disrupting the main shared Development or Testing environment.

As a part of Lullabot I'd been using something similar for quite a while, which we then brought over to Drupalize.Me after I joined the team. About 3 years ago James Sansbury wrote the original version of something we called the Pull Request Builder. Other Lullabots became excited about the project and we eventually renamed it to Tugboat. Since then Tugboat.qa has evolved quite a bit, but at its heart are the same principles as Pantheon's Multidev. An added bonus with Multidev is that it's integrated with, and configured identically to, the infrastructure that hosts our live website.

The Pantheon workflow and Multidev

Now that we're running on Pantheon here is a sneak peek at what our development workflow and infrastructure looks like.

The code for our site is hosted in a private GitHub repository. When working on code we roughly follow the Git Flow model. When we start working on a particular issue we create a feature branch and push the work to GitHub. This then triggers a webhook which pushes the new branch to our Pantheon git repository. One thing to note is that Multidev environments created from git branches on Pantheon have certain restrictions on branch names. In particular,

Multidev branch names must be all lowercase and less than 11 characters. Environments cannot be created with the following reserved names: master, settings, team, support, multidev, debug, files, tags, and billing.

This means we also need to run a bit of additional code, prior to synchronizing our GitHub and Pantheon repositories, to truncate branch names.

With our git branch pushed to our Pantheon repository we can create a new Multidev environment either via the dashboard or from the command line using terminus ( terminus multidev:create <site-name>.<environment> <new-environment-name>).

Pantheon Multidev dashboard create environment

It's also worth noting that you can create a new Multidev environment without an associated git branch. This can come in really handy for quick, disposable experimentation.

Pantheon create Multidev environment

A newly created Multidev environment contains the new code from the related feature branch, along with the filesystem and database from the environment it was cloned from (Development by default in our case). Now we have a URL we can share with the rest of the team to see the new work in action. While we still do full code reviews on GitHub, for less technical tickets, e.g. typos or small CSS tweaks, this allows us to double check our work on an actual copy of our site. We've noticed how empowering this can be for less technical team members. With a small amount of training using GitHub's editing and pull request interfaces everyone on our team is able to submit pull requests and help with immediately testing their work. While we typically do our code review and perform merges via GitHub, Pantheon's dashboard also provides buttons for merging the commits from a Multidev environment. Once the code has been merged, we're off to choose another issue to work on.

It's hard to overstate the amount of time this workflow saves everyone on our development team. Peer review can be done faster, more easily, and with a lower barrier to entry. Local development environments don't need to be constantly reset when dealing with context switching between tasks. Being able to preview work on a fully functional site alongside a code review is incredibly valuable. Honestly, it's hard to imagine how teams large and small are able to work together efficiently without something like Pantheon's Multidev.

If you have any other workflow tips or tricks we'd love to hear from you! If you're interested in learning more about Multidev in particular try these resources:

Oct 25 2016
Oct 25

You may have heard of a variety of Drupal 8 initiatives during the development cycle leading up to Drupal 8.0.0 being released in 2015. These were officially recognized efforts to get a variety of big changes into Drupal 8, and included projects such as configuration management, Views in core, and multilingual improvements. A lot of work from those initiatives is now part of Drupal 8. Not everything got in though, and as time moves on some priorities for new work will always shift.

With Drupal 8 we also changed how we approach future changes in the code base by switching to a new semantic versioning system and revising our development policies to embrace new features in future minor versions of the software. With a formal structure for new features to be rolled out in Drupal 8, many people have continued to push various projects forward. At DrupalCon New Orleans in Dries' keynote, he recognized the continuing efforts for major changes by carrying forward the concept of official initiatives that have been reviewed and approved by the core team. The idea of initiatives appears to be a solid concept that has been successful for our community, so we're going to keep using the same idea and word for post-launch projects as we did for the pre-launch projects.

The new initiatives

During that keynote Dries presented his list of existing and proposed official initiatives to the community. At that time the list he presented consisted of:

  • Migrate (already active)
  • Workflow (planned)
  • Media (proposed)
  • Data Modeling (proposed)
  • Block and Layout (proposed)
  • API-First (proposed)
  • Theme Component Library (proposed)

You'll notice that many of those were categorized as only proposed. That means that either folks in the community had started conversations around these things, or Dries thought these would be good ideas for someone to tackle. A proposed initiative isn't necessarily moving forward, and they do not have a formal team and/or proposed plan for accomplishing the goal. A lot has happened in the 5 months since New Orleans, so I want to take a look at where things are now, just after DrupalCon Dublin. (Note that much of this list is derived from Angie Byron's DrupalCorn keynote and several sessions from DrupalCon Dublin.)

Current active and planned initiatives

Here is a short overview of the current official initiatives for future versions of Drupal 8 (i.e. Drupal 8.3.0 and higher), along with some information about how you can follow the projects or even jump in to help.

  • The Migrate module shipped as an experimental module with Drupal 8 core when it released almost a year ago. Dries noted that this initiative was already in progress in his keynote, and there have been significant improvements to the system in each minor version so far (both Drupal 8.1.0 and 8.2.0). There is still yet more work to do to get it completely stable and feature-rich so there is ongoing work here. You can keep track of the work in the meta issue, Stabilize the older Drupal to newer Drupal migration system.
  • The Content Workflow initiative has moved from planning to fully active. The most significant change so far is that we now have a new experimental module in 8.2, Content Moderation. You can track it in the meta issue linked above, and there is a very nice post by Yoroy, Content workflow initiative, the concept map that defines the scope and areas to be addressed.
  • The API-first initiative is another one that has moved from proposed into fully active. This is currently focused on REST improvements for core. 25% of change records for 8.2 were for the REST module. The next target in sight is to get JSON API in as an experimental module. To learn more about what their plans are, you can see their DrupalCon core conversation.
  • There has been a team of people working on Media in the Drupal contributed space for the last 2 years, including work on a media library, using remote media, WYSIWYG embedding, and re-usability of media. They are now moving out of a proposed initiative with a plan to move these best-of-breed features into Drupal core. You can see the issues they are hammering on in their kanban board or filtering for issues using the D8Media tag. For a complete overview, check out the DrupalCon Dublin core conversation session.
  • The Blocks and Layouts initiative is one that was also a pre-launch project (formerly called the Scotch initiative). There is still work happening on that front in the contributed space, with strategic core issues to pave the way for moving this work into core. The issues are widespread and you can follow them by looking for the Blocks-Layouts tag in the issue queue.
  • The Usability initiative was not part of Dries' initial list, but there has been a strong UX team working throughout Drupal 8's development. We saw 2 new experimental modules in 8.2, Place block and Settings tray, and there is more coming for 8.3, most notably Quick edit improvements. This team also had a session at DrupalCon Dublin.

Proposed initiatives

These are some new initiatives that have been proposed within the community, and some of Dries' initial list that have not yet been adopted by or planned by anyone yet.

  • The New core theme initiative is moving along quickly with a team formed up and their plan outlined. This is just waiting on a green light from the core committers to move this into an official initiative. You can find out more about the plan from the DrupalCon Dublin core conversation.
  • The Theme component library was on Dries' list, but this one has been put on hold while the new core theme is worked out. To see some of the ideas being sketched out, you can see what's going on in the Zen theme as a proof-of-concept.
  • Both the Data modeling and Cross-channel organization ideas proposed by Dries have not gathered momentum or a team to carry them forward yet.

A Note about experimental modules

You'll notice that in these initiatives "experimental" modules are mentioned a lot. The idea is to get new features into core in various non-stable states (alpha, beta, etc.) and see how they work out. If they work out well then they will eventually become full, stable modules in core. If not, they can always be removed. Due to the fact that they could be removed, it is generally not recommended to use these for production sites, but of course getting people to test them out on real sites gives the feedback needed to evaluate and improve them.

The initiative pipeline

Given the wide variety of projects that are always going on in the Drupal community, how exactly do new initiatives come along, outside of having Dries announce them at a DrupalCon? Dries outlined some basic guidelines in New Orleans, but to make things clearer there is now a new Ideas project and issue queue on Drupal.org. You can see everyone's proposals and anyone can work an idea out where the community and core committers can give feedback. As a prototype or game plan is approved, it gets moved into the core issue queue for full implementation.

For a really good look at the changes in Drupal 8 development and the way new features are moving into core, Gábor Hojtsy has a great presentation, Checking on Drupal 8's Rapid Innovation Promises.

Oct 17 2016
Oct 17

Back in August we announced that we were moving our site to Pantheon hosting. Last month we completed the migration and Blake wrote a post about the process. This month I'm going to take a look at some performance comparisons between our previous infrastructure and our shiny new home.

Background

Prior to moving our site to Pantheon it was hosted on Linode, using a couple of different VPS servers that we managed ourselves with a bit of help from Lullabot. Our old Linode infrastructure consisted of a single web server running Varnish, Solr, Memcache, and Apache, along with a few other servers for testing and DevOps. It was always plenty fast. The choice to move to Pantheon wasn't because we hoped for a performance improvement, but still, we thought it would be a fun exercise to see how the change affected the performance of our site.

My hypothesis

They say that if you're going to measure something you should know what questions you want to answer before you start. Because if you go in saying, "to see what happens", that's what you'll do. See what happens. So I wanted to answer this question: How did moving our site from Linode to Pantheon affect the performance-measured in response time-of our site for both members and non-members?

Going into this, I expect that Pantheon will perform better than our previous setup, though I don't really have a sense of how much better. Hosting Drupal sites is, after all, what they do. I don't think our site was slow on Linode, but I also know that there are a lot of infrastructure and performance tweaks we never got around to making because they were never a top priority.

What should I test?

I want to see what response time looks like for various important pages on our site, as well as a few pages that are good samples of common page variants. So I came up with the following list of pages:

  • / : Our home page: most people's first impression of Drupalize.Me, and the content dashboard for authenticated users.
  • /tutorials : The main listing of tutorials on our site; the 2nd most popular page on our site.
  • /pricing : This page is important when it comes to converting users to paid members, so we want to set a good impression.
  • /user : Returning users go here to sign in, a common task. This is also the account dashboard for authenticated users.
  • /tutorial/core-migration-modules?p=2578 : Example of a written tutorial with an embedded video.
  • /videos/build-your-first-page-symfony-3?p=2603 : Example of a stand alone video tutorial.
  • /series/drupal-8-theming-guide : Example of a series, or guide, landing page.
  • /blog/201607/why-learning-drupal-hard : Example of a blog post with a few comments.
  • /search?query=pantheon : Example of a search query.

In the future we might want to test things like navigational scenarios. For example: an anonymous user navigating to a blog post, leaving a comment, and then navigating to the contact page. For now though, we're after some basic response time comparisons. So this feels like a good list.

Set up

Before running the tests I did a bit of configuration on our site to facilitate testing. First, I created a dummy user on both environments and configured it as if it was a normal monthly personal membership. This way I have an account I can use for testing the authenticated user experience.

I also made sure I could answer these two questions in advance:

  • Are your tests going to be performed against the live site? If so, do you have a way to quickly abort them?
  • Do your tests create dummy content? How are you going to make sure that content gets cleaned up afterwards?

Establish a baseline

I started by gathering some basic information using cURL. We'll use curl to request HTTP headers from the environments, and time to see how long our curl command takes. This will give us some information about the current environment, and a rough idea of what we can expect for a single page request.

Linode

time /usr/bin/curl -I https://drupalize.me/tutorials
HTTP/1.1 200 OK
Date: Fri, 16 Sep 2016 16:24:40 GMT
Server: Apache
Strict-Transport-Security: max-age=15552000
X-Drupal-Cache: MISS
Expires: Sun, 19 Nov 1978 05:00:00 GMT
X-Content-Type-Options: nosniff
Content-Language: en
X-Generator: Drupal 7 (http://drupal.org)
Link: <https://drupalize.me/tutorials>; rel="canonical",<https://drupalize.me/tutorials>; rel="shortlink"
Last-Modified: Fri, 16 Sep 2016 16:24:40 GMT
Vary: Accept-Encoding
Content-Type: text/html; charset=utf-8
X-Varnish: 2623806 2725564
Age: 23
Via: 1.1 varnish-v4
ETag: W/"1474043080-0-gzip"
Cache-Control: no-cache, must-revalidate, post-check=0, pre-check=0
>> 0.57 real         0.02 user         0.00 sys

Pantheon

time /usr/bin/curl -I https://drupalize.me/tutorials
HTTP/1.1 200 OK
Date: Fri, 16 Sep 2016 16:25:37 GMT
Content-Type: text/html; charset=utf-8
Connection: keep-alive
Set-Cookie: __cfduid=db4a4fc18bf748493351d2d6ae784af911474043137; expires=Sat, 16-Sep-17 16:25:37 GMT; path=/; domain=.drupalize.me; HttpOnly
Cache-Control: public, max-age=900
Content-Language: en
Expires: Sun, 19 Nov 1978 05:00:00 GMT
Last-Modified: Fri, 16 Sep 2016 16:25:24 GMT
Link: <https://drupalize.me/tutorials>; rel="canonical",<https://drupalize.me/tutorials>; rel="shortlink"
X-Content-Type-Options: nosniff
X-Drupal-Cache: MISS
X-Frame-Options: SAMEORIGIN
X-Generator: Drupal 7 (http://drupal.org)
X-Pantheon-Endpoint: 089c557c-2188-434f-b435-827816b210ba
X-Pantheon-Styx-Hostname: styx480365c9
X-Styx-Req-Id: styx-1bee92be066d604d0c8eb52711752b8a
X-Styx-Version: StyxGo
X-Varnish: 51631136 64695802
Age: 12
Via: 1.1 varnish-v4
Vary: Accept-Encoding, Cookie, Cookie
Strict-Transport-Security: max-age=15552000
Server: cloudflare-nginx
CF-RAY: 2e35ace6dc5a555e-ORD
>> 0.24 real         0.11 user         0.01 sys

The "real" value from the time command is probably the most interesting thing in this output. It gives you a rough idea of how long it takes for the site to respond to a single request. Which basically amounts to: how long does it take Drupal (and all the layers in front of it) to service my request? Shorter is better. In both of these examples you can see the X-Varnish: 51631136 64695802 header, which indicates to me that these anonymous requests are actually being serviced by Varnish, and aren't even making it to Drupal. It's also why they're so fast. In this instance we're really testing the speed at which Varnish can return a page.

Cache busting

What about if we force our requests to bypass the Varnish cache by adding a NO_CACHE cookie?

Linode

time /usr/bin/curl -I -H "Cookie: NO_CACHE=1;" https://drupalize.me/tutorials
HTTP/1.1 200 OK
Date: Fri, 16 Sep 2016 17:15:11 GMT
Server: Apache
Strict-Transport-Security: max-age=15552000
X-Drupal-Cache: HIT
Etag: "1474046080-0"
Content-Language: en
X-Generator: Drupal 7 (http://drupal.org)
Link: <https://drupalize.me/tutorials>; rel="canonical",<https://drupalize.me/tutorials>; rel="shortlink"
Cache-Control: public, max-age=900
Last-Modified: Fri, 16 Sep 2016 17:14:40 GMT
Expires: Sun, 19 Nov 1978 05:00:00 GMT
Content-Type: text/html; charset=utf-8
>> 0.58 real         0.02 user         0.00 sys

Pantheon

time /usr/bin/curl -I -H "Cookie: NO_CACHE=1;" https://drupalize.me/tutorials
HTTP/1.1 200 OK
Date: Fri, 16 Sep 2016 17:14:23 GMT
Content-Type: text/html; charset=utf-8
Connection: keep-alive
Set-Cookie: __cfduid=d2a7c943f0a16e2620050e4ffe8fd29cf1474046063; expires=Sat, 16-Sep-17 17:14:23 GMT; path=/; domain=.drupalize.me; HttpOnly
Cache-Control: public, max-age=900
Content-Language: en
Expires: Sun, 19 Nov 1978 05:00:00 GMT
Last-Modified: Fri, 16 Sep 2016 17:04:59 GMT
Link: <https://drupalize.me/tutorials>; rel="canonical",<https://drupalize.me/tutorials>; rel="shortlink"
X-Drupal-Cache: HIT
X-Frame-Options: SAMEORIGIN
X-Generator: Drupal 7 (http://drupal.org)
X-Pantheon-Endpoint: 089c557c-2188-434f-b435-827816b210ba
X-Pantheon-Styx-Hostname: styx480365c9
X-Styx-Req-Id: styx-2efd3f16e56111a76a349f5b3ab8e14b
X-Styx-Version: StyxGo
X-Varnish: 77632744
Age: 0
Via: 1.1 varnish-v4
Vary: Accept-Encoding, Cookie, Cookie
Strict-Transport-Security: max-age=15552000
X-Content-Type-Options: nosniff
Server: cloudflare-nginx
CF-RAY: 2e35f45652fc256d-ORD
>> 0.29 real         0.01 user         0.02 sys

Notice that the X-Varnish: 77632744 header only contains a single ID this time instead of the 2 numbers it showed before. This indicates that Varnish was not able to service the request, and thus passed it along to Drupal. We are still getting cached results from Drupal though: the X-Drupal-Cache: HIT indicates that the content was served from Drupal 7's anonymous page cache.

Authenticated users

So far all the data we've looked at is for anonymous users. That is, people who are browsing our site but are not signed in to their account. As a business that sells membership subscriptions, our goal is to convert anonymous users to subscribers, and subscribers always navigate our site while signed in. So we want to make sure that the experience is a good one for them as well.

Before doing any testing I fully anticipated that the experience would be slower for authenticated users. When you're signed in to our site we customize the experience in a lot of different and unique-per-user ways that make doing things such as caching the HTML of an entire page difficult. The page is unique for each person. So we already know that building the page for an authenticated user is going to be more expensive.

In order to generate authenticated requests using curl we can use the session cookie from a session in our browser. Here's how to find that. Sign in to your site in your favorite browser. Then find the cookie that starts with either SESS, or SSESS followed by a random string. Copy the cookie name, and value, and then use them as arguments to curl using the --cookie flag like so:

curl --cookie "{cookie.name}={cookie.value}"

Linode

time /usr/bin/curl -I --cookie "SSESS77386d408b0660b92f2dbc30c5675085=Xawrv1CllbUwC6ksX3qq7Ya2cbwitQv7xF33baJ2644" https://drupalize.me/tutorials
HTTP/1.1 200 OK
Date: Fri, 16 Sep 2016 17:22:33 GMT
Server: Apache
Strict-Transport-Security: max-age=15552000
Expires: Sun, 19 Nov 1978 05:00:00 GMT
Cache-Control: no-cache, must-revalidate, post-check=0, pre-check=0
X-Content-Type-Options: nosniff
Content-Language: en
X-Generator: Drupal 7 (http://drupal.org)
Link: <https://drupalize.me/tutorials>; rel="canonical",<https://drupalize.me/tutorials>; rel="shortlink"
Content-Type: text/html; charset=utf-8
>> 0.75 real         0.02 user         0.00 sys

Pantheon

time /usr/bin/curl -I --cookie "SSESS77386d408b0660b92f2dbc30c5675085=Ifrv29Rrk3RZ2DdUWhZDUhmCYzdFw_J0n0p217GXMTY" https://drupalize.me/tutorials
HTTP/1.1 200 OK
Date: Fri, 16 Sep 2016 17:26:45 GMT
Content-Type: text/html; charset=utf-8
Connection: keep-alive
Set-Cookie: __cfduid=d32d0c09a57b7ae447b943ebee6427dc81474046805; expires=Sat, 16-Sep-17 17:26:45 GMT; path=/; domain=.drupalize.me; HttpOnly
Cache-Control: no-cache, must-revalidate
Content-Language: en
Expires: Sun, 19 Nov 1978 05:00:00 GMT
Link: <https://drupalize.me/tutorials>; rel="canonical",<https://drupalize.me/tutorials>; rel="shortlink"
X-Content-Type-Options: nosniff
X-Frame-Options: SAMEORIGIN
X-Generator: Drupal 7 (http://drupal.org)
X-Pantheon-Endpoint: 089c557c-2188-434f-b435-827816b210ba
X-Pantheon-Styx-Hostname: styx480365c9
X-Styx-Req-Id: styx-22105de3110b076241cad5d6b9e44e61
X-Styx-Version: StyxGo
X-Varnish: 83009225
Age: 0
Via: 1.1 varnish-v4
Vary: Accept-Encoding, Cookie
Strict-Transport-Security: max-age=15552000
Server: cloudflare-nginx
CF-RAY: 2e3606762f2f2597-ORD
>> 0.81 real         0.01 user         0.01 sys

Linode Pantheon Anon. + Varnish/CDN 0.57 0.24 Anon. + No Cache 0.58 0.29 Authenticated 0.75 0.81

This shows that for a single request Pantheon significantly outperforms our Linode setup, but that Linode handles authenticated requests slightly better.

Calculating concurrent users

The above tests really only measure the performance of a page without accounting for load. We've just learned how fast a page from our site can theoretically be served, but this doesn't really tell us much about the underlying infrastructures ability to handle multiple concurrent users.

Individual page time is the thing we can affect the most as developers, but the underlying infrastructure impacts concurrency. When load testing we're not necessarily testing how fast Drupal or any of our custom code is. We are actually testing how well the given infrastructure can handle Drupal and our custom code while serving multiple users at the same time. In order to gather some more data I performed a load test, simulating normal load on our site.

So, what is normal load?

One way to approach this is to determine the average number of concurrent users you expect to be using your site and then run your test with that many users. I did this by looking at our Google Analytics stats for the last month and doing some quick math in order to calculate the average number of people actively using our site at any given time.

Total session for last 30 days: 48,585 Average length of each session: 6 minutes 32 seconds (392 seconds)

concurrent_users = (total_session_for_month * average_time_on_site) / (3600 * 24 * 30)
7.34 = (48585 * 392) / (3600 * 24 * 30)

Another, and perhaps more common, use for load testing is to try and get a sense of whether or not your application is going to hold up when you get a traffic spike on awards night. A site like Grammy.com for instance sees relatively little traffic 364 days out of the year, but on awards night, that traffic spikes to extremely high levels. In order to ensure that the site remains available during that traffic spike you might try and calculate the number of users you think will use the site in the given period and run that simulation instead. The end result is still X concurrent users.

For good measure, when load testing I would usually add 10% to this number.

Use Siege

I'm not going to cover this here, but another technique for getting an idea of how well a page performs is Siege. The difference is tools like Siege make multiple concurrent requests and average the results so you get a more accurate picture. Our example above could be suspect to network latency, and other variations, that skew the results. So an average might a bit more accurate.

Read more about using Siege to test the performance of your site in this blog post from earlier this year.

Using JMeter

Instead, for this test I'm going to use Apache JMeter to configure a test suite, and then run those tests via BlazeMeter.

Apache JMeter is a Java application that can be used to load test web applications. It is highly configurable, and can be used to simulate virtually any scenario you can imagine. In addition it can be used to simulate any number of concurrent users. It comes with a complete set of plugins for graphing and analyzing test results.

At a certain point you're going to want to simulate more users than your laptop has the resources for. JMeter has the ability to perform distributed testing by setting up a master instance that delegates to any number of slave machines to do the heavy lifting. Thus, you can scale your tests to any size. BlazeMeter is a service that understands how to read a JMX test file, and do this autoscaling for us. Bonus!

So here's what I did.

I started by installing the BlazeMeter Chrome plugin, which effectively allows you to record your active browser session, turn it into a JMX file, and upload it to BlazeMeter. This was a great way to perform some quick/simple tests.

I then downloaded those tests and opened them in JMeter so I could further tweak the scenarios and learn a bit more about how JMeter works. This ended up being great because I could run/debug my scenarios locally, and even do some initial testing for lower levels of concurrent users. I actually had a lot of fun playing around with JMeter once I got the hang of it.

Screenshot of JMeter showing list of summary results

Remember that list of URLs above that I wanted to test? I configured JMeter to read in a list of URLs from a CSV file, and then set up scenarios to test the set of URLs both as an anonymous user, and as an authenticated user. Finally, I generated lots of graphs because I love graphs.

Screenshot of JMeter load testing application

I then ran those scenarios from my localhost a couple of times, both on the Linode instance of the site, and on the Pantheon instance. In both cases, I had 7 concurrent users, and just for a few minutes, mostly as a litmus test. This still produced some useful information. JMeter allowed me to export a summary of response times from the tested URLs to CSV files, which I then imported into Numbers to make even more graphs.

This simple comparison allowed me to get a sense of how both Linode and Pantheon perform for each URL and for both anonymous and authenticated users.

This data represents the response time that you could expect as a user when our site is under normal load.

Graph showing summary results of JMeter tests. Pantheon generally outperforming Linode in response time.

Overall, this shows positive gains for almost every scenario on Pantheon. In most cases the gains are in the range of 30 to 50ms. In some cases, like /user for authenticated users (viewing your account dashboard), the gains are actually quite substantial: Linode 731ms vs. Pantheon 343ms.

Check out the JMX files for the above tests (which are also used below). Perhaps they will be useful as a starting point for your own load test suite.

BlazeMeter

Of course, those numbers are reflective of what you can expect if you're the only person using the server at a given time. What about the more likely scenario where you're sharing resources with a number of other users? Remember how we calculated concurrent users earlier?

To test this, I uploaded the JMX files from my JMeter tests above to BlazeMeter. I then used their free plan, and maxed out all available resources. 50 concurrent users for 20 minutes with a ramp up time of 900 seconds. So start with 1 user, and gradually increase to 50 over the course of 15 minutes and then continue to stress test with 50 concurrent users for an additional 5 minutes.

I ran this test once for Linode, and once for Pantheon. Because my JMeter tests contain 2 thread groups (one for authenticated users, one for anonymous users) and BlazeMeter runs each group separately the resulting graphs show two distinct scenarios. The first 20 minutes is anonymous traffic, and the second 20 minutes is authenticated.

Here's a comparison of average response times from all scenarios for the two. Linode in blue. Pantheon in yellow.

Comparison of Linode and Pantheon response times over time relative to concurrent users.

The following graphs show response time relative to number of concurrent users. In both cases you can see that adding more anonymous users has very little impact on overall response time. This is to be expected, as this should essentially all be cached by Varnish. On both environments I would anticipate that you could continue to increase the number of users (blue line) with little to no real effect on the response time (purple line).

Where it gets interesting is the second part of each graph where it shows how adding more authenticated traffic impacts the response time. My analysis of these graphs shows that for just a couple of authenticated users Linode performed marginally better than Pantheon. However, as the load increased, response times decreased more rapidly for Linode than Pantheon.

Linode

Graph of response time vs. concurrent users on Linode

Pantheon

Graph of response time vs. concurrent users on Pantheon

Summary and conclusions

I don't have a whole lot of experience doing load testing so this was a fun experience for me. I got to learn some new tools, and look at a lot of pretty graphs.

I tested response time, using various methods, for both anonymous and authenticated traffic on the Drupalize.Me site in order to get a sense of how the move to Pantheon for hosting impacted performance. Verdict? It was a good choice. Pantheon performs better in almost every case. Although the difference is generally expressed in changes like 50 milliseconds, the perceived length of a millisecond is pretty significant to users of our site.

As I said at the start, this is basically the outcome I expected. Though I was prepared for the differences to be bit bigger, any win is a big win when it comes to performance. In addition, these are wins that we gained by allowing someone else to manage our hosting infrastructure for us, which is an important win itself. As we've pointed out in previous posts in this series, this change allows us to focus more on producing the best Drupal training material. Pantheon can help us make sure you get it super fast.

Next steps

In addition to the already faster response times, I'm super excited about some of the tools that Pantheon provides us that will help us make this even better in the future. For example, we now have access to application profiling data from New Relic. I've barely started digging in yet, but I've already noticed a couple of SQL queries we could either cache or eliminate to shave off quite a lot of time on the front page and pricing page when loading from a stale cache.

Graph from new relic showing application response time increasing during load testing.

Pantheon also supports PHP7. Combine that with their MultiDev tools and we can pretty easily test our site on PHP7, see if everything works, then easily apply those same changes to our live environment. I anticipate that will bring yet further speed increases.

Resources

Want to do some load testing yourself? Here's some resources I found useful when figuring this all out:

Oct 10 2016
Oct 10

As a Drupal user, developer, and trainer, I have seen and experienced the spirit of a growing and thriving tech community. And while far from a utopian society, the Drupal community has been a personal source of good friends, personal and professional development, career opportunities, and many (many!) teaching and learning opportunities. Part of what brings this community together is the shared appreciation of and opportunities that arise from using Drupal and other Open Source Software.

We have some like-minded neighbors in the Open Hardware community. Last Friday I had the opportunity to attend the Open Hardware Summit 2016, an annual event of the Open Hardware Association, and catch the spirit of Open Source in a fresh way: from the perspective of the Open Hardware community.

What is Open Hardware? As defined by the Open Hardware Association, which just announced a new Open Hardware Certification:

Open source hardware is hardware whose design is made publicly available so that anyone can study, modify, distribute, make, and sell the design or hardware based on that design. The hardware's source, the design from which it is made, is available in the preferred format for making modifications to it. Ideally, open source hardware uses readily-available components and materials, standard processes, open infrastructure, unrestricted content, and open-source design tools to maximize the ability of individuals to make and use hardware. Open source hardware gives people the freedom to control their technology while sharing knowledge and encouraging commerce through the open exchange of designs.

--http://www.oshwa.org/definition/

You probably have heard of some examples of open hardware, such as the Arduino Uno/Genuino. If you go to the project page for Arduino Uno/Genuino, you'll find schematics, reference design, and board size information, and anyone is welcome and invited to download this information and design their own versions and sell them.

Open Hardware Summit 2016

This year's Open Hardware Summit took place in Portland, Oregon at the Crystal Ballroom (the same venue for past Write the Docs events in Portland, OR, another most excellent event). The summit primarily took place in the main ballroom, where speakers were each given 15 minutes for their presentations, broken up by coffee and lunch breaks. Lola's Room on the floor below hosted a dozen or so vendor booths, featuring various electronic learning kits, Internet of Things platforms, live 3D printing displays, and other delightful things.

Presentations varied and largely consisted of inspiring project reports, lessons learned, failures and success, live demos, invitations to collaborate and contribute (for both hardware and software developers), and even a personal invitation to the Shenzhen region in China, a renowned location for electronics manufacturing.

My personal interaction with open hardware is from a hobbyist, learner, and teacher's perspective. I've interacted with a variety of boards from various platforms and am utilizing them for personal projects and also for teaching my 10, 12, and 14-year-old nieces programming and physical computing applications such as robotics, wearable electronics, and other embedded systems.

While all of the presentations were interesting in one way or another, here were some of my favorites.

Steve Hodges: micro:bit Open Source Physical Computing Platform for CS Education

In this presentation, Steve presented and demonstrated the micro:bit, an initiative brought to life in the UK by a partnership with the BBC. The micro:bit is designed for use by kids in the classroom. He demo'd a live Scratch-like block programming interface using Blockly, a Javascript framework developed by Google. Kids use the web interface to connect logic, preview their programs, and program their boards. Both the hardware and software components of this program were impressive. I walked away inspired to look into Blockly, and found several examples of Blockly to program an Arduino Uno (both paid and free) that I want to try out and possibly fork to create a Blockly interface to program a Circuit Playground from Adafruit Industries.

Rianne Trujillo: Open Source Hardware in our National Parks

Rianne presented a couple of projects she's been leading about embedded and interactive displays at national parks. One of the displays she works on is a bird/egg matching game. Inside the fabricated birds' eggs was an RFID tag. When placed in the correct bird's dish, the bird's song was played. (If incorrect, a voice said, "Sorry!") I loved hearing about the challenges and goals of the project, including making sure the display was durable, accessible to all, didn't fail if one egg went bad, and was relatively easy for park staff to maintain and replace components.

Jason Krinder: Open vs. Collaborative: Lessons from Linux and Google

This presentation was really a plea for others to contribute to Linux, but what I took away from it was a better understanding of what open hardware is and isn't. I also learned more about a new platform and board that I had only heard of in name, but didn't really know what it was. Jason Krinder is a co-founder of the BeagleBoard.org Foundation, a "US-based non-profit corporation existing to provide education in and promotion of the design and use of open-source software and hardware in embedded computing." I learned that the Raspberry Pi isn't built on open hardware and that there does exist an open hardware alternative to the Raspberry Pi, the BeagleBoard. Through my subsequent research, I learned that the BeagleBone Black is a low-cost board for developers and hobbyists that runs Linux. There is a whole community and library of resources around the BeagleBoards and you can find out more at https://beagleboard.org/.

Final impressions

There was so much to inspire and learn from at the Open Hardware Summit. From 3D printed prosthetic hands, to touch sensors for surgeons in India performing surgery on the pancreas, to nurturing project communities, to makerspaces and entrepreneurship in refugee communities in Beirut, Lebonon--I walked away with so many ideas--and tabs open on my phone and computer. I think it will take all year to process all the things I learned and want to learn more about!

The event was live-streamed and you can watch all the presentations on this UStream channel.

Getting involved in open hardware communities has renewed my sense of appreciation for tech communities like the Drupal community. It is so wonderful to see the amazing things that people can make when they build things together.

Oct 04 2016
Oct 04

It's that time again. October 5th brings the second minor version of Drupal 8 since moving to a semantic versioning release schedule. We've taken the time to dig through the change records and release notes (in order to make sure our tutorials stay up to date) and we want to share some of the new features and functionality you can look forward to when you upgrade to version 8.2.

For any particular release you can see the release notes with complete change records via the releases page for Drupal core. Each one of these will contain brief descriptions of the issues that have been committed since the last release. This is a great place to get a quick overview of new experimental module additions. For example, you can see the experimental modules in Drupal 8.1.0--Migrate, Migrate Drupal, Migrate Drupal UI, BigPipe, and Inline Form Errors--and in the first beta release of 8.2 a new experimental module was added: Place Block. The release notes will also call out particular stable features or functionality that may be of interest to a wide audience. Things like enabling revisions by default fall into this category.

So, what's new in Drupal 8.2?

The most visible additions to this release of Drupal affect the content administration experience. Dries has written about the outside in user interface pattern which could help improve the end user editing experience for users new to Drupal. Several new experimental modules have been added in this release to help explore this interaction pattern. Let's take a look at Settings Tray, Place Block, Content Moderation, and Date Range.

Settings Tray

The Settings Tray experimental module allows a site administrator to edit elements of a page without visiting an administration page. After enabling the module you'll notice a new edit button in the main toolbar. This launches edit mode, where page elements become clickable targets. From here you can click on a portion of the page -- for example, the site name. After clicking on the element you'd like to change a tray pops out on the side of the page with an editing interface. From here you can make and save changes to the element. Here is the module in action on a demo site:

Setting Tray module in action

The official documentation on drupal.org is still a bit light. As an emerging experimental module this is a great opportunity to give the functionality a try on your site and share your thoughts in the issue queue.

Place Block

Another new experimental module that provides an outside in editing experience is Place Block. It probably won't be much of a surprise that the Place Block module helps site administrators, wait for it, place blocks on their site. Much like the Settings Tray module it's probably easier to see this in action.

Place Block module in action

After clicking the Place block link in the toolbar the theme regions become visible. From here you can click the plus button in any region to add a block. After selecting and configuring the block the page refreshes with your new addition already in place.

Content Moderation

Drupal 8.2 also adds an experimental module to help with Content Moderation. This module allows an administrator to define a variety of additional states in a publishing workflow, and to control which roles are allowed to make state transitions. This allows Drupal to support organizations with more complex publishing workflows with Drupal core. By default 3 states will be created after enabling the module: Draft, Published and Archived.

Content Moderation configuration screen

Once the states have been configured to your liking you can then set up the permissions for which roles are allowed to make which transitions.

Content Moderation transitions

With your states and transitions in place you can then configure each content type's moderation settings independently.

Content Moderation content type configuration

Date Range

This release also includes a new Date Range module, which will add a new date field with a beginning and end.

Date range configuration

Many more smaller changes

Much of the rest of the work that's gone into this release consists of clean up and bug fixes. The migration suite of modules retain their experimental status, so they see lots of bug fixes in this version. A migration from Drupal 6 to Drupal 8 is now supported, but because the Migrate API itself is still alpha, stability changes may be required in a future release. Big Pipe module, which helps improve perceived page rendering performance, is nearing beta stability. There have also been dozens of bug fixes to improve Drupal's REST web services. You can reference the release notes, as mentioned above, for the full details.

Sep 21 2016
Sep 21

If you're following along at home, you may have seen that we recently made the move to Pantheon hosting. Last week during our maintenance window, Joe and I worked through our migration checklist, and officially moved the site over to our new host. The process had a few hiccups, but we thought it would be interesting to take a look at what went into our migration process. Hopefully, sharing what went into our planning process, as well as what is in our pipeline for improvements now that we're on Pantheon, will help you if you ever find yourself facing a similar project.

One of the most difficult parts of this project was figuring out where to start. Pantheon has several helpful guides to get you started on their platform. In fact, in my initial "proof of concept" migration it only took me about 90 minutes to get a version of our site running on Pantheon. And, truth be told, a lot of that time was spent reorganizing our git repository, importing a database backup file, and running rsync to copy our files over. Doing an initial proof of concept like this increased our confidence in the actual migration process, but it also brought up a checklist of things we would need to figure out to make the transition as smooth as possible.

Our old Linode infrastructure consisted of several servers running Varnish, Solr, Memcache, Apache, and Jenkins. We needed to see where those fit in the new system for us:

  • Varnish: Moving to Pantheon we'd lose the ability to customize our Varnish configuration file. We ultimately found that our Varnish configuration customizations were either no longer needed, or we could replicate similar functionality by using the Context HTTP Headers module.
  • Solr: Adding Solr to our Pantheon site was, for the most part, a simple click of the button. We've since discovered a few differences between the version of Solr we were running and what is provided by Pantheon but it looks like a few configuration tweaks on the Drupal side will be able to account for those changes.
  • Memcache: Instead of Memcache, Pantheon supports Redis as a faster drop-in replacement for Drupal's database caching layer. Beyond tweaking a few lines of our settings.php file this was an easy win. In fact, some of our early performance tests seem to suggest that the site's performance improved in part due to this particular caching change. (Stay tuned for more details about that in a future blog post.)
  • Apache: Pantheon also uses the Nginx web server instead of Apache. This meant that any additional customizations we had made in our .htaccess file also had to be accounted for using a different approach. While quite a few of the pieces that make up our overall technology stack changed during the migration there was very little change required on our part to get things working.

The part of the project that was the most time consuming actually had very little to do with Pantheon and more to do with our internal tools and processes. Our old infrastructure contained setup instructions and a Vagrantfile that anyone on the team could use to get a development environment that mirrored our infrastructure up and running relatively quickly. We've also been enthusiastic users of Tugboat.QA to automatically build a full site for each pull request pushed to GitHub. A Jenkins server is responsible for periodically creating database backups, copying and sanitizing the database from production to our QA and test sites, as well as doing the actual deployments of new code. Figuring out how to accommodate changes to our workflow, and where the differences (and preferences) with Pantheon came up took a bit of time. All in all, we made small modifications to our existing workflow to adopt the very similar Pantheon workflow. This allowed us to decommission several Jenkins jobs, overall reducing the amount of critical infrastructure code we need to maintain.

Pantheon provides several really useful tools for working with sites on their platform. The site dashboard itself allowed me to enable our Solr and Redis servers, schedule regular backups for each environment (Dev, Test, Live), download drush aliases for our Pantheon sites, and merge code with just a couple of clicks.

Things really started to take off after installing Terminus. Terminus is a command line tool, like Drush, that enables interaction with sites hosted on Pantheon. We're only scratching the surface of what's possible with Terminus, but we've used it to do things like return connection information in order to copy our database from Linode to Pantheon (during testing and the actual final migration), and to clear caches and deploy code on various environments while trying to debug unrelated bugs that popped up last week.

Another tool Pantheon provides that we're just getting started with is Quicksilver. Quicksilver hooks allow us to configure our site to react to particular workflows (in Pantheon terminology). This enables us to do things like revert Features every time code is deployed or pushed to an environment. We'll also use these triggers to run our test suite to ensure that new functionality doesn't break the site in unexpected ways. So far we've been able to replicate much of our old infrastructure with a much smaller number of Jenkins jobs, and fewer moving pieces ultimately means less maintenance work for our small team.

While there are still a few rough edges we're working on smoothing over, we haven't even really had time to take advantage of everything Pantheon provides. Just via the dashboard alone we have optimizations to make based on suggestions from the integrated Site Audit tool, watchdog errors to clean up to reduce the size of our error logs, caching optimizations that will speed things up for users all across the site, as well as New Relic metrics which will allow us to profile our site in ways that weren't previously possible. We're also excited to give Kalabox a try, as a replacement for our Vagrant setup for simple localhost development environments.

Like any migration project, things didn't go perfectly smoothly for us. Here's a brief look at some of the hang-ups we encountered along the way, and the solutions we found.

When we originally considered the possibility of moving to Pantheon, they required a very particular configuration for the site's git repository. Specifically, the root of the repository also had to coincide with Drupal's root. The repository for our site contains our test suite, some helper scripts, patch files, and other miscellaneous documents. Thankfully we didn't have to figure out how to split off all of the non-Drupal material in our repository. Pantheon recently rolled out support for using a nested docroot. All that we needed to do to solve this problem was add a symlink. This meant that importing our code into our new Pantheon account was simply a matter of adding a pantheon.yml file, and a symlink from a directory called web to the docroot directory containing Drupal.

We were definitely thankful for the addition of this nested docroot feature, but it's likely to cause some additional work and testing for us for a while. Pantheon provides their own version of Drupal core which we've added to our repository as an upstream remote. Any time we pull in changes from this upstream, we will have to manually move the files to our nested docroot. While it's not a difficult task it means that we're unable to use the dashboard button to automatically upgrade. We also ran into a small hiccup, where the location of a certificate file used to communicate with the Solr server was hard coded into a module. This caused problems for us when we tried to launch the Solr service. Their support team quickly diagnosed the issue and pointed us towards a fix. Within about a day their platform team even had a pull request ready to incorporate into the upstream repository. Not only was the response time great, but seeing progress towards a solution happening in the open on GitHub gave us a lot of confidence.

During testing and QA of the new Pantheon environment we also found a small bug. Pantheon as a platform has limitations on storing large files within the Drupal file system. Even with our previous host we have been storing our video files on Amazon's S3 service. We use the Filefield Sources module so that we can reference files in S3 on the video nodes on our site. On our previous host we had the max upload size set quite high, to account for a bug in filefield sources. During the upload validation process the max file size value is checked even when the file is being stored remotely. In our case, especially since we transfer videos to S3 manually and don't rely on uploading through Drupal forms, this caused issues with our video asset production process. Fortunately there was a very simple work around that allowed us to solve the problem, unsetting the validation function that checks for file size if the upload location is remote.

You may have also experienced some frustrating behavior from the site last week, where periodically most of the links on the page were unexpectedly redirecting to one of our tutorial listing pages. Changes in the configuration of our caching infrastructure allowed a race condition bug to slip through our testing. The bug already existed on our site but hadn't reared its ugly head, and was unrelated to Pantheon, but this illustrates the importance and difficulty of fully testing a migration like this.

Now that we're getting settled into our new home, we are looking forward to continue optimizing, tweaking, and improving our site. With the help of the tools Pantheon provides, we will continue to work towards our goal of continuous deployment.

Aug 18 2016
Aug 18

DrupalCon is a great opportunity to learn all kinds of new skills and grow professionally. For the 3 days of the main conference in Dublin (September 27–29) there will be sessions on just about everything related to Drupal that you could want. One amazing opportunity that you may not be aware of though is the Mentored Sprint on Friday, September 30th. This is a great place for new folks to learn the ropes of our community and how to contribute back. What may be less talked about is the chance to be a mentor.

Why mentor at DrupalCon?

Mentoring provides you the opportunity to positively impact others while also gaining a lot for yourself. Through helping others one-on-one, you’ll find that your own confidence grows from sharing your knowledge. You'll also probably pick up a trick or two in the process, and you'll have great opportunities to learn from the other mentors in the group. In addition to helping people directly, you’ll make new connections with others in the community and walk away inspired and refueled to do more. Even if you’ve never mentored before, the mentor group has put a lot of effort into making the mentoring experience just as awesome and fun as for the learners.

What does mentoring involve?

Most mentors help new contributors set up their development environments, find tasks, and work on issues. You will go through mentor training that explains the tasks in detail and shows you how to help others find good issues to dig into. You should have experience setting up your own local environment and using the Drupal.org issue queue. There are a number of other tasks to help with as well though, like doing ticket triage ahead of time to make issue-finding easier during the sprints, doing issue reviews, or even just helping organize the day and make sure people know where to be. You’ll even get a special DrupalCon shirt just for mentors.

Sprint and Workshop

In addition to the main Mentored Core Sprint, there is also a First-time Sprinter Workshop for people who are completely brand new. This workshop runs from 9am–12pm and makes sure everyone is set up properly and understands how the issue queue and community process works. This is a more structured environment and where you really set up the foundation for a successful day. You may find yourself involved in both of these events throughout the day, and they are both great fun.

We highly recommend getting involved in the sprint day, whether as a mentor or a mentee. It’s been a very rich experience for us on the Drupalize.Me team, as Blake said after he volunteered as a mentor at DrupalCon New Orleans:

I was incredibly impressed by the organization of the mentored core sprint on Friday. I came into the week feeling pretty jaded, and left feeling like I had fun and helped empower a few new contributors. It was a pretty great way to end the week. Kudos to the folks that have developed the mentor prep materials, the sprint on-boarding process is so much better than it used to be.

At the #DrupalCon sprints and @blakehall is helping some new contributors learn how to navigate the issue queue. pic.twitter.com/GecKMGgBOs

— Drupalize.Me (@drupalizeme) May 13, 2016

Jul 21 2016
Jul 21

I'm super excited to be invited to be a keynote speaker for this year's DrupalCamp WI (July 29/30). If you're in the area you should attend. The camp is free. The schedule is shaping up and includes some great presentations. Spending time with other Drupal developers is by and large the most effective way to learn Drupal. So sign-up, and come say hi to Blake and me.

Why is Drupal hard?

The title of my presentation is "Why is Drupal Hard?" It is my belief that if we want to continue to make it easier for people to learn Drupal we first need to understand why it is perceived as difficult in the first place. In my presentation I'm going to talk about what makes Drupal hard to learn, why it's not necessarily accurate to label difficult as "bad", and what we as individuals and as a community can do about it.

As part of the process of preparing for this talk I've been working on forming a framework within which we can discuss the process of learning Drupal. And I've got a couple of related questions that I would love to get other people's opinions on.

But before I can ask the question I need to set the stage. Close your eyes, take a deep breath, and imagine yourself in the shoes of someone setting out to be a "Drupal developer."

Falling off the Drupal learning cliff

Illustration showing the scope of required knowledge across the 4 phases of learning Drupal. Small at phase 1, widens quickly at phase 2, slowly narrows again in phase 3 and through phase 4.

When it comes to learning Drupal, I have a theory that there's an inverse relationship between the scope of knowledge that you need to understand during each phase of the learning process and the density of available resources that can teach it to you. Accepting this, and understanding how to get through the dip, is an important part of learning Drupal. This is a commonly referenced idea when it comes to learning technical things in general, and I'm trying to see how it applies to Drupal.

Phase 1

Graph showing Drupal learning curve, showing exponential growth at phase 1

When you set out to start, there's a plethora of highly-polished resources teaching you things that seem tricky but are totally doable with their hand holding. Drupalize.Me is a classic example: polished tutorials that guide you step-by-step through accomplishing a pre-determined goal. During this stage you might learn how to use fields and views to construct pages. Or how to implement the hook pattern in your modules. You don't have a whole lot of questions yet because you're still formulating an understanding of the basics, and the scope of things you need to know is relatively limited. For now. As you work through hand-holding tutorials, your confidence increases rapidly.

Phase 2

Graph of Drupal learning curve showing exponential decay of confidence relative to time at phase 2, the cliff

Now that you're done with "Hello World!", it's time to try and solve some of your own problems. As you proceed you'll eventually realize that it's a lot harder when the hand-holding ends. It feels like you can't actually do anything on your own just yet. You can find tutorials but they don't answer your exact question. The earlier tutorials will have pointed you down different paths that you want to explore further but the resources are less polished, and harder to find. You don't know what you don't know. Which also means you don't know what to Google for.

It's a much shorter period than the initial phase, and you might not even know you're in it. Your confidence is still bolstered based on your earlier successes, but frustration is mounting as you're unable to complete what you thought would be simple goals. This is the formulation of the cliff, and, like it or not, you're about to jump right off.

Phase 3

Graph of Drupal learning curve showing relatively flat and low confidence over time at phase 3

Eventually you'll get overwhelmed and step off the cliff, smash yourself on the rocks at the bottom, and wander aimlessly. Every new direction seems correct but you're frequently going in circles and you're starving for the resources to help. Seth Godin refers to this as "the dip", and Erik Trautman calls it the "Desert of Despair". Whatever label you give it, you've just fallen off the Drupal learning cliff. For many people this is a huge confidence loss. Although you're still gaining competence, it's hard to feel like you're making progress when you're flailing so much.

In this phase you know how to implement a hook but not which hook is the right one. You know how to use fields but not the implications of the choice of field type. Most of your questions will start with why, or which. Tutorials like those on Drupalize.Me can go a long ways toward teaching you how to operate in a pristine lab environment, but only years of experience can teach you how to do it in the real world. As much as we might like to, it's unrealistic to expect that we can create a guide that answers every possible permutation of every question. Instead, you need to learn to find the answers to the questions on your own by piecing together many resources.

The scope of knowledge required to get through this phase is huge. And yet the availability of resources that can help you do it is limited. Because, as mentioned before, you're now into solving your own unique problems and no longer just copying someone else's example.

Phase 4

Graph of Drupal learning curve showing upswing of confidence, linear growth, at phase 4

If you persevere long enough you'll eventually find a path through the darkness. You have enough knowledge to formulate good questions, and the ability to do so increases your ability to get them answered. You gain confidence because you appear to be able to solve real problems. Your task now is to learn best practices, and the tangential things that take you from, "I can build a website", to "I can launch a production ready project." You still need to get through this phase before you'll be confident in your skills as a Drupal developer, but at this point it's mostly just putting in time and getting experience.

During this phase, resources that were previously inaccessible to you are now made readily available. Your ability to understand the content and concepts of technical presentations at conferences, industry blog posts, and even to participate in a conversation with your peers is bolstered by the knowledge you gained while wandering around the desert for a few months. You're once again gaining confidence in your own skills, and your confidence is validated by your ability to continue to attain loftier goals.

And then some morning you'll wake up, and nothing will have changed, but through continually increasing confidence and competence you'll say to yourself, "Self, I'm a Drupal developer. I'm ready for a job."

What resources can help you get through phase 3?

So here's my questions:

  • What resources do you think are currently available, and useful, for aspiring Drupal developers who are currently stuck in phase 3, wandering around the desert without a map asking themselves, "Panels or Context?"?
  • What resources do you think would help if they existed?
  • If you're on the other side, how did you personally get through this dip?

Responses from Lullabot

I asked this same question internally at Lullabot a few days ago, and here are some of the answers I received (paraphrased). Hopefully this helps jog your own memory of what it was like for yourself. Or even better, if you're stuck in the desert now, here's some anecdotal evidence that it's all going to be okay. You're going to make it out alive.

For me, it was trial and error. I would choose a solution that could solve the particular problem at hand most efficiently, and then I would overuse it to the extreme. The deeper lessons came months later when changes had to be made and I realized the mistakes I had made... Learning usually came also from working with others more experienced. Getting the confidence to just read others' code and step through it is also a big plus.

building something useful++. That's the absolute best way. Can't believe I forgot to mention it. Preferably something that interests you or fulfills your own need. You still fall off the cliff, but you at least see the fall coming, and your ability to bounce back is better.

At this stage I find that the best resources are people, not books or tutorials. A mentor. Someone that can patiently listen to your whines and frustrations and suggest the proper questions to ask, and who can give you the projects and assignments that help you grow and stretch.

Everything I know about Drupal I know through years of painful trial and error and shameless begging for help in IRC.

I spent a lot of time desperately reading Stack Overflow, or trying to figure a bug out from looking at an issue where the patch was never merged, or reading through a drupal.org forum where somebody tries to solve something but then just ends with "nevermind, solved this" without saying why.

I'd agree that people is what gets you through that. I learned IRC and how to write patches and get help from individuals and that is when the doors opened.

Another approach that really boosted me to the next level, especially early on in my career as a developer, was to work with someone that you can just bounce ideas off of. I'll never forget all the hacking sessions Jerad and I had back in the day. Coding at times can be boring, or the excitement of doing something awesome is self-contained. Being able to share ideas, concepts, and example code with someone that appreciates the effort or awesomeness of something you've done and at the same time challenges you to take it to the next level is priceless.

Printing out the parts of Drupal code I wanted to learn: node, taxonomy and reading comments and code like a gazillion times.

Try and code something useful so I could ask others for help. That's how I wrote the path aliasing module for core.

I often find that as you get into more complicated, undocumented territory, being able to read code is super valuable. You can often get lost in disparate blog posts, tutorials and forums that can lead you all sorts of ways. The code is the ultimate source of truth. Sometimes it takes firing up a debugger, stepping through the parts that matter to see how things are connected and why.

Jul 13 2016
Jul 13

On Tuesday, July 12th, the Drupal security team issued a Public Service Announcement (PSA) about
a highly critical security release
that happened today, Wednesday, July 13th, for 3 Drupal contributed modules. This security release gets the extra push of being ranked highly and a PSA because this very dangerous vulnerability will allow an attacker to execute their own PHP code on your site. Here are a few important things to know:

  • This is not a problem in Drupal core.
  • This is only for the contributed projects RESTWS, Coder (even if it is disabled!), and Webform Multiple File Upload.
  • If you do not apply the security updates to these modules on a site connected in any way to the internet, your site can, and very likely will, be hacked.

How to Secure Your Site

In order to make sure your site is secure, review all of your Drupal 7 sites to see if they are using any of the affected modules. If so, you need to either upgrade the module to the latest release that came out today, or find and apply the patch to the code directly.

You can find instructions on Drupal.org to update a module. We’ve also made our video lessons Updating Drupal Contributed Modules and Drush Commands for Site Administrators free for the next few weeks. The first will show you how to use Drupal’s built-in Update Manager to update the modules directly through the admin interface of your site, and the second will show you how to use Drush for the same process.

Bevan Rudge wrote a good article explaining how to prepare for this security release. While the release is already out now, it is still a good list of things you should go through. He also walks through the steps for applying the security fixes by manually applying the security patch.

Stay Informed

If you're just learning about this critical update due to this blog post, you should make sure that you can be more quickly informed in the future. Even if none of these modules effect you, the next time it might be something that does. Drupal always does security releases on Wednesdays. There are numerous channels to stay informed.

Start by making sure you have the core Update Manager module enabled, which provides you with an Available updates report on your site. This report shows you all new updates ready for your site. Most people are familiar with this report. If you go to the Settings tab in there, you can also configure the site to notify you by email whenever there are new releases — either generally, or just for security updates. Every site should have someone being notified of available security updates through Update Manager.

Speaking of emails, you can also put yourself on the security newsletter email list through Drupal.org. To subscribe, log in to Drupal.org, go to your user profile page and subscribe to the security newsletter by going to the Edit tab and then the My newsletters tab.

Beyond email, you can find all security releases on Drupal.org’s security advisories page. There are RSS feeds available for core, contrib, and public service announcements, like this one, which are used when there is a particularly critical situation. You can also follow @drupalsecurity on Twitter.

Jun 20 2016
Jun 20

One of the big changes in Drupal 8 is that Views, the most popular contributed module in Drupal 7, is now included as part of core. Learning Views is a key component of building Drupal sites. Aside from having this tool built in core now, the beauty of this new feature is that it is almost identical to Views in Drupal 7. You can get started site-building with Views in Drupal 8 without waiting for any fancy version-specific instructions.

For most people who are new to Drupal, the first step after creating content is to learn how to list that content in various ways on your site. This is where Views is your tool of choice. The Views interface has changed very little between versions, as can be seen in these 2 screenshots of the default Views interface. As a matter of fact, there are only a few minor tweaks in less-used features. Many people won’t notice the differences.

Drupal 7
Drupal 7 Views interface

Drupal 8
Drupal 8 Views interface

You can easily follow a Drupal 7 views interface tutorial, like our Introduction to Views or Using Drupal series tutorials, in Drupal 8 and achieve the same results.

One thing you’ll note is that Drupal core is now using Views to power its listing pages, so a lot of default views are already enabled for you on your new site. In Drupal 7, you had to enable the default views before you could actually use them. Drupal 8 still comes with 2 that are not enabled out of the box, Archive and Glossary, but otherwise you’ll see that you have more default views in Drupal 8 — and almost all of them are enabled.

Differences

There are a few minor differences in the user interface. For instance, the “No Results Behavior” setting has been moved out of the Advanced area into the center settings column, and the word “clone” has been changed to “duplicate” — but these are minor changes that shouldn’t break your stride.

There is also a very small list of items from Drupal 7's Views that are absent from Drupal 8:

Theme information link is gone.
In Drupal 7, you could get a handy list of the theme template files and suggestions for naming overrides right in the user interface, under the Advanced section. In Drupal 8 you should instead use Twig debugging to show you where the template file in use is located and then you will override that file as per usual in Drupal theming. (Also note that, of course, Views theme files are now in a core directory, core/modules/views/templates.) For some examples of template files and the naming conventions to use for overrides, take a look at api.drupal.org.

Field Language setting is gone.
There used to be another field under Advanced that let you set a language at the field level for your view. The underlying multilingual system received a lot of improvements in Drupal 8 and this setting was removed because “rows in entity views should have a language associated with them, and we should filter/sort based on language at the entity/row level, not at the field level.” (From the "Field language filter/sort/etc. for Views do not work and are not needed" issue.)

Advanced Help is no longer used.
Since Views is part of core, instead of using Advanced Help module it now uses the core help system and online documentation.

Something New

Views isn't the only newcomer to core. Now that various Web Services modules have been added to core, when you enable them, you will have access to a new "Responsive Endpoint" display type. This display type enables you to set a path and output JSON. This can be handy for integrating segments of your content with JavaScript.

Gotchas

Like most software, there are still bugs for Views in core, so if you find something that isn’t working the same way as it did in Drupal 7, you might want to check out the open Views issues. Now that Views is part of core, you need to search within the Core project, instead of the contributed Views project. You will need to select the views/module component in the core issue queue.

Drupal Core Views issue queue

One issue worth mentioning and which is likely to cause some furious cursing is “When deleting a content type field the related View also is deleted”. In Drupal 7, when you deleted a field that was also used in a view, you would get a missing handler error in your view. With this bug in Drupal 8 when you proceed to delete a field from a content type, Drupal's Configuration System will list all of the changes to configuration. If you see a view listed under deletions, and you confirm the field deletion, that view will be deleted as well.

As you can see, there aren’t really many differences, so you should feel confident forging ahead with Views in Drupal 8 because you have an entire world of Drupal 7 Views tutorials to guide you on your way.

May 26 2016
May 26

Drupal 8.1.0 was released on April 20th. (Read the full release notes.) There are a few exciting things about this release. At the top, this is the first time Drupal has done a scheduled feature release using the new semantic versioning and pre-set release schedule. Instead of 8.1, we have 8.1.0, and we got the release out on schedule! (Learn more about the Drupal 8 release schedule and semantic versioning.) This feature release added some CKEditor WYSIWYG enhancements, added some APIs, an improved help page, and two new experimental modules: Migrate Drupal UI and BigPipe.

One of the big advantages to these 6-month feature releases is that we can add new things to Drupal core over time instead of having to wait for the next big version. This means Drupal core will change more over its life than previous versions. One of the downsides of this is that documentation and learning materials may get out of date more quickly. We’re excited about the possibilities of Drupal 8 releases and we’re dedicated to having the most up-to-date and accurate tutorials you can find. To keep on top of things, we are reviewing all of the Drupal 8 releases to see where our tutorials need to be updated, and we’re making sure we stay involved in the core issues and processes so we are aware of big changes that may be coming. This is the most exciting release cycle Drupal has ever had, and we’re loving the energy and the challenge of keeping up with it toe-to-toe. So without further ado, here is a brief summary of the big changes in 8.1.0 and a look ahead at some things that might be coming down the road.

BigPipe

The Drupal 8 BigPipe module provides an advanced implementation of Facebook's BigPipe page rendering strategy, leading to greatly improved perceived performance for pages with dynamic, personalized, or un-cacheable content. This is a huge win for Drupal 8, and will make your sites load super fast. You can see the difference it makes in a short demo video on the BigPipe documentation page. The work on this has been happening for over a year and while it didn’t get into the initial 8.0.0 release, it’s very exciting to see this in Drupal core now. For an in-depth look at BigPipe in Drupal, watch the DrupalCon New Orleans BigPipe session video.

Migrate

The other big feature in Drupal 8.1.0 is improvements to the migration system, which is how we handle upgrades now. There is now a user interface for running migrations, which was a sorely missing feature. (Though, do keep in mind that more migration improvements need to be made.) There has been a lot of work in this space and we’ve been working on upgrade/migration tutorials for a few months. Part of our delay was the tricky space that the migration in core was in, which made it difficult to ensure we had the most accurate tutorials for coming versions of Drupal core. With the 8.1.0 release knocking we decided to push things along by funding Mike Ryan (creator of Migrate module) to make migrations more solid for 8.1.0, and dedicating some of our own Will Hetherington’s time to assist in the effort.

Specifically, we had Mike spend his time making sure the final core migrate issues for 8.1.0 got guided in, and working to get the Migrate Tools and Migrate Plus contributed modules working well with the new release. This is particularly important since not having these two projects functional would severely limit the usefulness of the Migrate work in core.

Aside from the actual code work that Mike cranked out, we also go 2 great blog posts; one by Mike, Migration update for Drupal 8.1, and the other by Will, Custom Drupal-to-Drupal Migrations with Migrate Tools.

You can also see the fruits of our labor on the Migrate front with the first of several parts for our new Drupal 8 Migration Guide, led by Will.

Drupal 8.2.0

We had our first successful feature release, and 8.2.0 is scheduled for October 2016. It remains to be seen what lands in that release, but we are keeping an eye on important developments. Of course, there is also still room for more improvements to Migrate and so we plan to keep working in the migration space over the coming months. One of the new, interesting things to watch is a project to simplify the theme and render system by using a component library. It's still very much in the planning phase, so we have no idea if it can even possibly make it in to the next version yet. (Here is the main issue on Drupal.org if you want to follow along.) We’re particularly interested here at Drupalize.Me because we have an extensive Theming Guide published and this would definitely be something where we’d need to update our previously published tutorials. Rest assured we’re keeping our finger on the pulse.

May 02 2016
May 02

DrupalCon in New Orleans, Louisiana, May 9-13, 2016DrupalCon is almost here and it’s time to start filling out your schedule. There’s a lot to do and see (not to mention eating lots of great New Orleans food!), so we definitely recommend having at least a rough game plan for how to use your time. Here’s a look at things you should be considering, especially if you are looking to take away a lot of Drupal 8 knowledge.

My Schedule

Sessions and BoFs

One of the great things about DrupalCon sessions is that they are recorded and uploaded to YouTube pretty quickly. This means that you can skip the live session (unless you think you’ll have questions you want to ask) and catch up later. Why would you do that? Well, the Birds of a Feather (BoF) sessions are not recorded. These are informal sessions that get organized and happen during the 'Con. If you want to be part of these interesting conversations, then you need to be there at the time. So, our first bit of advice is to check out the BoF schedule each day and skip any formal sessions that conflict for you.

That said, it also makes sense to have a good outline of which sessions you do want to see ahead of time and that makes checking for conflicts much easier. On the DrupalCon site, you can click the “Add to my schedule” link in the sidebar of any session, and then you’ll have a nice, handy reference when you go to “My Schedule”. Of course, there are a lot of sessions about Drupal 8 in the schedule. In particular, you can find many of them in these main tracks: Coding and Development, Front End, and Site Building. Make sure you check out Joe’s session, Altering, Extending, and Enhancing Drupal 8. Last year’s DrupalCons introduced a new track just for Symfony, and this one is no different. The Symfony track definitely has some goodies for digging under the hood of Drupal 8 as well. For a different perspective on what you can do with Drupal 8, you should see Amber’s session Beyond the Blink: Add Drupal to Your IoT Playground where you’ll see how to use Drupal with "Internet of Things" projects.

If you really want to dive into Drupal 8, there are also day-long workshops available on the Monday before the main conference starts. In particular, for developers, we’re happy that our partners over at KnpLabs have a great workshop on D8 & Symfony: Dive Into the Core Concepts that Make Each Fly where you’ll spend the day really getting to understand the underlying structure by working with things like routes, controllers, events, and services.

For those who are not necessarily building, developing, or theming Drupal 8, there are a number of sessions that take a different look at Drupal 8:

Attend the Friday Sprints

Drupal mentoringIn addition to the regular sessions at DrupalCon, Friday presents a lot of opportunities for people who want to dive in and get some real work done. There’s no better way to understand what is going on with Drupal 8 than to actually work on it! Friday is the classic sprint day, where everyone comes together to work on Drupal. There is a huge range of work going on, from documentation to code, plus good people and fun. Blake, Will, Amber, and Joe from our team will be there taking part in the fun. Blake and Will are mentoring as part of the Mentored Core Sprint which also has a morning workshop to get you all up and running, even if you’ve never contributed before. It’s a great place to get oriented to contribution with lots of helpful people there to answer questions and help you find good projects to work on. Joe is going to help lead up a sprint on the new Drupal 8 User Guide for Drupal.org. You can learn more about that project, and the kinds of things that need work at the sprint, by attending Joe’s session earlier in the week in the Drupal.org track, Documentation Is Getting An Overhaul.

We’re excited that we’ll be at DrupalCon, and this is shaping up to be an amazing event as always. We’d love to hear from you, so please don’t feel shy about walking up to us (find pictures of us on our About page) and introducing yourself. You’re also sure to run into us at the Lullabot party on Wednesday. See you there!

Apr 26 2016
Apr 26

On the Drupal Migration Trail

Editor's note: To learn how to upgrade/migrate to Drupal 8, follow our full Drupal 8 Migration Guide.

Drupal 8 core provides support for Drupal-to-Drupal migrations. Since there is no direct upgrade path for Drupal 6 or 7 to 8, you should become familiar with the migration system in Drupal, as it will allow you to migrate your content from previous versions to Drupal 8.

In this post, which is aimed at advanced Drupal users, we will discuss how you can conduct a custom Drupal-to-Drupal to migration using core, and contributed modules, along with Drush.

If you have not yet read Mike Ryan's excellent blog post about the changes to the Migrate System in Drupal 8.1, I would highly recommend it.

Preparation

To be able to run custom Drupal-to-Drupal migrations in Drupal 8, you will need the following:

  • Drupal 8.1 (or greater) installed
  • A database backup from your Drupal 6 or 7 site, and optionally, your sites/default/files directory from your Drupal 6 or 7 site
  • A database backup of your freshly installed Drupal 8.1

Enable the following modules in your Drupal 8.1 site:

Core:

  • Migrate
  • Migrate Drupal

Contributed:

After enabling the required modules, add an additional database definition to your settings.php for your Drupal 8.1 site. See Drupal\migrate\Plugin\migrate\source\SqlBase

// Database entry for `drush migrate-upgrade --configure-only`
$databases['upgrade']['default'] = array (
  'database' => 'dbname',
  'username' => 'dbuser',
  'password' => 'dbpass',
  'prefix' => '',
  'host' => 'localhost',
  'port' => '3306',
  'namespace' => 'Drupal\\Core\\Database\\Driver\\mysql',
  'driver' => 'mysql',
);
// Database entry for `drush migrate-import --all`
$databases['migrate']['default'] = array (
  'database' => 'dbname',
  'username' => 'dbuser',
  'password' => 'dbpass',
  'prefix' => '',
  'host' => 'localhost',
  'port' => '3306',
  'namespace' => 'Drupal\\Core\\Database\\Driver\\mysql',
  'driver' => 'mysql',
);

Generate a migration

With these steps complete, it's time to generate a migration. Open up a terminal, and issue the following command, from your Drupal 8.1 root directory:

drush migrate-upgrade --configure-only

If you didn’t create an entry in your settings.php you can pass your database credentials for your Drupal 6 or 7 site, and optionally, the path to the files directory like so:

drush migrate-upgrade --configure-only --legacy-db-url=mysql://dbuser:[email protected]/dbname --legacy-root=/path/to/sites/default/files

This command will generate migration configuration entities in the active configuration store, using migration plugins in Drupal core. Check out the tutorial, Configuration Data Storage, to learn more about the configuration system in Drupal 8.

Create a custom migration module

At this stage, we need to create a custom module for our migration, in your Drupal 8.1 site. You can do this manually, or using Drupal Console, like so:



drupal generate:module

// Welcome to the Drupal module generator
 
Enter the new module name:
> custom_migration
    
Enter the module machine name [custom_migration]:
>
    
Enter the module Path [/modules/custom]:
>
    
Enter module description [My Awesome Module]:
> A custom Drupal-to-Drupal migration
    
Enter package name [Custom]:
>
    
Enter Drupal Core version [8.x]:
>
    
Do you want to generate a .module file (yes/no) [no]:
> no
    
Define module as feature (yes/no) [no]:
> no
    
Do you want to add a composer.json file to your module (yes/no) [yes]:
> yes

Would you like to add module dependencies (yes/no) [no]:
 > yes

 Module dependencies separated by commas (i.e. context, panels):
 > migrate_drupal, migrate_plus
    
Do you confirm generation? (yes/no) [yes]:
> yes
    
Generated or updated files
Site path: /Users/willwh/Sites/drupal
1 - modules/custom/custom_migration/custom_migration.info.yml
2 - modules/custom/custom_migration/composer.json

Export site configuration

Create the directory custom_migration/config/install , which is where we will store our custom migration.

We can now export our site configuration, which will include our generated migration configuration entities. If you’re not familiar with the configuration system Drupal 8, you should check out our Configuration Management tutorials, specifically Manage Configuration with Command Line Tools.


drush config-export --destination=/tmp/migrate

Copy migration configuration to custom module

Next, we need to copy the migration configuration generated by drush migrate-upgrade --configure-only to our custom_migrate/config/install.

These files will be at /tmp/migrate and begin with migrate_plus.

Caution

Warning! Make sure you do not copy the default configuration group that is defined by migrate plus, i.e. migrate_plus.migration_group.default.yml.

Use the following command, and replace the last argument with the correct path to your custom module’s config/install location:

cp /tmp/migrate/migrate_plus.migration.* /tmp/migrate/migrate_plus.migration_group.migrate_*.yml /path/to/your/module/config/install/

Edit your module’s migrations

At this point, you can simply remove any of the migrations you don’t need, along with any dependencies on them. You can also now edit the migrations contained in your module to your liking.

For example, if you don’t want to migrate blocks from your previous site, you would delete the following files at custom_migration/config/install :

  • migrate_plus.migration.upgrade_block_content_body_field.yml
  • migrate_plus.migration.upgrade_block_content_type.yml
  • migrate_plus.migration.upgrade_d7_block.yml
  • migrate_plus.migration.upgrade_d7_custom_block.yml

Customize migrations with process plugins

Migrations may also be customized with process plugins.

Let’s say you’re migrating from a Drupal 7 site. If you wanted to map a node type from a previous Drupal version to a different node type in Drupal 8, you could accomplish this with the default_value process plugin.

For example, given this migration template:

  • migrate_plus.migration.upgrade_d7_node_blog_post.yml

In the process: section of the migration, take note of the following:

process:
  type: type
  name: name
  description: description
...

Instead of mapping the node type in Drupal 7 to one of the same name in Drupal 8, which, in my example would import the blog_post content from Drupal 7, to a content type of blog_post in Drupal 8, we can use the default_value plugin, and specify a node type of a different name.

In the process: key, change the values for plugin to default_value and value to the machine name of your desired node type.

process:
  type:
    plugin: default_value
    value: desired_node_type
  name: name
  description: description
  help: help
  title_label: title_label
  preview_mode: constants/preview
  display_submitted: display_submitted
  new_revision: options/revision
  create_body: create_body
  create_body_label: body_label

Run the customized migration

To run your migration, you must import your clean backup of Drupal 8.1, enable the required modules above, and your new custom_migration module.

You can check that you have everything set up correctly by running drush migrate-status. You should see a list of all of the migrations you have defined in your module.

A portion of my custom migration status looks like this:



Group: default                                  Status  Total  Imported  Unprocessed  Last imported
 upgrade_block_content_type                      Idle    1      0         1
 upgrade_d7_dblog_settings                       Idle    0      0         0
 upgrade_d7_image_settings                       Idle    0      0         0
 upgrade_d7_node_settings                        Idle    1      0         1
 upgrade_d7_search_settings                      Idle    1      0         1
 upgrade_d7_url_alias                            Idle    4953   0         4953
 upgrade_d7_user_flood                           Idle    0      0         0
 upgrade_d7_user_mail                            Idle    1      0         1
 upgrade_menu_settings                           Idle    0      0         0
 upgrade_search_page                             Idle    1      0         1
 upgrade_taxonomy_settings                       Idle    0      0         0
 upgrade_text_settings                           Idle    0      0         0

To execute your migration, run the following:

drush migrate-import --all

Stay tuned…

We will cover this process in much greater detail in our upcoming Migrate to Drupal 8 tutorials, but I hope this will help you get started in migrating to Drupal 8!

Update (May 6, 2016): Our Drupal 8 Migration Guide is underway. Check out the latest tutorials on Drupal-to-Drupal migrations here.

Apr 07 2016
Apr 07

So far, 2016 has been a great year for Drupal and Drupalize.Me. We started off as our own company, and we’ve been heads-down on lots of new Drupal 8 tutorials. We have a deep love for Drupal and the community around here—so in addition to working hard at our business, we encourage each other to roll up our sleeves in the wider community as well. Here’s a little rundown of what’s been on our team’s radar recently.

Migrate

Upgrading to Drupal 8 is on a lot of people’s minds, especially as the Drupal 8.1 release nears. With Drupal 8 we no longer have the old upgrade script method, but we’ve moved to a whole new framework for migrating your content. The basic pieces made it into Drupal 8.0, but there is still a lot of work to make this sweet new process function well for everyone. We’ve been working on creating tutorials for how to migrate to Drupal 8 and we realize there is more work than documentation needed—not everything is working smoothly yet. Upgrading is a big sticking point for many people in the Drupal world, so we decided to go a step further and bring someone in who can help improve the migration process itself. Will has been helping with tickets for Migrate-related issues (core and contributed), but we know that there is someone out there with a lot more expertise in this area: Mike Ryan, the creator and maintainer of the original Migrate module, and one of the maintainers of the core migration system in Drupal 8.

While core Drupal 8.1 is largely in good shape, there are some contributed tools which really make the core functionality more useful, notably Migrate Tools and Migrate Plus. Volunteer time is a hard thing to protect, so we’re paying Mike to focus his time to polish off some of the highest priority issues with these modules, as well as move a few remaining core patches that we’d all love to see go in. We’re also making sure that Will has protected time at work to help Mike by working on patches and doing reviews. The faster we can get Drupal 8 migrations working well, the better it will be for everyone in the Drupal world. We’re excited that we have the opportunity to work with Mike and give a strong push on this.

Drupal 8 User Guide

In the documentation side of things, Joe’s been plugging away on the Drupal 8 User Guide project, which he helped get off the ground about a year ago at DrupalCon Los Angeles, and has been working on intermittently ever since. They’re super close to completing the first draft of the content and will be moving into testing and editing after that. Last we checked, 95 out of 101 pages had been completed! This project is a great fit for us because of how much it overlaps with the work Joe's doing at Drupalize.Me to create tutorials, especially given our focus on improving Drupal by developing quality educational materials, both free and paid. Joe’s been able to take ideas from Drupalize.Me and incorporate them into the user guide, and vice versa, and that’s been beneficial to both projects.

Events

Outside of code and documentation, we also love some quality time with people! There are a ton of events happening all over the world (check out http://www.drupical.com for ones near you). Here are the ones we’re involved with right now.

DrupalCon New Orleans

Coming up in May, the North American DrupalCon is in New Orleans, Louisiana. What an amazing city! We’re super excited that our whole team will be there. We have a few things cooking there:

Amber and Joe will be presenting on a few different topics. Amber has a cool session called “Beyond the Blink: Add Drupal to Your IoT Playground” where you can find out all about using Drupal with physical things, with blinking lights and more! Joe will be running around with 2 sessions. The first dives into “Altering, Extending, and Enhancing Drupal 8” where you’ll explore hooks, plugins, events, and services. He will also be talking more about the user guide we mentioned above, and other changes to how the Drupal community maintains our documentation on Drupal.org in “Documentation Is Getting An Overhaul”.

Will and Blake have signed up to be mentors for the Friday sprint day, so you can run into them helping people out at the Mentored Core Sprint and the First Time Sprinter Workshop.

While we’re talking about DrupalCon, you should also check out the workshop “D8 & Symfony: Dive Into the Core Concepts that Make Each Fly”, with our partners at KnpUniversity. If you want some hands-on training with the internals of Drupal 8, this is the way to go!

Of course we’ll be around all week, meeting old friends, making new friends and taking in the atmosphere of NOLA. If you see us around, please stop and say “Hi!”

DrupalCamp Guadalajara

Joe is a pretty busy guy and was honored to be asked to keynote at DrupalCamp Guadalajara in Mexico. He’ll be there this coming weekend (April 7-9) to talk more about the Drupal community, chat with the local community, and eat lots of yummy food.

Twin Cities DrupalCamp

Joe and Andrew are both on the camp planning team, helping to organize the 2016 Twin Cities DrupalCamp. Joe has helped with the camp in some capacity or another every year that it’s been held, and this is Andrew’s second time diving in. This time around Joe is helping with the programming committee: responsible for setting a schedule, a theme, recruiting speakers, setting up pre-camp training and more. This year the committee has been focusing on including a new Higher Education summit on the first day of camp since it’s going to be held at the University of MN, and also thinking about ways we can mix things up a bit and intersperse some more un-conference style activities into the traditional session tracks.

Andrew is leading the sponsor committee, which focuses on wrangling former/new sponsors. (Shameless plug: If you want to sponsor the camp, please contact [email protected], and he’ll follow up.) We might be a bit biased, but Twin Cities DrupalCamp is the best camp in the entire world, and you should attend!

Portland Meetups

Not all events we jump into are camps. As part of her general involvement with the Portland Drupal User Group, Amber has been helping organize the local meetups—yes, there is not just one, but 3 different groups currently running on a monthly basis. If you’re in the Portland area you will likely run into Amber for sure.

Well, that wraps up a whole lot of fun stuff that we’re excited about. I hope that by sharing about our experiences, you can see that there are many ways to be involved, and meet new friends, in our community. What kinds of things are you involved in? If you need help getting started with the community in some way, please feel free to reach out to us and we’ll help you figure out a good way to dig in.

Mar 08 2016
Mar 08

On the Drupal Migration Trail

Drupal 6 was released in February of 2008 and on February 26th, 2016, after 7 years, Drupal 6 was retired, in accordance with the Drupal community’s policy of only providing active support for two major versions of Drupal at any given time. You can read more about the Drupal 6 End-of-Life (EOL) here. While it is possible to migrate a Drupal 6 (or 7) site to Drupal 8, the tools are still in flux. While simple sites make for simple migrations—since most sites are not simple and require considerable research, planning, and effort to migrate—migration remains a complex process. Continue reading to find out more about how Drupal 6’s end-of-life impacts Drupal site owners and what options you have if you still run a Drupal 6 site.

What does this mean for you, if you’re running a Drupal 6 site? This is a tricky question to answer, and there’s no concrete schedule declaring what you should do and when you should do it. But here are some things you should consider:

  • Your existing site will continue to function exactly as it does right now, and will continue to do so indefinitely. So there’s no need to panic. But it certainly accelerates the need to update your site to a supported version of Drupal.
  • There will no longer be security updates for Drupal 6 core provided by the community. Contributed module maintainers may choose to provide patches or new releases for their own modules, but Drupal 6 is no longer officially supported.
  • The Drupal Security Team is working with a few vendors who are willing to provide paid support for Drupal 6 sites beyond February 24th, 2016. That list of vendors can be found here. There is no guarantee as to how long this program will continue. However, if you continue to operate a Drupal 6 site it would be wise to familiarize yourself with the list of vendors providing continuing support or to confirm with your current team, or vendors, that they understand the implications of supporting Drupal 6.
  • Start planning—and executing—your migration to Drupal 8.

Planning a Migration? Awesome. We’ve got your back.

For many people, at this point the best course of action is to start the process of upgrading your site to Drupal 8. Depending on the complexity of your site, this can be a daunting task: a lot to learn and a still shifting set of best practices.

At the time of this writing (beginning of March 2016), the Migration support in Drupal 8.0.x is still experimental—and things are changing rapidly. This makes planning and executing a migration slightly more challenging. The tools required to do any successful migration are spread across core and contributed modules. People contributing to Migrate continue to refine the core APIs and move more of the essential functionality into core and out of contributed modules. The good news is that it’s getting better—and more powerful—quickly. The bad news is that at any given time you can expect to put a bit of up-front legwork into figuring out the current state of the Migrate tools and how to use them right now. This will settle down eventually and best practices will emerge, but it is hard to say how long that will take. So knowing how the pieces fit together continues to be important.

For now, here are some issues to keep an eye on:

Additionally you’ll need to consider the contributed module landscape before performing a migration. Many contributed modules from Drupal 6 and 7 have been moved into core for Drupal 8, and tons of contributed modules are in the process of being ported. Though knowing if they are complete and support migrations, or if there’s a new and better way to accomplish the same functionality in Drupal 8, is another thing you’ll need to learn to assess on your own.

As the Migrate API and associated modules mature, and as more contributed modules are ported to Drupal 8, performing a migration will get simpler. But no matter what the tools look like, data migrations are never as easy as we want them to be.

If you’re ready to jump in and start performing a migration now here are a few good resources to help get you started:

Over the coming months we’ll be releasing a comprehensive guide to using the Drupal 8 Migrate API and related contributed modules to plan and execute a successful migration from Drupal 6 or 7 to Drupal 8.

Right now we’re both figuring out what to cover, and writing some of the content we already know we’ll need for sure. We’re not quite ready to publish our proposed outline for the guide, but here’s some of the things we’re planning to cover:

  • The current state of the Drupal 8 Migrate API, and how you can evaluate it yourself
  • Terminology related to the Migrate API and the Extract, Transform, Load (ETL) process it uses
  • The current contributed module landscape
  • Evaluating your Drupal 6, or Drupal 7 site, with an eye towards planning a migration
  • How to prepare a Drupal 8 site to serve as the target for a migration
  • Running, rolling back, and debugging migrations via the Drupal UI, Drush, or Drupal console
  • Writing migration plugins
  • Special considerations for handling user uploaded files during migrations
  • And more

In addition to looking at Drupal-to-Drupal migrations we’ll also be expanding the guide in the future with information about performing migrations and data-imports from non-Drupal sources.

Are there any burning questions or concerns you’ve got about migrating your site to Drupal 8? Let us know what you need to learn in the comments and we’ll do our best to make sure that it’s covered in our Drupal 8 Migration Guide.

Dec 09 2015
Dec 09

Drupal 8

With the release of Drupal 8 comes a new way of making web requests, available via the Drupal::httpClient. This is simply a wrapper for the wonderful Guzzle HTTP Client. In this post, we'll take a look at how we can use the Drupal::httpClient class for making HTTP requests in a module. This is particularly useful when you wish to communicate with external websites or web services.

In Drupal 7, you would have used the drupal_http_request function for sending HTTP requests. This functionality now exists in Drupal::httpClient for Drupal 8.

Drupal and Guzzle (in short)

According to the Guzzle project page, "Guzzle is a PHP HTTP client and framework for building RESTful web service clients."

Guzzle utilizes PSR-7 as the HTTP message interface. PSR-7 describes common interfaces for representing HTTP messages. This allows Guzzle to work with any other library that utilizes PSR-7 message interfaces.

You can check the version of Guzzle that you’re using by taking a look at the composer.lock file in your Drupal project directory.

Drupal 8.0.1 comes with Guzzle 6.1.0:


  {
    "name": "guzzlehttp/guzzle",
    "version": "6.1.0",
    "source": {
      "type": "git",
      "url": "https://github.com/guzzle/guzzle.git",
      "reference": "66fd14b4d0b8f2389eaf37c5458608c7cb793a81"
    },
  // ...
  },

The Guzzle documentation is available here.

Drupal::httpClient in a module

Data.gov provides a catalog of data via CKAN, a powerful open source data platform that includes a robust API. We're going to take a look at some examples using the CKAN API, full documentation is available here.

First, let's take a quick look at how we make requests in Drupal. You can initialize a client like so:


  $client = \Drupal::httpClient();

Pass the full URL in your request:


  $client->request('GET', 'http://demo.ckan.org/api/3/action/package_list');

Guzzle also provides a list of synchronous methods for making requests, a full list is available here:

You can make GET requests as follows:


  $request = $client->get('http://demo.ckan.org/api/3/action/package_list');
  $response = $request->getBody();

Next, let's POST some JSON to a remote API:


  $client = \Drupal::httpClient();
  $request = $client->post('http://demo.ckan.org/api/3/action/group_list', [
    'json' => [
      'id'=> 'data-explorer'
    ]
  ]);
  $response = json_decode($request->getBody());

In the client->post() method above, we pass in a URL string, and an array of request options. In this case, 'json', and an array of the properties we'd like to send as JSON. Guzzle takes care of adding a 'Content-Type','application/json' header, as well as json_encoding the 'json' array. We then call json_decode to decode the response of our request.

A full list of request options is available on the project's website: Guzzle Request Options.

Example: HTTP basic authentication

What about handling HTTP basic authentication with GitHub's API, for example?


  $client = \Drupal::httpClient();
  $request = $client->get('https://api.github.com/user', [
    'auth' => ['username','password']
  ]);
  $response = $request->getBody();

Exception handling

When using Drupal::httpClient, you should always wrap your requests in a try/catch block, to handle any exceptions. Here is an example of logging Drupal::httpClient request exceptions via watchdog_exception.


  $client = \Drupal::httpClient();

  try {
    $response = $client->get('http://demo.ckan.org/api/3/action/package_list');
    $data = $response->getBody();
  }
  catch (RequestException $e) {
    watchdog_exception('my_module', $e->getMessage());
  }

You can get a full list of Exception types simply by listing the contents of <drupal_root>/vendor/guzzlehttp/guzzle/src/Exception. Utilizing this list allows you to provide different behavior based on exception type.

At the time of writing, the contents of that is as follows:


  BadResponseException.php
  ClientException.php
  ConnectException.php
  GuzzleException.php
  RequestException.php
  SeekException.php
  ServerException.php
  TooManyRedirectsException.php
  TransferException.php

Guzzle clients use a handler and middleware system to send HTTP requests. You can refer to the documentation for more information about creating your own handlers and middleware to allow for more fine grained control of your HTTP workflow.

Custom Http Client

Changing the properties of an Http Client, like base_uri can be done by using the ClientFactory class, and creating your own Http Client.

We do this by creating a module. Let's call our module custom_http_client.

Create a custom_http_client.info.yml:


name: Custom Http Client
type: module
description: A custom HTTP client
core: 8.x
package: Custom

Create a custom_http_client.services.yml and add the following content:


# Service definition in YAML.
services:
  custom_http_client.client:
    class: GuzzleHttp\Client
    factory: custom_http_client.client.factory:get
  custom_http_client.client.factory:
    class: Drupal\custom_http_client\ClientFactory

Then we create the following class at 'custom_http_client/src/ClientFactory.php':


namespace Drupal\custom_http_client;

use GuzzleHttp\Client;

class ClientFactory {

    /**
     * Return a configured Client object.
     */
    public function get() {
        $config = [
            'base_uri' => 'https://example.com',
        ];

        $client = new Client($config);

        return $client;
    }
}

You can then load this service to use your custom http client anywhere you need to.

You should load your service via a container, like so:


/**
   * GuzzleHttp\Client definition.
   *
   * @var GuzzleHttp\Client
   */
  protected $custom_http_client_client;
  public function __construct(
      Client $custom_http_client_client
    ) {
    parent::__construct();
        $this->custom_http_client_client = $custom_http_client_client;
  }

  public static function create(ContainerInterface $container) {
    return new static(
      $container->get('custom_http_client.client')
    );
  }

Then you can access it via:


$this->custom_http_client_client

The generation of a module and service can be completed using Drupal Console; check out the documentation links below.

Resources

Dec 01 2015
Dec 01

Two weeks ago I read an article that came up in my Twitter stream (thanks to Bob Kepford @kepford), titled Canary in the Code Mine. It is the story of a company, BitSource, located in Pikesville, Kentucky, that is trying to find a new way forward in the tech world for coal miners who are being laid off in droves. The article is well written, and the story is inspiring. It's worth the "long" read (about 15–20 minutes). Don't worry, I can wait while you go check it out.

Working for a distributed company, and being fortunate enough to be able to live where my life and love have taken me, I completely get the idea of people wanting to provide for their families without leaving the home and community they love. I also can't imagine what it is like to have an entire industry, which I've built my career on, simply vanish from under my feet. I hope I never do experience it. I think the way that BitSource has approached the problem is inspiring and I wish them much success.

In reading the article, one particular detail stood out, which is that of the variety of technologies the BitSource team are learning, one of them is Drupal. I decided to reach out to them to find out more about what work they are doing and whether Drupalize.Me would be of use to them. I hopped on the phone with Justin Hall, the President and team lead, and come to find out that they are in fact a Drupal shop now. They explored various platforms and decided, as a team, that Drupal was their tool of choice. That certainly made me happy and proud. During our brief chat we also discussed open source in general, and Justin pointed out that without open source they probably wouldn't be able to succeed in this journey. He was incredibly thankful for the opportunities they have. I'm impressed with the bravery of this team to step into a whole new world. We finished the call with me offering his team access to Drupalize.Me in order to get more advanced with their dev chops. I also encouraged his team to get directly involved in the community, as that is the best way I know to accelerate learning. It can be a hard leap of confidence to step into the open source world, so I also wanted to introduce them and their story to our community. Let's welcome them warmly into our family.

This is a clear reminder that open source matters. Drupal matters in many more ways than being able to build websites. It is easy to get into the routine of working in open source and handling the tool we are familiar with, working on the project or deadline in front of us, while having the larger impact of our work fade in the background. We, the open source community, are changing individual lives every. single. day. Every single person who has done even the smallest thing* to help an open source project, has helped the lives of this team of former miners tucked in the mountains of eastern Kentucky–and uncountable numbers of other lives as well, including my own. I just want to take a moment to appreciate that, and say "Thank You" to everyone in open source, regardless of the project. You are heroes, and you are amazing.

* When I say "the smallest thing" I mean to point out that many people feel you have to be some master coder to contribute to open source projects. That's simply not true. If you have reported a bug, tested a fix, answered a forum question, donated money, written a blog post, explained what open source is to someone, etc. then you have helped build what we all are today. Remember that, and acknowledge it. You are awesome.

Nov 30 2015
Nov 30

Meet Drupal Developers from Four Kitchens

This interview is part of an ongoing series where we talk with a variety of people in the Drupal community about the work they do. Each interview focuses on a particular Drupal role, and asks the individuals about their work, tools they use, and advice for others starting in that role. You can read all of these interviews under this list of Drupal roles posts.

Interested in learning how to become a Drupal developer, too? Check out our role-based learning pathway: Become a Drupal Developer.

Jon Peck

Jon Peck is the Senior Engineer at Four Kitchens. He's also a systems administrator and educator. He loves working on the backend of big enterprise sites with a focus on architecture and optimization, as well as playing keyboard in a progressive rock band.

Where to find Jon

David Diers

David Diers is an Engineer at Four Kitchens. Prior to his current position, he worked for many years in academic and administrative IT at the University of Texas at Austin, where he earned a master’s degree in music composition.

Where to find David

How do you define the Drupal developer role?

  • JP - Someone who analyzes and interprets needs, determines the best solution, implements, and then reviews results. Within Drupal specifically, they have a broad understanding of how Drupal interacts with itself (request handling, hooks, theming, and so forth) and they know how to seek out deeper knowledge.

  • DD - Someone who is a Drupal developer is well versed in site building, and custom code, and knows when it's best to build or configure. They have an understanding of how Drupal works and approach problem solving in native Drupal fashion — all the while ensuring an extensible and flexible approach.

What do you currently do for work? What does your daily routine and work process look like? What kind of tasks do you do on a daily basis?

  • JP - Right now, I'm the architect of two publications that will be implemented in Drupal, including the migration from multiple legacy systems. I'm also consulting on performance and site auditing. I work from home; my day consists of occasional meetings (mostly via Zoom), development and documentation, and discussions via Slack. Projects are managed using JIRA, and code is in GitHub or Stash (depending on the client).

  • DD - Currently, I am working with a major media company to unify a large number of disparate Drupal sites and find ways of abstracting the approach in Drupal 8 so that truly diverse approaches can be accommodated within a single definitive content model. In recent years I've been doing a lot of strategic work, analysis and architecture, but depending on the project I could be doing a lot of gnarly development, deep in the code — plugins, migrations and database work.

What do others look to you to do on a project?

  • JP - I provide experience and historical perspective, along with recommendations about how to resolve difficult issues and continue to grow. I present myself as a collaborative resource both within the team and to whomever I'm working with. Also, I have the ability to translate non-technical requests into actionable development.

  • DD - I tend to get called in on tough SQL problems, migrations, and custom plug-in work on the technical side. On the more holistic side, I think my teams trust that I am going to bring a balanced viewpoint, and a deep investment and understanding of the business from the client perspective. I tend to fall in love with people and missions instead of the tech and tools and that's a good balance to have on a team of technology forward folks.

What would you say is your strongest skill? How have you honed that skill over the years?

  • JP - I can collaborate across groups in such a way that the conversation is centered around the common goal, not an “us vs. them” conflagration. It's something that I've had to consciously work at; finger pointing is easy, but swallowing pride and saying “we messed up and here's how we're going to fix it” is easier to write than to do :-)

  • DD - I listen well. I used to call it intuition or gut, but over time I realized it wasn't about a feeling with mysterious origins, it was all based on things I heard or didn't, essentially, it was about listening. Listening is about what is said but it is also hearing the silence. A sentence with a lot of gaps has just as much to say as one with a lot of words — you just have to know how to interpret it.

How did you get started on this career path?

  • JP - I'd been a PHP developer for many years, working with several open-source platforms and frameworks before focusing on Drupal. I have found success specializing in specific areas and having a broad knowledge in many others, and the Drupal community drew me in ways that are unique and refreshing.

  • DD - I went to school for music but had developed an interest for technology. It seemed like a good path to support my artistic career. I caught a break with a company in Houston who saw my potential and my willingness to put in a ton of effort to making a good product. Over the years, the one thing I keep coming back to is that this career really provides a lot of opportunities to learn something new almost constantly. For someone who loves learning, it's been a great choice.

What is most challenging about being a Drupal developer?

  • JP - Adapting to the Drupal mindset, some of which (prior to Drupal 8) was entrenched in “it's always been done that way”. Finding consistent quality documentation in edge cases is typically a challenge.

  • DD - The community is really broad and egalitarian. There are a lot of perspectives and approaches which tend to get pretty equal treatment and time in the sun. That is cool. However, solutions aren't truly equal in most cases, and as a beginning developer and even a senior one — drawbacks aren't always brought to the surface — so it really takes some efforts to identify the merits of a particular approach over another and to discuss that, outside of how these solutions have been facilitated in code or community.

What are your favorite tools and resources to help you do your work?

  • JP - The people I work with, both within 4K and across the open-source community. On my workstation itself, PhpStorm, Drush, Drupal Console, gulp, and site_audit.

  • DD - My team members are my biggest resource. I've learned a lot and hopefully helped a lot — both are valuable to your growth as a developer. For tools, I probably couldn't do what I do without a debugger and access to Drupal docs.

If you were starting out as a Drupal developer all over again, is there anything you would do differently?

  • JP - [I would have] registered on Drupal.org when I first started using Drupal and contributed more back to the community. Playing catch-up now!

  • DD - Start writing earlier and more consistently. It's a wonderful way to take that dynamic of teaching and learning out into the world. People are pretty free with their opinions on the internet — it can be nice or not, but you will always learn something.

What advice do you have for someone just starting out as a Drupal developer?

  • JP - Present at Drupal User Groups, Camps, and Cons! My personal Drupal "break-out" moment was a direct result of presenting at DrupalCamp Western NY 2011; it made many introductions and opened many doors.

  • DD - Try not to make your first Drupal project the big one. Find something small, with very modest needs and do that first; it will pay off when you finally do get to the big one. Go to events, camps, and DrupalCon. Take some training and talk with folks. Most importantly, dig into core and the main modules source code — it's really all happening there. The sooner you understand what's going on, the better.

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web