Dec 07 2018
Dec 07
 

In case you missed the news from September, Dries Buytaert announced the end of life dates for both Drupal 7 & 8. The date for both Drupal 7 & 8 is slated for November of 2021, and that may seem strange, but it really makes sense given the differences in 7 & 8 and the widespread usage of 7. Drupal 8 is ending alongside Symfony 3 which powers a lot of the underlying framework of Drupal 8, so that makes perfect sense.

But why is Drupal 7 sticking around? Drupal 7 is the point in Drupal history where many large organizations bought into the CMS. It found a large user base with complicated government, education, and non-profit sites. Drupal was a web revelation for many of these large organizations, and they invested in the concept with time, money, and staff. In turn, the Drupal community benefited from having these organizations invest in the Drupal eco-system. More developers learned Drupal and more agencies took on Drupal in order to services these organizations. This enriched the Drupal community with tons of contributed modules and core contributions.

An unfortunate side of large organizations is that they move at a glacial pace. Making a move from Drupal 7 to Drupal 8 is not something they can just plan and complete in a matter of months. Sometimes funding needs to be procured in a specific way, and other times non-web savvy board members need to be educated on why their perfectly good website needs to go through a costly overhaul. Not to mention the gargantuan planning task that comes with a migration of a complicated site. These things take time, often many months of time. This is why I believe the Drupal 7 end of life has been extended out so far, it's Drupal giving some slack to these large entities that helped grow it into the community and platform it is today.

Still Not Sure You're Ready to Upgrade?  Learn more about the benefits and our upgrade process. 

Start planning now

Starting with Drupal 8, the full rebuild major version upgrades are supposed to be a thing of the past. More on that here. Once the move to 8 is completed major version updates are supposed become a little more smooth as long as regular minor version updates are kept up on. The important takeaway from this is that there is now a target date for when everybody needs to be moved over, it’s three-ish years, and that is really not a ton of time if you are one of these organizations moving at turtle speed. Now is the time to start securing funds, interviewing vendors, and making a plan to get over to Drupal 8. 

It’s true that Drupal 8 is also going to hit its end of life on the same day as Drupal 7, but don’t let that stop you from making the move to 8. As I mentioned already, major version updates after 8 are going to be streamlined and not require complete re-architecting so a move to 8 now will be an easy pipeline to Drupal 9. If you wait for Drupal 9 to upgrade your Drupal 7 site, you may find yourself racing against the clock, and we all know that can be costly for a large web launch. There is no way around it, large sites take a lot of time to plan and migrate, and it’s not unheard of for a rebuild to be estimated out to 13 or 14 months of development. During this time you are also going to want new features and upgrades. Make sure you give your organization the needed time to plan and build the next version of your site.

Do things better

A move from Drupal 7 to 8 shouldn’t just be thought of as a migration of the same old site over to a new CMS. This is the time to update and make your site better, stronger, and faster. Take all of the things you learned from the last version of your site and make it into a better system for both end users and administrators. Starting the move now will give you the time to analyze data and usage, you can send surveys and interview users to make intentional updates that users will be excited about.

Making a plan early will allow you the time to figure out a new infrastructure that is faster and more secure. Maybe integrating a new CDN is the right move for faster page loads, perhaps using something like Solr will speed up getting your customers to the information or products they are seeking faster. This might be the time to explore moving your front end to React for greater performance. Take the time to research the latest and greatest in security options for your site. Stating the plan now gives you the time to review the options. This will not only make the next version of your site so much better, but it will also save on costs in both time and money.

The Drupal updates keep coming

One key reason to get over to Drupal 8 as soon as possible is to take advantage of all of the latest development of Drupal core. Drupal 7 is only getting patch fixes at this point, and all new feature development is happening on Drupal 8. The same thing goes for contributed modules, the majority of new module development is for Drupal 8 only and many popular modules are only doing new development for 8. The past six minor updates for Drupal core have been delivered on time and have pushed the platform to new heights with each release. The next minor release for Drupal 8 (8.7) comes out on May 1st, 2019 and the next (8.8) will be released on December 4th, 2019 so now is a good time to get on board to take advantage of these upcoming updates. Check out the development roadmap for more details.

The end (future) is near

Now that there is a planned end date for Drupal 7, there is no reason not to start preparing the move to Drupal 8 now. If you are waiting for Drupal 9, you are just shortchanging your site the proper time to plan a thoughtful rebuild and migration. All sites still running on Drupal 7 are missing out on the current development efforts of Drupal. You have a date, you know when you’re site is going to fall out of support, now is the time to give it a new extension on life.

Offer for a free consultation with an Ashday expert

Oct 24 2018
Oct 24

 Drupal Module Spotlight: Paragraphs

I really don’t like WYSIWYG editors. I know that I’m not alone, most developers and site builders feel this way too. Content creators always request a wysiwyg, but I am convinced that it is more of a necessary evil and they secretly dislike wysiwygs too. You all know what wysiwygs (What You See Is What You Get) are right? They are those nifty fields that allow you to format text with links, bolding, alignment, and other neat things. They also can have the ability to add tables, iframes, flash code, and other problematic HTML elements. With Drupal we have been able to move things out of a single wysiwyg body field into more discrete purpose-built fields that match the shape of the content being created and this has helped solve a lot of issues, but still didn’t cancel out the need for a versatile body field that a wysiwyg can provide.

Content creators yearn for the flexibility of creating a page that matches their vision for how they want the layout. Front-end developers and designers want to be able to rigidly style a site that always looks good, and that clashes with the unexpected markup that a WYSIWYG can produce. These differing priorities can resulting in a real mess of a page. It’s been a struggle as old as the CMS itself.

Solutions of the Past

As I mentioned above, the Drupal solution to runaway WYSIWYG markup is purpose-built fields to catch all specific sections of content that you can.Instead of allowing the whole page to be created in one WYSIWYG field, we would have a field for an image, a summary, some various highlight sections, etc. One problem still remains, content creators want to be creative and have various patterns of content depending on what they are writing about. This includes images floating in every which-way, videos in the middle of an article, quote blocks, content in columns, and the list goes on. You can’t just build a bunch of types of fields for this problem. In the past, I tried using Field Collections and custom form alters to build a user interface that allowed them to select and only see the relevant fields, but this still proved to be too rigid.

The Modern Solution

Enter Paragraphs, the Drupal contributed module that can ultimately solve this problem. I started using Paragraphs with Drupal 7 back around 2014 and it really changed the way I thought about editing and creating content on a Drupal site.

With Paragraphs you can define little entities of fields much like the Field Collection module but less static. The difference is that you can access all or some of the different paragraph entity types from a single field on your content type. This single Paragraph content field will replace your body content wysiwyg field entirely. The Paragraph content field will allow you to endlessly add in different paragraph types and provides a method to reorder after creation. Working with your content creators, you can determine an unlimited number of content patterns that they would like to use. For example, you can create paragraphs with the fields required to make a slideshow, embedded video block, accordion content, or another that is an entity reference field to related articles, and one more that is simply a text area field. With this toolkit of content patterns your content creators can build content in any array of patterns they wish.

Paragraphs will satisfy the creative control that the editors of your site desire and will make them finally admit that they too hate that weird little wysiwyg field. When the situation arises that they want to do something new, it’s just a small revision to create a new paragraph type. Your frontend developers and designers will also be happy that they can strictly control the visual aspects of the individual Paragraphs with specific template files and targeted SASS styling.

Extending the Concept

An overlooked but very powerful use of Paragraphs is using it without any fields at all. Let's say that you want to add a newsletter signup form in the middle of your article. Simply define a paragraph called Newsletter signup then with a bit of custom code write a signup form, you could even have it automatically pass the signup source of the article in a hidden field. Next, all the article writer needs to do is drop the Newsletter signup Paragraph wherever they want it to appear within the article content. Another example of a field-less paragraph that we have used in the past is an about the author section. There are no fields needed, just add the paragraph and write code to look up the article’s author and write a template file to display a bit of author info with a headshot, simple as that.

I could write on and on about the possibilities and use cases for paragraphs, but you should really just download it and give it a test drive. It’s like introducing a mini block system into an individual article of content. If you are planning a new build or have an existing Drupal site, don’t hesitate to add Paragraphs. Your content creators and editors will love you for it.

Sep 19 2018
Sep 19

Spotlights shining on the Drupal 8 logo

Webform in Drupal 7 was always one of the top 10 must-haves on any Drupal site. Then Drupal 8 came along, and Webform wasn’t in the picture at first. Luckily, Drupal 8 came with the contact module in core that took care of most form needs, and we lived without the Webform module.

In the meantime, Drupal contributor Jacob Rockowitz had been working on the YAML Form module, which was a module used to define webforms in YAML files. At some point towards the end of 2017 YAML Form switched to the Webform namespace and Webform in D8 was born.

The Drupal 8 version of Webform will feel familiar to those with experience in the Drupal 7 version, but this is a completely new module and code base. It’s a really great tool for content creators to use, by giving them the ability to create complex forms without having to worry about code. However, as a developer, I found it tedious to create long forms through the Webform UI. Those days are in the past now, since Webform is a descendant of the YAML Form module, you still have the ability to define a form using YAML in the source tab of the build screen.

Screenshot showing direct editing in YAML

Direct editing in YAML really speeds things up for developers, and content creators can still use the UI to create and edit. It’s the best of both worlds.

The Webform module in Drupal 8 is such a powerful and fully featured form builder, that it could be reason enough alone for some site owners to switch to Drupal 8. If you run a site that is in need of new or dynamic forms on at least a monthly basis, then Drupal 8 + Webform would be a lifesaver and a budget saver. Personally, I find that Drupal 8 with Webform rivals any of the standalone form building tools available. There are dozens of features and settings that you get out of the box, and it’s easily extendible if you need to integrate with an outside service.

Like most things in Drupal 8, webforms created with Webform can be exported and synced as config. Webform also plays nice with Features if that’s your thing.

We still custom build many forms in Drupal 8 using the Form API, but Webform is a great tool to keep in mind when things need to be editable by administrators or when you just need a simple solution. It already deals with storage, email, and so many other things out of the box. If you have any webform needs, don’t hesitate giving Webform a shot.

Sep 14 2018
Sep 14

Illustration showing multiple components of an SEO strategy 

Drupal has a bunch of great SEO tools. Here are several tips and suggested modules for fine tuning SEO within Drupal. Easy SEO wins can be achieved through configuring metatags and URLs. Don’t forget to setup an XML sitemap of your site and submit to major search engines. SEO isn’t a once and done effort, make sure to constantly research and update with search trends.

Yes, but is it good for SEO? This is a question we hear all the time when we mention all of the wonderful capabilities of a Drupal site. First off, let's dispel the myth that there is a CMS that automatically does magical SEO and makes all of your pages rank higher in search. If you want good SEO, the most important thing that you can do is write good and unique content that humans actually want to read. The CMS or web software has nothing to do with it. So let's assume that you already have great content and semantically perfect markup, there are tons of other little things that you can do to further boost your content in the eyes of search engines and Drupal is a great tool for implementing them.

To get the most out of Drupal SEO, you are going to want to download a couple of contributed modules from drupal.org.

The following is a list of our goto modules for SEO that I am going to talk about in this post:

Using Metatags to Help Search Engines

Metatags are important for search engines to index, categorize, and understand the content of your webpage. There are a lot of different metatags but a couple that really matter for SEO are the meta description and the title tags. Meta descriptions can be used as a summary in search engines, so it is important to write some compelling content here. It’s the first pitch to a potential site visitor so you may want to put a little thought into it. What if you have hundreds or thousands of pages on your site? The Drupal contributed Metatag module provides a way to dynamically set metatags based on content type, or on other content rules you may have. Working with the Token module you can have all of your metatags, including the description, generated based off of your content. For those pages that you are really trying to squeeze the SEO juice out of, Metatag allows you to override for when you need to fine tune things.

Which CMS is best for your website?  Take our CMS Quiz and find out!

If you want to take things further with metadata, you can also install the Schema.org Metatag module which extends the base Metatag module. You can read more about the Schema.org project here. In a nutshell Schema.org is making a push for people to further define what kind of content they are writing by breaking things into categories of content that a search bot can read. The types of content that they have defined is very granular with deep sub categories. Check them all out here. Having Schema.org metadata, can give you a leg up on your competition as more devices and services start reading and prioritizing web sites based on this content categorization data.

URLs Should be Informative and Human Readable

Having URL paths that make sense can go a long way for SEO. Search engines are able to parse relevance from the words in a URL. For example a URL that is “example.com/node/34” gives no information as to what the topic of the page is. Alternatively, this URL “example.com/store/shirts/blue-shirt” can tell a lot more to a search bot. The Pathauto module enables you to make descriptive URLs. Similar to the Metatag module, you can use this with the Token module to automatically build new URLs based on the type of content you are creating. These are referred to as “patterns” in the Pathauto module. Paths generated by Pathauto can also be changed on a one by one basis, so you can overwrite the generated name in favor of a custom one.

Another useful module related to URLs is the Redirect module. The redirect module does a handful of useful things. Every time you go and change a URL, the redirect module will create a 301 redirect from the previous URL to the new URL. This is helpful when you are updating a page that may have been bookmarked or linked to elsewhere. 301 is the status code that a web server sends when a requested page has been moved and tells the browser to redirect to the new page. 301 redirects are essential to having good SEO, search bots give a poor evaluation to sites with a bunch URLs that go nowhere and humans don’t really like it either.

The Redirect module also comes with a really handy utility page called Fix 404 pages. 404 is the status code that a web server sends when a requested page doesn’t exist. We have all seen these annoying messages from time to time. Sometimes the page has been deleted and sometimes it has simply been renamed and moved, with a 404 message there is no way to know where the page you are looking for is and will leave you thinking that maybe you just visited the page in a dream. The Fix 404 pages utility gives you a report of all of the 404 URLs that your site is sending, it also tells you how many times a URL has been tried and the last time someone tried to go there. This report gives you the opportunity to add a redirect for those 404 URLs and send those users to a relevant page. This is a big boost for SEO because 301 > 404 in the eyes of a search bot.

Provide a Sitemap to Better Direct Search Bots

Search bots work hard, 24/7 365 days a year they are out there crawling the web to make your search experience better. So make it easy on them and provide a roadmap for traversing your site. This is done with an XML sitemap. An XML sitemap is simply an outline of your site’s pages with priority and update frequency all wrapped up in the XML format. We typically build a custom solution for generating an XML sitemap, but there is a Drupal contributed module out there to help you do it without needing to know how to write XML. The XML Sitemap module allows you to configure your sitemap based on things like content types or menu structure. Once you have a sitemap, go ahead and give those search bots a jump start and submit it to the search engines. Each search engine has a different process for doing this, so make sure you submit to more engines than just Google, the XML Sitemap module has a built in tool for submitting to search engines as well.

Research and testing

Lastly, let's talk about Google Analytics and so I don’t have to write that out a bunch of times i'm going to refer to Google Analytics as GA. GA won’t do anything to help your SEO, but it is a crucial tool for analyzing how effective your SEO work has been. The Drupal contributed Google Analytics module makes it easy to set up on your Drupal site. A good SEO strategy is all about testing and adjusting. Make some assumptions about what topics or keywords you think will drive traffic to your site but don’t stop there. Turn your assumptions into tested data with GA. Your website should be a testing ground for new search words, as you see traffic spike up around a search term adjust the rest of your content to cater to those search terms. GA is currently the best tool available for tracking visitors on your site. With testing and study eventually you will land on the right terms and words to use so that the right people find their way to your amazing content.

 image with text offering access to our free CMS Selection quiz.

Aug 15 2018
Aug 15

Business person looking at Drupal logo, deciding on a CMS

You may have heard of Drupal in passing, but you have not ever been given a straight answer on what it is and why you should care. The truth is that even if you have worked with Drupal, you might not actually know what to say when asked what it is. Looking around there doesn’t seem to be a lot of great answers to this question out there either. It would be difficult to tell if you need Drupal as a solution for your website if you aren’t even sure what it really is to begin with. 

The Basics

To say we talk about Drupal a lot is an understatement. It’s non-stop Drupal all day long here at Ashday, but what is Drupal? Simply put Drupal, is an open source content management system. It's primarily built in the PHP programing language, and designed to create websites for use on a variety of different web servers.

Okay, but what is a content management system? A typical definition of a content management system or CMS is an application that is often used for creating websites that focuses on publishing workflows, user management, and content delivery. These dynamic and database driven websites usually allow for multiple editors and templating for streamlined content development. This is different than a plain old web page that is written in HTML and styled in CSS. A content management system allows you to compose and edit just the content without having to adjust the HTML. It makes it easier for non-technical users to publish text and images on the web to use that content in more than one place. With a CMS you can display your blog post on the homepage and in your blog, without needing to write the same content twice.

Which CMS is best for your website?  Take our CMS Quiz and find out!

What does it mean to be open source? Open source means that all of the core code that makes the software tick is free and open to view. Open source programs are typically free to download and use. They aren’t exclusively developed by one person or corporation and they rely on an open community of peers to maintain and improve the core code. This means that the software can be rapidly developed with a large pool of contributors. Another benefit of open source is the adaptability of custom code, since everything is open we can extend the core code to do more and behave based on the business needs of the project.

Other Popular Content Management Systems

There are many open source software CMS tools available on the internet. Drupal is often mentioned in the company of other popular CMS such as WordPress. It even gets lumped together with SaaS (Software as a Service) based site builders like Squarespace and Wix. While it's true that Drupal serves a similar role as a means to building websites, it is far different from these other systems. WordPress for example, is primarily a blogging platform but people have stretched it far beyond its intended use. Because of this, complex sites built with WordPress often consist of a lot of custom code of varying quality. Drupal’s real contemporaries are more along the lines of a framework like Laravel. These frameworks are much more customizable and robust, but often lack the pre-built setup of users, content types, and management features. This results in a much longer time to market for projects built on a bare framework.

What are the use cases for Drupal?

Drupal is best suited for building websites that are medium to large in size with a focus on future scalability. That isn’t to say that Drupal can’t be used for smaller sites, it does fine for this sort of thing all of the time, but it is built to handle much more and can incur more overhead than is necessary. It is a bit subjective to use terms like medium and large for a website, but when we think of large we typically mean websites used for enterprise applications.

Drupal is great at storing, managing and delivering huge amounts of content. If you have a lot to publish, then Drupal can’t be beat. The admin interface allows for creation of custom management tools that make the whole publishing workflow tailored to how your business runs. Drupal is built around being able to have many levels of users with defined roles. Permissions can be fine-tuned to create a system with a publishing workflow that won’t slow down content creators and will save time for editors and curators.

In the world of web apps, Drupal is king. Drupal 8 is a very extensible framework capable of integrating with the vast ecosystem of services offered across the internet. If you need to build a product for others to connect to, Drupal is a great choice with its core RESTful API. The object-oriented framework within Drupal makes creating large-scale applications inexpensive with a reasonable timetable.

Why do organizations choose Drupal?

Large organizations both corporate and non-profit trust Drupal to run their sites and applications. Drupal has earned this trust through its open source community that provides contributed modules and timely core updates. The security team, which is world class, keeps Drupal a bit safer by finding and writing patches vulnerabilities before they can become a problem. Part of the Drupal community is the Drupal association that pushes cutting-edge initiatives to keep Drupal modern, innovative and thoughtful.

The large open source community behind Drupal provides thousands of modules to extend the core functionality of Drupal, all for free. Contributed modules are reviewed by the community and given a stamp of approval if they are stable and safe to use. This is very different than many other open source communities where contributed code can be malicious or come at a cost. When you use Drupal and its contributed modules, you are benefiting from the hundreds of thousands of development hours from a giant group of developers across the globe.

How do I start a Drupal project?

Drupal can be used to run just about anything you would want on the web. Because of this flexibility, Drupal doesn’t do a whole lot on initial install without more configuration and setup. This is not a simple task for the amateur site builder, Drupal is not known as the easiest of the frameworks to learn. If you are building a medium to large site or a web application, you may want to hire professional web developers with the right technical skills. This can be accomplished by an internal team of Drupal developers or outsourced to a Drupal development agency. Check out this article we wrote to help you determine if you should build with an in-house team or outsource. 

Popular examples of Drupal websites

You have most likely seen or used a site built on Drupal and didn’t even realize it. There are a lot of very influential sites out on the web that leverage Drupal to deliver content. Here is a small list of those:

ncaa.com

Harvard.edu

Taboola.com

ed.gov

economist.com

Billboard.com

Drupal is very popular in higher education with many universities and colleges running their sites on Drupal 7 & 8. Drupal also has a large share in local government sites and in the publishing industry. Drupal is everywhere you look but its flexible structure allows it to power a variety of types of sites while being invisible and allowing the content it serves to be visible.

 image with text offering access to our free CMS Selection quiz.

Jul 27 2018
Jul 27

Five Drupal Features essential to the publishing industry.

If you are in the publishing industry, you already know that Drupal 8 is by far the most useful CMS for publishers. It was great in Drupal 6 & 7 and with 8 it keeps getting better with each major release. Combined with the community contributed modules, Drupal 8 is the best platform for publishers yet. Here are five features in Drupal 8 that are essential to publishing.

1. Workflows

Moderation of content is crucial for quality when you have a lot of authors across multiple platforms. Drupal 8 has that solved out of the box with the core Workflow and Content Moderation modules. Check out this short post we did on Workflows in Drupal 8 core here for more info. Your CMS workflow should match your internal publishing workflow. Drupal 8 can automate most manual processes that you currently have. This is a prime example of getting your CMS to do some heavy lifting within your organization.

2. Integrations

Drupal is a fantastic framework for integrations. In the publishing business, you are going to require a decent amount of integrations. Google Tag Manager, Doubleclick for Publishers, HubSpot, and SalesForce just to name a few. Drupal’s object-oriented framework allows for integrations to be built quickly and efficiently. Plus, they can be custom tailored to your business needs. Many other popular CMS and site-builders are limited to the available integrations and options. Drupal is an open framework that can be custom built to do just about anything you need.

3. Content Management

Organizing, managing, and scheduling content is key to a successful publishing site. Drupal 8 builds on top of the fantastic content management system started in Drupal 7. Custom admin tools tailored to your business can be built on the fly using the core Views module. With custom tools you can build admin pages for any type of content you wish. You can even make an admin screen for niche needs. For example, if you want an admin screen for articles that were published in 2012 that were about cats, you can very easily make one. Content admin tools have the ability to be built around the permissions of the different roles in your organization. You can fine tune what access and permissions your editors, authors, admins, or others can have. This is all out of the box functionality in Drupal 8 and requires no custom code.

4. Asset Management

Like content management, asset management is fantastic in Drupal 8. Assets are treated much in the same way that content is. When I say assets I’m loosely defining them as any reusable element. Images, video, audio clips and other files are obvious assets, but you can also create some more obscure assets like tweets or Instagram posts. Assets are fieldable, so you can add additional data for display and categorization. Using the core Media module, you can manage all of your assets. Custom libraries can be created as well as custom selection widgets that are restricted to specific types of assets. This can be a huge timesaver for assets that are commonly used across many authors.

5. Web Services

Drupal 8 comes with a RESTful Web Services API in core. To put it simply, a RESTful web service can be a conduit to send your content from Drupal to another system like a native app. This can jump start a “publish once distribute everywhere” content model. For publishers this can be essential for content control and efficiency. There is an API first initiative actively going in Drupal 8 development which is pushing for new features in each version release. Besides the out of the box RESTful Web Services API, there are several contrib modules that include JSON API and GraphQL. These web services are also what gives Drupal the ability to go headless and communicate with a JavaScript front end such as React.

Drupal as the hub to a publishing ecosystem

We will cover more on this in the near future, but these 5 things really prove why Drupal 8 is a first class CMS for the publishing industry. Drupal 8 combined with great content, is a killer recipe for a successful publishing site. The out of the box features of Drupal 8 will save you time and money. If you are a publisher who is constantly struggling and fighting with your website, it’s time to give Drupal 8 a look. 

Offer for a free consultation to determine if Drupal is the right choice for your organization

Jul 21 2018
Jul 21

Unicode characters encoded using UTF8 can technically use 1 to 4 bytes to represent a single character. However, older versions of MySQL only provided support for storing UTF8 encoded characters that used 1 to 3 bytes. This was enough to cover the most commonly used characters, but is not suitable for applications that accept user input where any character can be submitted (like emojis, which use 4 bytes). Newer versions of MySQL provide a character encoding called utf8mb4 to fix this issue. Drupal 7 supports this, but requires some special configuration. Drupal 8 is configured this way by default.

Existing Drupal 7 sites that were setup with MySQL's old 3-byte-max UTF8 encoding must undergo a conversion process to change the character set on tables and text columns from utf8 to utf8mb4. The collation value (what MySQL uses to determine how text fields are sorted) also needs to be changed to the newer utf8mb4 variant. Thankfully, there's already a drush command you can download that does this conversion for you on a single database. Before running it, you should ensure that your MySQL server is properly setup to use the utf8mb4 character encoding. There's a helpful guide on this available on Drupal.org. Afterward the conversion is run, you still must configure Drupal to communicate with MySQL using this new encoding as described in the guide I linked to.

Part of my job is to help maintain hundreds of sites running as multi-site in a single codebase. So, same codebase, but hundreds of databases, each of which needed to have its database tables converted over to the new encoding. Converting a single database is not such a big deal, because it only takes a few minutes to run, but since I was dealing with hundreds, I wanted to make sure I had a good process laid out with plenty of logging. I created the below bash script which placed each site in maintenance mode (if it wasn't already), ran the drush command to convert the database, then took the site out of maintenance mode.

All in all, it took about 10 hours to do this for ~250 websites. While the script was running, I was monitoring for errors or other issues, ready to kill the script off if needed. I added a 3 second sleep at the end of each conversion to allow me time to cleanly kill the script.

After the script was completed, I pushed up new code for the common settings.php file (each site is configured to load a common settings file that they all share) which configured Drupal to connect to MySQL using the proper character set. In between the time that a database was converted, and the settings.php was updated for that site, there still should not have been any issues, because MySQL's UTF8MB4 character encoding should be backwards compatible with the original encoding that only supports 3 byte characters.

Here's the script for any that may be interested:

#!/usr/bin/env bash

#
# Usage:
# Alter this script to specify the proper Drupal docroot.
# 
# Run this command and pass to it a filename which contains a list of
# multisite directory names, one per line.
#
# For each site listed in the file, this script will first put the site in
# maintenance mode (if it's not already in that state), then run the
# uf8mb4 conversion script. Afterwards it will disable maintenance mode if
# it was previously disabled.
#

### Set to Drupal docroot
docroot="/var/www/html/"

script_begin=$(date +"%s")

count=0
total="$(wc -l $1 | awk '{ print $1 }')"
while read -r site || [[ -n "$site" ]]; do
    start_time=$(date +"%s")
    count=$((count+1))
    echo "--- Processing site #${count}/${total}: $site ---"
    mm="$(drush --root=${docroot} -l ${site} vget --exact maintenance_mode)"
    if [ $? -ne 0 ]; then
        echo "Drush command to check maintenance mode failed, skipping site"
        continue
    fi

    # If maintenance mode is not enabled, enable it.
    if [ -z $mm ] || [ $mm = '0' ]; then
        echo "Enabling maintenance mode."
        drush --root=${docroot} -l ${site} vset maintenance_mode 1
    else
        echo "Maintenance mode already enabled."
    fi

    drush --root=${docroot} -l ${site} utf8mb4-convert-databases -y $site

    # Now disable maintenance mode, as long as it was already disabled before.
    if [ -z $mm ] || [ $mm = '0' ]; then
        echo "Disabling maintenance mode."
        drush --root=${docroot} -l ${site} vset maintenance_mode 0
    else
        echo "Maintenance mode will remain on, it was already on before update."
    fi

    echo "Clearing cache"
    drush --root=${docroot} -l ${site} cc all

    end_time=$(date +"%s")
    echo "Completed in $(($end_time - $start_time)) seconds"
    echo "Done, sleeping 3 seconds before next site"
    sleep 3
done < "$1"

script_end=$(date +"%s")

echo "Ended: $script_end ; Total of $(($script_end - $script_begin)) seconds."

Jul 20 2018
Jul 20

Illustration of workflow concept

Did you know that setting up a content workflow is included in Drupal 8 core? It can be easily set up by simply turning on the Workflow and Content Moderation modules. The Workflow module gives you the ability to define a workflow and the Content Moderation module sets up a simple workflow for drafts and the ability to create more content moderation workflows.

Introduction to Content Moderation

If you aren’t familiar with content moderation in Drupal, let’s fix that with a quick overview of what that means. Without this module, Drupal content can only be in one of two states, published or unpublished. The content is either available to the public or it isn’t. Different users with different permissions can manage the publishing status certain types of content. All of this combines to make a pretty useful and normal experience. This should also seem like a normal experience for anyone that has ever used content management system.

Where the Content Moderation module comes in is when this workflow needs to be more than just on or off. This can allow for content to be placed into different statuses or states. With built-in states available, like Draft, content can be staged by one editor and then approved by a user with different permissions. In the publishing world, this matches workflows that exist for paper content, so this is an attractive feature for those in that vertical.

Set up in 5 minutes

Content Moderation in core provides a very simple but useful workflow. Simply turning on the Workflows and Content Moderation modules adds the Editorial workflow. The Editorial workflow adds the Draft state and sets up the content workflow to go from Draft to Published and provides an admin view for drafts that need to be moderated. Using permissions you can restrict authors to only be able to create and edit drafts. Then grant the "Transition drafts to published" permission to your editors and boom!—you have content moderation set up in a matter of minutes.

View of content moderation in Drupal core

 

If you are running a Drupal 8 site and have multiple content contributors that need feedback, there is little reason to not use moderation. The out-of-the-box Content Moderation should be able to handle most situations. If that doesn’t quite fit the needs of your content workflow then there is still good news, you can create a custom workflow.

Custom workflows

If you need a more complex workflow, you probably still don’t need to write any custom code. If it’s just content type, block, or paragraph based entities that you are building a workflow for, you can just create a new workflow based on the "Content" moderation type. Different states can be defined, for example let's say you have the states draft, ready for review, ready for second review, reviewed, and published. Next you need to define the transitions, for example one transition for the above states would be “move ready for review to ready for second review”. These transitions are going to be what you give different user roles permission to do. After that is set up, you and your team are ready to roll with your new workflow.

Setting up a complex workflow in Drupal 8 Core

 

Another example of a custom moderation workflow could be for a site that publishes to multiple platforms. A workflow could be set up to allow the editors of different platforms to approve content onto the system they manage. Let’s say you have a front-facing site, a native app, and an internal portal. You can create a workflow that goes through moderation for the front-facing site and then adds content to a queue to be published or declined for each of the other outlets. This is just one of the many possible use cases for a custom workflow.

Illustration of workflow in Drupal 8 Core for publishing to multiple platforms

If you need to extend Workflow further, maybe for a custom entity, you can write a WorkflowType plugin that covers your needs. This can be used for any entity that needs to change states, so think beyond the idea of content moderation. It could be used for steps in a manufacturing process, or steps for ordering in a restaurant app, the possibilities are limitless.

Do you need it?

Workflows are super powerful and moderation comes mostly ready to go with Drupal core but does that mean you should always use them? On some sites with only a handful of admins and not a lot of roles, it may be more cumbersome than useful. Just because workflows are an option, it shouldn’t be implemented unless your users understand the human element of a workflow. Moderators should know their role in content moderation. If most of your authors have admin privileges and can just push content straight through the flow, then your workflow is mostly standing in the way of being efficient. Every good workflow should start with a meeting of your entire editorial team, and have things worked out on a whiteboard or Slack channel first. Workflows are amazing as long as everyone understands their role in the publishing chain. 

New Call-to-action

Mar 23 2018
Mar 23

Why Squarespace Will Replace Wordpress, Wordpress Will Replace Joomla, and Drupal Will Replace Drupal

When the open source WordPress blogging platform first came out, it opened up the world of internet publishing to the masses. Sure, there were website builders out there like GeoCities and Angelfire, but they lacked much and were very ugly. When WordPress came along it gave voice to those willing to overcome the barriers of setting up web hosting and installing the software. These days there are much better website builders for the common person. Squarespace being a standout of the group even has an easy to use e-commerce option. Because of this, the roles of many popular Content Management Systems (CMS)s are shifting.

Squarespace is the new WordPress

While WordPress usage is higher than ever, it seems that a large portion of the DIY and personal site market is shifting to services like Squarespace. It makes sense, since Squarespace and its ilk (Wix, Weebly, etc) are often easier and cheaper in the long run. With hosting and support rolled into one cost, it simplifies everything. People building personal sites and even small business sites have been migrating over to these kinds of site builders more and more every year. The trend is sure to keep rising as these services start to offer e-commerce and other business tools.

Which CMS is best for your website?  Take our CMS Quiz and find out!

WordPress doing more than ever

WordPress really hasn’t changed all that quickly over the years, but what has changed is the ways in which people are using it. It’s still deep-down a blogging platform, but people have extended it to be much more of a full CMS. The sheer number of developers familiar with WordPress development is what has pushed it to be the top open source CMS on the web today. WordPress is now the CMS of choice for most small to medium sized enterprise sites, and it is increasingly capable of more on more complexity, potentially pushing middling tools like Joomla more and more into the fringes of the the market and perhaps eventual irrelevance. WordPress is probably going to continue to be the top CMS for at least the next few years as more web design and marketing agencies make it a cornerstone of their services. WordPress at it’s core doesn’t seem to be doing much to accommodate this new kind of usage that is beyond the blog. We are already starting to see the bubble burst as these sites are requiring more functionality than WordPress can handle and many end up being more custom code than WordPress. It will be interesting to see if WordPress adapts or loses share in the enterprise world over the next few years.

Drupal framework more of a framework

Drupal has always been viewed as more of a framework than a CMS. With the release of Drupal 8, Drupal has doubled down on the framework concept incorporating the Symfony PHP ecosystem into it’s core. Drupal 8 has become the perfect option for large Drupal 7 sites that have begun to outgrow what Drupal 7 can do. Drupal 8 has positioned itself to be a viable option for many web based apps and can easily beat out non-CMS frameworks such as Laravel in terms of development speed and scalability. Drupal 8 is filling the functionality gap that WordPress just can’t do. I predict that enterprise migrations from WordPress to Drupal 8 are going to be on the rise over the next couple of years as businesses require more of what the internet has to offer. 

Which platform you go with will depend on your website needs. Small brochure type sites will easily find a home on one of the instant site builders and those with strong WordPress understanding might continue to use Wordpress. We at Ashday strongly believe that Drupal will be able to serve a wide-range of needs for a long time to come. The stability and scalability has only improved with the latest iteration and in the hands of the right team it can be made to do just about anything.

image with text offering access to our free CMS Selection quiz.

Mar 23 2018
Mar 23

I'm working in creating a Drupal 8 installation profile and learning how they can override default configuration that its modules provide at install time.

All Drupal 8 modules can provide a set of configuration that should be installed to the site when the module is installed. This configuration is placed in the module's config/install or config/optional directory. The only difference is that the configuration objects placed in the config/optional directory will only be installed if all of their dependencies are met. For example, the core "media" module has a config file config/optional/views.view.media.yml which will install the standard media listings view, but only if the views module is available on your site at the time of install.

The power of installation profiles is that they can provide overrides for any configuration objects that a module would normally provide during its installation. This is accomplished simply by placing the config object file in the installation profile's config/install or config/optional directory. This works because when Drupal's ConfigInstaller is installing any configuration object, it checks to see if that config object exists in your installation profile, and uses that version of it if it exists.

However, overriding default configuration that a module would normally provide is a double edged sword and brings up some interesting challenges.

If you dramatically alter a configuration object that a module provides, what happens when that module releases a new version that includes an update hook to modify that config? The module maintainers may write the update hook assuming that the config object that's installed on your site is identical to the one that it provided out-of-the-box during install time. I think this falls on the module maintainer to write update hooks that first check to make sure that the config object is mostly what it expects it to be before modifying it. If not, fatal errors could be thrown.

Another challenge that I ran into recently is more complicated. My installation profile was overriding an entity browser config object provided by the Lightning Media module. Entity browsers use views to display lists of entities on your site that an editor can choose from. My override changed this config object to point to a custom view that my installation profile provided (placed in its config/install directory), but it didn't work. When installing a site with the profile, I was met with an UnmetDependenciesException which claimed that the entity browser override I provided depended on a view that didn't exist. Well, it did exist, it's right there in the install folder for the profile! After some debugging, this is happening because the Drupal doesn't install config from the installation profile until all of the modules your install profile depends are installed first. So to summarize, it's not possible for a module's default config objects to depend on config that is provided by an install profile.

Feb 14 2018
Feb 14

Sometimes you need to make custom modifications to a composer package. Assuming that your modification is a bug fix, the best approach is to file an issue with the package's issue queue and submit the fix as a pull request (or a patch file when dealing with Drupal projects). Then you can use the composer-patches plugin to include the change in your project.

However this doesn't always work. I had a need to modify the composer.json file of a package that my project used. I tried creating a patch to modify it as I mentioned above, but composer didn't use the patched changes to composer.json. I imagine this is because a package's composer.json file is parsed before composer-patches has a change to modify it.

So the next best thing is to fork the package you need to modify to make the changes you need. The package I was modifying was already hosted on GitHub, so I forked it, made my change in a new branch, and pushed it up to my fork.

From there, I just had to change my project's composer.json file to add my fork to the list of package repositories to scan when looking for project dependencies. This is described in composer's documentation. I changed the version to "dev-my-branch-name" as instructed.

But for some reason, composer was still refusing to use my version of the repo. After more digging, it turns out that's because composer looks at the default branch of the forked repo to "discover" what package it is. Turns out my fork was really old, and the default branch was an older branch. This old branch of code used a different name for the package in it's composer.json file! The package name needs to match exactly what you have in your project's requirements list. To fix this, all I had to do was sync the default branch of my fork with the upstream.

Feb 14 2018
Feb 14

Yes, a blog post about Drupal 7!

I recently worked on an enhancement for a large multi-site Drupal 7 platform to allow its users to import news articles from RSS feeds. Pretty simple request, and given the maturity of the Drupal 7 contrib module ecosystem, it wasn't too difficult to implement.

One somewhat interesting requirement was that images from the RSS feed be imported to an image field on the news article content type. RSS doesn't have direct support for an image element, but it has indirect support via the enclosure element. According to the RSS spec:

It has three required attributes. url says where the enclosure is located, length says how big it is in bytes, and type says what its type is, a standard MIME type.

RSS feeds will often use the enclosure element to provide an image for each item in the feed.

Despite being in a beta release still, the Drupal 7 Feeds module is considered quite stable and mature, with it's most recent release in September 2017. It has a robust interface that suited my use case quite well, allowing me to map RSS elements to fields on the news article content type. However, it doesn't support pulling data out of enclosure elements in the source. But alas, in there's an 8 year old issue containing a very small patch that adds the ability.

With that patch installed, the final step is to find the proper "target" to map it's data to. It's not immediately clear how this should work. Feeds needs to be smart enough to accept the URL to the image, download it, create a file entity from it, and assign the appropriate data to the image field on the node. Feeds exposes 4 different targets for an image field:

Feeds image field targets

Selecting the "URI" target is the proper choice. Feeds will recognize that you're trying to import a remote image and download it.

Oct 23 2017
Oct 23

Drupal sometimes gets a bad rap for being overly complex, but aren't your site needs also complex? If they aren't, you can stop reading here, skip over WordPress and go straight to Squarespace. If Squarespace can’t cover all your needs, keep reading.

Okay, if you are still reading, then you have complex web needs. The good news is that Drupal has you covered. But simply saying your web needs are complex is too generic, so let's break it down and see how Drupal deals with different kinds of complex needs.

Do you have a lot of content?

Drupal doesn’t care how much content you have. With its out-of-the-box tools, Drupal gives you a nigh-infinite number of ways to organize and manage your content. 

The built-in, customizable admin tools allow you to have large amounts of content in an easy to use management system. On the frontend, a well-executed content driven design will give your users a great experience even if you have hundreds or thousands of pages to navigate. Having a good Drupal partner is the key to establishing a content plan on both the backend and frontend. 

Brainstorm your next development project with an Ashday Drupal expert! Request your free session today. 

What does your content look like?

People tend to think of content as being pages on a website, but this isn't quite accurate. Most websites have pages that are part static and part dynamic. Consider, for example, Amazon.com. Very few of the “pages” on Amazon.com are set in stone; instead, most of them are result of a search combined with filtering to show the user lots of individual pieces of content in a single view.

Instead of thinking of entire pages as pieces of content, let's break it down further. Content is:

  • Text
  • Images
  • Files
  • Products
  • Items
  • Locations
  • Data Points
  • People

The list goes on. Drupal doesn’t care how you define content, it simply gives you the tools to define it in any way you need to. Drupal then takes all of your different types of content and gives you the power to manage and display them together in countless configurations and displays.

For example let's say you have products and locations, each defined as individual pieces of content. Using Drupal, we can display the products that a location stocks on the location’s page and we can also display the locations that a product is available at on the product’s page.

Drupal allows you to publish the details of each type of content in one system and then use and remix that content across multiple pages. You can't do that with a single WYSIWYG (what-you-see-is-what-you-get) field on a page, like you would get with an out-of-the-box WordPress site.

A good Drupal partner can help you define the shape, size, and types of content that your website needs.

Do you have users on your site?

Having users on your site is always going to crank up the complexity of the site. Fortunately, Drupal is built to deal with support for user accounts right out of the box. You can have many types of users by assigning them different roles, and you can have hundreds or thousands of individual users. Drupal handles users much like it does content, with near infinite flexibility. Each user role has its own set of permissions and rules to follow. This allows you to do things like turning user memberships into products, relating users to content, and configuring what your internal team members can do to edit the site content.

Your Drupal partner can help determine what kind of users you need.

Do you want to grow?

Even if your needs aren’t quite as complex as described above, Drupal may still be the right fit, especially if you intend on growing. Drupal is built to scale. Defining types of content and users in your system now will save you from needing to do so in the future.

Your site may be just an idea and a few pages at the moment, but when it's time to grow, don’t let an inflexible website stand in your way. Today you may be selling your goods on a simple website, but tomorrow you may get that big wholesale order that needs to connect with a large datasystem to get product details and specs and even automate reorders.

A good Drupal partner will help build your small site with the foundations to become a big site.

You need Drupal

A good Drupal site is an indispensable business tool, but I would not recommend taking on building an enterprise or complex Drupal site on your own. Having a good Drupal partner is key to a successful Drupal site.

There are a lot of articles out on the web that make the case that Drupal is too hard or complicated. Drupal isn’t hard or complicated for Drupal people... and we are Drupal people. Drupal is complex but that complexity allows us to tailor your Drupal site to match your business.

We will handle the complexity of Drupal, and you can handle the complexity of your business.

Offer for a free consultation to determine if Drupal is the right choice for your organization

Oct 17 2017
Oct 17

when-drupal-is-bad-fit.jpg

My instinct is to say never….but if you are still wondering “Should I use Drupal?”, read on and we will take a look deeper to see if we can find some cases where Drupal may be a bad fit.

Brochure and small websites

Drupal can be overkill if your site needs consist of a page or two and maybe a webform. I would agree that the overhead of Drupal may be a bit much here, but you should also consider your future needs. Your simple site may need to grow into something that requires users, e-commerce, or more complex data handling. If that is the case then you would save money in the future by investing in a solid web framework early on.

Legacy Systems

If years of your data is tied up in a legacy system, it may be too risky to try to switch a site over to Drupal. That is understandable but you should also calculate the costs of maintaining the old system, the added time it takes to add new features, and the vulnerabilities that come up in older depreciated software. It takes a lot of planning and can be a bit tedious but a migration to Drupal 8 may actually cost less in the long run.

Small Budget

Drupal development costs due seem to be a bit higher than development on other popular content management systems. But I would say that the difference in cost is negligible, especially when you have a lot of custom needs. Drupal will actually save you money once it comes time to build custom features.

See our article on Drupal vs. WordPress for more details when comparing the true costs of a WordPress site.

If you truly have a tiny budget and many development needs, it is probably time to face reality and scale back to what you can truly afford, regardless of which CMS you choose. If you have a small budget and you can’t accomplish everything you need with a WiX website, then it may be time to re-think your web presence entirely.

No Drupal Experience

If your internal development team or development partner has no previous experience in Drupal, then it really might not be the best choice for your project. It is definitely worth the effort to learn, but be prepared to make big increases to your project timeline. However, this can easily be solved with supplementing your existing team with an experienced Drupal shop like Ashday.

It turns out that there are a few instances where Drupal may not be the best fit, but most of them can be overcome with some planning and evaluation of future needs and requirements.

If you want to build in Drupal but don’t have the right team for the job, you are in luck! Ashday can help with that. 

Offer for a free consultation with an Ashday expert

May 18 2017
May 18

Imagine you have a view that lists upcoming events on your Drupal 8 site. There's a date filter that filters out any event who's start date is less than the current date. This works great until you realize that the output of the view will be cached in one or many places (dynamic page cache, internal page cache, varnish, etc). Once it's cached, views doesn't execute the query and can't compare the date to the current time, so you may get older events sticking around.

One way of fixing this is to assign a custom cache tag to your view, and then run a cron task that purges that cache tag at least once a day, like so:

/**
 * Implements hook_cron().
 */
function YOUR_MODULE_cron() {
  // Invalidate the events view cache tag if we haven't done so today.
  // This is done so that the events list always shows the proper "start"
  // date of today when it's rendered. If we didn't do this, then it's possible
  // that events from previous days could be shown.
  // This relies on us setting a custom cache tag "public_events_block" on the
  // view that lists the events via the views_custom_cache_tag module.
  $state_key = 'events_view_last_cleared';
  $last_cleared = \Drupal::state()->get($state_key);
  $today = date('Y-m-d');
  if ($last_cleared != $today) {
    \Drupal::state()->set($state_key, $today);
    \Drupal::service('cache_tags.invalidator')->invalidateTags(['public_events_block']);
  }
}

Assuming you have cron running just after midnight, this will refresh the cache of the view's block and the page at an appropriate time so that events from the previous day are not shown.

Jan 13 2017
Jan 13

I'm releasing a new series today!!!!!

Over the last year, I've given a talk at DrupalCon, DrupalCorn Camp, and Drupal Camp Colorado all about using Composer and Configuration Management in Drupal 8. Those sessions were around 45 minutes, which is much too short to go in depth, and explain everything thoroughly.

This series is the answer to that issue.

There is just over 1hr 15min of content in 26 videos that fall pretty well into seven parts. Here's the outline:

Part 1: Intro

Part 2: Installing Drupal 8 Locally

  • Creating a New Drupal 8 Project Using the Composer Template
  • Setting Up MAMP to Serve Your Site Locally
  • Using xip.io for Local Device Testing
  • Creating a Drush Alias
  • Installing Drupal with Console
  • Configuring settings.php and settings.local.php
  • Committing Your Project to Git

Part 3: Using Composer

  • Installing and Uninstalling Modules with Composer
  • Installing the Dev Version of Modules
  • Updating and Downgrading Modules
  • Skipping Specific Module Versions
  • Specifying Acceptable Version Ranges
  • Enabling Modules with Drush and Deciding What Version Pattern to Use

Part 4: Configuration Management

  • Setting the Config Directory in settings.php
  • Exporting Config Locally
  • Using the Configuration Installer Install Profile

Part 5: Installing Drupal 8 on a Remote Server

  • Installing the Site on a Production Server with Composer

Part 6: Overriding Settings in Code with settings.local.php

  • Setting up settings.local.php
  • Changing the Site Name and Disabling CSS Aggregation in settings.local.php
  • How to Enable Theme Debugging on Development Sites
  • Overriding Module Configuration (Like Google Analytics) in settings.local.php

Part 7: Putting it all Together

  • Configuring a Local Site and Exporting it's Configuration with Git
  • Pulling Changes to a Remote Site (And some gotchas)
  • Using Drush Shell Aliases to Make Development Easier
  • Verifying the Changes Made on Local and Reflected on Live

I'm pretty excited to finally have this series out, and hope you enjoy it! Oh... did I mention that it's ABSOLUTELY FREE?!?! Well, it is! So, check it out now, and let me know what you think.

Dec 09 2016
Dec 09

I'm working on a site where the editorial staff may occasionally produce animated GIFs and place them in an article. Image styles and animated GIFs in Drupal don't play nice out of the box. Drupal's standard image processing library, GD, does not preserve GIF animation when it processes them, so any image styles applied to the image will remove the animation. The ImageMagick image processing library is capable of preserving animation, but I believe the only way is to first coalesce the GIF which dramatically increases the output size which in unacceptable for this project (my sample 200kb GIF ballooned to nearly 2mb). For anyone interested in this approach anyway, the Drupal ImageMagick contrib module has a seemingly stable alpha release, but it would require a minor patch to get it working to retain animation.

I'm mostly interested in somehow getting Drupal to just display the original image when it's a GIF to prevent this problem. On this site, images are stored in an image field that's part of an Image Media Bundle. This media bundle supports JPEGs and PNGs as well, and those are typically uploaded in high resolution and need to have image styles applied to them. So the challenge is to use the same media bundle and display mode for GIFs, JPEGs, and PNGs, but always display the original image when rendering a GIF.

After some digging and xdebugging, I created an implementation of hook_entity_display_build_alter which lets you alter the render array used for displaying an entity in all view modes. I use this hook to remove the image style of the image being rendered.

/**
 * Implements hook_entity_display_build_alter().
 */
function my_module_entity_display_build_alter(&$build, $context) {
  $entity = $context['entity'];

  // Checks if the entity being displayed is a image media entity in the "full" display mode.
  // For other display modes it's OK for us to process the GIF and lose the animation.
  if ($entity->getEntityTypeId() == 'media' && $entity->bundle() == 'image' && $context['view_mode'] == 'full') {
    /** @var \Drupal\media_entity\Entity\Media $entity */
    if (isset($build['image'][0])) {
      $mimetype = $mimetype = $build['image'][0]['#item']->entity->filemime->value;
      $image_style = $build['image'][0]['#image_style'];
      if ($mimetype == 'image/gif' && !empty($image_style)) {
        $build['image'][0]['#image_style'] = '';
      }
    }
  }
}

So now whatever image style I have configured for this display mode will still be applied to JPEGs and PNGs but will not be applied for GIFs.

However, as a commenter pointed out, this would be better served as an image field formatter so you can configure it to be applied to any image field and display mode. I've created a sandbox module that does just that. The code is even simpler than what I've added above.

Oct 12 2016
Oct 12
163 Easy Local Development Using Kalabox with Mike Pirog - Modules Unraveled Podcast | Modules Unraveled

Skip to main content

Kalabox

  • What is Kalabox?
  • Brief story on history Kalabox
  • Is there a plan to use the “official” Docker for mac backend instead of VirtualBox?
  • Current update on state of Kalabox
  • How does Kalabox compare with other local dev tools like Mamp, DrupalVM etc.?
    • Specifically: Speed, flexibility
  • Is Kalabox, or will it be usable with server environments other than Pantheon? Ie: Acquia, VPS, PlatformSH

Use Cases

  • Team standardization
  • Fast local dev
  • Automated repeatable tasks
  • Github workflow?
  • Composer based workflow?
  • Our three month roadmap

Tandem

  • You mentioned Tandem in the into, and you gave me a brief description before the show, but can you expand a little bit on what that is?
Apr 01 2016
Apr 01

I’m really excited about today’s show because this is one of those topics that I know that I need to know, but don’t know where to start. So, I’m super excited that you agreed to come on and teach me what you know about regression testing!

Regression Testing

  • What do you mean when you say “Regression Testing”?
  • Why is regression testing important?
  • Is there anything to install on your computer? (Testing server)
  • What types of regression testing are there?
    • Unit/Functional (Mocha/Chai)
    • Behavioral (WebdriverIO, Behat)
      • Authenticated/Anonymous
    • Qualitative (Sasslint, JS lint, JSON lint)
    • Performance (Pagespeed, Perfbudget)
    • Visual Regression (wraith, phantomcss, webdrivercss)
      • How does this work on pages that have dynamic content. Homepage photos, comments etc.
  • Where would you start?
    • Front-end
    • Back-end
    • Site-builder

Where to learn

Mar 23 2016
Mar 23
158 Using the Group module as an Alternative to Organic Groups in Drupal 7 and 8 with Kristiaan Van den Eynde - Modules Unraveled Podcast | Modules Unraveled

Skip to main content

Group Module

  • What is the Group module?
    • A really awesome tool to basically create subsites within one site, private content, manage groups of people or all that combined.
  • Why did you create it instead of just using OG?
    • OG DX experience...
  • There are versions for D7 and D8. Which are you more focused on?
  • How is the Group module different from Organic Groups?
    • Good question! The key difference is how the modules decide to structure their data, how that affects the user flow and how the configuration model is built.
  • What is the underlying architecture? OG uses entityreferences heavily. How does Group work?
    • Dedicated Group Entity
  • What’s the status? Is it usable now? (D7 and D8)
    • Available for Drupal 7 and 8, with the 8 version being a large improvement over D7. There’s a few minor things I need to add to the D8 version, but it looks and works great already!
  • Is there much difference between the Drupal 7 and 8 version?
    • Yes and no: the key concept remains the same, but the UX and data model was improved even further. I would really recommend going forward with D8 from now on if you have the chance.

Use Cases

  • Why should I use Group instead of OG?
    • UX, UX, UX
    • data structure
    • DX
    • But most of all: dedicated functionality, it will all make sense!
  • Users that get something special that you can’t do with Roles
  • Groups of people
  • Groups of content
  • There are so many!!!

Questions from Twitter

  • Michelle Lauer @bymiche
    How many subgroups can you nest? How are permissions inherited?
  • Michelle Lauer @bymiche
    Can you easily categorize roles within a group?
  • Michelle Lauer @bymiche
    If you have many roles and want to expose them to "managers" - for UX purposes, roles in categories would be easier to look at
  • Erich Beyrent @ebeyrent
    Sounds like a great UX, what about DX - is there a well-defined API as well?
  • Damien McKenna @DamienMcKenna
    Any plans to port some OG modules, e.g. og_menu or og_menu_single?
  • Damien McKenna @DamienMcKenna
    Would it be possible to create og_forum's functionality out of the box or will it require custom work?
  • Damien McKenna @DamienMcKenna
    Are the join forms configurable/fieldable?
  • Ted Bowman @tedbow
    Interested in this. Can you add fields to a membership?
  • Erich Beyrent @ebeyrent
    What doesn't Groups do? Are there features that you feel need to be added before 1.0 release?
Mar 09 2016
Mar 09
157 The Drupal 8 Port of Advagg with Nick Wilde - Modules Unraveled Podcast | Modules Unraveled

Skip to main content

Advagg

  • First, can you give us an overview of what Advagg is?

The Drupal 8 Port

  • Are you the main maintainer of the D8 version?
  • What was the porting process like?
  • What features are in the Drupal 8 version right now?
  • What's the status of the Drupal 8 version?
  • Do you know of any compatibility issues with other modules?
  • What do you have planned for the future?
Feb 18 2016
Feb 18

Here's the issue

If you're using MAMP, you might have experienced an issue where a website loads just fine in the browser, but when you try to use a Drush command, you get an error like the following:

exception 'PDOException' with message 'SQLSTATE[HY000] [2002] No such file or directory' in core/lib/Drupal/Core/Database/Driver/mysql/Connection.php:146

To be completely honest, I'm not sure what causes this issue. I think it has to do with the way Drush accesses MySQL. As far as I can tell, Drush is trying to access the system MySQL, instead of the one that comes with MAMP.

Here's the fix!

Luckily, this can be easily fixed by changing:

'host' => 'localhost',

to

'host' => '127.0.0.1',

and/or by adding the following line to the database credentials in your settings.php (or settings.local.php) file.

'unix_socket' => '/Applications/MAMP/tmp/mysql/mysql.sock',

Example

So, for example, if your database credentials look like this in settings.php:

$databases['default']['default'] = array (
  'database' => 'drupal',
  'username' => 'root',
  'password' => 'root',
  'prefix' => '',
  'host' => 'localhost',
  'port' => '3306',
  'namespace' => 'Drupal\\Core\\Database\\Driver\\mysql',
  'driver' => 'mysql',
);

Change it to this (differences highlighted for clarity):

$databases['default']['default'] = array (
  'database' => 'drupal',
  'username' => 'root',
  'password' => 'root',
  'prefix' => '',
  'host' => '127.0.0.1',
  'port' => '3306',
  'namespace' => 'Drupal\\Core\\Database\\Driver\\mysql',
  'driver' => 'mysql',
  'unix_socket' => '/Applications/MAMP/tmp/mysql/mysql.sock',
);

With those updates in the database settings, Drush should work as expected!

Hope that helps!

Feb 17 2016
Feb 17
156 Using BigPipe to Achieve Incredible Site Speed in Drupal 8 with Wim Leers - Modules Unraveled Podcast | Modules Unraveled

Skip to main content

Big Pipe

  • What is Big Pipe?
  • This isn’t a Drupal specific thing right? Where did the idea of BigPipe come from?
  • How does it work?
    • Builds on Drupal 8’s caching system, specifically cacheability metadata (tags, contexts, max-age)
    • Rendering system (bubbling, placeholders)
    • page cache / dynamic page cache
    • BigPipe is built ON TOP OF ALL OF THE ABOVE
  • Does it work for anonymous and authenticated users?
  • Is this compatible with reverse proxies like Varnish?
  • Does BigPipe affect the need for something like redis or memchache?
  • How does BigPipe relate to authcache?

Use Cases

  • How can we start using it? What’s the install process?
  • What do we need to do to configure it for our site?
  • Is BigPipe ready to be used?
  • Is there anything like this for Drupal 7?

Questions from Twitter

  • Daniel Noyola @danielnv18
    What can I do to make my site compatible with BigPipe? O what shouldn't do?
  • Daniel Noyola @danielnv18
    Is it compatible with contrib modules like panels or display suite?
  • Ryan Gibson @ryanissamson
    I may have missed it, when BigPipe is included in D8.1, will it be enabled by default?
  • TheodorosPloumis @theoploumis
    Is the bigpipe caching working with the RESTful responses?
Feb 10 2016
Feb 10
155 Using the Block Visibility Groups Module as a Lightweight Replacement for Context and Panels in Drupal 8 with Ted Bowman - Modules Unraveled Podcast | Modules Unraveled

Skip to main content

Scheduled Updates

  • What is the Scheduled Updates module?
  • How does it differ from the Scheduler module?
  • What are some of the use cases? What types of updates can be scheduled?
  • Is it ready to be used?

Block Visibility Groups

  • What is Block Visibility Groups?
  • How does it differ from Context and Panels?
  • Do you intend it to be a replacement for context?
  • When would you use it instead Panels
  • What are some example use cases?
  • How does it work with other D8 block-related modules?
  • Is it ready to be used?
Feb 03 2016
Feb 03
154 Commerce 2.x for Drupal 8 with Bojan Zivanovic - Modules Unraveled Podcast | Modules Unraveled

Skip to main content

Commerce 2.x for Drupal 8

  • What has been the biggest success of Commerce on D7?
    • By starting from scratch on D7 technologies we created a solution that is intuitive to Drupal developers and easier to extend. And with 60k installs, we’ve set a record for ecommerce on Drupal in general.
  • And what do you think have been its biggest weaknesses?
    • Not prioritizing UX from the start. Took us a year after the 1.0 release to create Inline Entity Form and recreate the admin screens as a part of the Kickstart. At that point many people already had the impression that Commerce was hard to use.
    • Not providing enough direction to developers. Flexibility is important, as is having unopinionated code. But developers also need to have a clear and obvious path forward. Having an opinionated layer on top, with sane defaults, can save a lot of development time and prevent frustration.
    • Not prioritizing certain features, leaving them to contrib instead. Modules that make up the checkout ux (checkout progress, checkout redirect, addressbook), discounts. Of course, all generals are smart after the battle.
  • How has that influenced the development of Commerce 2.x?
    • With Commerce 2.x we once again started from scratch, evaluating all feedback received in the 1.x cycle. We decided to address all three of these major points.
    • Better UX means paying more attention to the product and order admin experience, as well as providing better checkout out of the box.
    • Better APIs means doing more work for the developer, especially around pricing and taxes.
    • And finally, we’re growing the core functionality. We’re expecting a dozen contrib modules to be no longer needed, as we address edge cases and add functionality.
  • What are some of the biggest new features of Commerce 2.x?
    • Multi-store will allow people to bill customers from different branches (US and FR offices, for example), or create marketplaces like Etsy.
    • Improved support for international markets means better address forms, better currency management, and significantly better tax support, the kind that will reduce the need for people to use cloud-based tax solutions, at least in Europe.
    • Support for multiple order types, each with its own checkout and workflows will allow developers to create tailored experiences for different kinds of products, such as events, ebooks, t-shirts.
    • An integrated discounts UI means more power to the store admin.
    • And this is just the beginning. Under the hood there are many small features and improvements, over both 1.x and Kickstart.
  • What has Commerce done to integrate better with the PHP and Drupal communities?
    • We’ve created several independent ecommerce libraries, attacking currency formatting, address management and taxes. These libraries are now being adopted by the wider PHP community, bringing us additional contributors.
    • On the Drupal side we’ve joined forces with the Profile2 team, creating the D8 Profile module that we’ll use for customer profiles. We’re also depending on Inline Entity Form, which is now shared with the Media team. We’re also moving some of our generic entity code into a new Entity API module, maintained together with Daniel Wehner and other community members.
    • Finally, we have been champions of Composer, the replacement for Drush Make, and required for any module that depends on external libraries.

The Future of Commerce 2.x

  • Commerce 2.x is now in alpha2. What’s included? What’s next?
    • Alpha2 includes stores and products, as well as initial order and cart implementations.
    • It also has functional currency management and formatting, address and profile management.
    • Alpha3, to be released in the next two weeks, is focusing on completing the order and cart implementations, and adding the initial checkout implementation.
    • Post-alpha3 our focus will be on discounts, taxes, and finally, payments.
    • The best way to learn more about this is to read the drupalcommerce.org blog, where I post “Commerce 2.x stories” detailing work done so far. We have several new posts planned for february.
  • When can we expect Commerce 2.x to be production ready?
    • Our current goal is to release a production ready beta by end of march. We should also have Search API and Rules by then. Leading up to DrupalCon New Orleans we’ll be helping the community implement shipping and licensing and port payment modules. At the same time, we’ll be focusing on reaching RC status.
  • What’s the status of commerce contrib? Like PayPal, Authorize.net, etc.
  • How can the community help?
    • Each new alpha welcomes more manual testing and feedback.
    • We also have office hours every wednesday at 3PM GMT+1 on #drupal-commerce where people can discuss code and help out on individual issues.
  • Do you feel that requiring Commerce to be installed via Composer will impact adoption?
    • The average developer is already familiar with Composer and will benefit greatly from it, just like D7 developers benefited from Drush Make. Getting Drupal, Commerce, and all dependencies is a single Composer command, as is keeping it all up to date.
    • People unwilling to run Composer on their servers can run it locally and commit the result.
    • I’m also hoping we’ll be able to offer distribution-like tarballs on either drupal.org or drupalcommerce.org as we get closer to a release candidate.

Questions from Twitter

  • howdytom ‏@howdytom
    Commerce Kickstart provides a great toolset with basic configuration. Is there a plan to do a Commerce Kickstart for Drupal 8? If not, will Commerce provide more out of the box solutions for a full featured shop?
    • Commerce Kickstart had several parts.
    • The first one was about providing better admin and checkout UX, as well as discounts. That’s now handled by Commerce out of the box.
    • The second was about providing a demo store with a developed set of frontend pages. That’s going to stay in contrib and will greatly benefit from the flexibility introduced by Drupal 8 and CMI.
    • It’s too early to plan a distribution yet. Drupal 8 has almost no contrib, and drupal.org doesn’t support using Composer to build distributions yet.
    • However, we are using Composer to provide single-command site templates, the kind that gives you Drupal core, Commerce and other modules. This will allow us to provide good starting points for different use cases, similar in nature to Commerce Kickstart 1.x.
    • Once 2017 comes around, we’ll investigate next steps.
  • Jimmy Henderickx ‏@StryKaizer
    In commerce d8, will it be possible to alter a product name dynamicly (either by hook or other solution)?
  • Czövek András ‏@czovekandras
    Any plans making iframe payment methods 1st class citizens? Thinking of running checkout form callbacks.
  • Marc van Gend ‏@marcvangend
    How did D8 architecture change the way you code your modules?
Jan 25 2016
Jan 25

Have you started working with Drupal 8 yet? If so, you might have noticed that Drush 7 doesn't play nice with Drupal 8. And if you install Drush 8, that won't work with your Drupal 7 sites. Yikes!

Have no fear!

Here's how to install BOTH Drush 7 and Drush 8 AND have each project automatically use the version that corresponds to that install. It's stinkin' awesome!

(The following is an combination and adaptation of techniques I learned from two blog posts. Both were a bit outdated when I came across them, so I've updated the techniques here. The first is a Lullabot article, and the second is on the Triquanta blog)

Uninstall existing Drush instances

Okay, the first thing you'll want to do is uninstall every version of Drush that you already have installed. This process varies depending on how you installed it, but for example, if you installed with homebrew, the command would be something like brew remove --force drush.

Install Composer

We're going to install multiple versions of Drush using Composer, so we need to make sure you have that installed first. Detailed instructions on how to install Composer globally are on their website, but here's the gist.

curl -sS https://getcomposer.org/installer | php
mv composer.phar /usr/local/bin/composer

Note: If this fails due to permissions, run the mv line again with sudo. You may also have to create the /usr/local/bin directory first, depending on your existing system.

  • To confirm composer was successfully installed, type composer --version and you should see something like "Composer version 1.0-dev (...) 2016-01-20 11:17:40"

Install Drush 8

Okay, let's install Drush 8!

cd /usr/local/bin
mkdir drush-8
cd drush-8
composer require drush/drush:8.0.x-dev
ln -s /usr/local/bin/drush-8/vendor/bin/drush /usr/local/bin/drush8

  • The "composer require..." line will download the latest dev release, you could replace "8.0.x-dev" with "8.0.2", for example, to download that specific version.
  • The "ln -s..." line creates a "symbolic link" called "drush8" in the /usr/local/bin directory to the location where Drush 8 is installed. This means that we can call it from anywhere on the system by typing "drush8 --version", for example.

Easy!

Install Drush 7

Now, we'll install Drush 7!

cd /usr/local/bin
mkdir drush-7
cd drush-7
composer require drush/drush:7.x-dev
ln -s /usr/local/bin/drush-7/vendor/bin/drush /usr/local/bin/drush7

  • The "composer require..." line will download the latest dev release, you could replace "7.x-dev" with "7.1.0", for example, to download that specific version.
  • The "ln -s..." line creates a "symbolic link" called "drush7" in the /usr/local/bin directory to the location where Drush 7 is installed. This means that we can call it from anywhere on the system by typing "drush7 --version", for example.

Create Shell Script to Automatically Select Version Based on Git Config

Now, if you're already used to typing something like "drush --version" (without the specific version number), remembering to use it can be a little cumbersome, so now, we're going to create a little shell script that will automatically use the correct one for each project based on a git config variable that we set.

cd /usr/local/bin
vi drush

  • Press the "i" key to enter "insert" mode
  • Paste the following

#!/bin/sh
version=$(git config --get drush.version)
if [ "$version" = '7' ];
then
drush7 "[email protected]"
else
drush8 "[email protected]"
fi

  • Press "esc", then type ":wq" and press "enter" to save and quit this file
  • Type chmod +x drush (This makes the "drush" script we just created executable.)

Now, when we type a command like "drush --version" it will use Drush 8 by default. In order to use Drush 7, we need to set a configuration variable in the git repo of the project that should use it.

Set Drush 7 as the Required Version for a Project

cd /path/to/project
drush --version
git config drush.version 7
drush --version

  • The first time you run "drush --version" it should return something like "Drush Version : 8.0.0-rc3" showing that you're using Drush 8
  • The "git config..." line declares that you want to use Drush 7 for this project
  • The second time you run "drush --version" It should show somethign like "Drush Version : 7.1.0". If so, you're all set!

YAY!!!

You might want/need to close and re-open all terminal windows to make sure it takes effect.

If you have any questions about, or issues with, this setup, let me know in the comments!

Jan 20 2016
Jan 20
152 What to Do About Drupal 6 End of Life on Feb 24th 2016 with David Snopek - Modules Unraveled Podcast | Modules Unraveled

Skip to main content

Drupal 6 End of Live

  • What does Drupal 6 EOL mean?
  • When is Drupal 6’s End-Of-Life (EOL)?
    • February 24th
  • Why is support for Drupal 6 being dropped by the Drupal project in the first place? (ie. why does our community even do this?)
    • What makes Drupal 6’s End-of-Life (EOL) different than previous ones (ie. Drupal 5)?
  • What, specifically, will happen after February 24th?
    • All D6 modules will be marked as “unsupported” on Drupal.org - which will mean the ‘update’ module will start telling you that ALL your modules are out-of-date
    • Also, the status information that the ‘update’ module uses could go away at any time - so, you’ll no longer be able to rely on that in general (myDropWizard or another vendor MAY create a replacement for the ‘update’ module…)
    • The Drupal security team will no longer be making Security Advisories (or coordinating security releases)
    • In general, most module maintainers will no longer pay attention to Drupal 6 issues and will stop making new releases
  • What should people with Drupal 6 sites do?
    • Archive the site, or
    • Plan upgrade, and
    • If you can’t upgrade by February 24th, buy Drupal 6 Long-Term Support from one of the “official” vendors:
      • https://www.drupal.org/node/2646980
  • What makes the “official” vendors special (vs. any other vendor)?
    • Get confidential information from Drupal security team
    • Agree to follow security team processes and release all security patches publicly
    • Were vetted by the Drupal security team
  • How will the Drupal 6 LTS work?
    • Same process as security team - but work done by vendors rather than security team
    • Will publish patches on the D6LTS project:
      • https://www.drupal.org/project/d6lts
    • Likely, but not 100% decided:
      • Announce new patches on the D6LTS issue queue
      • Make new Pressflow 6 releases with the Drupal core patches
  • So, can the community get this without working with a vendor?
    • Yes!
    • But each vendor only supporting those modules their customers depend on
    • And what about security issues that hackers find first?
  • What does myDropWizard.com do? And how is your offer different than the other vendors?
    • “myDropWizard.com provides 24/7 support and maintenance from Drupal experts for a fixed monthly fee. We keep your site online, up-to-date and secure!”
    • Our Drupal 6 Long-Term Support offer:
      • http://www.mydropwizard.com/drupal-6-lts
      • making security fixes
      • fixing bugs
      • performing one-off maintenance and support tasks on request
      • getting your site back online in the case of an outage, and
      • remediation if your site gets hacked.
    • Basically, keep your site online and secure until you’re ready to upgrade - and we can help with a D7 or D8 upgrade as well
  • Technical questions about how we do what we do?
  • Your offering includes a whole bunch of stuff! Why don’t you have a “security updates only” offering?
Jan 13 2016
Jan 13
151 Using Composer to Build Drupal Sites Fast - Modules Unraveled Podcast | Modules Unraveled

Skip to main content

Composer

  • What is composer?
    • Dependency Manager for PHP
  • How does it relate to Drush Make?
    • Surprisingly similar
    • “Getting off the island”
  • What can you do with composer on a Drupal site?
    • download dependencies, drupal modules, themes, profiles, drush, external libraries etc
  • How do you download contrib modules?
    • https://packagist.drupal-composer.org/
  • How do you download contrib themes?
  • Can you specify where to download project? ie: modules/contrib and modules/custom?
    • composer/installers & davidbarratt/custom-installer
  • Where do you store custom modules/themes?
    • Two methods: Committing to your repo or creating separate repos
    • Reference repo in you composer.json
    • Toran Proxy
    • davidbarratt/custom-installer
  • How can you specify and download a library to go with a module?
    • module should specify it in composer.json
    • if not submit a patch, and add to your composer.json for now
  • Patches
    • cweagans/composer-patches

Use Cases

  • drupal/drupal vs drupal/core
    • http://cgit.drupalcode.org/drupal/tree/composer.json
    • If you want to use the project-repo/web
    • composer create-project drupal/drupal
  • Composer template for Drupal Projects
    • https://github.com/drupal-composer/drupal-project
  • Build sites from scratch with composer install
  • Update drupal/modules/themes/etc. with composer update
Oct 12 2015
Oct 12

In order to produce training and informational material on Drupal 8, I'd like to know what's keeping you from using Drupal 8 right now.

Please answer this ONE QUESTION survey to explain the biggest issue, concern or other reason you're not using it already.

Thanks!

-Brian

Jul 13 2015
Jul 13

Hey everyone,

Super exciting news this week! I've secured sponsors for the next two months to provide EVERY video on ModulesUnraveled.com absolutely FREE for you!

It's a bit of a long story, so if you don't care, just go learn some Drupal. But, I thought you might be interested in the back story, so here goes...

(P.S. If you're not a reader, I recorded this as an 8 minute podcast just for you.)

Why Modules Unraveled is now FREE for everyone

Late last year, there was a moment when I realized that it had been a long time since I had thought about why I started Modules Unraveled in the first place, and whether or not I was still on course with that original vision. I decided that I wasn't.

For reference, Modules Unraveled started with a video demonstrating how to setup Organic Groups in Drupal 7, back in 2011, when Amitai took over development, and did a complete re-write. Nothing was the same, and there were support requests all over the internet from people who didn't know how to use the new version.

I wanted to use it in a project I was starting at the time, so I went through the twisted, and difficult process of figuring out how the new module was supposed to be used. When I finally did, I recorded a video showing exactly what I had learned, so that others could skip the hours (and let's be honest, days!) that I had spent figuring it out, and just use the module the way it was intended.

Basically, I wanted to help everyone use Drupal to do amazing things.

At the time, I was teaching in the public schools full-time, but had the desire to do more web development and create videos like the one for Organic Groups. So, I started getting up at 3am... yes. 3:00 in the morning... and during that time I worked on videos until 6am when I would get ready to go to my full-time job.

Then, I started to charge for some videos. I figured, if I was saving people time, it was worth the $29 investment to learn how to use something like the Simplenews module, which has its own complexities. People apparently agreed with me, because the orders started coming in. Then I created more series' and put those up for sale, and everything was on an upward trajectory.

In the spring of 2012, I looked at what I was making between the hours of 3:00 and 6:00 in the morning, and extrapolated that to see what I could make full time. Based on that, I decided that it would make sense to quit my full-time teaching position, and focus on developing sites, and videos full time. (Too early as it turned out, but I'll explain that later.)

So, I was working for myself, from home, and everything seemed to be going great. The income was incredibly irregular though, so eventually, I switched the site from a pay-per-series model, to a subscription model. That was nice, because it provided a bit more predictable income, but though there were good months, on average, it still hadn't ramped up to what I was making as a full time teacher.

Because of this, I started to focus more and more on the money that I needed to make, instead of providing amazing quality videos that could help thousands of Drupal developers create their sites more quickly, and with less headache, like the original Organic Groups series did.

So, in December of of 2014, when I re-evaluated whether or not I was fulfilling my original mission, the answer was a resounding "No." I saw the thousands and thousands of site visitors in Google Analytics, and the meager handful that were actually signing up as a paying subscriber to access the videos. And the interesting thing is that it wasn't the lack of subscribers that hurt the most, it was that such a broad audience was coming to my site, because I had something they needed, but then immediately leaving because of the subscription requirement.

These were the exact people I had set out to help. And I was turning them away. I had reached a wide audience of people who didn't know what I was saying, because I was charging them to listen.

So, I wasn't making enough on the subscriptions alone to provide for my family, but I kept churning out new videos in the hope that eventually, the snow ball would roll in my direction, but it wasn't happening.

I eventually came to grips with that fact, and started trying to figure out other ways to produce an income, but still help the people that I had originally set out to help. I knew if I just got another job, I'd stop making videos. Because they take time. A LOT of time. You see this evidenced all over the web. YouTube is littered with Drupal tutorials teaching you how to do this or that, but there are rarely more than a handful done by any one individual. And, because video production, and teaching are not generally their areas of expertise, the quality is widely varying (to put it nicely.)

The obvious monetization strategy was advertising. I had never had third party ads on my site, and really didn't like the idea, but I had to try something new, because what I was doing wasn't working. (And one definition of insanity after all, is doing the same thing over and over, expecting a different result. I didn't want to be insane.)

So, I immediately started to contact businesses that I had some form of personal connection with. Some were ones that had products I used, some were ones where I knew someone(s) personally within the business, and others were ones that close friends of mine had personally recommended.

I wanted to see if anyone would be interested in advertising on my site. Some said yes, and some said no (one even suggested buying me outright, but after thorough discussion with my wife, and time in prayer, it didn't align with my life goals.)

The ones that said yes had a hard time nailing down a dollar amount (which is understandable, since I didn't have metrics to give them, having always been subscriber only), and for four or five months, I felt strung along, and didn't know if it was ever going to work out.

Then, I went to DrupalCon in LA. I wasn't planning to go, but then I won a conference ticket from the amazing people at Four Kitchens (thank you again!) and then just had to figure out a way to pay my room and flight. The only way I could convince my wife (who would be left alone with a two year old, and five month old) to let me go, was to convince her that I resolved to make something happen while I was there and that it would be a good investment.

I followed through. To give you an idea, I went to one (1) session the entire week, and spent the rest of my time in the expo hall, and hanging out with people who might be interested in becoming a sponsor (as well as a few old friends, that I don't get to see except at DrupalCons).

This was MUCH more fruitful than my previous, online-only, attempts. The net result was that six sponsors agreed to advertise, and combined, they would replace the existing membership income, and a little more. It wasn't much more, but was enough to let me run a two month trial to see how it would go, since I have absolutely no idea if it will be beneficial to anyone. My hope though, is that it will be beneficial for everyone, and I can sign additional sponsors, and/or charge higher rates.

One of those sponsors let me know that, after reviewing their budget, they would not be able to advertise, but the other five are still on board. You can see them, and (please!) thank them on the "Sponsors" page.

So, here we are! The site is completely free for anyone that wants to learn what I teach, and I get to find out if advertising will work on Modules Unraveled.

I'm happy!

If you want to learn Organic Groups, Search API, Simplenews, Git Basics, or anything else, check out the videos on this site. They're all FREE!

Want to become a sponsor?

I'd be a fool to leave it at that, and not mention that if you're interested in becoming a sponsor of the site, or the podcast, contact me, and I'll get you the information you need.

You MUST be able to legitimately serve my audience, and prove your worth in order to be a sponsor. But, if you meet that criteria, I'd be delighted to talk to you!

Thanks for taking the time to read this. It means a lot to me! And if you have any further questions, please don't hesitate to ask!

-Brian

May 08 2015
May 08

The Installing Git series is a free series that walks you through the process of installing Git for the first time and/or updating Git to the latest release on both Windows and Macs.

Installing Git on Windows is pretty straight forward, you just download the "Git for Windows" installer, and step through the process.

On a Mac though:
* You might not have any version of Git installed
* You might have the version that comes with Xcode or the Command Line Developer Tools
* You might have the version from Apple as well as an official install
* Or, you might just need to update an existing official install

This series covers all of those scenarios to get you updated to and running on the latest release.

It also covers the basic Git configuration options you need to set in order to use Git effectively.

So, what are you waiting for? Watch them now! They're free!

Mar 03 2015
Mar 03
Command Line for Beginners - Introduction

Command Line 00 - IntroductionWhen you're just getting started with a new operating system you have to learn how to get around and perform basic actions like navigating the file structure reading and writing files and creating and deleting files and directories. The command line (once you're familiar with its commands) can be used to navigate a computer's file structure much faster than through the GUI (Graphical User Interface). It is also very helpful when working with a remote machine, like a web server. One stumbling block for many first time command line users is the fact that when you first startup your command line, you're just given a blank screen with a prompt where you can type in your command. Since there isn't anything to indicate what you should do next you have to already know the commands before you get started. This series will lay out some of the most basic command line commands. These are the ones you'll likely use every time you start up the command line. We'll take a look at the pwd, ls and cd commands to see where you are in your file structure what files and folders are in your current directory and moving to other directories. We'll create and edit text files with the VI application and learn the basics of utilizing VI. We'll learn how to move, rename, copy and delete files and folders with the mv, cp and rm commands. There are plenty of other commands you might want to know but this is an introductory class and a quick Google search will give you all of the information you need. //-->

TL;DR: Watch the full series here for FREE!

If you're serious about building your Drupal site right, you'll find yourself looking into command line tools like Git and Drush.

If you don't work in the command line on a regular basis, that might be intimidating. Well, not any more!

This series is designed to be a primer on the command line basics, to get you comfortable enough with the command line that you can utilize command line tools without hesitation.

Once you've watched the series, you can move on to the Drush series to improve your Drupal-fu, and will be ready for the upcoming Git series (which will be awesome, by the way!)

When you're ready, you can watch the entire series, for free.

When you do, let me know if you have any questions or comments!

Tags: Command LineBasicsplanet-drupal
Nov 01 2014
Nov 01

Let me start out by stating that I don't know the technical implications of an autocomplete feature. Okay? I don't have the answer. I'm just looking for information. Best case, I can help get something started that will benefit the entire Drupal community in the future.

With that out of the way, I firmly believe that anything is possible with Drupal. And with the "Drupageddon" of late, an auto update feature would be greatly appreciated by many, I'm sure. (I certainly would have benefited from one.)

I was recently discussing the security update with some friends, and one of them asked "Does Drupal have an auto update?"

And I was like "...no."

Immediately, I thought about all of the updates I don't immediately apply to contrib projects because they change a configuration option, or otherwise modify the way I've set up the site.

So I thought "I don't really want an auto update because it might break things."

That said, why can't Drupal automatically update security fixes - at least in core - automatically? If it did, Drupageddon could have never been a widespread issue.

It's easy to think, "Well, it's fixed now, so there's nothing to worry about."

But I think that's shortsighted.

The particular vulnerability that caused "Drupageddon" has been around since the inception of Drupal 7, which was officially released in 2011. So, for at least 3 years, every time we've fixed a security flaw, we've thought, "It's fixed now, so there's nothing to worry about." ... until the next issue was found, and this last one was a pretty gigantic one!

Wordpress introduced auto update in version 3.7 on October 24, 2013. They also included options in their configuration file that can be set to disable auto updates, as well as choose which types of updates should be performed automatically: none, major and minor or minor only.

You can read more about it on the Configuring Automatic Background Updates page:
http://codex.wordpress.org/Configuring_Automatic_Background_Updates

I'm just curious if this is something that can be added in a point release of D8 (like 8.5 or something).

Also, I've ready a few posts saying that auto updates would not fit their workflow. They use Drush, Git, etc. to manage their development workflow. And if that's you, I'd say that turning the auto update setting to off would mean that you can continue to work the way you currently do.

However, small business owners, churches, non-profits and the like that have volunteers (with little to no development background) managing their sites don't have the luxury of utilizing Git, Drush etc. In those scenarios, I think the case could be made that an autoupdate feature (as long as the updates are tested before release) could be a much more stable way of maintaining a site than having a volunteer FTP files to a server without really knowing what they are doing.

If you have thoughts, please add them below. I'd love to hear them!

Updates

  1. After doing some more research, I've found that some people tried to do this in D7, but postponed to D8. However, there hasn't been any movement since April 28, 2013. https://www.drupal.org/node/606592
  2. There's also a post explaining why auto updates would be a very bad idea from September 1, 2011. http://www.freelock.com/blog/john-locke/2011-09/why-auto-updates-are-very-bad-idea I'm not sure that I agree with everything he says though.
  3. It looks like the current conversation about this is happening here: https://www.drupal.org/node/2367319 (And I agree with comment #4, I'd rather have a broken site than an exposed site... A multi-million dollar ecommerce site might disagree, but that's not me.)
Oct 22 2014
Oct 22

tl;dr

Rollback a server backup (files and database) from before October 15th 2014.

No server backup?

  1. Run "git status" to find new and modified files.
    • Delete new files
    • Checkout modified files
  2. Thouroughly check files directory for anything unusual.
  3. Make sure the .htaccess file in the files directory restricts code execution
  4. Restore database from pre Oct. 15th backup
  5. Update Drupal Core to latest release

... Read on for details...

I think I might have been hacked. What do I do?

Hi, this is Brian Lewis with Modules Unraveled.

As you probably already know, there was a huge security fix released for Drupal 7 on October 15th (SA-CORE-2014-005). The patch to update Drupal is actually quite small, but the implications of not updating your site are massive. As a matter of fact, if you haven't already updated your site, chances are you have already been hacked. There were automated programs systematically attacking Drupal sites hours after the fix was released. In this video I'm going to show you how to find out whether or not your site has been hacked. And if so, I'll walk you through what you need to do now, to reduce the damage done.

There are two ways to find out whether your site has been hacked. With "git status" and by searching the database.

  • Run "git status" inside Drupal root
    • This will show us any files that have been modified since our last commit. On the live server, there shouldn't be any, so anything listed here, I know is a result of being hacked.
    • This is a huge reason you should be using version control on your site. If you're not, you can try to re-download every module, theme and library you have and download a fresh copy of the version of Drupal core that you had before the attack and replace all of those on your server. I'm hesitant to recommend this as a full fix though, because there may be hidden files, or files in places you don't think to look. Really, my recommendation is a full re-install. If you're in this situation, I'm sure you don't want to hear that, but I hope this gives you a reason to look into Git.
  • Search for "file_put_contents" in database
    • If there is a result. You've been hacked.
    • Click "Browse".
    • Click the "BLOB" link under "access_arguments". This should download a file to your local machine.
    • Open that file with a text editor.
    • Notice that only one file is listed. There may be others that need to be deleted.
  • If there are no extra files in your git repo, and no results in database search. You're not hacked. Update Drupal Core now! Or at least do the hotfix mentioned here as a temporary measure.
  • Delete/checkout all files listed by "git status" (Also check your files directory. The files directory should not be in Git, but that means there's no easy way to view new and modified files, but they could have been placed there. By default, the .htaccess file that is in that directory prevents php code from being executed, but Michael said he has seen an attack that modified that .htaccess file. So, you need to check your site.)
  • Restore Database (Otherwise thouroughly check Users, Node, etc.)
  • Install latest Drupal Core update

Recap:

  1. Run "git status" to find new and modified files.
    • Delete new files
    • Checkout modified files
  2. Thouroughly check files directory for anything unusual.
  3. Make sure the .htaccess file in the files directory restricts code execution
  4. Restore database from pre Oct. 15th backup
  5. Update Drupal Core to latest release

Updates:

  1. Drupal security team member Greg Knaddison (greggles) wrote up a great guide on what to do when you get hacked. He includes things I didn't mention like making a forensic copy of your site to inspect later, and notifying site stakeholders. You can read that here.
Feb 07 2014
Feb 07

I recently worked on porting over a website to Drupal that had several dynamic elements throughout the site depending on the IP address of the user. Different content could be shown depending on if the user was within a local network, a larger local network, or completely outside the network.

When porting the site over, I realized that it wouldn't be possible to enable page caching for any page that had this dynamic content on it. In Drupal, standard page caching is all or nothing. If you have it enabled and a page is "eligible" to be cached, Drupal saves the entire output of the page and uses it for future requests for the same page (I go into much more detail about page caching in previous blog post). In my case, if I enabled it, users who hit within one of the local intranets could trigger a page cache set, and now any users outside the intranet would view that same content.

I wanted a solution that let me either differentiate cache entries per by visitor "type" (but not role), or to at least prevent Drupal from serving cached pages to some of the visitors when a cached page already existed. I found a solution for the latter that I'll describe below. But first...

Why this is a hard problem

I already knew I could prevent Drupal from generating a page cache entry using drupal_page_is_cacheable(FALSE);. In fact, there's a popular yet very simple module called Cache Exclude that uses this function and provides an admin interface to specify which pages you want to prevent from being cached.

But what if you wanted to cache the pages, but force some visitors to view the un-cached version? This is what I needed, but Drupal has no API functions to do this. Many Drupal developers know that hook_boot is run on every page request, even for cache hits. So why can't you implement the hook and tell Drupal you don't want to serve a cached page? The reason is because of the way Drupal bootstraps, and when it determines if it should return a cached page or not.

There's a whole bootstrap "phase" dedicated to serving a cached page called _drupal_bootstrap_page_cache. If you take a close look, you can see that Drupal doesn't invoke the boot hook until after it already determined it's going to serve a cached page. In other words, there's no going back at this point.

Enter the "Dynamic Cache" module

I came across the Dynamic Cache module that seemed solve this problem. Once enabled, this module lets you disable serving a cached page by setting $GLOBALS['conf']['cache'] = false; within your own modules hook_book implementation - exactly what I suggested was not possible above!

So how was Dynamic Cache doing this? In summary, Dynamic Cache implements hoot_boot, checks if you tried to disable serving the cached page, and if so will "hijack" the bootstrap process to render the whole page and ignore the page cache entry that may exist. In then makes sure to "finish" up the request by completing the bootstrap process itself and calling menu_execute_active_handler(); that is normally done in index.php (but no longer get executed because of the hijack).

I want to note that what Dynamic Cache is doing is pretty scary in that it's almost hacking core without actually modifying any core functions. This fear is actually what triggered me to explore how the Drupal bootstrap process works under the hood so I could understand if there'd be any potential issues.

It's not an easy concept to understand initially, especially since for Drupal 7 you have to enable a second module called "Dynamic Cache Bootfix" that hijacks the bootstrap process a second time to properly finish up the request! I don't want to go into much more detail, but the modules code is pretty slim and I encourage developers to take a look. It will help you get a greater understanding of the bootstrap process and the obstacles this module tries to overcome.

There's also a core issue that is trying to address this problem of not being able to easily disable a cached page from being served. I also encourage you to read thru that to get a better understanding of what the problems are.

How I implemented it

In my case, I found that the majority of traffic to the site was from users outside any of the intranets, so I decided to allow them to both trigger cache entries being generated and to be served those cached page entries. For everyone else (a small % of traffic), Drupal would always ignore whatever was in the cache for that page and would also not generate a cache entry:

function my_module_boot() {
  $location = _my_module_visitor_network();
  if ($location != 'world') {
    # Prevent Drupal from serving a cached page thanks to help from the Dynamic Cache module
    $GLOBALS['conf']['cache'] = false;
    # Prevent Drupal from generating a cached page (standard Drupal function)
    drupal_page_is_cacheable(FALSE);
  }
}

Note that Dynamic Cache relies on having a heavy module weight so it runs last - which allows me to disable the cache in my own hook_boot. Make sure you read the README that comes with the module so you set everything up properly.

Also note that I still called drupal_page_is_cacheable(FALSE);. Without this, Drupal may still generate a cached paged based on what this user saw. With my code in place, anonymous users outside the networks I was checking would both generate page cache entries and be served page cache entries. Anonymous users within the networks/intranets would never trigger a cache generation and would never be served a cached page.

Final Thoughts

Ideally, I would be able to generate separate page caches for each "type" of visitor I had. I think this is possibly by creating your own cache store (which is not that difficult in Drupal 7) and changing the cache ID for the page to include the visitor type. I think the boost module may also allow for this sort of thing.

For really high traffic sites, you're probably going to be using something like Varnish anyway - and completely disable Drupal's page caching mechanism. I don't know much about Varnish but I imagine you could put this similar type of logic in the Varnish layer and selectively let some users through and hit Drupal directly to get the dynamically generated page (especially since my check for visitor network is just based on IP address).

There you have it. Dynamic Cache is by no means an elegant module, but it gets the job done! If you're better informed than I and I made a mistake somewhere in this writeup, please let me know in the comments. I certainly don't want to spread misinformation!

Feb 05 2014
Feb 05

I just finished up a small project at work to create a basic resource management calendar to visualize and manage room and other asset reservations. The idea was to have a calendar that displayed reservations for various resources and allow privileged users the ability add reservations themselves. The existing system that was being used was a pain to work with and very time consuming - and I knew this could be done easily in Drupal 7. The solution could be extended to create a more general resource booking / room booking system.

I wanted to share the general setup I used to get this done. I won't go into fine detail, and this is not meant to be a complete step by step guide. I'm happy to answer any questions in the comments.

Step 1: The "Reservation" content type

I quickly created a new content type "Resource Reservation" and added a required date field. Due to a bug in a module I used below, I had to use a normal date field and not ISO or Unix (I usually prefer Unix timestamps). These three different types of date fields are explained here. Aside from that, I also made the date field have a required end date and support repeating dates using the Date Repeat Field module (part of the main Date module). I then needed to decide how I would manage the resources and link them to a reservation.

I created another content type "Resource" and linked it to a reservation using the Entity Reference module. Another option I considered was using a Taxonomy vocabulary with terms for reach resource, and adding a term reference field to the reservation content type. I decided to go for a full blown entity reference to allow greater flexibility in the future for the actual resource node.

In my case, I created the 6 "Resource" nodes (all rooms in a building) that would be used in my department.

Step 2: The Calendar

Years ago at the 2011 DrupalCamp NJ, I attended Tim Plunkett's session "Calendaring in Drupal." Tim provided a great introduction to a new Drupal module called Full Calendar that utilized an existing JavaScript plugin with the same name. I was very impressed with the capability of the module and wrote about it after the camp was over.

I immediately knew I wanted to use the module and was happy to see it has been well maintained since I last checked it out in 2012. The setup was incredibly simple:

  • Create a new "Page" view displaying node content
  • Set the style to "Full Calendar"
  • Add a filter to only show published "Resource Reservation" node
  • Add the date field that is attached to "Resource Reservation" nodes

The style plugin for Full Calendar has a good set of options that let you customize the look and functionality of the calendar. I quickly able to shorten it quite a bit to display the start and end times as "7:30a - 2:00p".

One thing to note is that while you can add any fields you want to the view, the style plugin only utilizes two: A date field and a title field. Both are displayed on the calendar cell - and nothing else. If you add a date field, the style plugin automatically uses it as "the" date field to use, but if you have multiple date fields for whatever reason you can manually specify it in the settings. Similarly, for the title field, you can add any field and tell the plugin which one to use as "the" title for the event. In my case the node title was suitable. If you wanted to display more than one field, try adding them and then add a global field that combines them, then assign that as the title field.

I loaded up some reservation nodes and viewed them in the calendar and everything was looking great so far. Next I wanted to provide some filtering capability based on the resource of the reservation "events".

Step 3: Filtering the Calendar by Resource

In my case there was a desire to be able to display the reservations for select resources at a time instead of all of them at once. This would be a heavily used calendar with lots of events each day, and it would become a mess without some filtering capability. This was easy enough by creating an exposed filter for the calendar view.

Ideally I would have a filter that exposed all of the possible resources as checkboxes - allowing the user to control what reservations for what resource they are viewing. I'm sure I could have done that by writing my own views filter plugin or doing some form altering, but I settled for this approach:

  • Added a new input filter for my "Resource" entity reference field.
  • Exposed it
  • Made it optional
  • Changed it to "grouped filter" instead of "single filter". This let me specify each Resource individually since there's no out-of-the-box way of listing all available.
  • Used the "radio" widget
  • Allowed multiple selections - this actually changed the radio buttons to checkboxes instead - exactly what I want.
  • Added 6 options for the filter - one for each resource. I looked up the node ID's for each resource and put them in with their appropriate label. Downside is each time a new resource is added I have to manually update the filter.
  • Changed the "filter identifier" to the letter "r", so that the query string params when filters are used aren't so awful looking

There are two major gotchas here. The first is that if you have more than 4 options to chose from, Views changes the checkboxes to a multi select field (bleh). This is an easy fix:

function YOUR_MODULE_form_views_exposed_form_alter(&$form, &$form_state) {
  if ($form['#id'] == 'views-exposed-form-calendar-page') { # find your own views ID
    $options =& $form['r']; # my exposed field is called "r" (see last step above)
    if ($options['#type'] == 'select') {
      $options['#type'] = 'checkboxes';
      unset($options['#size']);
      unset($options['#multiple']);
    }
  }
}
This ensures that the exposed filter is ALWAYS going to be checkboxes. The second gotcha is how views handles the multiple selections. By default, views will "AND" all of the selections together. So if you select "Room 5" and "Room 6", I get reservations that have both selected - which is not possible in my case since I purposely limited the entity reference field on the reservation to only reference one resource. Instead I want views to "Or" them, so it shows any reservations for either "Room 5" or "Room 6". The fix for this is simple, but not obvious:
  • In the filter criteria section in the View UI, I went to "Add/Or, Rearrange" which is a link in the drop down next to the "Add" button.
  • I created a new filter group and dragged my exposed filter into it.
  • The top group has the published filter and the content type filter, and the operator is set to AND.
  • The bottom group has my single exposed filter for the resource, and the operator is set to OR.
  • The two groups are joined together with an AND operator.

Setting the second group to use OR is the key here. Even though there is just one item in the filter group, it's a special filter because it allows multiple selections. Views recognizes this and will apply the OR operator to each selection that was made within that filter. By default I had everything checked (which is actually the same as having nothing checked, at least in terms of the end result). This makes it obvious to calendar viewers that they can uncheck resources.

Step 4: Adding Colors for each Resource

Since the default calendar view includes 6 resources, I wanted each resource to be displayed with a color that corresponded to the resource it was reserving. The Full Calendar module can sort of do this for you with help of the Colors module. This module lets you arbitrarily assign colors to taxonomy terms, content types, and users. Colors then exposes an API for other modules to utilize those color assignments however they want. Full Calendar ships with a sub module called "Full Calendar Colors" that does just this by letting you color the background of the event cells in the calendar based on any of those three types of assignments that may apply.

In my case, since I wasn't using Taxonomy terms, I couldn't use the Colors module to color my reservations. Someone opened an issue to get Colors working with entity references like in my case, but it's not an easy addition and I couldn't come up with a practical way of adding it to the Colors module myself.

Instead, I examined the API for Full Calendar and found I could add my own basic implementation in a custom module. Here's the basics of what I did:

  • Add my own color assignment form element to each "Resource" node using form alters and variable set/get.
  • Implement hook_fullcalendar_classes to add a custom class unique to each "Resource" for the calendar cell. Like ".resource-reservation-[nid]".
  • Implement hook_preprocess_fullcalendar to attach my own custom CSS file (created using ctools API functions) to the calendar that has the CSS selectors for each resource reservation with the proper color.

Finally I added a "legend" block that lists each Resource (with a link to that Resource node) displaying the color as the background, so users can quickly see what the colors in the calendar meant. You could also avoid some of this complexity by removing the ability to assign colors via the node form and just hardcode the color assignments in your theme CSS file. You'd still need to implement hook_fullcalendar_classes.

Step 5: Reservation Conflicts

With the basic calendar view completed and displaying the reservations, I shifted focus to the management aspect of the feature. Specifically, I needed to prevent reservations for the same resource to overlap with one another.

A little bit of digging led to me a great module called Resource Conflict. This module "simply detects conflicts/overlaps between two date-enabled nodes, and lets you respond with Rules". It requires Rules which is used to setup reaction rules when a conflict is detected, allowing you to set a form validation error. Note the module integrates with Rules Forms as well, but I've found it's not actually required. Resource Conflict is a very slim but capable module - I was very impressed and happy with its capabilities.

The module provides a Rules event "A resource conflict node form is validated". To get this event to trigger, I had to enable "conflict detection" for the Resource Reservation content type (part of the Resource Conflict module). To do this, I edit the Resource Reservation type, went to the new "Resource Conflict" vertical tab, and enabled it by selecting the date field to perform conflict checking on.

<

p>The Resource Conflict module provides a default rule that by prevents form submissions if there are any other nodes of the same type with an overlapping date. This is too general because I want the Rule to only throw a validation error if the conflicting reservation is for the same resource I'm trying to reserve. I disabled that default rule and worked to create a rule to also take the resource into consideration. This part was somewhat complicated and I was happy to find some guidance in the issue queue. EDIT: I've since taken maintainership of the module and updated the real documentation page with details on how to perform the following steps.

First, I needed to create a Rule Component that encapsulates the logic to compare two Reservation nodes, check if they have the same Resource entity reference, and if so set a form error. Here's how I did that:

2 Variables:

  • "Node" data type, "Unsaved Reservation" label, "unsaved_reservation" machine name, usage as a "parameter"
  • "Node" data type, "Conflicting Reservation" label, "conflicting_reservation" machine name, usage as a "parameter"

3 conditions:

  • "Entity has field" on the "unsaved-reservation" data selector, checking it has the resource entity reference field
  • "Entity has field" on the "conflicting-reservation" data selector, checking it has the resource entity reference field
  • "Data comparison" to make sure that the values of the two entity reference fields are equal

1 action:

  • Set a form validation error. I wrote in a message including a link to the conflicting resource using the available tokens.

Rule Component

<

p> Now, with this rule component in place, I could incorporate it into a normal Rule that reacted on the node submission, loading all the conflicting reservations (based on date alone) and looping through each one to execute the component actions for the more complicated comparison. Here's how I did that:

  • React on event "A resource conflict node form is validated"
  • Added condition for "Contains a resource conflict" - this relies on the "node" param that is made available from the event
  • Added action for "Load a list of conflicting nodes". This is provided by the Resource Conflict module and this is where the all the conflict detection is done, comparing other nodes of the same type for conflicting dates. This action is added as a Loop.
  • Add a Rule Component within the action loop, selecting the one we just created.

Since I setup the component with three variables, I needed to pass them in as arguments to the component after adding it into the loop. For the "unsaved-reservation" variable, I fill in "node". For the "conflicting-reservation" variable, I supply the original "list-item" variable from the loop.

Main Rule

Testing the rule proved that I was not able to overlap any dates for the same resource when creating a reservation. Perfect!

Final Thoughts

The basic functionality of the resource management was there. Users could add new reservations for existing resources and were alerted if the reservation conflicted with others. Reservations were displayed in a calendar for the department to see, and users could filter out specific resources to provide a cleaner view. Here are some additional notes and considerations:

  • To allow the calendar to scale, you'll want to enable AJAX on the calendar view which will only display events for a given month (+/- two weeks). There's a bug in the stable release of the module related to AJAX but I provided a patch.
  • If you're using repeating date fields, make sure you uncheck "Display all values in the same row" on the date field settings in the view. If you don't, any exposed filters for the date range (which is how the AJAX feature works for Full Calendar) will not apply to dates with multiple values. If you do this properly, only the repeating dates for the given date range will be loaded.
  • There's a bug in the Resource Conflict module that only allows you to use the standard "Date" database storage type for a date field. I'm working on a patch.
  • Remember that you could also implement a "resource" using taxonomy terms instead of entity references. If you do, you'll have a much better time getting the Colors stuff working.

And that's pretty much it! Let me know if you have any questions in the comments below.

Dec 16 2013
Dec 16

I've been away from full time Drupal development for a couple of years and have recently returned, this time making a commitment improve my understanding of core. There's a lot of information out there on Drupal caching, but I found much of it to be fragmented and outdated (Drupal 6). I wanted to provide a more comprehensive look at Drupal 7's core caching, explaining how some of this stuff is actually working under the hood.

Measuring Performance

Before we get started, it's worth discussing how you can measure the performance of your site so you can see for yourself the impact caching will have. The easiest way to do this is to use the devel module, which most Drupal developers should already be familiar with. Among other useful features, this module allows you to print out the time it took to render the page and the total memory usage of PHP to serve a page request.

Devel will reveal that for a Drupal 7 site with a couple dozen contrib modules enabled and no caching enabled, about 30-50 MB of memory will be used to serve each page request. It will also show that page execution time (time it took to render the HTML) is around 400-500ms.

Generally, those numbers are not performant and you won't be serving a lot of simultaneous page requests before bringing your server down. You should always be concerned with optimizing your site to increase page response time and reduce memory usage. Even if you're not developing for high traffic sites, you want every visitor to have the best experience possible.

Another useful and easy to use tool is your browser's developer tools. Years ago you had to use FireBug w/ FireFox, but most of the FireBug features are now built into the dev tools native to all the popular browsers, including Internet Explorer (which isn't so bad these days!).

I use Chrome, and the rapid release cycle for the browser means the packaged dev tools suite is very robust and constantly improving. For looking at performance of your site, dev tools is useful in showing you the number of HTTP requests made (the fewer the better), the time it took the server to respond to these requests, and the HTTP headers sent and received for each request. I encourage you to explore the dev tools and discover their usefulness.

How Drupal sets cached pages

Page caching is when Drupal takes the entire rendered output of a page and stores it in the database (or another cache store; defaults to the database). Pages will only be cached for anonymous traffic and for users that don't have session data, like items in a shopping cart. This is because if dynamic data like a shopping cart or a "Welcome Brian" message was cached, it would screw things up when that cached page was delivered for other anonymous traffic.

We need to look into how Drupal loads up every time a page is requested. It's not as complicated as you may think, and it's fairly straight forward to follow. This process is called "bootstrapping" and is split into many different phases. Each phase loads a different part of Drupal, progressively loading more core API functions, theme code, and module code.

If we look at index.php, you can see a call to drupal_bootstrap(DRUPAL_BOOTSTRAP_FULL);. If you take a look at the the code for drupal_bootstrap, you can see each of the 8 phases and get an idea of what each is doing. Drupal's index.php passes in DRUPAL_BOOTSTRAP_FULL, which indicates that every single phase should be executed to load the entire environment. Each phase is loaded in succession. Also of note are the comments for this function, which indicate how you could call drupal_bootstrap yourself to load the Drupal environment for a custom script (very useful!).

So how does page caching tie into this? Well, more time, memory, and CPU is used for each bootstrap function that is loaded. Under normal circumstances, each phase of the process is needed so that the page can be properly rendered. However, when page caching is turned on, and the page is eligible to be cached, Drupal will store the rendered page output via the drupal_page_set_cache function. This function is called right before the rendered output is flushed and delivered to the browser.

I mentioned above that the page must be eligible to be cached. Even with page caching enabled, Drupal may prevent some pages from being cached. An example is a page that displays a dynamic message, like when a user doesn't fill out a form properly and validation errors are displayed. You wouldn't want that message to be part of the cached page result.

Also of note, there's a useful function drupal_page_is_cacheable that can be used to instruct Drupal NOT to cache the page it was called on.

How Drupal serves cached pages

Now let's say the user reloads that page that was just generated and cached. Drupal again kicks off the full bootstrap process, but this time things are different because of the second phase of the bootstrap: DRUPAL_BOOTSTRAP_PAGE_CACHE. This phase is used to determine if a cached page can be delivered to the user, and if so, output it directly. The code is simple to follow. Checks are made to see that:

  • The user has no Drupal session cookie (therefore, user is "anonymous" with no dynamic data)
  • Page caching is actually enabled
  • A page cache entry exists for the requested page

If all three conditions are met, then Drupal loads the cached data out of the cache store using drupal_page_get_cache, outputs it, and exits out of the bootstrap process early.

It's worth noting that under normal circumstances, two additional phases are loaded before Drupal can serve the page cache (thanks Mark Pavlitski for the heads up): DRUPAL_BOOTSTRAP_DATABASE and DRUPAL_BOOTSTRAP_VARIABLES. The database phase is needed so Drupal knows how to access the cache, which by default is stored in the database. The variables phase will load all the settings in the variables table and load the "bootstrap" modules (see next section for more info on that).

Alternative cache implementations (like memcache) don't typically need the database for anything when serving a cached page. You can explicitly tell Drupal to skip loading up the database and system variables by setting page_cache_without_database to false in the settings.php file to make responses even faster. Note that since the "bootstrap" modules are not loaded when this setting is enabled, you can't use the hooks discussed below. Everything has a trade off when it comes to performance.

Two hooks you can count on

Since the cache delivery happens almost immediately and early in the bootstrap process, most of Drupal's core API and modules are not loaded at all. That means you cannot run any hooks that affect page output. However, there are two hooks that Drupal will execute even on cached page delivery: hook_boot and hook_exit.

How are any hooks executed if Drupal doesn't load the hooks system and modules that implement them (this happens at a later bootstrap phase)? Well, when a module implements hook_boot or hook_exit, Drupal makes note of it in the "system" database table when the module is enabled. These modules will be loaded on demand when the hooks are invoked in DRUPAL_BOOTSTRAP_PAGE_CACHE. However, the more modules that implement these hooks, the slower it is for Drupal to actual serve a cached page entry (more code = more time).

Modules can use hook_boot to execute any code that must run on every page, where as its companion hook_init is invoked only when a page is first rendered (meaning not on cached pages).

Almost all of the time you'll want to use hook_init, typically for things like adding specific CSS or JS files to a page. hook_exit is used to execute any code after a page has already been sent to the browser and right before the php process exits.

The popular devel module uses hook_boot so it can ensure its profiling code is run even for cached pages. Note I previously wrote that the redirect module implemented hook_boot, but that is incorrect. Must have been a late night when I wrote that!

Both hook_boot and hook_exit can actually be disabled on cached pages as well to provide even further performance gains for cached pages. This can be done by setting page_cache_invoke_hooks to false in your settings.php file. A lot of modules rely on those hooks though, so you'd really need to understand the repercussions of turning those hooks off. In Drupal 6 you could control this on the performance settings page, but now it's just an override in your settings.php file.

Page compression

Once you enable page caching, Drupal will reveal an additional option on the performance page called "Compress cached pages." Doing so, Drupal will first compress the rendered content using PHP's gzencode function (see drupal_page_set_cache) before saving it. This reduces size of the data to store in the cache dramatically, as well as offering an additional benefit! Web browsers can accept this compressed content directly and uncompress it themselves.

Browsers that support this (just about all of them) add a header indicating as such, and Drupal will deliver the gzipped content directly to the browser. The heavy lifting of decompressing the data is left up to the resources on the users machine - which is a good thing. It reduces the load on your server (except for that first "hit" that must be compressed) and decreases the transfer time and bandwidth.

If page caching is disabled, Drupal won't compress it before delivering it, but your web server can do that if you wish. Apache and Nginx both support this. There's some debate about whether you should use this in conjunction with Drupal's compression or not. For the most part you should be okay just having Drupal handle it for you. If you are working on a site where performance is a huge concern, this is something you'll need to look into more yourself.

Performance gains

The benefits of page caching are immediately clear. Above I mentioned that a Drupal site could use around 30-50 MB of RAM just to serve one page request. While that RAM is used for only a half second or so, it severely limits the amount of traffic you can serve. If a cached page is delivered instead, you're looking at around 2-4 MB of RAM paired with a dramatic improvement in page response time.

You won't be able to use the Devel module to print out the memory usage and execution time for cached page results. That's because Devel has no opportunity to alter the output of a cached page (nor does any other module, as I discussed above). I wrote a blog post a while back explaining how you can determine php memory usage for cached page results. Check if out if you're interested.

Your browsers dev tools will also show the dramatic improvement in response time. To test it out, clear your page cache (in performance settings, or using drush) and then load a page with dev tools open. Note the time it took to get the page from the server. Now reload the page and look at the time again (make sure you're logged out). On the second request, Drupal is returning the cached page that was stored from the first request.

Of all the caching methods available in Drupal core, page caching is by far the most effective and performant. Of course it's of no use unless you're serving to "anonymous" logged out users, but the majority of Drupal sites are probably aimed toward static content delivery.

How and when the page cache is cleared

There's a Drupal function called cache_clear_all that is used all over the place to wipe out cache entires in various "bins". Here are some of the actions that trigger a call to cache_clear_all, clearing (among other cache bins) the page cache:

  • A node is created/edited/deleted
  • A block is created/edited/deleted
  • A comment is created/edited/deleted
  • A vote is registered in a poll
  • User profile fields are manipulated
  • System theme settings are changed
  • Taxonomy terms/vocabularies are manipulated
  • Permissions for roles are changed
  • Cron is run

That's quite a list! Why do so many actions trigger a cache clear? For the most part, it's because Drupal doesn't know where your content is displayed on the site. It's not quite intelligent enough (but it will be in Drupal 8). It makes the assumption that any one of your cached pages may include a poll, a node, a comment, a taxonomy term, etc. So any time those are changed or added, Drupal clears the entire page cache!

Here's a common example: Let's say you have a View that displays the 5 most recent news articles on your homepage. When you submit a new news article, you'd want that list to be updated. The only way that list is updated is if you clear the page cache entry for the homepage, or else it will display stale content.

All those cache clears can be problematic for a site that sees even a small amount of updates. Whenever a cached page is wiped out Drupal has to regenerate it on the next hit. That unlucky visitor will have to wait a few seconds while the whole thing is rendered instead of the snappy cached version. To combat this, Drupal allows you to enforce a minimum amount of time a cache must be valid.

Minimum cache lifetime

This is a setting on the performance page and has been confusing users for years. The minimum cache lifetime determines the minimum amount of time that must pass between entire cache clears. Many users misunderstand this setting to instead apply to the lifetime of individual cached entries, but it has nothing to do with individual entries. If you have the min set to 10 minutes, you could create a new page and have it only be cached for 1 minute before it is cleared from the cache. It doesn't mean that a page will be cached for 10 minutes at the minimum or automatically cleared out after 10 minutes. Nothing is broken, this is how the system is designed for better or worse.

If you don't have the min lifetime set (which is the default), the page cache will clear no matter what on any of those above actions (including cron!). If you do set a minimum, anytime cache_clear_all is called to clear the page cache, it will first set a system variable indicating the timestamp of the request. On a subsequent request to clear the page cache, Drupal compares the current time to that previous time that was recorded. If it exceeds the minimum you set, it will then clear the cache.

No matter what you do, the process is very inefficient. What this usually means is that a lot of a your visitors will be hitting non-cached pages and having a bad experience. One solution is to "warm" the cache after it has been cleared. You can do that by using a crawler that hits all pages on your site. You can also use boost, which has a built in crawler and more advanced cache logic. Sites with serious traffic will probably use a reverse HTTP proxy like Varnish instead of Drupal's page caching. There's also the Alternative Database Cache module that aims to correct some of these core shortcomings (thanks to Eric Peterson for authoring and bringing to my attention).

Expiration of Cached Pages

This other option on the performance page is more straight forward and hopefully shouldn't confuse people thanks to the helpful comment alongside it. At first you may think this is a way to control the maximum amount of time an individual page will be cached before Drupal forces a new rendering of it. However, the comment reads "The maximum time an external cache can use an old version of a page."

This will control the HTTP response header Cache-Control, setting the parameter "max-age" to whatever value you indicated. HTTP reverse proxies like Varnish or Nginx (or a CDN like Akamai), which can provide an extra caching layer in front of Drupal, use this important header to expire cached pages in their cache. The setting has nothing to do with Drupal's internal caching mechanisms.

Conclusion

Page caching is a no-brainer for most websites. It dramatically reduces system resources consuming when serving pages and allows for you to serve much more traffic at once. A lot of the page cache settings may seem counter intuitive. It can take hours to really dig through the code and see what's going on and try to figure out why. Hopefully this blog post can clear up some of the confusion and give you a better understanding at what's happening under the hood.

I plan on writing up more on the other forms of caching in Drupal, like Views, Block, and Form caches. Stay tuned, and please comment below.

Dec 06 2013
Dec 06

I recently started on a project that involves migrating some data from a legacy app & database into Drupal. The old application is a collection of PHP scripts that basically just generate forms, accept data, insert said data into the database, and output it on a website. Pretty simple stuff - there's not a whole lot going on. It was developed long before many of the popular CMS's and frameworks came to be, and probably before people really started paying attention to the character encoding of their data.

I initially began the data migration using Drupal's Migrate module, setting up my new content types and fields, and running through some test imports. Things seemed to go well, until I started scanning the imported data. I started to see some really strange characters like ’ and é

For the most part, it was pretty clear what these characters were supposed to represent, based on the context they were placed in. I knew right off the bat that it had something to do with character encodings, something that I've never taken the time to truly understand. I made the mistake of thinking "oh this shouldn't take too long to fix", later unraveling my very own "character encoding hell". I'll talk more about that towards the bottom of the post, but let's first get an intro character encodings.

So what are character encodings exactly?

When textual data is stored and transmitted, it has to first be converted to binary just like everything else. So how is text converted? There needs to be a lookup table matching characters with binary representations. That's determined by the character encoding that is chosen - which can be one of many.

Whenever data is decoded, the character encoding must be known beforehand. Web browsers do this every time you view a website. How do they know what encoding the text is in? The web server tells it, or the browser has to make an educated guess (not what you want!).

ASCII is a very basic character encoding that many are familiar with. It covers the English language along with common symbols and control characters, using only 7 bits to provide a maximum of 128 characters in the set. ASCII isn't really used anymore on the web because of the small number of characters it supports. There are many encodings that do use a full 8-bit byte for each character, re-claiming that wasted 8th bit from ASCII and bringing the total to 256 characters. In America, the most common single byte encodings are probably Windows-1252 and ISO-8859-1. However, no single-byte encoding can possibly hold all of the characters necessary to create one "universal" encoding that supports all known languages and symbols.

UTF-8 is a Unicode compliant character encoding that has become the dominant encoding on the web. One of the strongest properties of UTF-8 is that it's a variable width byte encoding - meaning a single character can be represented with one or more bytes (more advanced, less used characters take up more bytes). Most importantly, UTF-8 supports just about every character in every language you can think of. This is very important for the web. It makes multilingual sites easier to manage since you don't have to worry about any localized character sets for each language. Everything uses the same character set.

Most developers should only be dealing with UTF-8 at this point (or another Unicode encoding) and should understand how character encodings are involved in every part of your website or application.

Where you need to worry about them

Remember that any time textual data is transmitted, it needs to be encoded in a specific encoding, and decoded on the other end. The other end needs to know what encoding was used. There are at least 4 major areas where a web developer needs to be concerned with character encodings:

Web pages

When a response w/ text in it is delivered from a server to a client, the server needs to tell the client about the encoding.

There are two opportunities to do this - one is the "Content-Type" HTTP response header which is typically set to text/html; charset=utf-8 for standard HTML pages. Your application should set this before delivering the response to the browser. It also includes the MIME type of the response which tells the browser what type of document is being delivered (image, video, document, etc).

The other is a meta tag header <meta charset="utf-8"> or <meta http-equiv="Content-Type" content="text/html; charset=utf-8">. The former is the newer HTML5 version. It's a bit confusing to indicate the character encoding within the data that needs to be decoded, but this is allowed in HTML. Parsers will interpret everything as ASCII until it hits that header (which will work, since the HTML syntax is within ASCII and can be parsed that way) then re-parse the document with the new encoding. The reason it's supported in HTML is to account for any inability to set the HTTP response header which would otherwise provide the same info.

You should be using both methods. Without setting this data, your browser will have to guess, and it may display "garbage" text.

You can actually see what character encoding your browser chose to render the page in, and even force it to render it using a different encoding. It's a handy tool that can help diagnose if a page was meant to be rendered in some other encoding. Chrome, FireFox, and Safari all support this ability (IE probably does as well) in the "View" menu.

Form submissions

When data in input in text boxes or text areas in an HTML form, the browser has to encode it before sending it to the server. What encoding does it use to do this? Again, this is up to you to decide. By default most browsers will just use the same encoding that the page was rendered with. However, you can specify this in the <form> tag: <form action="/process.php" accept-charset="UTF-8">. So, while explicitly indicating the character set to encode the data with is not totally necessary, you should do it anyway just in case. If this is not set you risk the browser encoding the data in some random Encoding that your back-end is not anticipating.

MySQL connections

Something that is often overlooked is that when you communicate with a database server and you send textual data, you need to indicate the character encodings when sending and receiving data between your back-end and the MySQL server. This makes sense once you understand that any time text is transferred from one place to another, you need to indicate what encoding the text is in. The text does not automatically have a way of indicating what encoding it's in (not universally anyway).

It get's pretty tricky here, at least with MySQL. There are specific settings in MySQL that you can set after a connection has been established that dictate how characters are treated between the client and server. In particular, there are three things that are important to look at:

  • The encoding of data you send to the server from the client
  • What encoding the server should convert to after receiving the data
  • The encoding the server should return to the client when queries are run (a conversion if necessary).

Assuming the data you're sending MySQL actually IS in UTF-8 and the data in MySQL is stored as UTF-8, you'll want all three of those things to be UTF-8. No conversions will actually take place, and MySQL will just pass everything along as UTF-8. To set these values, you can simply execute the statement SET NAMES utf8 after making a connection.

If you DON'T set these values, then MySQL will more than likely default to latin1 (Windows-1252) which is just asking for trouble! A knowledgable developer may recognize that they need to set the character encoding for their web page and forms, and even their database fields (see below). However, if they have a backend script that accepts UTF-8 data from a form submission, but it doesn't tell MySQL it's UTF-8, then MySQL will think it's latin1 when it's actually UTF-8.

MySQL text field storage

Text fields in MySQL require you to indicate the character encoding of text fields. Defaults are set up on the server, database, and table level (each inheriting from the former). Since you're telling MySQL a field is a text field, it needs to know how to interpret the raw data it's storing as textual data. Without doing so, you wouldn't be able to query for text or have MySQL compare text fields with one another. You could alternatively just use BLOB fields instead, where the data is stored as binary data and not interpreted in any way. Just keep in mind you won't be able to search on these fields.

From what I can tell (as well as another blog author I reference below), MySQL doesn't change how the data is stored based on the encoding of a field. It just interprets the data differently when reading it. I could be wrong here (if I am, please let me know in the comments). However, you can alter the encoding of an existing field which will re-encode it (actually does change the data) as long as you recognized the limitations outlined in the that article.

For instance, if the string é was encoded as latin1 and stored in MySQL in a latin1 field, it will be stored as hex value E9. You can verify this yourself by running something like SELECT hex(text_field_name) FROM table_name. If you then run ALTER TABLE `table_name` CHANGE `text_field_name` `text` MEDIUMTEXT CHARACTER SET utf8 NULL; MySQL will convert that data from latin1 to UTF-8 for you. If you run the hex query again, you'll get back C3 A9.

In MySQL 5, all the text storage defaults are set to latin1 (Windows-1252) just like they are for the connection settings discussed above. While it's well known at this point that UTF-8 is preferred, I believe that the consensus was that the team behind MySQL didn't want to make the dramatic change to UTF-8 quite yet (though I think this is how it will be in MySQL 6).

Note: the collation of a field has nothing to do with how the data is stored, and instead effects how the data is compared and sorted.

What can go wrong

One of the important things to understand is that UTF-8 is a multibyte encoding. This means that the majority of characters are represented with more than one byte (typically two or three), while a traditional character set just uses one byte per character. To really understand let's look at a practical example. Here's the word "Résumé" in two different common encodings:

Encoded in Bytes Interpreted in Windows-1252 Interpreted in UTF-8 Windows-1252 52 E9 73 75 6D E9 Résumé R�sum� UTF-8 52 C3 A9 73 75 6D C3 A9 R©sum© Résumé

The normal letters in Résumé are actually the same in both Windows-1252 and UTF-8 (that's was an intentional design choice when creating UTF-8). However, once we get into the special accented e, it's actually represented with two bytes in UTF-8 and just one in Windows-1252.

If your data is sent from the browser to the server as UTF-8 (this is something you as a web developer have control over), is then stored in your database as UTF-8, and finally pulled out of the database and displayed on your website as UTF-8, then will be all dandy. This is how things should be and you have little to worry about.

But let's say that you accidentally removed the charset meta tag from your page template and your web server and app don't set the "character-encoding" HTTP header. Your app will still be delivering UTF-8 encoded text to the browser, but the browser no longer knows it's UTF-8. So the browser guesses. It's possible it will correctly guess that it's UTF-8, but it's also possibly it will guess Windows-1252 as the encoding. If that happens, Résumé will look like R©sum©. Why? Because the é was sent to the browser as C3 A9, which in Windows-1252 translates to the characters  and ©. If it were correctly interpreted as UTF-8, the browser would correctly translate C3 A9 to é.

Whoops! You catch the error a few hours later and make sure the character set is properly set to UTF-8. No harm done, right? Since the the data didn't actually change at all, there is no data corruption to worry about. The browser now interprets the text data properly as UTF-8. Well, not so fast. Let's say you have a form on your site that accepts comment submissions. Someone submitted the form when the browser rendered the page in Windows-1252. Since most browsers will encode data in form submissions using the same encoding that it used to render the page, the text may be encoded as Windows-1252 and sent along to your backend.

Now you have a problem. Your app is expecting the data to come in as UTF-8 when it's actually encoded as Windows-1252! If the comment contains with the word "Résumé", it will be encoded to 52 E9 73 75 6D E9 instead of 52 C3 A9 73 75 6D C3 A9 like it should be in UTF-8. You may have your database driver setup to indicate that the data you're sending it is UTF-8, and the database field may be set to store it as UTF-8. It's possible that your DB abstraction code throws an exception when it sees that 52 E9 73 75 6D E9 is not valid UTF-8 (which is correct, it's not valid), but it may let it slide and insert it anyway.

After you fixed the website to tell browsers you're sending it UTF-8, let's say someone goes to view the comment that was submitted earlier. Instead of displaying Résumé, the browser will actually display R�sum�. That funny question mark box is the Unicode "replacement character" - which is used when the byte sequence in the text is not supported in UTF-8. E9, the byte that Windows-1252 uses for é, is NOT a valid UTF-8 character. If you're curious why, you need to read more about how UTF-8 encodes text.

You have data corruption now. To fix this, you'd need to find the data entries that were submitted with the wrong encoding and manually convert them to UTF-8. Not a fun task. Yo can't even assume that any text submitted after your bad commit would be encoded as Windows-1252 and is stored incorrectly. Some browsers may have properly send UTF-8 data and it was properly encoded. So if you tried to re-encode all the data after that commit to UTF-8, you may end up double-encoding the valid UTF-8 characters.

Conclusion and further reading

Character encodings are not trivial to deal with and it's not terribly fun trying to figure out why funny characters are being displayed on your app or website. Hopefully after reading this you'll take the subject seriously and really begin to understand what character encodings mean to your website and application. Generally you should be using UTF-8 everywhere!. If you're using Drupal - you don't need to worry about much, since Drupal already creates database fields with UTF-8 and sets the correct database connection settings for UTF-8. You should still take time to really understand as much as you can about encodings though.

The application I'm working on has been operating for years without a character encoding set in the HTML pages or form submissions. I've determined that form submission data has been encoded in UTF-8, Windows-1252, and ISO-8859-1, with no definitive way to figure out what data is what. The application does not set the character encoding for the MySQL connection at all, so it defaults to latin1 (Windows-1252). To top it all off, the database text fields are a mix of UTF-8 and latin1 (and the UTF-8 fields may have been "converted" from latin1). It's been frustrating trying to determine what is right and wrong in the database and how to properly fix it because there's both a mix of encodings on the field storage level as well as the input coming into the app. There's no sure fix for any of it.

If you're interested in learning more, these are some great starting points:

Jun 20 2013
Jun 20

The following blog post was written by Cathy Theys of comm-press, and is also read by Cathy Theys. The original article can be found on the CommPress blog

All of the Audioblogs will be released on the podcast feed, so you can subscribe to that to get them all automatically. And if you'd like to participate in the Audioblog, you can contact me on Twitter @ModsUnraveled, or through the contact form at ModulesUnraveled.com/contact.

May 07 2013
May 07

The following blog post was written by Cathy Theys of comm-press, and is also read by Cathy Theys. You can find the original article where you can comment, and find all of the links mentioned in the blog on the Comm Press blog. Enjoy!

All of the Audioblogs will be released on the podcast feed, so you can subscribe to that to get them all automatically. And if you'd like to participate in the Audioblog, you can contact me on Twitter @ModsUnraveled, or through the contact form at ModulesUnraveled.com/contact.

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web