Feeds

Author

RSS Tags

Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
Aug 20 2018
Aug 20

Helping content creators make data-driven decisions with custom data dashboards

Greg DesrosiersMassachusetts Digital ServicePublished in

4 min read

Aug 20, 2018

Our analytics dashboards help Mass.gov content authors make data-driven decisions to improve their content. All content has a purpose, and these tools help make sure each page on Mass.gov fulfills its purpose.

Before the dashboards were developed, performance data was scattered among multiple tools and databases, including Google Analytics, Siteimprove, and Superset. These required additional logins, permissions, and advanced understanding of how to interpret what you were seeing. Our dashboards take all of this data and compile it into something that’s focused and easy to understand.

We made the decision to embed dashboards directly into our content management system (CMS), so authors can simply click a tab when they’re editing content.

GIF showing how a content author navigates to the analytics dashboard in the Mass.gov CMS.

The content performance team spent more than 8 months diving into web data and analytics to develop and test data-driven indicators. Over the testing period, we looked at a dozen different indicators, from pageviews and exit rates to scroll-depth and reading grade levels. We tested as many potential indicators as we could to see what was most useful. Fortunately, our data team helped us content folks through the process and provided valuable insight.

Love data? Check out our 2017 data and machine learning recap.

We chose a sample set of more than 100 of the most visited pages on Mass.gov. We made predictions about what certain indicators said about performance, and then made content changes to see how it impacted data related to each indicator.

We reached out to 5 partner agencies to help us validate the indicators we thought would be effective. These partners worked to implement our suggestions and we monitored how these changes affected the indicators. This led us to discover the nuances of creating a custom, yet scalable, scoring system.

Line chart showing test results validating user feedback data as a performance indicator.

For example, we learned that a number of indicators we were testing behaved differently depending on the type of page we were analyzing. It’s easy to tell if somebody completed the desired action on a transactional page by tracking their click to an off-site application. It’s much more difficult to know if a user got the information they were looking for when there’s no action to take. This is why we’re planning to continually explore, iterate on, and test indicators until we find the right recipe.

Using the strategies developed with our partners, we watched, and over time, saw the metrics move. At that point, we knew we had a formula that would work.

We rolled indicators up into 4 simple categories:

  • Findability — Is it easy for users to find a page?
  • Outcomes — If the page is transactional, are users taking the intended action? If the page is focused on directing users to other pages, are they following the right links?
  • Content quality — Does the page have any broken links? Is the content written at an appropriate reading level?
  • User satisfaction — How many people didn’t find what they were looking for?
Screenshot of dashboard results as they appear in the Mass.gov CMS.

Each category receives a score on a scale of 0–4. These scores are then averaged to produce an overall score. Scoring a 4 means a page is checking all the boxes and performing as expected, while a 0 means there are some improvements to be made to increase the page’s overall performance.

All dashboards include general recommendations on how authors can improve pages by category. If these suggestions aren’t enough to produce the boost they were looking for, authors can meet with a content strategist from Digital Services to dive deeper into their content and create a more nuanced strategy.

GIF showing how a user navigates to the “Improve Your Content” tab in a Mass.gov analytics dashboard.

We realize we can’t totally measure everything through quantitative data, so these scores aren’t the be-all, end-all when it comes to measuring content performance. We’re a long way off from automating the work a good editor or content strategist can do.

Also, it’s important to note these dashboards are still in the beta phase. We’re fortunate to work with partner organizations who understand the bumps in the proverbial development road. There are bugs to work out and usability enhancements to make. As we learn more, we’ll continue to refine them. We plan to add dashboards to more content types each quarter, eventually offering a dashboard and specific recommendations for the 20+ content types in our CMS.

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web