Author

Jul 28 2016
Jul 28

Last week in higher education was all about technology impacting student outcomes and teaching methodology. This week, the buzz is around the almighty dollar. We’re seeing higher education become one of the focal points of the Democratic party’s 2016 U.S. Presidential election platform. We also found an interesting study that highlighted the usage of “open” textbooks and the impact they can have on student costs.

Did we mention higher education pays? Read on to learn more.

Open Textbooks Gaining Momentum, but Still Need Broader Adoption Strategy

As someone who can still remember the sting of purchasing textbooks every semester, the thought of open textbooks gets me excited. Semester after semester I would scour eBay, hunt classmates on Facebook who had taken the courses I had registered for the previous semester, and grudgingly make my way to the bookstore to pay for the books I couldn’t find used. It was an incredibly painful experience, as even the used books were $75+. And the most frustrating part? It seemed as if every other semester the version of the textbooks would be updated, meaning all of the used books available prior to that were useless, putting high upward pressure on the prices of used books that were in short supply. I often thought, “There has to be a better way than this.”

In a time when tuition prices and student debt levels are getting near constant attention in the media spotlight (more on that below), textbooks are the forgotten cost of doing business for students. It wasn’t uncommon for my books to cost $500 per semester, which was roughly 18% of my tuition.

A new report, “Opening the Textbook: Education Resources in U.S. Higher Education 2015-16”, shed some light on the usage of free or inexpensive “open education resources” (OER).  The study found that 58.1% of faculty members surveyed were not aware of the concept, or where to find alternatives to traditional textbooks. As one might expect, introductory courses offered the highest adoption rate of OER with rates dropping as the courses become increasingly specialized. Although a broader adoption strategy is still required to significantly increase usage rates, it’s exciting to see a more student-focused solution being discussed.

Student Debt: a Hot Topic at the Democratic National Convention

It was hard to miss the prevalence of higher education topics at the DNC this week. The stark contrast between the Democratic party’s view on student debt vs. Donald Trump’s failed “Trump University” initiative was highlighted early and often throughout this week.

We analyzed Hillary Clinton’s “Technology & Innovation Agenda” a few weeks ago, where she promised to lobby for free tuition for kids who come from household incomes below the $125,000 annual threshold. At the time, I openly questioned if the proposal was in direct response to the fierce socialist values that Bernie Sanders supporters were so vocal about, and pending the formal announcement of her nomination as the Democratic party leader, was perhaps Clinton’s way of bridging the gap with Sanders’ supporters and ensuring their votes come November. It would appear I was thinking small with this hypothesis; it was apparent from watching the DNC that the Democrats are going to use higher education, both past experience of the candidates’ and proposed policy, as a way of showcasing their value alignment with the middle class. Trump, on the other hand, is using fear to connect with America’s middle class, capitalizing on the civil unrest and international incidents that have occurred of late. Looking at his website, no mention of higher education appears on his positions page. Could higher education be a lynchpin of this election? We’ll soon find out.

Research Shows Higher Education Still Worth the Investment

The University of Ottawa published a study that reviewed income data for 620,000 graduates from 14 Canadian universities and colleges between 2005 and 2013. The results are what you would expect -- a strong correlation between higher education and both starting income of graduates and earning potential over time.

The study found the average university graduates from 2005 made $45,200 in inflation-adjusted income their first year out of school, with college graduates earning $33,900. By 2013 those salaries had increased to $74,900 and $54,000, respectively. The study shows that attending any form of higher education makes a positive impact on the income of Canadians.

The study did, however, highlight the ongoing wage gap between men and women. With starting wages in 2005 averaging $46,800 and $44,000 for men and women respectively, by 2013 men were earning $89,800 while women only $62,500. The study only reports the wage discrepancy and does not attempt to understand its causation.

Tuition Discounts: It’s on the Rise

U.S. News posted a very interesting read that aggregated several studies on the topic of tuition discounts throughout both private and non-private institutions. 48.6% is the average discount on tuition for first-time, full-time students at private, nonprofit colleges. That means that $0.48 of every $1.00 spent in tuition is returned to the student in the form of a discount, scholarship, or grant.

The article does a great job showcasing the impact of free-market concepts like supply and demand on higher education institutions, noting that the prevalence of discounts is becoming a self-fulfilling prophecy with traditional full tuition paying students “rebelling” against paying the sticker price of tuition. Definitely worth the read.

Jul 28 2016
Jul 28
Matt & Mike talk with Eduardo "Enzo" García about his 18 country tour around the the world! We talk about various far-flung Drupal communities, motivations, challenges, and more.
Jul 28 2016
Jul 28

Ever since Drupal 7 I've used GIT to keep track of both my personal repos as the ones the company I work for manage. In short, I use GIT quite a lot. A colleague of mine use GIT to keep track of his computer setup so he easily can pull down his settings and .profile whenever he chooses to reinstall his computer or get a new one. I've heard of a guy (or girl, can't remember) who use GIT to keep track of all his documents. That seems a bit handful, but GIT is wonderful to keep track of the changes in your, for example, Drupal repo.

And even if I've used GIT for so long, I'm still learning new stuff about GIT and it makes my daily work even better and more streamlined. Below I'll share some of my discoveries when working with Drupal and GIT.

I've updated many modules, but I want to commit them one by one

I've done a total update perhaps, or anyway updated more than one module, but I want to make separate commits due to OCD or company guidelines, well as long as the module is located in it's own folder (which they are), that's no problem.

Example: I've updated two modules, Token and Metatag, and now I have a list of 40+ files which are listed as either modified, deleted or new. Instead of doing a total commit of them all, I simply add the folders, one by one, and do separate commits of them.

$ git add sites/modules/token
$ git commit -m "Updated Token module"
$ git add sites/modules/metatag
$ git commit -m "Updated Metatag module"

Two modules, nicely committed one by one. Now you can sleep without any OCD nightmares.

Add all files, but not all!

This is one of the latest I learned. Sometimes, when I update Drupal Core I want to add all files, but for some reason I want to exclude one or two files.

Example: I've updated Drupal core, which led to 60+ modified, deleted or new files but among them I have my settings.php and an updated module (let's choose Metatag again) that I want to exclude. I could do a separate git add for the module folder if I want, but perhaps I don't want to commit it at all, and that's where this comes in handy.

$ git add -A    // Add all the files
$ git reset -- sites/modules/metatag**     // "un-commits" the metatag module folder
$ git reset -- sites/default/settings.php    // "un-commits" the settings.php
$ git commit -m "Updated Drupal Core like a pro!"

I've made changes to a file, but want the original file back

This happens a lot. You try something out, go wild, and then suddenly want your original file back. No worries. Just use this to make the changes disappear and the original file will magically appear again.

git checkout [filename]

You can even make all the changes disappear, going back to square 1 on your repo.

git reset --hard

That command will not erase new files, though, so you might be standing with a bunch of new files that GIT won't ignore. You can either delete them, file by file, or you can use the following command to erase all the un-committed files:

git clean -d -x -f

There's changes in my file? What changes?

Sometimes when you upgrade a module or if you just come back to a project and you can't remember what you've done, there's a quick and easy way to see the differences in a file. If you do a git status and a file comes up as changed you can type git diff followed by the filename or complete path to the filename and you'll see the changes. If there are a lot of changes you might want to check out other ways of analyzing the changes. If you're using Sublime Text Editor I can recommend Git Gutter.

$ git status    // Lists all changed files
$ git diff .htaccess   // Shows the changes in the .htaccess-file

That's a few examples of what you can do with GIT, use it for. As I discover more, I'll write them down here so we can spread the knowledge.

Jul 28 2016
Jul 28

At the end of May, I shared how The Drupal Association went through some hard staffing reductions so we could better align our expenses with revenue, making the organization much more sustainable. While this is a challenging transition, it allows The Association to serve our mission long into the future.

Part of this transition includes deciding where to focus our smaller team and, unfortunately, identifying work we can no longer take on. I want to give transparency into these decisions so we can best set expectations and invite the community to take over programs and work efforts we can no longer do. 

Our Focus

Our mission is to unite the community to build and promote Drupal. As I mention in my blog post, now that Drupal 8 is out, our focus is to put more energy into the “Promote Drupal” portion of the mission.

We will do this by improving the adoption journey within Drupal.org and DrupalCon. By adoption journey, I’m referring to the decision making path someone takes to chose a new content management platform for their organization. These decision makers narrow their choice by talking to industry peers and service providers, reading analyst reports, and gathering information online and at conferences. When decision makers visit Drupal.org or attend DrupalCon as part of their fact finding mission, we want to make it easier for them to see that Drupal is the right choice. We will curate content that highlights the power of Drupal solutions, amplifies success stories, and connects the decision makers with Drupal service providers and industry peers. 

It’s important that we play our role in growing the number of organizations using Drupal because gaining more Drupal customers is good for the project. They employ Drupal developers, contribute back code, and provide financial support. Plus, highlighting Drupal successes on Drupal.org and at DrupalCon creates a rallying point for our community. Let’s celebrate the many amazing ways organizations are using our community-built software. 

I do want to point out that the community will still have the great resources they need to continue building and releasing the software. We will still level up developer skills and host large sprints at DrupalCon. And, Drupal.org will continue to provide the tools and resources the community needs to release new versions.

What we can’t do right now is invest in new ways to improve the contribution journey. By contribution journey, I am referring to the path a person takes to join the community online and in person, to collaborate with others and contribute code, documentation, camp organization, etc. So right now, we are not spending resources to improve this contribution journey. However, we are studying what improvements are needed so we can invest in them again as we become financially stronger.

As we serve our mission, we will also focus on strengthening The Association’s sustainability. Naturally aligning expenses with resources is the biggest step in that direction. Now we will focus on strengthening our financial health by prioritizing revenue-generating initiatives as thoughtfully as we can. We’re starting this effort by finding out how to add value to each segment of our community so we can update our programs, making them even more attractive for people and companies to invest in.

There are many segments that make up our community from the individual to the business community of Drupal Shops, hosting companies, and technology companies. Each segment funds our mission work by buying Drupalcon tickets and sponsorships, by becoming Members and Supporting Partners, finding talent on Drupal Jobs, and buying Drupal.org digital opportunities. We will talk with members in each of these segments and see how we can make these programs better and more valuable to them.

Plus, our community has grown over the years and we need to welcome newcomers and find out how we can provide them with value that they are willing to pay for. Specifically, system integrators like Tata Consultancy Services and WIPRO and digital agencies like WPP and Digitas are now using Drupal to build ambitious digital experiences for their clients. By interviewing these organizations throughout the summer and fall, we will have a much better understanding of how to best support these kinds of organizations.

More Details

While the details I provided above are high level, The Drupal Association staff are operating from a 12 month execution plan that includes roadmaps with milestones and metrics. As we selected the work in our roadmap, we applied three imperatives:

  • Strengthen our financial health: Simply put, we will thoughtfully prioritize revenue generating opportunities that rebuild our cash reserves so we are more stable and do this in ways that add value to the community.
  • Execute well: We are picking a few areas to focus on so we are able to deliver results and make an impact. We are using good old fashion project management best practices to properly scope work and get stakeholder input to make sure we set ourselves up for success.
  • Determine strategic direction for future planning: While we are heads down working on our execution plan for the next year, we need to know where to focus next to best serve the community. The Drupal Association board and staff will spend time this year determining that strategic direction.

We want to share details of the work we will do this year, but rather than make this blog post even longer with all of that detail, we will do a blog series from the Events, Engineering, MarComm, Revenue, and Operations departments.

Each department will explain what their focus is in more detail and it will explain what work we are not able to do given our smaller size. Plus, we will highlight where community members can get involved to take on the work we can no longer do. If you are interested in volunteering your time to work on community programs, please contact me. We would love to work with you!

About Megan

As the Executive Director of The Drupal Association, I am inspired by the community values of kindness, collaboration, learning, and doing our best. The Drupal community is a bright spot in a complex world and I am personally motivated to protect its health and longevity.

Outside of work, you can find me exploring nature with my family and friends or working out with my bootcamp squad doing 100 burpees in the 100℉ / 38℃ heat of Tucson, Arizona, USA.

Jul 28 2016
Jul 28
    To improve SEO, we need to clean our URLs. By default in drupal, we've an option called clean URLs at the configuration. In drupal 7, we can also manage the URLs. For instance, you have a content type called services. You wanted to each service page have url like services/page-name. To do that, we've a pathauto module in drupal 7. The pathauto module allow us to manage the URLs for every content types, files, taxonomy & users and also we can remove some unnecessary words from URL like an, the and so on.

   The pathauto module can remove some unnecessary words like a, an, the and so on & also remove special characters like !, @, $ and so on. Unfortunately, it doesn't included some other symbols like copyright(©), trademark(™), registered(®) and so on. But it provide a hook to add new symbols into the punctuation settings called hook_pathauto_punctuation_chars_alter. After created a content with some symbols which are represented above, your page URL looks like below image:

Drupal 7 - remove special characters from url using pathauto module

/**
 * Implements hook_pathauto_punctuation_chars_alter().
 */
function phponwebsites_pathauto_punctuation_chars_alter(array &$punctuation) {
  $punctuation['copyright']          = array('value' => '©', 'name' => t('Copyright'));
  $punctuation['registered']         = array('value' => '®', 'name' => t('Registered trademark'));
  $punctuation['trademark']          = array('value' => '™', 'name' => t('Trademark'));
}
   After implemented above code into your module, you cold see added symbols are listing on Pathauto module's settings page at /admin/config/search/path/settings. If You didn't get these symbols, clear cache & test it again. It looks like below image:

Drupal 7 - pathauto settings after hook_pathauto_punctuation_chars_alter

Now you can create a content with those symbols. The pathauto module didn't added those symbols into the URL.

Now I hope you know how to remove some special characters from URL alias using pathauto module in drupal 7.

Jul 28 2016
Jul 28

Mobile users are not patient! More than 71% of mobile users delete emails immediately that don’t render well on a mobile device, 74% will only wait 5 seconds for a web page to load on their mobile device before abandoning the site—and nearly half won’t return.

Not only is mobile responsiveness important for your customers’ experience, it is important for Google search rankings as well. Google favors sites that are mobile-friendly and fast. In fact, they are backing the Accelerated Mobile Pages (AMP) project to help websites become optimized across all devices. Drupal’s AMP module delivers pages that comply with the AMP standard and drastically improves the performance of mobile content.

What Marketers Should Know about Drupal AMP to Sound Smart

The Accelerated Mobile Pages (AMP) project is an open source initiative that allows content to be optimized for mobile once and loaded instantly everywhere. AMP is a way to build web pages for static content that render quickly for mobile devices. It includes HTML standards, custom tags, and cache for building rich content with reliable performance and fast page loading speed.

The Drupal AMP module converts Drupal pages to comply with the AMP standard. It provides special AMP formatters for text, image, and video fields. It includes:

  • AMP Theme which produces the specific markup that the AMP HTML standard requires. It works just like any other Drupal theme with flexibility and customization on page displays and can place AMP ad blocks.
     
  • AMP PHP Library which analyzes the HTML entered by users and makes corrections, where possible, to make it compliant with AMP HTML. It automatically converts images, iframes, Tweets, Instagram, and YouTube HTML into their AMP HTML equivalents.

Use Drupal AMP to speed up your site's mobile load times and improve overall SEOWhat Marketers REALLY Need to Know about Drupal AMP

So why should a marketer care about Drupal AMP?

AMP web pages are fast. Using AMP HTML makes the web fast with smart caching, predictable performance, and modern, beautiful mobile content. When Pinterest tested publisher AMP pages in their iOS and Android apps, they found that AMP pages loaded four times faster and used eight times less data than traditional mobile-optimized pages. Because it increases page loading speed, it is likely to provide a ranking boost. While AMP isn’t the only way to improve page speed, it is one way that Google recognizes and supports.

Advertisements are more effective. Businesses using AMP ads are seeing greater revenue. According to ampproject.org:

  • 80% of publishers are realizing higher viewability rates
  • 90% of publishers are driving greater engagement with higher CTRs
  • The majority of publishers are seeing higher eCPMs

Google favors AMP pages. Google began integrating AMP pages into its search engine in February. For mobile results, Google is holding the News carousel (with AMP content) above-the-fold—which means organic search results are pushed down, resulting in fewer clicks, impressions, and click-through-rate, which, in return, affects SEO rankings.

AMP pages will help improve SEO. Google has made it clear that mobile-responsiveness and page speed are important for high search engine rankings. Some speculate that AMP pages will automatically get a “fast” label designation. In the future, AMP may even become a ranking signal. If you have news-type content or blogs, you can expect to see better rankings when you use the Drupal AMP module.

The Drupal AMP module is an important addition if you are looking to improve your Drupal SEO efforts. If you would like to be sure your website is following all of the latest practices for high Google rankings, check out our Drupal SEO services and give us a call at (512) 989-2945.

Jul 28 2016
Jul 28

[embedded content]

While developing a module or modifying a template in Drupal you'll often print variables, especially if you're in a preprocess hook.

You learn early on how to use var_dump and print_r function. But these functions can sometimes display too much information and can be hard to filter through the arrays or methods in the variable.

In Drupal 7, with the Devel module, you could use the dpm or dsm function. When used, these functions will print variables at the top of the page or in the message area using Krumo.

Now in Drupal 8, Devel has adopted a new library to print variables and it's called Kint.

Fig 1.0

Please note, Krumo has been removed from the Drupal 8 version of Devel. The dpm and dsm functions are still there but the variables are printed without formatting.

Fig 1.0

Getting Started

Kint ships as a sub-module in Devel and the library itself is also in the sub-module. So all you need to do is download and install it.

Below are the Drush and Drupal Console commands to download and install:

Drush:

$ drush dl devel
$ drush en kint

Drupal Console:

$ drupal module:download devel --latest
$ drupal module:install kint

Using Kint

As mentioned earlier, Kint comes as a sub-module so it hasn't replaced the good-old beloved dpm and dsm, these functions are still there.

Instead, you get two new ones: kint and ksm.

kint()

Fig 1.0

The kint function prints everything at the top of the page.

ksm()

Fig 1.0

The ksm function prints the Kint output in the message region of your theme.

Expanding and Collapsing

Once you have a variable printed on the screen you want to drill down and see what's in it.

Kint allows you to navigate around in two ways. First, if you click anywhere on the row, it'll expand just the next level.

But if you click on the + icon, it'll expand all child items below.

Have a look at the GIF below to see how it works.

Fig 1.0

Viewing Stack Trace

When viewing a printed variable, Kint also displays a stack trace just below the output. Click on the + icon below the variable and it'll expand a stack trace.

Fig 1.0

Using Kint in Twig

Kint can also be used in Twig templates. To print a variable, just add {{ kint() }} into the template and pass a variable in, for example, {{ kint(page) }}.

For this to work, you must turn on Twig debugging. I won't cover it in this tutorial, but read this documentation page on drupal.org to learn how to do it.

If you're using the 8.x-1.0-alpha1 version of Devel, then you'll get a PHP warning when using Kint in a Twig template. This has been fixed in the latest dev version.

Summary

Kint is a good upgrade for the Devel module. The only drawback is you'll have to retrain your muscle memory to type kint or ksm instead of the old functions.

FAQs

Q: Has dpm() and dsm() been removed?

No, these functions are still there but they won't print the variable using Krumo as it has been removed from Devel in Drupal 8.

Q: I'm trying to print a variable in a Twig template and nothing is happening.

You must turn on Twig debugging for the {{ kint() }} function to render. Read this page on drupal.org to learn how.

Like what you see?

Join our free email list and receive the following:

  1. Discover our best tutorials in a 6 part series
  2. Be notified when free content is published
  3. Receive our monthly newsletter
Jul 28 2016
Jul 28

DrupalEasy training globe image DrupalEasy is proud to announce another cavalcade of training events in the coming months, both online and in-person. We have numerous opportunities for you to take advantage of our proven Drupal 8 module and theme development courses as well as our flagship technical education program for those seeking comprehensive training to become developers; the 12-week Drupal Career Online.

The primary instructor for all courses is Mike Anello (ultimike), an expert instructor and experienced, practicing Drupal developer. All DrupalEasy training is taught by only expert developers to ensure that lessons are taught accurately, reflect best practices, and draw on how real-world Drupal sites are built. Mike has been active in the Drupal community for over 10 years, is a core contributor, a Drupal Association member, and one of the leaders of the Florida Drupal Users' Group.

12-Weeks of Career Training
Drupal Career Online starts on September 26, and runs every Monday and Wednesday afternoon from 1-4:30pm EDT. In addition, there is a 4-hour co-working lab, which is scheduled by the students. Class is held online using GoToMeeting, and use of webcams and microphones keeps the classes highly interactive, with instructor and student-led demos and discussions. The live online classes are supplemented with reference and lesson-guide handouts, as well as screencasts that cover each and every lesson. Each student is assisted by a community mentor to help kickstart their personal Drupal network. Our goal is to provide the most holistic, sanely-paced, best-practice-focused, long-form Drupal course in the world.

Interested in learning more about 12-week Drupal Career Online course? We are offering two free Taste of Drupal webinars that outline the entire course, set expectations, and give potential students the opportunity to ask questions. Past students have included Drupal newbies, hobbyists, and content admins looking to learn how to become full-fledged Drupal developers.

5 Options for D8 Theme and Module Development

First, if you're headed to DrupalCon Dublin, we're excited to announce that we've been selected to be one of the official training providers! We'll be offering our Introduction to Drupal 8 Module Development workshop live, and in-person. You'll learn through our stellar curriculum, and have follow-on access to all of the handouts and screencasts, with the added bonus of the synergy (yes, we just used that word) of a live classroom.

We're also offering the popular Introduction to D8 Module Development online, as well as our Introduction to D8 Theme Development workshops (https://www.drupaleasy.com/training/workshops/upcoming) multiple times in August and September. The module development workshop is 8 hours, split over two half-days, and teaches the fundamental concepts of Drupal 8 module development, including using Drupal Console as an aid for module development. Our theme development workshop is 12 hours long, split over three half-days. This super-sized workshop teaches the fundamental concepts of Drupal 8 theme development including building template files with Twig, creating custom subthemes (using Bootstrap as the base theme), and setting up a professional-level front-end development toolchain using Node.js and Gulp. Both of these courses include a live instructor, PDF handouts, and screencasts for all in-class examples.

We are committed to providing the highest quality Drupal talent development, from beginner to advanced programs; which is why our trainings work. We've taught the 12-week DCO ten times (including twice as part of Acquia U) and our 1 and 1.5 day workshops always get great reviews, just see what our graduates have to say!

Jul 27 2016
Jul 27
TL;DR Last week, I had worked on and developed tests to ensure that the similar images are grouped in accordance to the Image Properties feature of the Vision API. The code is under review by the mentors, and I would continue on it once the review is done. Meanwhile, they also reviewed the “Fill Alt Text” feature issue, and approved it is good to go. This week, I have worked on developing tests for this issue.

An important feature that I have implemented in the Google Vision API module is the filling of Alt Text field of an image file entity by any of the four choices- Label Detection, Landmark Detection, Logo Detection and Optical Character Detection. My mentor suggested me to check the availability of the response and then fill the field, as we can not fully rely on the third party responses. With this minor suggestion being implemented, now its time to develop tests to ensure the functionality of this feature.

I started developing simple web tests for this feature, to ensure that the Alt Text field is properly filled in accordance to the choice of the user. It requires the selection of the four choices one by one and verify that the field is filled correctly. Thus we require four tests to test the entire functionality. I have added an extra test to ensure that if none of the options are selected then the field remains empty.

I created the image files using the images available in the simpletests. The images can be accessed through drupalGetTestFiles(). The filling, however, requires call to the Google Cloud Vision API, thus inducing dependency on the API key. To remove the dependency, I mocked the function in the test module, returning the custom data to implement the feature.

The first test ensures that the Label Detection feature returns correct response and the Alt Text field is filled correctly. The simpletest provides a list of assertions to verify it, however, I found assertFieldByName() to be most suitable for the purpose. It asserts the value of a field based on the field name. The second test ensures that the Landmark Detection feature works correctly. Similarly, the third and fourth test ensures the correct functionality of the Logo and the Optical Character Detection feature.

The fifth test which I have included perform tests when none of the options are selected. It ensures that under this case, the Alt Text field remains empty, and does not contain any unwanted values.
I have posted the patch covering the suggestions and tests on the issue queue Fill the Alt Text of the Image File using Google Vision API to be reviewed by my mentors. Once they review it, I would work on it further, if required.
Jul 27 2016
Jul 27
Rick Donohoe's picture Jul 27th 2016Account Manager

DrupalCamp Bristol 2016 was held over this last weekend, and I hope I'm not the only one to say it was an enjoyable and very useful weekend for many. Although there were a couple of usual hiccups, feedback from attendees was very positive and certainly made it all feel worthwhile. Let's hope that wasn't just the beer talking!

Last year I wrote up some tips and feedback after the event, and as the Chair of the DCB Committee I thought it would be great to share my thoughts again this year. After all, Drupal as a community is based upon being open, so behind the scenes camp organisation should be too!

What went well?

Let's start of with the positives:

  • Business Day - In general the Business Day was a fantastic success. We had an increased number of attendees this year which included a larger number of client types, the talks were well varied and lived up to the high expectations set last year, and our new venue - Colston Hall - was a fantastic choice. The weather was brilliant and attendees were able to chat with each other outside on the roof terrace which was a great feature.
  • Quality of Saturday talks - I'll give a bit more insight into the talks shortly, but this year we managed to put together a larger number of talks without sacrificing any quality. Last year almost all of the talks were of a technical nature, but this year we managed to get a better variety which is the first step to brining in newer faces.
  • Sponsors - This year we completely changed the Sponsor tiers up, allowing us to reduce the number of agency types involved. We had 4 key Organisational Sponsors, a Recruitment Sponsor, and a couple of "Brunel tier" Sponsors, which was well received. I think I'm correct in saying that most of the Org Sponsors have an existing relationship with the Recruitment Sponsor, so we were able to pick somebody we trusted for the exclusive Sponsor role.
  • Quiz - This year we ended the Saturday talks at 3.30pm and finished with a Quiz instead for a lighter end to the day. Admittedly the questions were a bit hard and in some cases....different, but the teams soon began to laugh at it and with a load of prizes to give away it was a win-win situation for everybody!
  • Social - We invested heavily in bar tabs both Friday and Saturday, with a total of £1K budgeted between both evenings, and this was definitely worthwhile. For the Business Day attendees the socials are where the networking is most prominent, and for developers the socials are where most people really seem to bond. Personally I think the relationships which are forged at the socials are key to ensuring the regular faces return each year, as this is where you get a real sense of community and team spirit.
  • Sprints - We held a day of Sprints for the first time and we had to stop ticket sales as this had reached maximum capacity. Thanks to Torchbox for letting us use their offices for this one.
  • Use of Slack - The committee was much more dispersed this year, but using Slack as a communication tool made things much simpler and ensured the remote factor didn't play against us.

What didn't go so well?

There were a few things that concerned me, and although some where event specific I think there's some wider problems that are becoming apparent in the Drupal Community. I'd be really keen to hear how we as a community can go about improving these:

  • New talent is lacking - It was great to hear during Sheena Morris's Apprenticeships talk that agencies in the North and in London have adopted their Apprenticeship program so well, but Saturday's attendees showed that there is a huge lack of new Drupal talent. Emma Jane won "Learning Drupal 8" during the Quiz which was kindly provide by Inviqa, and gifted it to the person in the room who was newest to Drupal. The person in the room with the least Drupal experience had just less than 2 years!
  • Female speakers are sparse - Speaker diversity is a problem which isn't new to us, and this year we only had one female speaker per day. What's troublesome about that is that we didn't have a single talk submission from a female to choose from; it was committee members who sourced both of those speakers.
  • Last minute program changes - I guess this one is unavoidable, but 3 speakers dropped out of Saturday within a week of the event, and upon arriving at the venue on Saturday morning we were told the main lecture theatre was out of use and we had to direct talks to another lecture theatre. Typical! It would be interesting to know how other camps deal with this; maybe we shouldn't include the Saturday schedule in the printed programme, but instead ensure attendees use the website as the primary source of this info?

Camp Funds, DCB 2017, and the Committee

We've estimated an income of £11K this year, with expenditure of just short of £10K. We started the year with a cushion of approximately £4K from last year's camp, which gives us around £5K to rollover. I personally think it would be great to invest some of this into the local Drupal community, perhaps sponsoring more regular talk nights and Sprints? I'd be interested to hear of any suggestions - tweets to @DrupalCampBris.

The committee will be having a wash up meeting over the next week to discuss next year and see what we can do to improve the event, but I can guarantee you DrupalCamp Bristol 2017 will certainly be happening. Organisation will begin again later this year, and as the committee is always in need of new members we'd be interested to know if anybody new would like to join the team? Again, simply tweet @DrupalCampBris if you'd like to get involved.

Finally, I'd like to take this chance to offer up the Committee Chair position as I've decided to take a step down to a more back seat next year. I've thoroughly enjoyed leading the team over the first 2 years, but I think it would be good to see somebody else step into the role next year.

A massive thank you to everybody who attended, thank you to all the Speakers who volunteered their time over the weekend, thank you to all the Sponsors who made the event possible, and a huge thank you to all the committee members for pulling everything together and making the event a success.

See you all next year!

Rick Donohoe's picture

You may also like...

DrupalCamp Bristol 2016 - 22nd to 24th July

DrupalCamp Bristol is back for a second year, with a variety of talks covering hot topics in the Drupal (and wider digital) area. This year we will...

Jul 27 2016
Jul 27

This blog post summarizes week #10 of the Google Summer of Code 2016 project - Mailhandler.

In the last blog post, I was writing about the comment and demo module improvements. This blog post will provide an overview of the work done in the past 7 days. Even though the plan was to work mostly on UI/UX issues we ended up in code refactoring.

During the last meeting with my mentors we identified 3 key Inmail issues to work on: Lack of standard result in collaboration of analyzers, Support switching the user context and Provide a hook_requirements fed by plugin instances. We agreed those issues will provide better overall value in the integration of Mailhandler and Inmail modules. It will allow Inmail to implement ideas from Mailhandler, make them generic in a way both modules can benefit from.

Lack of standard result in collaboration of analyzers was identified as the main blocker to achieve analyzer collaboration. After a long discussion and 7 patches, it was finally committed. As a result, Inmail will have a default analyzer result which can be extended sequentially by all enabled analyzers. As this was a big change in core Inmail module, there are several follow-ups created.

Another Inmail issue Support switching the user context was dependent on the default analyzer issue. It uses AccountSwitcher service which completely switches the user context to the given user/account. That is done after the handlers run. In case the account switching mechanism was activated, we make sure it is switched back after the handlers-processing. On a handler level, we can use \Drupal::currentUser() and be sure it represents the identified user sender or an anonymous user otherwise.

Last but not least, I have been working on Provide a hook_requirements fed by plugin instances. The goal of this issue is to create a way for each of the Inmail plugins (analyzers, deliverers, handlers) to provide information about its runtime requirements. They can be PHP extension requirements, valid credentials etc. This information is displayed on “Status report” page (admin/reports/status) or processed by contrib modules like Monitoring.

PGP Analyzer requirementsPGP Analyzer requirements

Inmail plugins can produce several issues with unmet requirements. As displayed in the picture above, PGP Analyzer needs gnupg PHP extension in order to work with signed messages. On the other side, an IMAP deliverer needs valid credentials to be functional.

Since the most of the Inmail issues mentioned above were committed, Mailhandler module will need adaption which is going to be my focus for this week. Also, I will analyze the code of the module and try to simplify it. Besides the mentioned, I will work on Inmail UI/UX issues which will be described in the next blog post.

Jul 27 2016
Jul 27

With this week over, we are getting closer to GSoC final evaluations. Therefore, we keep working hard to develop the Social API project even further. This week, I focused on Social Post Twitter as I mentioned in my last weekly summary. This module works as an implementer of Social Post.

Achievements

During this week, I made social_post_twitter_user entity be able to store new users and their access_token and access_token_secret. To store this data, users with the right (drupal) permission, can authorize the Twitter app (associated with the Drupal site) to create tokens to post on their behalf.

These are screenshots of the module:

Settings FormSocial Post Twitter Settings Form

The settings form allow site builders to add information about their Twitter App. Twitter requires only a Consumer Key and a Consumer Secret.

Option to add a twitter accountOption to add accounts in user edit form.

Once the Twitter app information is added, site builders can grant the permission "Perform Twitter autoposting" to a role (e.g. an editor role). The users with the specific role will be able to add as many accounts as they like.

Twitter User entity listsocial_post_twitter_user entity list.

Users with right permissions can access to a full list of users who have associated their Twitter accounts to their Drupal user. Thus, allowing the site to tweet on their behalf.

Challenges

We have worked on the authorization layer and user data storage so far. However, there are many variables of how this module could be used. For instance: what about anonymous users, can we ask for permissions to tweet on their behalf? In what situations the site would tweet for the users? What about general Twitter accounts for a company at which many users can access?

I believe these questions would be discussed in our following weekly meeting on Wednesday, July 27 at CEST 3:00 pm. So, join us if you have ideas to contribute with!.

Next week

As I have mentioned, there are many questions regarding functionality. Hence, I would have a clearer idea of what to work on for next week after the meeting with my mentors. What I can assure is that for my next weekly summary, social_post_twitter will provide functionality to tweet on behalf of a user.

As always, feel free to contact me if you have any question. You can also collaborate with the Social Initiative projects (social_apisocial_authsocial_post, and social_widgets). We also have our weekly meetings, so follow the Social Initiative and join us on Wednesdays.

Stay tuned for the next weekly summary!

Jul 27 2016
Jul 27
DrupalCon New Orleans

Drupal 8 was released on the 19th of November, 2015. Just a touch under five years from the 5th of January, 2011 release of Drupal 7. In web terms, an epoch. To put things in context, in 2011 AngularJS had not reached 1.0, React was two years out from being released and the term “Devops” was only whispered in dark corridors. Now that Drupal 8 is finally here the front-end revolution is led by AngularJS and React and articles are proclaiming that “DevOps is dead”. In short, much has changed.

Understandably, this has created some concern within the Drupal community. Has the huge Drupal 8 release cycle hurt Drupal in some irrevocable way? Is Drupal still a relevant technology? Is it too little too late or, as some argue, too much too late?

At the New Orleans DrupalCon, roughly six months after the release of Drupal 8, many were trying to glean some answers from available data. One fact cited is that after the first three months of the Drupal 8 release there were roughly 60,000 sites compared to 30,000 in the equivalent time period for Drupal 7. However, others argued that that is not necessarily a positive number because (as is pointed out here) the Drupal community is also three times bigger now.

In addition, the reasons that are holding people back from moving to D8 were discussed. Dries Buytaert’s keynote at the New Orleans DrupalCon covers that very well, with the leading factor being the migration of existing modules from Drupal 7 to Drupal 8.

However, as Dries goes on to explain, the number of sites and the number of upgraded modules are not the only relevant metrics. He argues that this increase in the richness and reach of Drupal will ultimately mean that Drupal 8 will see much bigger numbers than any other Drupal release.

What I will attempt to do in this blog post is add to the discussion by offering some simple facts that are very revealing about the actual state of Drupal and are relevant to any organisation trying to decide whether Drupal should form a component of their web strategy for the next five years.

Millions in ongoing investment by the Drupal community

2016 marks the year where it is no longer a novelty for leading Drupal agencies to employ people with a very specific mandate of working directly on Drupal core issues, or large Drupal 8 contrib modules (here at Deeson this is exactly what we do with the Group module, Rainmaker, Warden and our monthly Coder Lounge). The amount of ongoing direct investment from Drupal agencies and other organisations easily runs into multiple millions.

$500,000 invested to accelerate Drupal 8 module development

Acquia alone is investing $500,000 to speed-up the migration of popular Drupal 7 modules to Drupal 8. This means that popular modules will be ready faster and, more importantly, the quality of those modules will be higher as the maintainers will be able to dedicate focused time to get upgrades right. Using Drupal 8 means you are using an open-source CMS that is built to very high standards from some of the best developers around.

Big site wins are no longer big news

There was a time when a big site launching on Drupal would represent headline news in the community. It would receive applause at conferences and would be tweeted widely. This is no longer the case. Everyone is still happy to hear about big names joining Drupal but it is not news, it is the normal state of affairs.

These big launches span from media brands like ITV to multinationals like Johnson & Johnson; from pop singers to large cultural institutions. While the absolute number of sites running a CMS is important, the number of large complex sites using Drupal is arguably more significant to an organisation looking to set out its strategy for the next 5 years.

DrupalCon isn’t just about Drupal anymore

If you haven’t attended a DrupalCon yet, you should. One of the most interesting ways DrupalCons have evolved in the past few years is that they are no longer just about Drupal. There are dedicated tracks on project management, business development, user experience, PHP, front end technologies and much more. This is proof of a maturing community.

As the community has matured, interests have become more diverse and the breadth of shared practices has grown. Drupal has community sharing in its DNA which means that, by using Drupal, you are tapping into a very rich world that is willing to share knowledge about every aspect of a web strategy.

Drupal 8 is actually already at 8.1.7

The Drupal release cycle has changed drastically from 7 to 8. This is an often overlooked “feature” of Drupal 8. While with Drupal 7 we essentially had the sum of features at the start and new functionality could only really develop through contributed modules, Drupal 8 brings minor releases that can add completely new functionality.

To prove the point we are already at Drupal 8.1.7 with Drupal 8.1.0 seeing exciting new features such as the addition of the BigPipe technique to Drupal. This finally allows Drupal to adapt and adjust its course, and the days of worrying about Drupal becoming irrelevant as the rest of the web marches on are behind us.

There are 17 external JS Libraries and 27 external PHP Libraries in Drupal 8

Drupal 8 core contains at least 17 significant JavaScript libraries and 27 PHP libraries that are completely separate open-source projects. Drupal has well and truly built bridges to other technology islands and this also makes Drupal stronger, more relevant and more resilient to changes.

You can build a Drupal site with 0% Drupal on the frontend.

0 is a strange number to be touting as a success. But Drupal 8 now realistically allows you to build a front end that is 100% Drupal free. At Deeson we are currently building amazing front end experiences using React and taking advantage of Drupal’s powerful content model to store information and allow editors to quickly add and update information. The wider community is developing modules and best practices around this (for example, the great work happening around the Decoupled Blocks module).

Now, given the above is Drupal a relevant technology for a modern, forward-looking web strategy? All indications point to an unequivocal yes. Drupal 8 is more relevant than ever. It is the most advanced open-source CMS and it is built in a way that allows it to embrace and enhance a range of other technologies. The best news is that we are just at the start of what Drupal 8 will be able to do and, as best practices and experience accrues, the possibilities and the efficiency with which projects can be delivered will improve.

Want to know more about this topic? We're holding a webinar on the current state of Drupal and where it's headed. During the webinar you'll have the chance to ask Ronald Ashri, author of this post, any questions you like on this topic.

Interested? Sign up using the form at the top of this page.

Want to know more about this topic?

Join our webinar on the 4th August discussing the current state of Drupal and where it's headed. During the webinar you'll have the chance to chat with our Lead Solutions Architect Ronald Ashri on this topic. To join simply sign up below:

Jul 27 2016
Jul 27

I am part of the Search Configuration module porting process to Drupal 8 as part of the Google Summer of Code’ 16. I have been mentored by Karthik Kumar, Naveen Valecha and Neetu Morwani. If you would like to have a quick look at my works so far, please go through these posts.

The past week I worked on fixing some of the issues in the port process. Moreover, I could learn some important concepts in the process. I ported the helper functions of the search configuration settings to the form API.

Basically, settings were stored as helper functions in the Drupal 7 module. I ported the helper functions of the search configuration settings to the form API. Generally, forms are created and worked on using four important functions, namely, getFormId, buildForm, validateForm and submitForm. These definitions are stored in classes.

The basic format will be of the form:


use Drupal\Core\Form\FormBase;
 
use Drupal\Core\Form\FormStateInterface;
 
class searchForm extends FormBase {

 public function getFormId() {
    return 'search_form';
  }

  public function buildForm(array $form, FormStateInterface $form_state) {
    //create the form here.
    return $form;
  }

  public function validateForm(array &$form, FormStateInterface $form_state) {
    // Validate submitted form data.
  }

  public function submitForm(array &$form, FormStateInterface $form_state) {
    // Handle submitted form data.
  }
}

The getFormId() returns the unique id of the form. We need to write the features of the form in the buildForm(). It includes the structure of the form, various fields included and its corresponding types accepted. The validate form is for validating the contents entered into the form which is definitely an important part of any form. Finally, the submitForm() handles the works to be carried out once the user enters the data in the form fields.  This mode of arrangement of the form functionality makes it more standard and properly arranged.

Also, the deprecated functions were removed from the .admin.inc of the module. There were some underscore functions in the Drupal 7 module which supported the configuration and the form settings. These are to be added to the helper file of the module.

These were some of the basic functionalities I could work on and explore. Stay tuned for the future updates on this port process.

Jul 27 2016
Jul 27
CKEditor Anchor Link for Drupal 8 CKEditor Anchor Link

We all are excited about Drupal 8 - there are many articles explaining why we shouldn’t be afraid to migrate to it. Briefly speaking, D8 is more mobile-friendly, multilingual, robust, and makes it easier to edit content by including several must-have modules to the core. However, since the new Drupal version was released not a long time ago, there are still many things left in the CMS that require from developers more effort when using Drupal 8. Some of the CKEditor plugins like CKEditor's AutoGrow will be integrated into D8 in the nearest future, but some plugins still have no integration or at least a plan of integration.

Working on the last project with our Drupal team at Vardot I’ve realized that there is a basic feature in D7’s CKEditor that is missing in D8, that is: Anchor Link. Basically this is the background of how the new module was created.

CKEditor Anchor Link

CKEditor Anchor Link allows content editors to insert links and anchors using multiple protocols. The possibility to link content was integrated to the core of Drupal 8, however as we can see from the screenshot it didn’t provide users the option to create flags within the document they are editing.

CKEditor Anchor Link in Drupal core

The ability to jump from one part of the page to another was critical for our client, and in Drupal 8 we had to go an extra mile and code it instead of just installing a module like in D7. CKEditor Anchor Link solves this problem and adds the new icon just in a few minutes. Moreover, it adds to the standard link icons their alternatives that provide site editors with more options of setting up the URL.

Upgraded CKEditor Anchor Link installed with a module

Installing CKEditor Anchor Link on Drupal 8

Below you can find a quick guide of how to install this module:

  1. Download CKEditor Anchor Link from drupal.org

  2. Enable the CKEditor Anchor module.

  3. Go to Configuration -> Text formats and editors.

  4. Select which text format you want to add the anchor button to. For example “Basic HTML”.

  5. Add the Link, Unlink, and Anchor, which came from CKEditor Anchor Link module.

    CKEditorAnchorLink.gif

  1. Manually remove default link and unlink command buttons and add the new link and unlink, which came with CKEditor Anchor Link.

  2. Save the settings for the for the  “Basic HTML” text format.

  3. Try to add new content, for example Basic page or Blog post, then in the body select the “Basic HTML” text format, then you should see the flag icon in the CKEditor tool.

Usage statistics for CKEditor Anchor Link 

The module is pretty new, and the number of users is not very big yet. The good thing however is that it's constantly growing and promises to reach a good level in the future (this blog post is to increase the visibility of CKEditor Anchor Link Module).

CKEditor Anchor Link usage statistics from drupal.org

Bottom line

For many editors the ability to quickly jump within the page they are editing (link, unlink, and flag it) is one of the top editing priorities. Since the goal of Vardot distributions Varbase and Uber Publisher is to make Drupal as more editor-friendly as possible, CKEditor Anchor Link seems to be an important addition to our products. If you find this module valuable, please feel free to install it, share this article with others and of course provide me your feedback about the module.

Jul 26 2016
Jul 26

There is about a week left before Drupal 8.2 goes into beta! That means we will switch to figuring out any issues with changes in the new version instead of making new changes. For core development that means new features and API additions will move up to 8.3. I asked initiative leaders of both proposed and active initiatives for key things that could use help in the remaining time.

API-first initiative

Allowing user registrations via the REST API only needs some more tests for which examples can be found elsewhere in core. Also, although it may sound a bit scary, REST requests without X-CSRF-Token header: unhelpful response significantly hinders DX, should receive a 401 response is actually a nice approachable novice issue to get involved with.

Ongoing, check out the proposed initiative roadmap and attend the API-first meetings every third Monday of every month 5pm GMT at in Google Hangouts.

Media initiative

An amazing feature is in the works to Improve the UX of Quick Editing images and could use some frontend reviews. Help is also welcome to get HTML 5 video and audio playback functionality directly from file field formatters as well as to get camera capture functionality on image fields.

Larger media management goals in core are still to be defined. The team is meeting on that on July 27th. Follow @DrupalMedia on Twitter. Public meeting times are 2pm UTC every Wednesday on #drupal-media on IRC.

Migrate initiative

Help on any of the issues tagged Migrate critical are welcome, especially Refactoring EntityFile to use process plugins instead which blocks several other issues.

Ongoing, check out the list of issues categorized in migrate's master spreadsheet, and follow @MigrateDrupal on Twitter. Public meeting times are alternating 1pm GMT Thursday and 1am GMT Friday every other week on Google Hangouts.

Workflow initiative

Content moderation module is proposed for core based on the existing improvements achieved by the initiative to expand revision handling for content. Helping with unblocking that issue is very welcome.

Other top issues are Allow removing a module's content entities prior to module uninstallation, Add archived base field to revisionable entities, and Upgrade path between revisionable / non-revisionable entities.

Ongoing, check out the high-level roadmap at https://www.drupal.org/node/2721129, and follow @drupaldeploy on Twitter. Public meeting times to be defined.

For a complete list of meeting times and places (links to Google Hangouts where needed), see the Drupal 8 core calendar.

Jul 26 2016
Jul 26

A detailed blog post by our Drupal developer about using
Drupal Composer template and Phing. It is written from
the point of view of the latest Drupal version — Drupal 8.

Every Drupal developer faces daily routine tasks, regardless of the development area, whether it is front-end, back-end, or QA. Most of us are used to optimize any workflow, and there exist plenty of technologies to help us do that. Even simple tasks like a fresh Drupal installation or a local database update with the database server may become a stimulus for creating a number of tools to optimize these tasks. In this regard, I would like to consider two simple tools that will solve the above-mentioned tasks (and many others).

Due to the fact that we are mostly interested in Drupal 8, we will consider these tools from the point of view of working with the latest Drupal version, although both can work with Drupal 7 (the Drupal version is not important for Phing).

Drupal Composer template for Drupal projects

As you can see from the title, we will discuss a tool created to work with Composer to optimize and speed up the installation and further upgrade of Drupal projects, whether it involves core upgrade or installation/updates of contributed modules.

The tool is available via this github link and most of the work is concluded in one command:

 composer create-project drupal-composer/drupal-project:8.x-dev some-dir --stability dev --no-interaction

where some-dir is the name of the folder that will be created for the project, and 8.x is the branch and, respectively, the Drupal version to be installed.

It is worth noting that in order to work with this tool, you need to have Composer installed, and if it is installed, it may need to update to version 1.0.0 or higher (you will be notified about it in your terminal).

Having started your project, you can go and check out the tasks in your bug/issue tracker, as it will take some time.

When all the dependencies are loaded and the project is created, you can get to the folder from your browser and continue the standard Drupal installation process.

Let's consider the main benefits of using Drupal Composer template

The structure of the files:

As you can see on the image, Drupal itself, namely its core, modules, themes and profiles are in the folder that is a level below the root (the web folder), which makes the root files externally inaccessible. It is worth noting that your virtual host (apache) or server block (nginx) should be directed at the web folder. All contributed modules, themes and profiles should by default be installed in the contrib folder inside the appropriate folder (web/modules, web/themes, web/profiles), which separates them from the custom code. Note that if for some reason you place custom code in the contrib folder (and you shouldn’t do that), it will not fall under version control.

Version control: As you can see from the .gitignore file in the project root, folders such as vendor, core, contrib, files will not fall under version control. So the repository will include only the necessary files for the Composer, custom code, and additional configuration files, depending on your project requirements.

Additional components by default: After the installation, a developer has two very important tools: Drush and Drupal console, which means that they do not need to be installed in advance in the system. They can be started from the web folder using the following commands:

 ../vendor/bin/drush some-command
../vendor/bin/drupal some-command

Pre/post install/update scripts/commands/methods: The scripts/composer folder has a php file with a ScriptHandler class and standard methods. If your project requires additional checking/file creating, etc., you can add your method in this class and add a callback of this method in the scripts section of the composer.json file. In this section, there are examples of methods callback for running during the pre/post install/update events. Also, you can add a callback of your command or a start of shell scripts in the scripts section, which will help make some processes more automated.

These are the main advantages of using the Drupal Composer template. It’s hard to disagree that this approach optimizes the time deploying a pure project and supporting it. Trying to provide it all on your own would take much more time.

As for its future use, the installation of modules, core update, patch using is now the Composer’s task.

For example, the installation of the layout plugin contributed module looks like this:

 composer require drupal/layout_plugin:8.1.0-alpha22

And Drupal core update looks like this:

 composer update drupal/core

Please note that the that the modules will be installed from the package repository specified in composer.json file (at the moment it https://packagist.drupal-composer.org, but it is deprecated and will later be replaced by the official package repository on drupal.org). Note also that the module version specification will slightly differ from the version shown on drupal.org (the layout plugin module version with drupal.org of the 8.x-1.0-alpha22 type will not be accepted by the Composer, because it uses a more accurate versioning of the 8.1.0-alpha22 type).

Phing as a build tool for Drupal projects

Now it's time to talk about the tasks we face much more frequently than downloading and installing a fresh Drupal project.

Suppose that we have a Drupal 8 build which uses the Drupal Composer template mentioned above, and your project is currently under active development. What is the process of updating the local site instance to the repository state (version control), where the project is located? This is done with a set of special commands:

 git pull origin master // let’s take the simplest option without inventing more repositories and branches
composer install // perhaps someone has added a new module, and you need to install it
drush config-import // new configs to be imported into your database came with the pull
drush updb // run database updates
drush entup // entities updates
drush cr // keep calm and clear cache :)

We have described the simplest case, one that a Drupal developer faces several times a day during the active development, but even this includes a set of 6 commands that you may get tired of writing, and, in fact, you may also need to run bower, migrations, unit testing, or any other tool that is used on the project.

Your assistant here is PHING — a build tool based on Apache Ant.

Since we are already using the Drupal Composer template and presume that everyone will use Phing on the project, let’s install it using Composer specifically for our project (the choice between a local and a global installation is up to you, this case is just an example)

 composer require-dev phing/phing:2.* //

at the time of your reading the article, the version may vary.

Next, the Phing folder will appear in the vendor folder of your project.

All we need to make our life easier (after all, it’s already complicated enough ;)) is an xml build file with 3 components:

  • 1. Task — the code that calls a specific function (git pull, mkdir, etc.)
  • 2. Target — the list of tasks, which may be dependent on another target
  • 3. Project — the root element consisting of targets

Based on the available data, we write our first simple build file in order to optimize the 6 commands described above.

Let’s create a build.xml file in the project root and leave it under version control, so that other developers can also use it, too (since Phing installed locally for the project). It is advisable not to forget about the fact that the file should not go anywhere outside dev/stage environment, because it is unnecessary.

Let’s describe the file Project:

 <?xml version="1.0" encoding="UTF-8"?>
<project name="Awesome project" default="build" basedir="." description="Build site">
// here are targets will be (Target).
</project>

The project element has several attributes whose mission is clear from their names, but I would like to draw your attention to the default attribute — the Target name which will be used by default, unless the target name is specified when running a Phing command.

Let’s describe the purpose and objectives for our case:

 <target name="build">
                        <exec command="git pull" dir="." description="Fetch data from cvs repository." logoutput="true"/>
                        <exec command="composer install --no-interaction" dir="." description="Install missing composer packages." logoutput="true"/>
                        <exec command="../vendor/bin/drush -y config-import" dir="web" description="Import drupal configuration." logoutput="true"/>
                        <exec command="../vendor/bin/drush -y updb" dir="web" description="Run drupal update database hooks." logoutput="true"/>
                        <exec command="../vendor/bin/drush -y entup" dir="web" description="Run drupal entity update hooks." logoutput="true"/>
                        <exec command="../vendor/bin/drush -y cr" dir="web" description="Rebuild the cache." logoutput="true"/>
</target>

As you can see, the target and task syntax is fairly simple. It is worth noting that the created target has a name specified in the default attribute of the root element, which means that the list of commands will be run by default.

For the most part, these are the basics of working with Phing. The final look of the build file will be like this:

 <?xml version="1.0" encoding="UTF-8"?>
<project name="Awesome project" default="build" basedir="." description="Build site">
                                <target name="build">
                                            <exec command="git pull" dir="." description="Fetch data from cvs repository." logoutput="true"/>
                                        <exec command="composer install --no-interaction" dir="." description="Install missing composer packages." logoutput="true"/>
                                        <exec command="../vendor/bin/drush -y config-import" dir="web" description="Import drupal configuration." logoutput="true"/>
                                        <exec command="../vendor/bin/drush -y updb" dir="web" description="Run drupal update database hooks." logoutput="true"/>
                                        <exec command="../vendor/bin/drush -y entup" dir="web" description="Run drupal entity update hooks." logoutput="true"/>
                                        <exec command="../vendor/bin/drush -y cr" dir="web" description="Rebuild the cache." logoutput="true"/>
                                </target>
</project>

and the start of the command running:

 phing // from the root of the project, where the build file is

Of course, these are not all the features of Phing. The tool has its own documentation, which will help you build much more complex build files.

As a bonus, I would like to share another target, which will be useful for getting a database from dev/stage with just one command:

 <target name="sync">
                        <exec command="../vendor/bin/drush -y sql-drop" dir="web" description="Drop the database." logoutput="true"/>
                        <exec command="../vendor/bin/drush -y sql-sync @stage @self" dir="web" description="Sync database from stage." logoutput="true"/>
                        <exec command="../vendor/bin/drush -y cr" dir="web" description="Rebuild the cache." logoutput="true"/>
                            <exec command="../vendor/bin/drush -y cim" dir="web" description="Import drupal configuration." logoutput="true"/>
                        <exec command="../vendor/bin/drush -y updb" dir="web" description="Run drupal update database hooks." logoutput="true"/>
                        <exec command="../vendor/bin/drush -y entup" dir="web" description="Run drupal entity update hooks." logoutput="true"/>
</target>

By placing this target in the project of your build file, you can easily synchronize with dev/stage with just one command:

 phing sync // sync

— the name of the created target.

Also, running Phing builds can be combined with the Composer pre/post install/update hooks described in Drupal Composer section.

I hope this information was helpful and will reduce the time you spend on some common tasks.

Jul 26 2016
Jul 26

In my previous blog post, I talked about six design alternatives to avoid slideshows. The response to that blog post was great - who knew there were so many kindred spirits who dislike slideshows? From the feedback I received, the number one question was why are slideshows so bad in the first place? Hopefully this companion blog post will give you that deeper understanding of some reasons not to use a slideshow and maybe help convince your next client that slideshows are a thing of the past.

Why do people still use slideshows?

The hero/banner section is arguably the most important region of real-estate on the homepage of your website. It is a place where your site goals are displayed - whether that is promoting a specific event, convincing users to buy your product, or listing your mission statement. It should be put to best use. So why do most sites clutter it with ineffective slideshows?

  • Slideshows are the norm - politics, marketing trends, etc. cater to the misconception
  • Clients still believe in the ‘above the fold’ mentality - insisting that the most important content belongs at the top of the page. This is true of newspapers, but not always the case in website design.
  • Slideshows display a lot of content - clients use a carousel to get as much content on the screen at one time.
  • Slideshows are “cool” - don’t underestimate the draw of flashy visual eye-candy.

So Why are Slideshows Bad?

My original blog post focused on the design/themer aspects of slideshows. The research that supported those ideas came from the many other blog posts on the subject. Try Googling “Are Website Slideshows Bad” and you will get at least 3,670,000 results. Obviously, I can’t read and give overviews of all the great blog posts about slideshows, but below are some of the main arguments I saw and links to specific blog posts that support each point, for further reading.

  • Slideshows are not effective - The blog post by Erik Runyon supports the idea that having more than one slide is pointless. Studies have shown that people look at and take action only on the first slide. If you do want them to look at more than one slide, make the first slide interesting or useful. The first slide has to sell the next slide to the user.

  • Slideshows can have poor accessibility - most slideshows are lacking in their support for users with accessibility issues, including users with language or motor skill issues. According to the w3.org, there are four main concepts to make a slideshow more accessible:
     
    • Structure: The carousel as a whole as well as individual slides should have structural markup (code) that enables users to establish where they are;
    • Controls: User interaction to change the display must be possible by both keyboard and mouse, as well as being identifiable, both visually and to people who can’t see them;
    • Action: When a control is activated the visually rendered effect should be replicated in actual content and functionality;
    • Scrolling: If the carousel automatically changes slides, a mechanism must be provided to pause or stop the movement.
       
  • Slideshows are a blindspot - multiple eye tracking tests show that slideshows get little attention by site users. Users just ‘gloss-over’ these very important sections of your site. James Royal-Lawson argues that “banner attention and retention is a secondary task for our brains, so even having a slider containing a series of branding images and messages might not be anywhere near as effective as you think.”

  • Slideshows can distract or induce user apathy - According to the blog post by Peep Laja, “Our brains have 3 layers, the oldest part is the one we share even with reptiles. It’s mostly concerned about survival. A sudden change on the horizon could be a matter of life and death. Hence human eye reacts to movement – including constantly moving image sliders and carousels.” Having constant stimulation from slideshows distracts a user from a site’s important content.
  • Slideshows will not increase conversion rates - In theory, a slideshow should entice a user to take an action or otherwise become informed about a site goal or mission, but studies show that slideshows can actually decrease conversion rates due to frustration of use. Fahad Muhammad argues that “Marketers put image sliders on their pages because they give them a chance to feature multiple offers at the same time. And this is a serious problem. They divide the most important real estate of their website between offers. So what happens? Nobody goes home happy. You don’t know how to persuade your customer, so they get decision fatigue and don’t make a decision. You failed to solve their problem.”

  • Slideshows can be bad for SEO/UX - improper header tags, slow page load due to high bandwidth images or videos, lack of alternative image tags, etc., can have a negative impact on your site’s SEO/UX . Harrison Jones’s blog post states that, “As with any website, the more you complicate and add things, the slower the page loading speed. I came across a few sites featuring full-width carousels packed with high resolution images, which greatly impacted the page load speed. Every second it takes to load a page past two seconds hurts the user experience, and has an impact on search performance.”

  • Slideshows on mobile devices can be tricky - slideshows do not always work well on mobile devices and they can even slow down your site due to the amount of bandwidth they use. This can result in lower SEO rankings and poor user experience. In the blog post, Kyle Peatt reminds us to think differently about slideshows - “Don’t use a carousel just to get additional content on the screen. Think of carousels for one particular use case: providing additional content within a specific context. Use a carousel when vertical space is limited — as it is on mobile — and when the content is directly related — especially if the content isn’t useful to the user.”


Don’t Believe the Research - Take the Slideshow Challenge

Although I will admit that as a former scientist, I am dissatisfied the lack of hard empirical and recent data to support the good vs. bad argument of website slideshow, the limited data that is out there is compelling. It would be amazing to find even more studies on the effectiveness of website slideshows. If you have any links, please add them to the comments below.

In the meantime, Brad Frost encourages you to Take the Slideshow Challenge and make your own conclusions about using a slideshow on your own site.

As a Reminder, If You Must Use a Slideshow...

If you simply cannot convince your client use an alternative to a slideshow, at least use a slideshow that is accessibility/UX focused. Some ways to make your slideshow more accessible and user friendly:

  • show the first slide by default and allow a user to navigate through the rest of the slides manually (not auto-rotating)
  • limit the number of slides and make sure the load time is fast
  • create navigation buttons that are highly visible and large enough to be useful on all devices
  • include all the controls available (next, previous, stop/pause, play, etc.) and make sure you can use the controls with a mouse, keyboard, and by touch
  • provide alternative ways to access the content (ex. text transcripts)

By providing accessible and user focused slideshows, we enable more users to access the important content of the site, thereby enhancing the overall user experience.

Additional Resources
6 Design Alternatives to Avoid Slideshows | Blog Post
Friday 5: 5 Problem Areas in Accessibility | Video
Accessibility Best Practices for Content Editors | eBook

Jul 26 2016
Jul 26
Access Readme Files in Drupal 8

In this tutorial we will add a module that makes site maintainers' lives easier.

With Drupal 8 setups you are encouraged to use composer, DrupalConsole and Drush because this is a faster and more effective way of adding components to your site. However, you can't access the readme file directly to read information about the module.

We will show you how to use the README module to access readme files directly from the Drupal 8 admin area.

  • Download, install and enable the README module.
  • Now return to the Extend page. Then expand the details of the module to see the Readme button next to Configure for the Readme module.

readme

  • Selecting the Readme option will open up the readme file within Drupal.

example

If you want to allow external access to the file you can go into configuration and set a security token to enable this feature.

All modules should contain a readme.txt or readme.md if you find yourself using a module that does not have a readme.md I am sure if you put in a request for it they will add it to the module.


About the author

Daniel is a web designer from UK, who's a friendly and helpful part of the support team here at OSTraining.

View the discussion thread.

Jul 26 2016
Jul 26
TL;DR In the past two weeks I had worked on using the Image Properties feature offered by the Google Cloud Vision API to group the image files together on the basis of the dominant color components filling them. In addition, I had also worked on detecting the image files and filling the Alternate Text field based on the results of Label/Landmark/Logo/Optical Character Detection, based on the demand of the end user. This week, I have worked on and developed tests to ensure that the similar images are grouped in accordance to the Image Properties feature of the Vision API.

At present, the Google Vision API module supports the Label Detection feature to be used as taxonomy terms, the Safe Search Detection feature to avoid displaying any explicit contents or violence and the User Emotion detection to detect the emotions of the users in their profile pictures and notify them about it.

I had worked on grouping the images on the basis of the dominant color component(Red, Green or Blue) which they are comprised of. I got the code reviewed by my mentors, and they approved it with minor suggestions on injecting the constructors wherever possible. Following their suggestions, I injected the Connection object instead of accessing the database via \Drupal::database().

After making changes as per the suggestions, I started developing simple web tests for this feature, to ensure that the similar images gets displayed under the SImilarContents tab. It requires the creation of new taxonomy vocabulary and adding an entity reference field to the image file entity. After the creation of the new Vocabulary and addition of the new field to the image file, I created the image files using the images available in the simpletests. The images can be accessed through drupalGetTestFiles(). The first test ensures that if the Vocabulary named ‘Dominant Color’ is selected, the similar images gets displayed under the file/{file_id}/similarcontent link.

The grouping, however, requires call to the Google Cloud Vision API, thus inducing dependency on the API key. To remove the dependency, I mocked the function in the test module, returning the custom data to implement the grouping.

To cover the negative aspect, i.e. the case when the Dominant Color option is not selected, I have developed another test which creates a demo vocabulary to simply store the labels, instead of the dominant color component. In this case, the file/{file_id}/similarcontent link displays the message “No items found”.
I have posted the patch covering the suggestions and tests on the issue queue to be reviewed by my mentors. Once they review it, I would work on it further, if required.
Jul 26 2016
Jul 26

The content_type ctools plugin is the most used type of ctools plugin in Drupal 7. It allows us to quickly build complex (and configurable) components that can be used in the Panels interface. They are quick to set up, the easiest start being the definition of the $plugin array and the implementation of the plugin render function in a .inc file. Have you ever wondered though what the $subtype parameter of this render function is and what it serves?

Most of the time our content_type plugins only have one type so the $subtype argument is the same name as the plugin (and file name). However, it's possible to have multiple subtypes that have slight (but critical) differences while sharing common functionalities. Not many people are familiar with that. Intrigued? Let's see how they work.

When content_type plugins are being processed (loaded, prepared for use and cached), ctools asks them whether there are any subtypes it would like to define or they are single. By default the latter is true but in order to define variations we can either populate an array of subtype definitions in the main $plugin array or implement a function with a specific naming convention: module_name_plugin_name_content_type_content_types. This callback then needs to return the plugin information for all the subtypes of this plugin.

But since it's easier to show than explain, let's take a look at an example. Imagine you need a simple content_type plugin that outputs data which depends on a certain ctools context. You can define your plugin as such:

$plugin = array(
  'title' => t('My plugin'),
  'description' => t('My plugin description'),
  'category' => t('My category'),
  'required context' => new ctools_context_required(t('Node'), 'node'),
);

This is a simple example of a plugin that depends on the Node context. But what if you want it to depend on the Node context OR the current User context? In other words, it should work on the node_view page manager template or the user_view one. Or whatever page these contexts are on but nowhere else.

Instead of required context you could use 'all contexts' => true. But this would then pass in to your render function all the available contexts. And this is neither elegant nor a statement of dependency on one of those two contexts. In other words, it will be available on all page manager pages but maybe won't do anything on most and it's up to the render function to handle extra logic for checking the contexts.

This is where plugin subtypes come to help out. Since your render function does the exact same regardless of context (or very similar), you can have a subtype for each. So let's see how that's done.

First, we simplify the main plugin array:

$plugin = array(
  'title' => t('My plugin'),
  'description' => t('My plugin description'),
  'category' => t('My category'),
);

Then we implement the function that returns the subtypes (following this naming convention):

function my_module_my_plugin_content_type_content_types() {
  return array(
    'node' => array(
      'title' => 'My plugin for nodes',
      'required context' => new ctools_context_required(t('Node'), 'node'),
    ),
    'user' => array(
      'title' => 'My plugin for users',
      'required context' => new ctools_context_required(t('User'), 'user'),
    ),
  );
}

The subtype machine name is the key in the array and the rest is regular plugin definition as we are used to. In our case we define two, each for their respective dependencies. And with this in place we achieve a number of things.

First, we can add the My plugin for nodes content_type plugin whenever the Node context is available and the My plugin for users when the User context is present. They cannot be used in other cases. Second, we ensure that whatever context is passed to the render function is either a Node or a User (nothing else). This can come in really handy when your context is custom and wraps an object that implements a common interface. Third, the $subtype argument to the render function now will be either node or user which is helpful to maybe slightly fork the functionality depending on the subtype.

Clear the caches and give it a go. Let me know how it works out.

Jul 26 2016
Jul 26

I started the week by testing Field Encrypt module with my project Pubkey Encrypt. So Pubkey Encrypt provides support for encrypting data with users login credentials by generating Encryption Profiles. And Field Encrypt provides support to encrypt field values using any specified Encryption Profile. In this way, both these modules are expected to work together in harmony. I tested much and both the modules seemed to be getting along perfectly fine with each other.

When we were in the planning phase for Pubkey Encrypt, Field Encrypt had this issue that sometimes decrypted field data got cached when it was not supposed to. Due to the presence of three cache systems in Drupal 8, i.e. Static cache, Persistent cache and Render cache, the issue needed to be dealt with much care. So I committed in my GSoC proposal to dedicate the last two weeks towards fixing this issue. But interestingly, this issue has already been fixed and now there’s a checkbox in Field Encrypt settings via which a user can set a field as uncacheable.

So I spent quite some time studying the architecture of Field Encrypt and exploring its codebase, so to learn how the maintainers of the module got rid of this cache-related issue. Turns out, whenever a field is requested to be set as uncacheable, the module simply marks the corresponding entity type as uncacheable via this code block in hook_entity_type_alter():

foreach ($uncacheable_types as $uncacheable_type) {
  $entity_types[$uncacheable_type]->set('static_cache', FALSE);
  $entity_types[$uncacheable_type]->set('render_cache', FALSE);
  $entity_types[$uncacheable_type]->set('persistent_cache', FALSE);
}

After I had explored much of the Field Encrypt module, I thought I was in a good position to help in its issues queue. So I tried to fix these two issues:

In my weekly meeting with mentors Adam Bergstein (@nerdstein) and Colan Schwartz (@colan) , we discussed:

  • the GSoC project submission guidelines; we’ve decided to create a separate branch in Pubkey Encrypt github repo with all the commits made during the 3-months of GSoC coding period. We think a link to that branch would meet Google Work Product Submission Guidelines and would make it easy for anyone to figure out the work I’ve done as a GSoC participant.
  • the scenario of an unprivileged user trying to access the Role key value; we’ve decided to throw an error message instead of an exception so to ensure a graceful shutdown of the encryption/decryption mechanism instead of a complete system hault.
  • the scenario of a user trying to change his login credentials without providing existing credentials; we’ve decided to not allow a user do this via a custom form validator for user_form. This means that the password-reset functionality, and other similar features, won’t work on a website with Pubkey Encrypt enabled.
  • the possibility of a feature to ask users, via email, to perform the one-time login when Pubkey Encrypt gets initialized. We think this feature would make it easy for others to use the module. I’d work on it in next week. For now, I’ve created an issue ticket to formally capture this need.

Next I worked on fine-tuning the module documentation; I updated the Architecture document to reflect the latest status of the module, I added a User-stories document to provide step-by-step instructions for using the module with Field Encrypt and I wrote a README file to get unfamiliar users acquainted with the module. I’ve also tried to use simple phrases and real-life examples instead of the technical jargon whenever possible but especially in the README file. One of the results of that effort is that the module’s description has now been changed from a much confusing phrase “Adds support for Credentials-based Public key Encryption support into Field Encrypt” to a relatively simple one “Provides support for encrypting data with users login-credentials”.

I then bundled the default plugins provided by the modules as submodules within the modules. So, for example, now we have pubkey_encrypt_openssl and pubkey_encrypt_phpseclib modules within the pubkey_encrypt module; the former provides an OpenSSL-based Asymmetric Keys Generator plugin implementation while the latter provides a PHPSecLib-based one. In both the modules, hook_requirements() ensures the presence of corresponding external dependencies.

I then worked on an overview page for Pubkey Encrypt. So the module generates Encryption Profiles for each role in the website. But an Encryption Profile for any role should only be used if all users from the corresponding role have performed the one-time login. Otherwise, the security mechanism provided by Pubkey Encrypt won’t work to its full potential. Previously, there was no way of knowing which Encryption Profiles generated by Pubkey Encrypt are ready-to-use and which are not. Now, there exists this overview page:

Then I worked on an experimental feature for the module which involves using cookies for temporarily storing the Private key of any logged-in user. Actually whenever a user logs in, Pubkey Encrypt uses the user’s login-credentials to decrypt his Private key and then temporarily stores the decrypted Private key in a session. We are currently using sessions because ownCloud Data Encryption Model, on which this module is based, uses sessions. But there’s this idea of shifting to cookies, though I have yet to discuss it in detail with my mentors. Even though we have not made any final decision yet, I have still started working on this feature. Because if we do decide to use it, it’d be already there in an experimental branch. And if we decide not to use it, still I’d have learned how to use cookies in a Drupal 8 website.

Since cookies need to be set in HTTP headers, I cannot simply call the set_cookie() method in a user_login() hook. And because hook_init() isn’t present in Drupal 8, I cannot use it either for setting content headers. After much exploration, I finally did it by creating an event subscriber to KernelEvents::RESPONSE and calling $event->getResponse()->headers->setCookie($cookie) in the corresponding event callback function. As expected, Pubkey Encrypt encryption mechanism is working perfectly fine with cookies too. Though the tests are breaking and I have yet to fix those.

See all the code changes I did this week here: Pubkey Encrypt Week 9 Work.

At this point, I’m pleased to announce that all the work which I planned to do in my GSoC proposal three months ago, has been done. I’m super happy about the fact that I still have 3 weeks left and I’ll try to utilize them in the best way possible.

Jul 25 2016
Jul 25

DrupalCon Dublin is right around the corner and we're proud to say that 3 sessions submitted by Appnovators have been accepted!

This will be the first European DrupalCon since the release of the Drupal 8 project this previous November. Organizers are anticipating some of the best sessions, keynotes, and birds of a feather (BOF) roundtable discussions yet. In addition to sharing the latest Drupal knowledge, the planning team is also committed to sharing more and better sessions about getting off the island.

As a team, we had 7 sessions submitted for this Con, but knew that the Program Team had their work cut out for them with a record-breaking submission count for a European Con - 621. Congrats to all 130 selections! Check out all the sessions here. 

The Comunity Keynote was also announced, with Eduardo García presenting: Around the Drupal World in 120+ Days. Eduardo has spent the last few months travelling around the globe, meeting many of the diverse and dynamic members of the Drupalverse. His keynote will cover 3 topics from his journey:

  • Link with the community - community engagement now, and how we can do better
  • Language barriers - making introductions to Drupal multilingual and accessible for people around the world
  • Being a "knowmad" - creating connections for people who travel to different Drupal communities

Here are the 3 sessions we'll be presenting at DrupalCon Dublin:

Track: Coding and Development

Title: Composer Based Workflows for Drupal 8

Speaker: Kevin Moll 

One of the biggest changes in "Getting off the Island" with Drupal 8 is the adoption of Composer.  Composer lets us easily manage dependencies and pull useful functionality from other parts of the PHP community and utilize them in Drupal. Its now not only used to pull in outside libraries, but also can be used to pull in all your Drupal modules as well.

This session isn't just for big companies or expert composer users.  Everyone from small to large companies and beginners to experts will learn how they can leverage composer and build a Drupal 8 workflows to easily mange and deploy code whether it be on a single site, or hundreds of sites.

Track: Core Conversations

Title: Workflow Initiative

Speaker: Tim Millwood

Announced at DrupalCon New Orleans by Dries, the Workflow Initiative is a planned Drupal 8 initiative to bring better content workflow tools into Drupal 8.2.0 and beyond.

At DrupalCon Dublin we will be days away from the 8.2.0 core release. This session will look at what we've been able to get into that release and how it plays with the contrib modules in this space.

Track: Site Building

Title: Enterprise Level Content Staging with Deploy

Speaker: Tim Millwood

Through Drupal 6, Drupal 7, and now Drupal 8, the Deploy module has been the best way to stage content between different environments.

Many of the underlying elements of Deploy are moving into core as part of the Workflow Initiative, this session will not delve into that but it will look from a site builders point of view on how to configure and use these tools.

 

Appnovation is proud to be a Silver Sponsor of DrupalCon Europe, taking place September 26-3, 2016 at The Convention Centre Dublin, Dublin, Ireland. 

For more information and to register, click here. 

Jul 25 2016
Jul 25

by David Snopek on July 25, 2016 - 12:02pm

Just finished a big project for client? Awesome!

Did you selling them a support and maintenance plan for their new site?

No? Well, I'm sorry to tell you: YOU'RE DOING IT WRONG!

But you wouldn't be the only one!

The vast majority of Drupal shops and freelancers build sites and move on without offering a support and maintenance plan, figuring if the client has any problems they can just bill them for it at their hourly rate.

However, you're missing out on several advantages - read more to find out what they are!

Setting client expectations

We all know that a website is never finished. It requires constant maintenance: security updates, bug fixes, changes for the latest SEO and mobile trends (ex. responsive design, AMP), and so on.

Well, we all know that, but our clients might not!

For most clients, when the project is done - the website is "done."

When they find a bug or need an update, they're left wondering, "Didn't I already pay for this?"

Yes, they'll probably come back to you and pay your hourly rate, but estimating the scope of this mini-project and having to sell it to them is an unnecessary source of tension for both sides.

But if you sell them a support and maintenance plan when selling them the initial project, you help to set their expectations for life after the project is over. They'll understand from the very beginning that the website needs constant maintenance and the framework for receiving it.

Get future work!

How many times have you gotten a project to re-build a new website for client who hates their old buggy website?

Have you ever wondered why they didn't go back to the person or company who built that old site and had those bugs fixed years ago?

Well, if you didn't sell your clients a support and maintenance plan, then YOU were probably the person who built the "old buggy site" in a couple of cases - and you didn't even know it!

Any unmaintained site is going to get buggy. Once people start to be annoyed or despise their site, they'll distrust the work that was done on the site originally. So, when they reach their breaking point, they'll go to someone new to build a new site, rather than return to you.

But if you have a history of fixing their problems and answering their support questions on a regular basis... First of all, the site won't become an unmaintained mess... But also, you'll maintain a long-term positive relationship with the client and when they need a new site or feature, you'll be at the top of their minds!

And it's easier and cheaper to get a new project from an old client, than chasing new clients.

Recurring revenue

One of the hardest parts of doing project work is managing the feast and famine cycle.

You have times when all your potential clients finally sign the contract on the same day and you're struggling to figure out how you're going to do all at once.

And you have the times (frequently just before that happens :-)), when you don't have enough work to keep you busy and you're hustling like crazy to find new clients!

Support and maintenance work can help to fill the gaps and stabilize your business.

Look out for your clients best interests

There's a number of things that your clients need - I mean really NEED - but they might not know or understand that they need them.

For example: security updates.

Your clients care that their customers can buy their products or that the contact page works. Security is something abstract that, sure, they know they need it, but they might not understand that security is an on-going process and not a box that you tick off once and are done.

Providing your client with a support and maintenance plan allows you to stay on top of things like security updates for your clients. While they might never fully understand the value in that, it is in their best interest, and they'll feel the effect (if only passively) by not having the stress of their site getting hacked.

This is something you need to do to look out for your client's best interests - and not just "sell and run" :-)

Don't have the extra time? Outsource it!

I know - you have enough to do already! You've got a big project in the works and you can't drop everything every time a previous client's site has a critical problem or even when it needs a minor update.

Well, you can get all the advantages discussed above by outsourcing your support and maintenance to partner company (like myDropWizard)!

Contact us today about white-label Support and Maintenance for your clients!

... or learn more about how our white-label service works!

Jul 25 2016
Jul 25

Thank you on a neon signThank you, to everyone who participated in our 2016 certificate campaign. We sent a personalized certificate to everyone who joined or renewed their Association membership. We also asked members to help boost our outreach by encouraging others to join or renew. These contributions matter. The funds you helped raise support the Association’s work, and your goodwill inspires us. Whether you became a member for the first-time, renewed, or took the time to share, you made the campaign a success.

Success feels great

From May 1 - June 30, 335 people became new members and 476 members renewed. For comparison, those numbers were 233 and 378 during that period last year. This campaign brought our total membership to 3,670 individuals and organizations. That’s a 12% increase over this time last year (from 3,290). Our certificate goal was to deliver 675 by the end of the campaign. But you helped us crush it. We delivered 854, exceeding that goal by 27%.

Lessons learned

This year, we created a landing page on assoc.drupal.org and promoted it via blog post, social media, and newsletters. One month into the campaign, we launched a new banner on drupal.org, and sent an email to members, asking for help sharing the campaign. From the attribution provided by members on the sign-up form, most learned about membership via drupal.org or through a community member/organization. Therefore, the banner/landing page and direct message to our members was more effective than using our social media channels and newsletters.

About campaign components

Last year, we used social sharing, a blog post, and newsletters. We also added new content to our existing membership and contribution pages. We didn’t create a landing page. The results? There were 885 tracked pageviews of the campaign-related blog post (during the campaign period) and we delivered 611 certificates.

This year, we did a little more to test whether a banner on drupal.org could make a difference. It definitely did.

We launched with a blog post on May 2, but this time we added a landing page. When the campaign ended, we’d had 1,025 tracked pageviews on the blog post (a 15.8% increase from last year). However, we didn't see a jump in membership sales (296 total) or much traffic to the landing page. On June 1, we added a banner to some drupal.org pages. That’s when it got interesting.

Traffic came from drupal.org, not assoc.drupal.org

The landing page we launched in May had 16,768 tracked pageviews during the full campaign period, but 98% of them (16,410) came after the banner was launched on drupal.org. June had 517 membership sales, and 50% of those were new members (up from 34% new members in June 2015).

This screenshot shows traffic to the landing page before and after the banner launch.

google analytics shows traffic spike and sustained high level after launch of banner

Digging deeper into the data, we looked at what members wrote when asked how they found out about membership. New members told us it was via drupal.org (54.8%) or thanks to a community member (19.3%). These percentages were even higher than when looking at total members from the campaign period. If we want to increase overall membership, having the landing page and banner combination is the way to go.

pie chart shows 44.6% of members from campaign period report drupal.org, 16.4% report a community member, and 18.% report DrupalCon as source

pie chart shows 54.8% of new members from campaign period report drupal.org, 19.3% report a community member, and 10.7% report DrupalCon as source

Compared to the 2015 campaign’s data, there were 123% more responses driven by evangelism, and 108% more mentions of drupal.org as the start of a member’s user journey.

You love selfies as much as we do

Thanks for getting in front of the camera! It came as no surprise that so many of you responded to our call for selfies. Our community is full of caring members who love to share. Not only did this make for a fun time, but it helped show the people behind Drupal.

What’s next?

A note about content: regrettably, we showed the same banner to all visitors, and its language caused some confusion about what members could do to help. We'll be mindful of that for future editions.

In the meantime, you can still help continue the momentum of this campaign. Reach out to us. Tell us why you’re a member. Share why you’re a member of the Drupal Association when you renew your membership—or anytime, really. No matter where you share, you help us help the community, and we all make a difference for each other and for Drupal.

Jul 25 2016
Jul 25

Yesterday all the accepted sessions for DrupalCon Dublin were announced, and we are delighted to report that 5 of our 8 session proposals were accepted! With Acquia being the only company receiving more acceptances, we are extremely proud of our achievement.

Testament to our high standing in the Drupal community, we are the only Irish company speaking at DrupalCon Dublin. Our accepted sessions this year span a number of different tracks, namely Business, Horizons, Site Building, Being Human and Core Conversations, and cover topics from accessibility to remote working to building mobile apps with the Ionic framework. Congratulations to all our speakers!

Here's a quick run down of each session.

Building a co-lingual website - lessons learned from ireland.ie

Speaker: Alan Burke
Track: Site Building

2016 marks the centenary of the 1916 rising in Dublin, a pivotal year in Irish history, and is marked with a series of high-profile events commemorating the rising. ireland.ie is the official state website for the 1916 commemoration and runs on Drupal 7.

While English is the main language in Ireland, Irish is the first official language. A decision was taken to present both languages side by side wherever possible for the 1916 commemorations - including on the website. This session will focus on the unusual co-lingual [2 languages side-by-side] approach, and how Drupal made it possible. 

Choosing Drupal - insider advice from an Irish multinational

Speaker: Alan Burke & Aisling Furlong from Glanbia
Track: Business

Struggling to sell Drupal to clients? Ever wondered what goes into the decision making process when choosing a CMS?
In 2014, Glanbia selected Drupal as the CMS of choice for marketing sites. This session will outline the decision-making process used, and what Drupal agencies can learn when pitching Drupal. This is a joint session proposal between Annertech and Glanbia.

Bridging the PhoneGap: Getting Started Creating Hybrid Mobile Apps with Drupal and Ionic Framework

Speaker: Mark Conroy
Track: Horizons

With the advent of hybrid mobile apps, you can continue being a Drupal frontend developer and also build apps without needing to learn new technologies. The mobile web is quickly catching up with native apps. The mobile web is free, and open, and available to all of us right now and doesn't bind us to proprietary systems. With the many advances being made in this area, we can create great mobile experiences for users.

Future Directions for Drupal Accessibility

Speaker: Andrew Macpherson
Track: Core Conversations

Drupal has made great advances in accessibility over several major releases, and nowadays ranks as a leading implementation of web accessibility standards.  This session will encourage contributors to look ahead at future challenges and opportunities for accessibility during the faster 8.x (and 9.x) release cycle. 

Happiness is... remote working

Speaker: Anthony Lindsay
Track: Being Human

Many Drupal agencies have remote workers. Some are entirely distributed. Whilst remote working is beneficial to all concerned in so many ways, it does come with its own challenges. This talk will cover the journey I took when I moved from a typical 9-5 office job and joined Annertech, which is an entirely distributed Drupal agency. It will highlight the challenges I found: the good, the bad, the funny and the downright surprising, and offer as examples, my experiences for staying happy and healthy in what has the potential to be an isolating environment. 

Congratulations to Alan, Anthony, Andrew and Mark on their great achievement. We look forward to seeing these and all the other great sessions at DrupalCon Dublin in September. Hope to see you there!

Jul 25 2016
Jul 25

More often than not with nearly all the projects I work on with Appnovation, we're tasked with some degree of updating, redesigning and modernizing an existing web presence, whether that be a dated website, re-modelling a business web presence inline with its evolution or de-commissioning legacy data services. The target result being something new and shiny. Additionally, now with the dawn of Drupal 8, a lot more requests to port a site into the latest and greatest version of Drupal. 

Ideally this task would be made simple if the existing system had been built in a sensible way, with a clear separation between content and presentation, a clear information architecture approach and a minimal dependency tree. However, our ideals are never really reality, and in my experience redesigns are never really just redesigns in the sense that replacement or a few updates to the CSS would suffice.

When our clients look to consider a significant change to their website, like a re-branding refresh, it's often a good opportunity to think about their audience, their needs, goals and motivations and reflect that back into their content and business models, as to improve not just the look of a site but also its positive impact in whatever measure they wish to monitor that. Additionally considering some of the benefits that the new version of Drupal brings that in previous versions may have been difficult to implement or never even considered.

With that in mind, a ‘redesign’ is now much more than just updating or replacing a theme and moving to Drupal 8. With a little luck (and for those non-Appnovation readers) your sales team would have correctly guided the client into not thinking that it’s just re-skinning and will be a quick and simple job, with a fast turnaround. When inheriting a new redesign, it's always wise to take a look under the hood as early as possible. This will ensure that the potential nasty mess of hideous hacks and unstable code that needs a new paint job will be highlighted in the client / vendor relationship and the changing of its look and feel will be better understood, scoped and sized accordingly and all parties to be crystal clear on the challenges ahead.

To add a little context and the rationale behind a desire to redesign, these are then often coupled to a wider project of re-branding and are often associated with timescales and expensive deadlines that are often always agreed upon way in advance of any real knowledge of scale being made aware. Now herald this warning, for whatever reason, it’s not unlikely that redesign projects will find themselves behind schedule, or over budget, usually because a schedule has been guessed at, and agreed by non-technical sources way before the size and scope of the redesign is truly known. In this situation the perceived agile wisdom is that time and resources are fixed, so you know the scope will need to be changed or reduced. But what about an MVP (minimal viable product) for a redesign? When you’ve got an existing product, how much of it do you need to rework before you put the new design live?

Let's consider a few variables in this, and thinking purely in a Drupal orientated site: How dated is the existing site? Is it responsive? What’s the underlying platform version powering it? Are the contrib modules available for direct replacement and so on? These are all on top of the foreseen desired redesign scope; this assumes the proper UX and visual design streams have both spun up and are in advanced stages.

With interesting challenges, a lot of varying answers and paths present themselves; you could begin with sizing the impact of each request and weighting them in priority of importance to focus on the most important tasks first. For that to be an efficient delivery, you'll need to have a good feel for the current burn down, or in the instance of a nearly formed team, what is their potential burn down (this is where having experienced resources in place helps as they can usually provide a pretty accurate estimate range). That way you can be confident on what can be achieved in a given time frame.

Next up comes more brave decisions, once the fantasy deadline is a known distant dream. What features can be shelved or possibly dropped? What value are new features bringing that are worth keeping? In other words, a redesign can (and should) be an opportunity for a business to look at their content strategy and consider rationalising the site. If you’ve got a section on your site that isn’t adding any value, or isn’t getting any traffic, and the development team will need to spend time making it work in the new design, perhaps that’s a candidate for the chop?

Let's also consider prioritizing elements that will get us most of the way to ultimate completion. This fits back into the Agile development principles where each sprint should deliver a shippable, potentially production ready features. I would interpret this to mean that we should make sure that the site as a whole doesn’t look broken, and then we can layer on the bells and whistles afterwards, similar to the progressive enhancement approach when dealing with legacy browsers. If you're not sure whether you’ll have time to get everything done, don’t spend an excessive amount of time perfecting one section of the site to the detriment of basic layout and styling that will make the whole site look reasonably good.

Try starting with a simple set of styles that cover all common HTML elements, then put these into known common components (like forms / navigations etc.). This will become the foundation to build the rest of the site up on and will ensure all the components that build a page up are presentable and uniform. You can even test these components against their real contexts of use, user motivations and persona driven user flows driven out by the UX sessions. Also, by this point, the content audit should have highlighted the depth, variety and count of different layouts the site will be accommodating, so you should be able to validate whether these look good and are presentable enough to ship.

There is another option though, its slightly more radical but can work for the largest of redesign projects. Ask do you have to deliver all the changes at once? Can you (and should you) do a partial go-live with a redesign? (Depending on how significant the redesign change is, the attitudes to these changes and capability of continuous delivery within your development team and of course the client’s appetite.) Also consider the technology stack involved; it may make sense to deliver changes incrementally. In other words, put the new sections of the site live as they’re ready, and keep serving the more important old bits from the existing soon to be legacy system. This could deliver some inconsistency in the look and feel and a little bit of awkwardness in user flows through the site, but if there are obvious stand alone sections of the site that users have been identified to stay exclusively within, then chances are they won't even be hitting the new stuff anyway.

To close… We shouldn’t fall into the trap of thinking of redesigns as big-bang events that sit outside the day-to-day running of a site. Along with software upgrades, redesigns should be considered as part of a business’ long-term strategy, and they should be just one part of a plan to keep making improvements through continuous delivery.

Jul 25 2016
Jul 25

Our existing users may have already noticed a few changes and improvements in Drop Guard. However, not everything is visible enough, so we decided to make a short list with the recent updates.

Composer support

Drop Guard is now capable of managing your composer.json and composer.lock files, in the same fashion as you would do it normally via CLI.

When executing the update task, Drop Guard modifies the composer.json to accommodate the recommended module or core version and runs "composer update" command to keep the composer.lock in sync. Both files get pushed to the repository, and the only thing you need to take care about is running "composer install" to receive the updated packages.

Both repositories - the official Drupal.org one and the Drupal Packagist are supported.

We are providing personal assistance for those having trouble with the setup or configuration. Just drop us a note and we will arrange a personal onboarding and setup session.

We encourage everyone who's in love with Composer to test this feature and give us your generous feedback. As always, we're looking for both positive and negative thoughts - don't be shy!

The Dashboard

Not long ago the only way to manage update tasks and statuses across multiple projects was to inspect each project connected to Drop Guard separately. Things have changed now - you have a sleek dashboard-like interface where tasks from all projects are collected. You can adjust the Dashboard to fit your needs by rearranging widgets or adding new ones. 

The Activities page and widget give you a bird's-eye view of everything going on under the hood - all update attempts, patches application, changing statuses - arranged in chronological order.

Latest release date

Smallish, but a long-awaited feature (we take the responsibility for you waiting so long for it) - you can see the date of the recommended release next to the project version on all pages.

Project management systems support

We have started implementing project management systems support, starting from Jira (Redmine will follow shortly, as well as other popular services). Here's a little background on how it works:

On the project edit screen, you can provide your Jira/Redmine credentials (we recommend to create a separate user with the correct permissions for that), and map Drop Guard tasks statuses to the corresponding statuses in your system.

Once it's done, Drop Guard will start establishing connections between its own tasks and the tasks in the project management system. All you need to do is to create the Action reacting to the "New updates are available" event.

Next, when Drop Guard creates the update task, it will create the appropriate task in your system, listing the modules, their versions and everything you'd like to be included in such task.

Both tasks will be kept in sync, so once all tests and checks for an update are performed, and you close the project management task (or give it another status), the corresponding Drop Guard task will be also closed (or the status will be changed).

It allows you to manage Drop Guard tasks from the external system without ever visiting Drop Guard itself. How cool is that!

And as always, due to Drop Guard's extensible nature, you may choose to fire up the external task only if there's an error or the updates are ready to test, just make it work according to your established workflow.

You use another system? Send us a note with your favourite one and it may happen it will be added soon. Can't wait to hear from you!

A lot of speed and stability improvements were made as well, so don't forget to check your existing account (or create a new one), to see it by yourself.

We have more exciting news to come. Keep your ears open!

Jul 24 2016
Jul 24

Contributing to Drupal

Drupal has a fantastic community of contributors. People help with everything, from coding tasks to checking documentation and triaging issue queues on drupal.orgWith a community as large and diverse as Drupal's, it should come as little surprise that contributors range from full time, sponsored maintainers to casual hobbyists who choose to dedicate a few hours a week to improving something they use and care about. 

Getting started

For people looking to get started with contributing, it can be a bit difficult to work out where to begin. A good place to start is drupal.org's own overview on contributions. This page links through to a really helpful contributor task page which explains the many ways that the technical, and non-technical, can get started.

A common misconception is that you need to be an experienced software developer to make improvements to Drupal. Whilst there is often no shortage of technically demanding issues, there are a large number of simpler, smaller tasks which are just as important.

A very quick way of accessing this information is to (ironically, perhaps?) use the advanced search on the core Drupal issue queue. All of the tasks listed here have been tagged with ‘novice’ and should provide easy pickings for most people. For example, to find Drupal core's issue queue you would go to https://drupal.org/project/drupal, then find the 'all issues' link in the sidebar and follow the 'advanced search' link near the page title.

Drupal

In my opinion, the best way to get started with contributing to Drupal is to attend a sprint day or local developer event, and find a friendly mentor who can offer you some support to avoid the dreaded feeling of getting stuck, bored and disheartened. 

If you're a developer with a reasonable understanding of PHP frameworks, have a go at writing and submitting a patch. There are usually front end tasks and ticket triaging; almost anyone can check whether steps to reproduce a bug still work. Even if your first ever contribution is to read an issue's summary and leave a comment saying “it's actually far more complex than first estimated” - this is a very useful first step. Every little contribution makes a difference.

Contributions by Torchbox

Our team's contributions are summarised on our drupal.org organisation page. A lot of our developers' efforts go towards writing patches for existing/ new bugs on contributed modules. We also think it's really important that the projects we're most proud of are written up as drupal.org case studies. Not only does this promote our best work, but it also raises the profile of Drupal as a product that can deliver success for a huge variety of organisations and individuals across the globe. Many of our developers are also active in the #drupal-uk IRC channel: a public chat room inhabited by friendly Drupal community members in the UK.

Jul 24 2016
Jul 24

Mike interviews Gregg Marshall, Enzo Garcia, and Daniel Schiavone live from Drupal GovCon 2016! Gregg discusses his new book, Enzo talks about his upcoming community keynote and the upcoming DrupalCamp Costa Rica, and Daniel previews Baltimore DrupalCamp and discusses preparations for Baltimore DrupalCon 2017.

DrupalEasy News

  • The Fall, 2016 session of Drupal Career Online begins September 26; applications are now open.
  • Online and in-person workshops; introductions to both module and theme development for Drupal 8. See the complete schedule.

Sponsors

Follow us on Twitter

Intro Music

Subscribe

Subscribe to our podcast on iTunes, Google Play or Miro. Listen to our podcast on Stitcher.

If you'd like to leave us a voicemail, call 321-396-2340. Please keep in mind that we might play your voicemail during one of our future podcasts. Feel free to call in with suggestions, rants, questions, or corrections. If you'd rather just send us an email, please use our contact page.

Jul 23 2016
Jul 23

Ever since Andrew joined Annertech, he's been a champion of accessible web design and has ensured that accessibility has remained a key focus area in everything we do. That combined with his dedication to open source and contributing back to the community, meant that we were not surprised when he was asked if he'd be interested in becoming a Drupal core accessibility maintainer.

Andrew is truly passionate about accessibility and has increased the knowledge and awareness of issues encountered by people with disabilities for all members of our team. We can not think of a better candidate for a new Drupal core accessibility maintainer.

His response when asked to be a Drupal Core maintainer?

I was really stoked when Mike asked if I'd consider becoming a core maintainer. I have barely stopped bouncing around my home.

Congratulations from everyone in Annertech Andrew!

Jul 22 2016
Jul 22

In our weekly roundup of higher education notes and trends, you can usually count on three themes being discussed by the academic community: student demographics, budget constraints, and technology. In this post, we'll expand more on these themes by sharing some of our own insights, and we'll cover a few unique and emerging technology trends across higher education and technology.

Virtual Reality on the Horizon in Higher Education

As a web agency specializing in building high-end websites for colleges and universities, anything technology related that has the potential to impact the sector is sure to get our attention. As a VP at our agency, imagine my enthusiasm when I read Inside Higher Ed’s article on virtual reality in the classroom!

The technology is still in its infancy. As such, it’s oftentimes expensive to produce and procure, so it will likely be years before we see it make any kind of tangible impact in schools. That said, the potential it may have in the future on learning outcomes is significant. Imagine complimenting a history lesson with a virtual reality tour, or studying rock formation in a geology class by seeing it in augmented reality. Expensive field trips? No need! Plug into virtual reality and tour the world right from your seat! Another hypothesized value will be the ability for a truly global classroom where virtual classes can meet “face to face” and work together on problem-solving. 

Technology has come a long way; I remember how excited the classroom would get when the teacher rolling in a bulky tubed-TV meant we’d get to watch a grainy, severely outdated educational video. Kids today - they have no idea how lucky they are.

Machine Learning -- Adapting Content To Complex Higher Education Websites

In a previous blog post, we reviewed an interesting new trend in higher education where learning management systems were starting to predict student outcomes by their usage patterns. That particular article noted the stats that showed that the more a student logged into the system during the first week or two of classes, the higher the probability they would succeed in the class. 

The concept of a “machine learning” has been prevalently used in the commercial sector for years now, with trendsetters like Amazon and Facebook serving up product and advertising suggestions based on your purchase history, “likes” and what websites you’ve visited. But the application for higher education is just as promising.

As a specialized web agency that does most of our work for higher education institutions, we’ve been introducing machine learning or “personalization” concepts to our clients for some time now (if you want to read more about personalization in higher education, check out this blog post we published last year). Higher education websites are what we call “content complex,” meaning they have a large number of distinctly different visitor types (what the web industry calls “personas”) frequenting them. Prospective students, parents of prospective students, enrolled students, parents of enrolled students, faculty and alumni is the best case scenario; often times our clients will have very different types of prospective students who require further segmentation (imagine international students vs. local). How does one landing page identify and speak to six unique types of visitors? Personalization technology, that’s how. 

When used, web personalization technologies can log specific user criteria and attributes such as age, location, purchasing behaviors, social media and more. With user attributes logged and indexed, businesses can deploy unique, adaptive content (even web pages) that are custom tailored to individual users. A sophisticated personalization strategy will have a unique web experience for each type of persona where everything from the content, images, colors and messaging has been tailored for them. As a person engages with the website, the technology “learns” more about what they are looking for and can serve up relevant content to them (just like Amazon suggests products to you based on your search and purchase history). 

Of course, this level of personalization requires a complex and thorough content strategy that many institutions simply do not have (yet). We often recommend simpler ways of personalization such as explicitly asking the user when they arrive who they are as a starting point. This basic framework can be evolved as the content strategy of the website becomes more refined. ImageX believes that personalization will be the foundation of user experience on the web moving forward, much like responsiveness for a mobile experience is today. Getting started on this trend now will make future adaptations that much more efficient.

Native Mobile Apps

It seems like every organization has or wants to have a native app; in some cases for good reason, while others, not so much. If an organization has a customer base that needs to interact with a large and often complex data set or tasks on a frequent basis, a native app is likely a good idea. Mobile banking on an app is a great example; the complexity of a banking website and the volume of content make the mobile experience cumbersome for specific interactions, such as paying bills or transferring money (what people generally refer to as “doing their banking”). A banking app immerses the user in that specific set of tasks, with a light set of complementary content and features. The business case for higher education is, in our opinion, just as strong as it is for mobile banking.

The obvious use case for higher education is current students managing their courses. Class schedules, assignment submissions, reminders for upcoming deadlines or events, test score notifications, paying tuition; you get the point. It wasn’t that long ago I was in University and the student portal my school had wasn’t even mobile responsive; if you couldn’t get to a desktop computer, you were in trouble. Another potential use case we parents here at ImageX often discuss is a parent app for those of us with children in post-secondary. Imagine if we could stay up to speed on children’s class schedule, assignments and grades? Oh, the possibilities…

Like to stay on top of higher education notes and trends? Subscribe to our newsletter below!

Jul 22 2016
Jul 22

In our weekly roundup of higher education notes and trends, you can usually count on three themes being discussed by the academic community: student demographics, budget constraints, and technology. In this post, we'll expand more on these themes by sharing some of our own insights, and we'll cover a few unique and emerging technology trends across higher education and technology.

Virtual Reality on the Horizon in Higher Education

As a web agency specializing in building high-end websites for colleges and universities, anything technology related that has the potential to impact the sector is sure to get our attention. As a VP at our agency, imagine my enthusiasm when I read Inside Higher Ed’s article on virtual reality in the classroom!

The technology is still in its infancy. As such, it’s oftentimes expensive to produce and procure, so it will likely be years before we see it make any kind of tangible impact in schools. That said, the potential it may have in the future on learning outcomes is significant. Imagine complimenting a history lesson with a virtual reality tour, or studying rock formation in a geology class by seeing it in augmented reality. Expensive field trips? No need! Plug into virtual reality and tour the world right from your seat! Another hypothesized value will be the ability for a truly global classroom where virtual classes can meet “face to face” and work together on problem-solving. 

Technology has come a long way; I remember how excited the classroom would get when the teacher rolling in a bulky tubed-TV meant we’d get to watch a grainy, severely outdated educational video. Kids today - they have no idea how lucky they are.

Machine Learning -- Adapting Content To Complex Higher Education Websites

In a previous blog post, we reviewed an interesting new trend in higher education where learning management systems were starting to predict student outcomes by their usage patterns. That particular article noted the stats that showed that the more a student logged into the system during the first week or two of classes, the higher the probability they would succeed in the class. 

The concept of a “machine learning” has been prevalently used in the commercial sector for years now, with trendsetters like Amazon and Facebook serving up product and advertising suggestions based on your purchase history, “likes” and what websites you’ve visited. But the application for higher education is just as promising.

As a specialized web agency that does most of our work for higher education institutions, we’ve been introducing machine learning or “personalization” concepts to our clients for some time now (if you want to read more about personalization in higher education, check out this blog post we published last year). Higher education websites are what we call “content complex,” meaning they have a large number of distinctly different visitor types (what the web industry calls “personas”) frequenting them. Prospective students, parents of prospective students, enrolled students, parents of enrolled students, faculty and alumni is the best case scenario; often times our clients will have very different types of prospective students who require further segmentation (imagine international students vs. local). How does one landing page identify and speak to six unique types of visitors? Personalization technology, that’s how. 

When used, web personalization technologies can log specific user criteria and attributes such as age, location, purchasing behaviors, social media and more. With user attributes logged and indexed, businesses can deploy unique, adaptive content (even web pages) that are custom tailored to individual users. A sophisticated personalization strategy will have a unique web experience for each type of persona where everything from the content, images, colors and messaging has been tailored for them. As a person engages with the website, the technology “learns” more about what they are looking for and can serve up relevant content to them (just like Amazon suggests products to you based on your search and purchase history). 

Of course, this level of personalization requires a complex and thorough content strategy that many institutions simply do not have (yet). We often recommend simpler ways of personalization such as explicitly asking the user when they arrive who they are as a starting point. This basic framework can be evolved as the content strategy of the website becomes more refined. ImageX believes that personalization will be the foundation of user experience on the web moving forward, much like responsiveness for a mobile experience is today. Getting started on this trend now will make future adaptations that much more efficient.

Native Mobile Apps

It seems like every organization has or wants to have a native app; in some cases for good reason, while others, not so much. If an organization has a customer base that needs to interact with a large and often complex data set or tasks on a frequent basis, a native app is likely a good idea. Mobile banking on an app is a great example; the complexity of a banking website and the volume of content make the mobile experience cumbersome for specific interactions, such as paying bills or transferring money (what people generally refer to as “doing their banking”). A banking app immerses the user in that specific set of tasks, with a light set of complementary content and features. The business case for higher education is, in our opinion, just as strong as it is for mobile banking.

The obvious use case for higher education is current students managing their courses. Class schedules, assignment submissions, reminders for upcoming deadlines or events, test score notifications, paying tuition; you get the point. It wasn’t that long ago I was in University and the student portal my school had wasn’t even mobile responsive; if you couldn’t get to a desktop computer, you were in trouble. Another potential use case we parents here at ImageX often discuss is a parent app for those of us with children in post-secondary. Imagine if we could stay up to speed on children’s class schedule, assignments and grades? Oh, the possibilities…

Like to stay on top of higher education notes and trends? Subscribe to our newsletter below!

Jul 22 2016
Jul 22

Of the many things that contribute to the success of a project, communication is the most important. While every project will differ in its requirements, team members, and plan, at the most basic level their goals should always be the same: to add value for the client. Open communication -- that is, the free exchange of ideas, collaboration, and ensuring clarity and direction is the lynchpin that holds a project together in the pursuit of that goal.

At ImageX, we believe in using the right tool for the job. And while “tool” usually means the specific software our staff uses to execute tasks, it also extends to the individuals themselves and how we bridge together teammates and project details. Among the many benefits of being one of the top-ranked Drupal agencies in the world is that we attract some of the top-ranked talent in the world -- and just like we don’t confine ourselves to a specific geographic area when we’re choosing which clients to partner with, neither do we for the teams we build to serve them. That team is based in our office in Vancouver, but it also includes those of us who help expand the depth and breadth of our agency -- remote employees, or as we affectionately call them, our “remotees.” 

For those of us who do work remotely, myself included, the benefits are vast:

  • We’re liberated from our desks;
  • Our morning commute is usually from our breakfast table to our den or workspace;
  • Or for that matter, our office is wherever we make it -- a café, library, or even on our travels; and,
  • We have the flexibility to set our own schedules and be more available for life’s demands (as long as we’re available for meetings, of course -- more on that below).

And for ImageX, the world becomes our talent pool and this allows us to hire the best people available for every position -- whether they’re in Vancouver, Toronto, Sweden, Ohio, Seattle, Taiwan, Florida, or the Ukraine (ImageX has remotees in all of these locations).

Working remotely also has quantifiable benefits to a business’ bottom line:

  • Two-thirds of managers reported that employees are more productive when working remotely;
  • 54 percent of remote workers reported completing as much or more work in less time because of fewer distractions;
  • 82 percent of remote workers reported lower stress levels;
  • Attrition rates fall by as much as 50 percent;
  • 68 percent of younger workers said that the option to work remotely would “greatly increase” their interest in a specific employer; and,
  • Businesses can significantly lower their overall operating costs.

And not to mention the environmental impact of fewer people commuting. When health insurance company, Aetna measured the benefits of their remote working policies, they found that their employees drove 65 million fewer miles, saved over two million gallons of gas, and reduced carbon dioxide emissions by over 23,000 tonnes per year.

A remotee (me on the laptop screen) joining our weekly #toughcoders push-up challenge by webcam, via Google Hangouts.

But working with distributed teams isn’t without its challenges. Communication problems can surface easily, whether because of logistics due to time zones or something simply being lost in translation online, and it’s easy to feel isolated at a home office and disconnected from your team. Like any project, overcoming these challenges and adding value to your team comes from having a strong plan in place to mitigate them. Building your team with the right mix of individuals, having a structured communication plan in place, and using the right tools for each job can help you realize the benefits of working with a distributed team.

Building Your Remote Team

Managing distributed teams introduces some additional considerations to make when you’re recruiting for new staff. Outside of the core competencies of each position, we’ve found emphasizing these four qualities to be a good predictor of success:

  • Is the candidate self-motivated? Working autonomously and independently requires a very high degree of self-motivation, rather than the constant encouragement and motivation that can be expected in a traditional office environment.
  • Does the candidate have strong communication skills? With limited face-to-face contact, above-average communication skills become even more important. Can the candidate communicate clearly and concisely, regardless of the medium, and accommodate for the subtlety and nuance that can often get lost?
  • Is the candidate results-driven? In the absence of more subjective evaluations, it’s important that your team members set clear objectives and that they’re measured against them.
  • Is the candidate open, honest, and transparent? This one is often the most important because you’re relying on your team to pro-actively raise any problems or concerns that could otherwise slip by unnoticed if people confine themselves to communication silos. The more forthcoming and straight-forward, the better.

Building teams of self-sufficient individuals who are empowered to work autonomously will encourage open communication and collaboration between members, rather than top-down (micro)management.

A SMART Communication Plan

With a distributed team, it’s essential that all members unite around a clearly defined and shared goal or purpose. A strong project and/or client manager can act as the advocate for this goal or purpose when any gaps occur and be the face of the team to the client. 

When defining the team’s goal or purpose, consider the SMART framework:

  • Specific
  • Measurable
  • Attainable
  • Relevant
  • Time-bound

Whether it’s included in a formal team or project charter, or more informally in how the project manager oversees the team day-to-day. 

Creating and fostering a results-driven culture is essential. Rather than tracking the team’s working hours (though we still track project hours for billing, efficiency, and accountability to the client), it’s more important that they’re able to produce results that drive the team towards their goals on a sustained basis. And it’s incumbent upon the project manager to ensure continued clarity on what those goals are.

Bringing your team together for regular meetings, either in-person or digitally, is the best way to make certain of this. Daily stand-ups for the project team where each member shares what they worked on yesterday, what their goals are today, and if there is anything blocking their progress shouldn’t take more than 10-15 minutes each morning, but will save exponentially more time in focus.

Weekly team-wide stand-ups allow department leads and upper management to share higher-level progress and help bring any remote staff out of their project silos and into the “office”. We don’t use these meetings to discuss the specifics of any projects -- rather, they bring the team together so that we can hear each other’s voices and see each other’s faces (even if they’re just on a screen), and it reinforces the bigger picture that each individual project is working towards.

The Right Tool for the Job

Once you have the right team members in place and a plan to facilitate communication, you need the tools in place to keep them connected. Using the right software can make communication seamless and effortless, and gives teams the advantage of having every project discussion documented, archived, and searchable -- far from the risk of impromptu drive-bys in the office. 

Every project needs a central repository that captures the tasks, responsibilities, and dependencies involved. While physical backlogs with Post-It Notes are great for the office, they don’t help distributed teams. Trello is an easy-to-use Kanban-style board that lets you drag-and-drop cards between lists to show progress in real-time. Or for a more comprehensive and collaborative solution, we like Basecamp and Jira.

For conversations between team members and clients, in-person meetings allow the participants to communicate verbally as well as non-verbally. There is no substitute for this, but web-based tools like Google Hangouts, Skype, and Slack are the next best thing. Their video functions help approximate in-person meetings and allow the participants to see each other’s faces to better detect nuance. And as a bonus, they have the added functionalities such as screen sharing that further simulate a meeting room setting.

Skype and Slack are particularly helpful for team members in different time zones who may be limited in the meetings they can attend. Because they archive the transcripts of any typed conversation, it’s easy for anyone to catch themselves up at the beginning of their day without the risk of anything being missed in a game of broken telephone. It also allows for easy searching anytime someone needs something confirmed.

Finally, document repositories and collaboration tools like Google Drive and Dropbox can centralize any templates, documentation, design artifacts, and project assets while providing versioning control as well as the ability for multiple team members to collaborate on the same file at the same time.

Final Thoughts

  • It can be difficult for a team to build a positive culture when its members are distributed -- it’s not as simple as grabbing a coffee or going out for lunch. But a strong team culture extends beyond being social. It’s also about “seeing a vision, aligning to a mission, creating a sense of community and belonging and having loyalty to a project that gets people excited about work.”
  • Get to know each other personally. Catch-up with before and after calls, take breaks and make time to chat, and build relationships that help bridges across timezones and cultural divides;
  • Take advantage of your communication tools and create spaces for team members to share off-topic, interesting, or funny content. We have Slack channels for #office, #kudos, #random and even #nhl; and,
  • Iterate. Like any project, test an idea and adapt based on what works and what doesn’t. Every team will have its own dynamic, and it’s essential that any plan adjusts to accommodate it. You won’t get everything right at first, but you can continually improve over time.

Does your organization have distributed teams? If so, what benefits have you realized, what challenges have you encountered, and what have you learned from the process? Get in touch below and let’s talk.

Jul 22 2016
Jul 22

Today marked the final day of this year's Drupal GovCon. It's been three days of insightful talks, swapping knowledge, and catching up with industry peers.

One of this week's most hands-on talks was this morning's overview of the structural differences between custom modules in Drupal 7 and Drupal 8. Unlike Drupal 7, Drupal 8 utilizes Symphony, Autoloading, and Composer. Additionally, the use of YAML files for .info configuration takes some getting used to. While the minimum structure of a Drupal 8 module is at first glance more complex than in Drupal 7, it minimizes effort as the module grows in complexity, utilizing Drupal 8's object-oriented structure to its advantage.

Another fantastic talk today took an in-depth look at implementing living style guides within Drupal. With the ever-changing nature of the web, a living style guide pulls in real code from a website to gather all of the site's components and styles in one place. This is a valuable tool not only for designers and developers, but also for content editors to see their options. In this talk, Sarah Thrasher showed how her team implemented the popular style guide library KSS not only to pull in the site's CSS, but also to leverage the same Twig templates that Drupal used for the theme to minimize duplication of code.

Last but certainly not least, I attended a talk on using usability.gov, a resource provided by the Department of Health and Human Services (HHS) to promote better usability across both government and private-sector sites. This site provides a number of valuable tips and templates not only on development, but also on everything from design to content strategy to project management.

All in all, this has been a fantastic event. I look forward to implementing the new knowledge and ideas this week has provided, and to GovCon 2017! If you missed them, check out the recaps of day 1 and day 2.

Jul 22 2016
Jul 22

This is the second post in a series about coding standards. In our first post, we talked about code standards and why they are so important. In this post, we’ll talk about how to implement Drupal coding standards in your projects.

Other posts in this series:

  1. Code Standards: What Are They?
  2. Code Standards: How Do We Implement Them?
  3. Code Standards: Formatting
  4. Code Standards: Documentation
  5. Code Standards: The t() function
  6. Code Standards: Object Oriented Coding & Drupal 8

Read the coding standards and keep them handy.

It’s a good idea to read over the Drupal coding standards so you have an idea of what’s expected. Even if you’re familiar with them, we can always use a refresher. They’re also a living document, so there’s a good chance something may have been changed or added since the last time you gave them a go-over. Use this post as a reason to read them again! Make sure you have them bookmarked for reference, as well. https://www.drupal.org/coding-standards

Set up your editor for success

The easiest way to keep your code clean and up to par is by having your editor do the work! There are a lot of editors out there, and even the ones that don’t have many bells and whistles can be set up to help you keep standards in mind when you’re coding.

Sublime Text

This post from Chris is a couple years old, and geared towards front-end developers, but has lots of great Sublime Text setup tips and plugins for every developer.

There’s some great info on drupal.org as well: https://www.drupal.org/node/1346890. Here you can find the basic configuration for adhering to Drupal coding standards, a script to set it up on OSX and Linux, and great plugins to help with development. Now you don’t need to worry about line length, spaces, tabs, line endings, and more. It’ll all be handled for you!

PhpStorm

If you’re using PhpStorm, their website has extensive instructions for getting set up with Drupal configuration here.

If you’re using another editor, you can see if it’s listed here: https://www.drupal.org/node/147789

If not, I’d suggest googling it, and if you don’t find instructions, create them and add them to the list!

Review your own code - Use coder

The easiest way to make sure you’re conforming to coding standards is to use a program like codesniffer. You can install coder, which is a Drupal module that allows you to check your code from the command line using custom rules and PHP Codesniffer. Here’s an example of what you might see:

Example Coder output

Let’s walk through this screenshot.

  1. I’ve navigated to a module directory - here, I’m checking the countries module.
  2. The alias I have set up for codesniffer, using custom Drupal rules, is drupalcs.
  3. I want to test the file at tests/countries.test.
  4. Sometimes this command can take a little while. If it seems like it’s hanging, especially if you’ve checked a directory, it may be too much, so try a single file at a time.
  5. The first thing you’ll see is which file you checked, and the full path. Here, it’s /Applications/MAMP/htdocs/countries/tests/countries.test
  6. Next, you’ll see how many errors and warnings, and how many lines they affect - there can be multiple errors per line, and coder will catch them all.
  7. Next, each error or warning will be listed line by line.

I find it’s easiest to go in order, because sometimes one error causes others - coder can only understand so much, so if you have, for example, an array that has one line indented improperly, it may also think the subsequent lines are indented improperly, even if they’re correct.

Christopher did a great post on PHP Codesnifffer last year, check it out here.

Generally, you want to run coder every time you make a change, and before you commit your code or submit a patch. This way, you’re always writing clean code, and anyone reviewing your code is reviewing it for content, and they don’t have to worry about style. Of course, everyone is human and we all make mistakes. Sometimes you’ll push up a tiny change without running coder, and not realize there was a style issue. That’s why team code reviews are so important!

Team code reviews - make the time

The most successful teams build in time to review one another’s code. There’s no substitute for code reviews by another person, and making sure that you view them as an essential part of your process - the same goes for reviews on drupal.org. When planning time and resources for a project, make sure that there is time set aside for code reviews. When you’re working on contrib projects, make sure you take a look at issues marked "Need Review," and test them. If you want a way to dive into a project or just Drupal and contrib work in general, reviewing patches is a great way to get acclimated. You get exposed to other people’s code, and if you find something that needs to be corrected, that will stick with you and you’ll remember it.

Two things to remember when reviewing other people’s code, or when receiving reviews of your own:

  1. Treat others as you would like to be treated. Be kind, courteous, respectful, and constructive. Be aware of your tone. It’s easy to come off more harshly than you intended, especially when working quickly. Take just a second to re-read your comments, especially if you’re communicating with someone you’re not acquainted with.
  2. Take everything in stride, and don’t take it personally. Those reviewing your code want it to be good, and corrections aren’t a personal attack. This can be especially hard when you start out, and even after years, you can still get a comment that comes off in a way that hurts your feelings. Don’t dwell on it! Thank them, make the corrections, submit them, and chances are, they’ll thank you, too.

Now you know what code standards are, why they’re important, and how you can get started implementing them in your code. Set up your editor, install coder, and get ready for our next code standards post on formatting! We’ll talk about the nitty gritty of how you should format your Drupal code.

[1] Hero photo attribution: charlene mcbride

Jul 22 2016
Jul 22

The more I work with Drupal 8, the more I realize how much has changed for developers in the Drupal community. While the transition to a modern, object-oriented system is what's best for the longevity of the platform, it certainly doesn't come without challenges. As someone who doesn't come from an OOP background, I've found the transition difficult at times. In many cases, I know exactly what I want to do, just not how to do it the "Drupal 8 way". On top of this, tutorials and blog posts on D8 are all over the map in terms of accuracy. Many posts written during D8's development cycle are no longer applicable because of API changes, etc.

Below is a list of snippets that might be helpful to site builders or developers more familiar with D7 hooks and procedural. It might also be useful to OOP folks who are new to Drupal in general. My goal below is to add to and update these snippets over time.

Routes & Links

Determine the Current Drupal Route

Need to know what the current Drupal route is or need to run some logic against the current route? You can get the current route like so:

$route = \Drupal::routeMatch()->getRouteName();

To some, the \Drupal::routeMatch() syntax might look foreign (it did to me). Here's a rundown of what's happening here:

First, \Drupal. This is calling the global Drupal class, which, in Drupal 8, is a bridge between procedural and OO methods of writing Drupal code. The following comes from the documentation:

This class acts as a unified global accessor to arbitrary services within the system in order to ease the transition from procedural code to injected OO code.

Right. Moving on to ::routeMatch(). Here we're using the routeMatch() method which "Retrieves the currently active route match object." Simple enough. But what is "::" all about? This StackOverflow answer helped me to understand what that's all about.

From there, the getRouteName() method returns the current route name as a string. Here are some example routes: entity.node.canonical, view.frontpage and node.type_add.

Is this the Front Page Route?

Need to check if the current route is the front page route? There's a service and method for that:

// Is the current route/path the front page?
if ($is_front = \Drupal::service('path.matcher')->isFrontPage()) {}

Here we're calling the path.matcher service (defined in /core/core.services.yml) and using the isFrontPage() method. For more on services, check out the "Services and Dependency Injection Container" documentation on api.drupal.org which helped me understand how all of these bits work together and the why of their structure.

Get the Requested Path

Need to know what the current page's requested path was, as opposed to the route? You can do this:

$current_uri = \Drupal::request()->getRequestUri();

Redirect to a Specific Route

Need to redirect to a specific page? In Drupal 7, you would likely handle this with drupal_goto() in your page callback function. In Drupal 8, you can use RedirectResponse() for that. Here is the relevant changelog.

Here are some examples, borrowed heavily from said changelog. First, in procedural PHP:

use Symfony\Component\HttpFoundation\RedirectResponse;

function my_redirect() {
  return new RedirectResponse(\Drupal::url('user.page'));
}

Here is how you would use a Drupal 8 controller to accomplish the same thing:

use Drupal\Core\Controller\ControllerBase;

class MyControllerClass extends ControllerBase {

  public function foo() {
    //...
    return $this->redirect('user.page');
  }
}

Links on the Fly

Drupal 7 and prior relied heavily on the l() function. (In fact, I would wager this was my most used function over the years. In Drupal 8, if you need to create links on the fly, utilize the Link class

$link = \Drupal\Core\Link::fromTextAndUrl($text, $url);

Working with Entities

Query Database for Entities

If you need to query the database for some nodes (or any other entity) you should use the entityQuery service. The syntax should be pretty familiar to most D7 developers who have used EntityFieldQuery:

// Query for some entities with the entity query service.
$query = \Drupal::entityQuery('node')
  ->condition('status', 1)
  ->condition('type', 'article')
  ->range(0, 10)
  ->sort('created', 'DESC');

$nids = $query->execute();

Loading Entities

If you need to load the actual entities, you can do so a number of ways:

While the following will technically work in Drupal 8:

$node = entity_load_multiple('node', $nids);

This method has been deprecated in Drupal 8 and will be removed before Drupal 9, in favor of methods overriding Entity::loadMultiple(). To future-proof your code, you would do something like the following:

$nodes = \Drupal::entityTypeManager()->getStorage('node')->loadMultiple($nids);

Here's how you would do similar for a single node:

$node = \Drupal::entityTypeManager()->getStorage('node')->load($nid);

Here are a few other entity snippets that might be useful:

// Link to an entity using the entity's link method.
$author_link = $user->toLink();

// Do the same thing, but customize the link text.
$author_link = $user->toLink('Some Custom Text');

// Given a node object, here's how to determine its type:
$type = $node->getType();

// To get the full user entity of the node's author:
$author = $node->getOwner();

// To get the raw ID of the author of a node:
$author_id = $node->getOwnerId();

Image Styles

Need to whip up an image using a particular image style on the fly? This will work for that:

// Create an instance of an image using a specific image style, given a path to a file.
$style = \Drupal\image\Entity\ImageStyle::load('yourStyle_image');
$img_path = $user->field_profile_some_image->entity->getFileUri();
$img_style_url = $style->buildUrl($img_path);

That's it for now. I intend to keep this post updated as we learn more and more about the new world of Drupal 8. If you have a snippet worth sharing, drop us a line via Twitter and we’ll add it to this post (with credit of course).

Jul 22 2016
Jul 22

Like it or not, sometimes you have to output HTML in javascript.

Recently, I ran across a line of code something like this while reviewing a pull-request for a client:

var inputMarkup = '<span><label data-val="' + inputText + '"
for="checkbox-' + index + '" data-tid="' + tid + '">' +
  inputText + '</label><input type="checkbox" id="checkbox-' + index + '"
  data-tid="' + tid + '" data-val="' + inputText + '"
  /></span>';

Aside from the fact that this code was hard to read (and therefore would be more difficult to maintain), the same code was used with no significant modification in three separate locations in the pull-request.

In PHP, most developers familiar with Drupal would immediately reach for one of the well-known parts of Drupal's theme system, render arrays, theme(), or a *.tpl.php file. In javascript, however, I seldom see much use of Drupal 7's extensive javascript API (also made available in a nicely browseable--though not quite up-to-date--form by nod_).

In this case, the relatively difficult-to-read code, combined with the fact that it was repeated several times across more than one file were clear signs that it should be placed into a theme function.

The Drupal.theme() function in the javascript API works much like theme() in PHP. When using theming functions in PHP, we never call them directly, instead using the theme() function.

In javascript, it's similar; when output is required from a given theme function, we call Drupal.theme() with the name of the theme function required, and any variable(s) it requires.

For example, drupal.org shows the following usage:

Drupal.theme('myThemeFunction', 50, 100, 500);

The example uses Drupal.theme() to call the theme function, myThemeFunction(), and pass it the arguments it requires (50, 100, and 500 in this instance). A theme function can accept whatever number of arguments is necessary, but if your theme function requires more than one parameter, it's good practice to define the function to take a single javascript object containing the parameters required by the function.

So in the case of my code-review, I suggested we use a theme function like this:

/**
 * Provides a checkbox and label wrapped in a span.
 *
 * @param {object} settings
 *   Configuration object for function.
 * @param {int} settings.index
 *   A numeric index, used for creating an `id` attribute and corresponding
 *   `for` attribute.
 * @param {string} settings.inputText
 *   The text to display as the label text and in various attributes.
 * @param {int} settings.tid
 *   A Drupal term id.
 *
 * @return {string}
 *   A string of HTML with a checkbox and label enclosed by a span.
 */
Drupal.theme.checkboxMarkup = function(settings) {
  "use strict";

  var checkboxId = 'checkbox-' + settings.index;
  var inputText = Drupal.checkPlain(settings.inputText);
  var checkboxMarkup = '';

  // Assemble the markup--string manipulation is fast, but if this needs
  // to become more complex, we can switch to creating dom elements.
  checkboxMarkup += '<span>';
  checkboxMarkup += '<label data-val="' + inputText + '" for="' + checkboxId + '" data-tid="' + settings.tid + '">';
  checkboxMarkup += inputText;
  checkboxMarkup += '</label>';
  checkboxMarkup += '<input type="checkbox" value="' + inputText + '" id="' + checkboxId + '" data-tid="' + settings.tid + '" data-val="' + inputText + '">';
  checkboxMarkup += '</span>';

  return checkboxMarkup;
};

This allowed the calling code to be much simpler:

// Creates themed checkbox.
checkboxMarkup = Drupal.theme('checkboxMarkup', {
  index: i,
  inputText: $('.inputText').val(),
  tid: $('.tid')
});

$container.append(checkboxMarkup);

The HTML generation is now also more loosely coupled, and more portable, meaning that we can easily use Drupal.theme.checkboxMarkup() elsewhere in this project--or in any other Drupal project.

Jul 22 2016
Jul 22

This webinar has passed. Keep an eye on our blog for future webinars.

You know how to get things done with git: pull, add, commit, push; but have you mastered it like a Jedi does the force? Nothing is a more lasting record of our work then our git commits. In a galaxy where companies ask you for your Github account in lieu of, or in addition to a resume, we have one more reason to make sure that our commit history is as readable as our code itself.

In this one hour session, we will cover:

  • Rewriting commits
  • Reordering commits
  • Combining commits
  • The perfect commit message
  • Finding bugs using git
  • Avoiding common pitfalls

Join us for this session and you will leave a jedi-level git master!

These Are Not the Commits You're Looking For

Jul 22 2016
Jul 22

Below is a site launch checklist, with details on individual areas of interest to follow in the appendix. While some are Drupal specific, the majority would apply to most any site.

Launch Checklist

  • Is the web server instance size large enough?
  • Is there a load balancer in front of your web head(s)?
  • Is Jenkins configured to automatically deploy your code, run cron, etc?
  • Is Redis configured and enabled?
  • Is a CDN configured?
  • Is the CDN serving HIT's?
  • Is Varnish serving HIT's?
  • Is New Relic configured?
  • Is the VirtualHost configured to redirect from www to the base url (or vice-versa)?
  • Is HTTPS enabled?
  • Is Apache configured for HTTP/2?
  • Is Google Analytics (or your analytics tool of choice) configured?
  • Is robots.txt configured for production (ie. did you remove any changes that were made for development)?
  • Is Drupal's internal page cache enabled?
  • Is the Security Review module installed and providing a clean report?
  • Do Drupal's settings.php & (if Drupal 8) services.yml files have the correct read-only permissions?
  • Are all of the checks on Drupal's status report page reporting green?
  • Are all development related modules disabled?
  • Are errors configured to be suppressed?

Appendix

Infrastructure

Though we use a number of different hosting providers in practice, our standard is Linode. Specific hardware recommendations follow:

Web Server

Use at least a 4GB cloud instance. If you or the client are price sensitive and are considering opting for a smaller instance size to save money, I would argue that the billable time spent troubleshooting an underperforming server is easily much more expensive then paying for more power.

Load Balancer

A load balancer is essential when configuring a site with multiple web servers, but using a load balancer is preferable even in situations with only one web server. Having DNS point to a load balancer, instead of to the web server directly, will give you instantaneous control over where your traffic is routed. For example, if you need to replace your web server hardware, you can redirect traffic instantaneously as opposed to waiting for DNS to propagate. Additionally, a load balancer can add simplicity when configuring a site that uses HTTPS, as you can configure the appropriate certificates at the load balancer level as opposed to on all of the relevant web servers.

Automation

At a minimum, the following jobs should be configured in Jenkins:

  • Automated deployments triggered from Github.
  • Cron to be run at least once every 24 hours.
  • A Drush cache clear job that can be run on-demand from the Jenkins UI.

Performance

New Relic

Chromatic configures all web servers meant for production with New Relic. If the client does not already have a New Relic account, create one and obtain the license key. When configuring production boxes using Ansible, utilize the New Relic role in the playbook and provide the correct API key.

Redis

Redis should be installed and configured for all production Drupal sites. Using Redis will improve database performance.

CDN

Putting a CDN in front of your site, provides many perfomance and security benefits. With the many low-cost and free options available, there is rarely a reason to not institute a CDN on every production site. We have had great success using CloudFlare.

Note: CloudFlare requires you to change your name servers and use them for DNS configuration. These changes should be made at least 24 hours in advance of launch.

If this is a Drupal 7 site, be sure to add the following line to your production settings.php file:

/**
 * Remove "cookie" from Vary header to allow HTML caching.
 */
$conf['omit_vary_cookie'] = TRUE;

Varnish

Many high traffic sites will benefit from an extra layer of caching between the web server and the CDN. In these instances one or more Varnish reverse proxy servers is recommended.

HTTPS

SSL can be configured easily with Let's Encrypt. These certificates need to be renewed quarterly but this renewal process can be automated.

HTTP/2

If you have configured HTTPS, you should go one step further and enable HTTP/2 to reap its additional performance benefits. While HTTP/2 does not technically require encryption, no browser currently supports it over HTTP, so for all intents and purposes HTTP/2 requires HTTPS.

Enabling HTTP/2 is a straight-forward process:

  • Enable the Apache HTTP/2 mod:
sudo a2enmod http2

  • Add the following line to the SSL vhost in question:
Protocols h2 http/1.1

  • Restart Apache
sudo service apache2 restart

This content has been generated from the Chromatic Site Launch Guide repository. Fork it on Github!

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web