Jul 26 2018
Jul 26
July 26th, 2018

Intro

In this post, I’m going to run through how I set up visual regression testing on sites. Visual regression testing is essentially the act of taking a screenshot of a web page (whether the whole page or just a specific element) and comparing that against an existing screenshot of the same page to see if there are any differences.

There’s nothing worse than adding a new component, tweaking styles, or pushing a config update, only to have the client tell you two months later that some other part of the site is now broken, and you discover it’s because of the change that you pushed… now it’s been two months, and reverting that change has significant implications.

That’s the worst. Literally the worst.

All kinds of testing can help improve the stability and integrity of a site. There’s Functional, Unit, Integration, Stress, Performance, Usability, and Regression, just to name a few. What’s most important to you will change depending on the project requirements, but in my experience, Functional and Regression are the most common, and in my opinion are a good baseline if you don’t have the capacity to write all the tests.

If you’re reading this, you probably fall into one of two categories:

  1. You’re already familiar with Visual Regression testing, and just want to know how to do it
  2. You’re just trying to get info on why Visual Regression testing is important, and how it can help your project.

In either case, it makes the most sense to dive right in, so let’s do it.

Tools

I’m going to be using WebdriverIO to do the heavy lifting. According to the website:

WebdriverIO is an open source testing utility for nodejs. It makes it possible to write super easy selenium tests with Javascript in your favorite BDD or TDD test framework.

It basically sends requests to a Selenium server via the WebDriver Protocol and handles its response. These requests are wrapped in useful commands and can be used to test several aspects of your site in an automated way.

I’m also going to run my tests on Browserstack so that I can test IE/Edge without having to install a VM or anything like that on my mac.

Process

Let’s get everything setup. I’m going to start with a Drupal 8 site that I have running locally. I’ve already installed that, and a custom theme with Pattern Lab integration based on Emulsify.

We’re going to install the visual regression tools with npm.

If you already have a project running that uses npm, you can skip this step. But, since this is a brand new project, I don’t have anything using npm, so I’ll create an initial package.json file using npm init.

  • npm init -y
    • Update the name, description, etc. and remove anything you don’t need.
    • My updated file looks like this:
{ "name": "visreg", "version": "1.0.0", "description": "Website with visual regression testing", "scripts": { "test": "echo \"Error: no test specified\" && exit 1" } }   "name":"visreg",  "version":"1.0.0",  "description":"Website with visual regression testing",  "scripts":{    "test":"echo \"Error: no test specified\" && exit 1"

Now, we’ll install the npm packages we’ll use for visual regression testing.

  • npm install --save-dev webdriverio chai wdio-mocha-framework wdio-browserstack-service wdio-visual-regression-service node-notifier
    • This will install:
      • WebdriverIO: The main tool we’ll use
      • Chai syntax support: “Chai is an assertion library, similar to Node’s built-in assert. It makes testing much easier by giving you lots of assertions you can run against your code.”
      • Mocha syntax support “Mocha is a feature-rich JavaScript test framework running on Node.js and in the browser, making asynchronous testing simple and fun.”
      • The Browserstack wdio package So that we can run our tests against Browserstack, instead of locally (where browser/OS differences across developers can cause false-negative failures)
      • Visual regression service This is what provides the screenshot capturing and comparison functionality
      • Node notifier This is totally optional but supports native notifications for Mac, Linux, and Windows. We’ll use these to be notified when a test fails.

Now that all of the tools are in place, we need to configure our visual regression preferences.

You can run the configuration wizard by typing ./node_modules/webdriverio/bin/wdio, but I’ve created a git repository with not only the webdriver config file but an entire set of files that scaffold a complete project. You can get them here.

Follow the instructions in the README of that repo to install them in your project.

These files will get you set up with a fairly sophisticated, but completely manageable visual regression testing configuration. There are some tweaks you’ll need to make to fit your project that are outlined in the README and the individual markdown files, but I’ll run through what each of the files does at a high level to acquaint you with each.

  • .gitignore
    • The lines in this file should be added to your existing .gitignore file. It’ll make sure your diffs and latest images are not committed to the repo, but allow your baselines to be committed so that everyone is comparing against the same baseline images.
  • VISREG-README.md
    • This is an example readme you can include to instruct other/future developers on how to run visual regression tests once you have it set up
  • package.json
    • This just has the example test scripts. One for running the full suite of tests, and one for running a quick test, handy for active development. Add these to your existing package.json
  • wdio.conf.js
    • This is the main configuration file for WebdriverIO and your visual regression tests.
    • You must update this file based on the documentation in wdio.conf.md
  • wdio.conf.quick.js
    • This is a file you can use to run a quick test (e.g. against a single browser instead of the full suite defined in the main config file). It’s useful when you’re doing something like refactoring an existing component, and/or want to make sure changes in one place don’t affect other sections of the site.
  • tests/config/globalHides.js
    • This file defines elements that should be hidden in ALL screenshots by default. Individual tests can use this, or define their own set of elements to hide. Update these to fit your actual needs.
  • tests/config/viewports.js
    • This file defines what viewports your tests should run against by default. Individual tests can use these, or define their own set of viewports to test against. Update these to the screen sizes you want to check.

Running the Test Suite

I’ll copy the example homepage test from the example-tests.md file into a new file /web/themes/custom/visual_regression_testing/components/_patterns/05-pages/home/home.test.js. (I’m putting it here because my wdio.conf.js file is looking for test files in the _patterns directory, and I like to keep test files next to the file they’re testing.)

The only thing you’ll need to update in this file is the relative path to the globalHides.js file. It should be relative from the current file. So, mine will be:

const visreg = require('../../../../../../../../tests/config/globalHides.js'); constvisreg=require('../../../../../../../../tests/config/globalHides.js');

With that done, I can simply run npm test and the tests will run on BrowserStack against the three OS/Browser configurations I’ve specified. While they’re running, we can head over to https://automate.browserstack.com/ we can see the tests being run against Chrome, Firefox, and IE 11.

Once tests are complete, we can view the screenshots in the /tests/screenshots directory. Right now, the baseline shots and the latest shots will be identical because we’ve only run the test once, and the first time you run a test, it creates the baseline from whatever it sees. Future tests will compare the most recent “latest” shot to the existing baseline, and will only update/create images in the latest directory.

At this point, I’ll commit the baselines to the git repo so that they can be shared around the team, and used as baselines by everyone running visual regression tests.

If I run npm test again, the tests will all pass because I haven’t changed anything. I’ll make a small change to the button background color which might not be picked up by a human eye but will cause a regression that our tests will pick up with no problem.

In the _buttons.scss file, I’m going to change the default button background color from $black (#000) to $gray-darker (#333). I’ll run the style script to update the compiled css and then clear the site cache to make sure the change is implemented. (When actively developing, I suggest disabling cache and keeping the watch task running. It just makes things easier and more efficient.)

This time all the tests fail, and if we look at the images in the diff folder, we can clearly see that the “search” button is different as indicated by the bright pink/purple coloring.

If I open up one of the “baseline” images, and the associated “latest” image, I can view them side-by-side, or toggle back and forth. The change is so subtle that a human eye might not have noticed the difference, but the computer easily identifies a regression. This shows how useful visual regression testing can be!

Let’s pretend this is actually a desired change. The original component was created before the color was finalized, black was used as a temporary color, and now we want to capture the update as the official baseline. Simply Move the “latest” image into the “baselines” folder, replacing the old baseline, and commit that to your repo. Easy peasy.

Running an Individual Test

If you’re creating a new component and just want to run a single test instead of the entire suite, or you run a test and find a regression in one image, it is useful to be able to just run a single test instead of the entire suite. This is especially true once you have a large suite of test files that cover dozens of aspects of your site. Let’s take a look at how this is done.

I’ll create a new test in the organisms folder of my theme at /search/search.test.js. There’s an example of an element test in the example-tests.md file, but I’m going to do a much more basic test, so I’ll actually start out by copying the homepage test and then modify that.

The first thing I’ll change is the describe section. This is used to group and name the screenshots, so I’ll update it to make sense for this test. I’ll just replace “Home Page” with “Search Block”.

Then, the only other thing I’m going to change is what is to be captured. I don’t want the entire page, in this case. I just want the search block. So, I’ll update checkDocument (used for full-page screenshots) to checkElement (used for single element shots). Then, I need to tell it what element to capture. This can be any css selector, like an id or a class. I’ll just inspect the element I want to capture, and I know that this is the only element with the search-block-form class, so I’ll just use that.

I’ll also remove the timeout since we’re just taking a screenshot of a single element, we don’t need to worry about the page taking longer to load than the default of 60 seconds. This really wasn’t necessary on the page either, but whatever.

My final test file looks like this:

const visreg = require('../../../../../../../../tests/config/globalHides.js'); describe('Search Block', function () { it('should look good', function () { browser .url('./') .checkElement('.search-block-form', {hide: visreg.hide, remove: visreg.remove}) .forEach((item) => { expect(item.isWithinMisMatchTolerance).to.be.true; }); }); }); constvisreg=require('../../../../../../../../tests/config/globalHides.js');describe('Search Block',function(){  it('should look good',function(){    browser      .url('./')      .checkElement('.search-block-form',{hide:visreg.hide,remove:visreg.remove})      .forEach((item)=>{        expect(item.isWithinMisMatchTolerance).to.be.true;      });

With that in place, this test will run when I use npm test because it’s globbing, and running every file that ends in .test.js anywhere in the _patterns directory. The problem is this also runs the homepage test. If I just want to update the baselines of a single test, or I’m actively developing a component and don’t want to run the entire suite every time I make a locally scoped change, I want to be able to just run the relevant test so that I don’t waste time waiting for all of the irrelevant tests to pass.

We can do that by passing the --spec flag.

I’ll commit the new test file and baselines before I continue.

Now I’ll re-run just the search test, without the homepage test.

npm test -- --spec web/themes/custom/visual_regression_testing/components/_patterns/03-organisms/search/search.test.js

We have to add the first set of -- because we’re using custom npm scripts to make this work. Basically, it passes anything that follows directly to the custom script (in our case test is a custom script that calls ./node_modules/webdriverio/bin/wdio). More info on the run-script documentation page.

If I scroll up a bit, you’ll see that when I ran npm test there were six passing tests. That is one test for each browser for each test. We have two test, and we’re checking against three browsers, so that’s a total of six tests that were run.

This time, we have three passing tests because we’re only running one test against three browsers. That cut our test run time by more than half (from 106 seconds to 46 seconds). If you’re actively developing or refactoring something that already has test coverage, even that can seem like an eternity if you’re running it every few minutes. So let’s take this one step further and run a single test against a single browser. That’s where the wdio.conf.quick.js file comes into play.

Running Test Against a Subset of Browsers

The wdio.conf.quick.js file will, by default, run test(s) against only Chrome. You can, of course, change this to whatever you want (for example if you’re only having an issue in a specific version of IE, you could set that here), but I’m just going to leave it alone and show you how to use it.

You can use this to run the entire suite of tests or just a single test. First, I’ll show you how to run the entire suite against only the browser defined here, then I’ll show you how to run a single test against this browser.

In the package.json file, you’ll see the test:quick script. You could pass the config file directly to the first script by typing npm test -- wdio.conf.quick.js, but that’s a lot more typing than npm run test:quick and you (as well as the rest of your team) have to remember the file name. Capturing the file name in a second custom script simplifies things.

When I run npm run test:quick You’ll see that two tests were run. We have two tests, and they’re run against one browser, so that simplifies things quite a bit. And you can see it ran in only 31 seconds. That’s definitely better than the 100 seconds the full test suite takes.

Let’s go ahead and combine this with the technique for running a single test to cut that time down even further.

npm run test:quick -- --spec web/themes/custom/visual_regression_testing/components/_patterns/03-organisms/search/search.test.js

This time you’ll see that it only ran one test against one browser and took 28 seconds. There’s actually not a huge difference between this and the last run because we can run three tests in parallel. And since we only have two tests, we’re not hitting the queue which would add significantly to the entire test suite run time. If we had two dozen tests, and each ran against three browsers, that’s a lot of queue time, whereas even running the entire suite against one browser would be a significant savings. And obviously, one test against one browser will be faster than the full suite of tests and browsers.

So this is super useful for active development of a specific component or element that has issues in one browser as well as when you’re refactoring code to make it more performant, and want to make sure your changes don’t break anything significant (or if they do, alert you sooner than later). Once you’re done with your work, I’d still recommend running the full suite to make sure your changes didn’t inadvertently affect another random part of the site.

So, those are the basics of how to set up and run visual regression tests. In the next post, I’ll dive into our philosophy of what we test, when we test, and how it fits into our everyday development workflow.

Web Chef Brian Lewis
Brian Lewis

Brian Lewis is a frontend engineer at Four Kitchens, and is passionate about sharing knowledge and learning new tools and techniques.

Web Chef Dev Experts
Development

Blog posts about backend engineering, frontend code work, programming tricks and tips, systems architecture, apps, APIs, microservices, and the technical side of Four Kitchens.

Read more Development
Jul 11 2018
Jul 11
July 11th, 2018

Someone recently asked the following question in Slack. I didn’t want it to get lost in Slack’s history, so I thought I’d post it here:

Question: I’m setting a CSS background image inside my Pattern Lab footer template which displays correctly in Pattern Lab; however, Drupal isn’t locating the image. How is sharing images between PL and Drupal supposed to work?

My Answer: I’ve been using Pattern Lab’s built-in data.json files to handle this lately. e.g. you could do something like:

footer-component.twig:

... {% footer_background_image = footer_background_image|default('/path/relative/to/drupal/root/footer-image.png') %} ... {%footer_background_image=footer_background_image|default('/path/relative/to/drupal/root/footer-image.png')%}

This makes the image load for Drupal, but fails for Pattern Lab.

At first, to fix that, we used the footer-component.yml file to set the path relative to PL. e.g.:

footer-component.yml:

footer_background_image: /path/relative/to/pattern-lab/footer-image.png footer_background_image:/path/relative/to/pattern-lab/footer-image.png

The problem with this is that on every Pattern Lab page, when we included the footer copmonent, we had to add that line to the yml file for the page. e.g:

basic-page.twig:

... {% include /whatever/footer-component.twig %} ... {%include/whatever/footer-component.twig%}

basic-page.yml:

... footer_background_image: /path/relative/to/pattern-lab/footer-image.png ... footer_background_image:/path/relative/to/pattern-lab/footer-image.png

Rinse and repeat for each example page… That’s annoying.

Then we realized we could take advantage of Pattern Labs global data files.

So with the same footer-component.twig file as above, we can skip the yml files, and just add the following to a data file.

theme/components/_data/paths.json: (* see P.S. below)

{ "footer_background_image": "/path/relative/to/pattern-lab/footer-image.png" }     "footer_background_image":"/path/relative/to/pattern-lab/footer-image.png"

Now, we can include the footer component in any example Pattern Lab pages we want, and the image is globally replaced in all of them. Also, Drupal doesn’t know about the json files, so it pulls the default value, which of course is relative to the Drupal root. So it works in both places.

We did this with our icons in Emulsify:

_icon.twig

paths.json

End of the answer to your original question… Now for a little more info that might help:

P.S. You can create as many json files as you want here. Just be careful you don’t run into name-spacing issues. We accounted for this in the header.json file by namespacing everything under the “header” array. That way the footer nav doesn’t pull our header menu items, or vise-versa.

example homepage home.twigthat pulls menu items for the header and the footer from data.json files

header.json

footer.json

Web Chef Brian Lewis
Brian Lewis

Brian Lewis is a frontend engineer at Four Kitchens, and is passionate about sharing knowledge and learning new tools and techniques.

Web Chef Dev Experts
Development

Blog posts about backend engineering, frontend code work, programming tricks and tips, systems architecture, apps, APIs, microservices, and the technical side of Four Kitchens.

Read more Development
Jun 18 2018
Jun 18
June 18th, 2018

Last month, Ithaca College introduced the first version of what will represent the biggest change to the college’s website technology, design, content, and structure in more than a decade—a redesigned and rebuilt site that’s more visually appealing and easier to use.

Over the past year, the college and its partners Four Kitchens and design firm Beyond have been hard at work within a Team Augmentation capacity to support the front-to-back overhaul of Ithaca.edu to better serve the educational and community goals of Ithaca’s students, faculty, and staff. The results of the team’s efforts can be viewed at https://www.ithaca.edu.  

Founded in 1892, Ithaca College is a residential college dedicated to building knowledge and confidence through a continuous cycle of theory, practice, and performance. Home to some 6,500 students, the college offers more than 100 degree programs in its schools of Business, Communications, Humanities and Sciences, Health Sciences and Human Performance, and Music.

Students, faculty, and staff at Ithaca College create an active, inclusive community anchored in a keen desire to make a difference in the local community and the broader world. The college is consistently ranked as one of the nation’s top producers of Fulbright scholars, one of the most LGBTQ+ friendly schools in the country, and one of the top 10 colleges in the Northeast.

We emphasized applying automation and continuous integration to focus the team on the efficient development of creative and easy to test solutions.

On the backend, the team—including members of Ithaca’s dev org working alongside Four Kitchens—built a Drupal 8 site. The transition to Drupal 8 keeps the focus on moving the college to current technology for sustainable success. Four Kitchens emphasized applying automation and continuous integration to focus the team on the efficient development of creative and easy to test solutions. To achieve that, the team set up automation in Circle CI 2.0 as middleware between the GitHub repository and hosting in PantheonGitHub was used throughout the project to implement, automate, and optimize visual regression, advanced communication between systems and a solid workflow using throughout the project to ensure fast and effective release cycles. Learn from the experiences obtained from implementing the automation pipeline in the following posts:

The frontend focused heavily on the Atomic Design approach. The frontend team utilized Emulsify and Pattern Lab to facilitate pattern component-based design and architecture. This again fostered long-term ease of use and success for Ithaca College.

The team worked magic with content migration. Using the brainchild of Web Chef, David Diers, the team devised a plan to migrate of portions of the site one by one. Subsites corresponding to schools or departments were moved from the legacy CMS to special Pantheon multidevs that were built off the live environment. Content managers then performed a moderated adaptation and curation process to ensure legacy content adhered to the new content model. A separate migration process then imported the content from the holding environment into the live site. This process allowed Ithaca College’s many content managers to thoroughly vet the content that would live on the new site and gave them a clear path to completion. Learn more about migrating using Paragraphs here: Migrating Paragraphs in Drupal 8

Steady scrum rhythm, staying agile, and consistently improving along the way.

In addition to the stellar dev work, a large contributor to the project’s success was establishing a steady scrum rhythm, staying agile, and consistently improving along the way. Each individual and unit solidified into a team through daily 15-minute standups, weekly backlog grooming meetings, weekly ‘Developer Showcase Friday’ meetings, regular sprint planning meetings, and biweekly retrospective meetings. This has been such a shining success the internal Ithaca team plans to carry forward this rhythm even after the Web Chefs’ engagement is complete.     

Engineering and Development Specifics

  • Drupal 8 site hosted on Pantheon Elite, with the canonical source of code being GitHub and CircleCI 2.0 as Continuous Integration and Delivery platform
  • Hierarchical and decoupled architecture based mainly on the use of group entities (Group module) and entity references that allowed the creation of subsite-like internal spaces.
  • Selective use of configuration files through the utilization of custom and contrib solutions like Config Split and Config Ignore modules, to create different database projections of a shared codebase.
  • Migration process based on 2 migration groups with an intermediate holding environment for content moderation.
  • Additional migration groups support the indexing of not-yet-migrated, raw legacy content for Solr search, and the events thread, brought through our Localist integration.
  • Living style guide for site editors by integrating twig components with Drupal templates
  • Automated Visual Regression
Aerial view of the Ithaca College campus from the Ithaca College homepage. From the Ithaca College Homepage.

A well-deserved round of kudos goes to the team. As a Team Augmentation project, the success of this project was made possible by the dedicated work and commitment to excellence from the Ithaca College project team. The leadership provided by Dave Cameron as Ithaca Product Manager, Eric Woods as Ithaca Technical Lead and Architect, and John White as Ithaca Dev for all things legacy system related was crucial in the project’s success. Ithaca College’s Katherine Malcuria, Senior Digital User Interface Designer,  led the creation of design elements for the website. 

Katherine Malcuria, Senior Digital User Interface Designer, works on design elements of the Ithaca.edu website

Ithaca Dev Michael Sprague, Web Chef David Diers, Architect,  as well as former Web Chef Chris Ruppel, Frontend Engineer, also stepped in for various periods of time on the project.  At the tail end of the project Web Chef, Brian Lewis, introduced a new baby Web Chef to the world, therefore the amazing Randy Oest, Senior Designer and Frontend Engineer, stepped in to assist in pushing this to the finish line from a front-end dev perspective. James Todd, Engineer, pitched in as ‘jack of all trades’ connoisseur helping out where needed.

The Four Kitchens Team Augmentation team for the Ithaca College project was led by Brandy Jackson, Technical Project Manager, playing the roles of project manager, scrum master, and product owner interchangeably as needed. Joel Travieso, Senior Drupal Engineer, was the technical lead, backend developer, and technical architect. Brian Lewis, Frontend Engineer, meticulously worked magic in implementing intricate design elements that were provided by the Ithaca College design team, as well a 3rd party design firm, Beyond, at different parts of the project.

A final round of kudos goes out to the larger Ithaca project team, from content, to devOps, to quality assurance, there are too many to name. A successful project would not have been possible without their collective efforts as well.

The success of the Ithaca College Website is a great example of excellent team unity and collaboration across multiple avenues. These coordinated efforts are a true example of the phrase “teamwork makes the dream work.” Congratulations to all for a job well done!

Special thanks to Brandy Jackson for her contribution to this launch announcement. 

Four Kitchens

The place to read all about Four Kitchens news, announcements, sports, and weather.

Jun 06 2018
Jun 06
June 6th, 2018

I recently had the privilege of helping PRI.org launch a new React Frontend for their Drupal 7 project. Although I was fairly new to using React, I was able to lean on Four Kitchens’ senior JavaScript engineering team for guidance. I thought I might take the opportunity to share some things I learned along the way in terms of organization, code structuring and packages.

Organization

As a lead maintainer of Emulsify, I’m no stranger to component-driven development and building a user interface from minimal, modular components. However, building a library of React components provided me with some new insights worth mentioning.

Component Variations

If a component’s purpose starts to diverge, it may be a good time to split the variations in your component into separate components. A perfect example of this can be found in a button component. On any project of scale, you will likely have a multitude of buttons ranging from actual <button> elements to links or inputs. While these will likely share a number of qualities (e.g., styling), they may also vary not only in the markup they use but interactions as well. For instance, here is a simple button component with a couple of variations:

const Button = props => { const { url, onClick } = props; if (url) { return ( <a href={url}> ... </a> ); } return ( <button type="button" onClick={onClick}> ... </button> ); }; constButton=props=>{  const{url,onClick}=props;  if(url){    return(      <ahref={url}>        ...      </a>    );  return(    <buttontype="button"onClick={onClick}>      ...    </button>

Even with the simplicity of this example, why not separate this into two separate components? You could even change this component to handle that fork:

function Button(props) { ... return url ? <LinkBtn {...props}/> : <ButtonBtn {...props} />; } functionButton(props){  returnurl?<LinkBtn{...props}/>:<ButtonBtn{...props}/>;

React makes this separation so easy, it really is worth a few minutes to define components that are distinct in purpose. Also, testing against each one will become a lot easier as well.

Reuse Components

While the above might help with encapsulation, one of the main goals of component-driven development is reusability. When you build/test something well once, not only is it a waste of time and resources to build something nearly identical but you have also opened yourself to new and unnecessary points of failure. A good example from our project is creating a couple different types of toggles.  For accessible, standardized dropdowns, we introduced the well-supported, external library Downshift.:

In a separate part of the UI, we needed to build an accordion menu:

Initially, this struck me as two different UI elements, and so we built it as such. But in reality, this was an opportunity that I missed to reuse the well-built and tested Downshift library (and in fact, we have a ticket in the backlog to do that very thing). This is a simple example, but as the complexity of the component (or a project) increases, you can see where reusage would become critical.

Flexibility

And speaking of dropdowns, React components lend themselves to a great deal of flexibility. We knew the “drawer” part of the dropdown would need to contain anything from an individual item to a list of items to a form element. Because of this, it made sense to make the drawer contents as flexible as possible. By using the open-ended children prop, the dropdown container could simply just concern itself with container level styling and the toggling of the drawer. See below for a simplified version of the container code (using Downshift):

export default class Dropdown extends Component { static propTypes = { children: PropTypes.node }; static defaultProps = { children: [] }; render() { const { children } = this.props; return ( <Downshift> {({ isOpen }) => ( <div className=”dropdown”> <Button className=”btn” aria-label="Open Dropdown" /> {isOpen && <div className=”drawer”>{children}</div>} </div> )} </Downshift> ); } } exportdefaultclassDropdownextendsComponent{  staticpropTypes={    children:PropTypes.node  staticdefaultProps={    children:[]  render(){    const{children}=this.props;    return(      <Downshift>        {({isOpen})=>(          <divclassName=dropdown>            <ButtonclassName=btnaria-label="Open Dropdown"/>            {isOpen&&<divclassName=drawer>{children}</div>}          </div>        )}      </Downshift>    );

This means we can put anything we want inside of the container:

<Dropdown> <ComponentOne> <ComponentTwo> <span>Whatever</span> </Dropdown> <Dropdown>  <ComponentOne>  <ComponentTwo>  <span>Whatever</span></Dropdown>

This kind of maximum flexibility with minimal code is definitely a win in situations like this.

Code

The Right Component for the Job

Even though the React documentation spells it out, it is still easy to forget that sometimes you don’t need the whole React toolbox for a component. In fact, there’s more than simplicity at stake, writing stateless components may in some instances be more performant than stateful ones. Here’s an example of a hero component that doesn’t need state following AirBnB’s React/JSX styleguide:

const Hero = ({title, imgSrc, imgAlt}) => ( <div className=”hero”> <img data-src={imgSrc} alt={imgAlt} /> <h2>{title}</h2> </div> ); export default Hero; constHero=({title,imgSrc,imgAlt})=>(  <divclassName=hero>    <imgdata-src={imgSrc}alt={imgAlt}/>    <h2>{title}</h2>  </div>exportdefaultHero;

When you actually need to use Class, there are some optimizations you can make to at least write cleaner (and less) code. Take this Header component example:

import React from 'react'; class Header extends React.Component { constructor(props) { super(props); this.state = { isMenuOpen: false }; this.toggleOpen = this.toggleOpen.bind(this); } toggleOpen() { this.setState(prevState => ({ isMenuOpen: !prevState.isMenuOpen })); } render() { // JSX } } export default Header; importReactfrom'react';classHeaderextendsReact.Component{  constructor(props){    super(props);    this.state={isMenuOpen:false};    this.toggleOpen=this.toggleOpen.bind(this);  toggleOpen(){    this.setState(prevState=>({      isMenuOpen:!prevState.isMenuOpen    }));  render(){    // JSXexportdefaultHeader;

In this snippet, we can start by simplifying the React.Component extension:

import React, { Component } from 'react'; class Header extends Component { constructor(props) { super(props); this.state = { isMenuOpen: false }; this.toggleOpen = this.toggleOpen.bind(this); } toggleOpen() { this.setState(prevState => ({ isMenuOpen: !prevState.isMenuOpen })); } render() { // JSX } } export default Header; importReact,{Component}from'react';classHeaderextendsComponent{  constructor(props){    super(props);    this.state={isMenuOpen:false};    this.toggleOpen=this.toggleOpen.bind(this);  toggleOpen(){    this.setState(prevState=>({      isMenuOpen:!prevState.isMenuOpen    }));  render(){    // JSXexportdefaultHeader;

Next, we can export the component in the same line so we don’t have to at the end:

import React, { Component } from 'react'; export default class Header extends Component { constructor(props) { super(props); this.state = { isMenuOpen: false }; this.toggleOpen = this.toggleOpen.bind(this); } toggleOpen() { this.setState(prevState => ({ isMenuOpen: !prevState.isMenuOpen })); } render() { // JSX } } importReact,{Component}from'react';exportdefaultclassHeaderextendsComponent{  constructor(props){    super(props);    this.state={isMenuOpen:false};    this.toggleOpen=this.toggleOpen.bind(this);  toggleOpen(){    this.setState(prevState=>({      isMenuOpen:!prevState.isMenuOpen    }));  render(){    // JSX

Finally, if we make the toggleOpen() function into an arrow function, we don’t need the binding in the constructor. And because our constructor was really only necessary for the binding, we can now get rid of it completely!

export default class Header extends Component { state = { isMenuOpen: false }; toggleOpen = () => { this.setState(prevState => ({ isMenuOpen: !prevState.isMenuOpen })); }; render() { // JSX } } exportdefaultclassHeaderextendsComponent{  state={isMenuOpen:false};  toggleOpen=()=>{    this.setState(prevState=>({      isMenuOpen:!prevState.isMenuOpen    }));  render(){    // JSX

Proptypes

React has some quick wins for catching bugs with built-in typechecking abilities using React.propTypes. When using a Class component, you can also move your propTypes inside the component as static propTypes. So, instead of:

export default class DropdownItem extends Component { ... } DropdownItem.propTypes = { .. propTypes }; DropdownItem.defaultProps = { .. default propTypes }; exportdefaultclassDropdownItemextendsComponent{DropdownItem.propTypes={  ..propTypesDropdownItem.defaultProps={  ..defaultpropTypes

You can instead have:

export default class DropdownItem extends Component { static propTypes = { .. propTypes }; static defaultProps = { .. default propTypes }; render() { ... } } exportdefaultclassDropdownItemextendsComponent{  staticpropTypes={    ..propTypes  staticdefaultProps={    ..defaultpropTypes  render(){    ...

Also, if you want to limit the value or objects returned in a prop, you can use PropTypes.oneOf and PropTypes.oneOfType respectively (documentation).

And finally, another place to simplify code is that you can deconstruct the props option in the function parameter definition like so. Here’s a component before this has been done:

const SvgLogo = props => { const { title, inline, height, width, version, viewBox } = props; return ( // JSX ) } constSvgLogo=props=>{  const{title,inline,height,width,version,viewBox}=props;  return(    // JSX

And here’s the same component after:

const SvgLogo = ({ title, inline, height, width, version, viewBox }) => ( // JSX ); constSvgLogo=({title,inline,height,width,version,viewBox})=>(

Packages

Finally, a word on packages. React’s popularity lends itself to a plethora of packages available. One of our senior JavaScript engineers passed on some sage advice to me that is worth mentioning here: every package you add to your project is another dependency to support. This doesn’t mean that you should never use packages, merely that it should be done judiciously, ideally with awareness of the package’s support, weight and dependencies. That said, here are a couple of packages (besides Downshift) that we found useful enough to include on this project:

Classnames

If you find yourself doing a lot of classname manipulation in your components, the classnames utility is a package that helps with readability. Here’s an example before we applied the classnames utility:

<div className={`element ${this.state.revealed === true ? revealed : ''}`} > <divclassName={`element${this.state.revealed===true?revealed:''}`}>

With classnames you can make this much more readable by separating the logic:

import classNames from 'classnames/bind'; const elementClasses = classNames({ element: true, revealed: this.state.revealed === true }); <div classname={elementClasses}> importclassNamesfrom'classnames/bind';constelementClasses=classNames({  element:true,  revealed:this.state.revealed===true<divclassname={elementClasses}>

React Intersection Observer (Lazy Loading)

IntersectionObserver is an API that provides a way for browsers to asynchronously detect changes of an element intersecting with the browser window. Support is gaining traction and a polyfill is available for fallback. This API could serve a number of purposes, not the least of which is the popular technique of lazy loading to defer loading of assets not visible to the user.  While we could have in theory written our own component using this API, we chose to use the React Intersection Observer package because it takes care of the bookkeeping and standardizes a React component that makes it simple to pass in options and detect events.

Conclusions

I hope passing on some of the knowledge I gained along the way is helpful for someone else. If nothing else, I learned that there are some great starting points out there in the community worth studying. The first is the excellent React documentation. Up to date and extensive, this documentation was my lifeline throughout the project. The next is Create React App, which is actually a great starting point for any size application and is also extremely well documented with best practices for a beginner to start writing code.

Web Chef Evan Willhite
Evan Willhite

Evan Willhite is a frontend engineer at Four Kitchens who thrives on creating delightful digital experiences for users, clients, and fellow engineers. He enjoys running, hot chicken, playing music, and being a homebody with his family.

Web Chef Dev Experts
Development

Blog posts about backend engineering, frontend code work, programming tricks and tips, systems architecture, apps, APIs, microservices, and the technical side of Four Kitchens.

Read more Development
May 08 2018
May 08
May 8th, 2018

Over the past few months, Four Kitchens has worked together with the Public Radio International (PRI) team to build a robust API in PRI’s Drupal 7 site, and a modern, fresh frontend that consumes that API. This project’s goal was to launch a new homepage in the new frontend. PRI intends to re-build their entire frontend in this new structure and Four Kitchens has laid the groundwork for this endeavor. The site went live successfully, with a noticeable improvement in load time and performance. Overall load time performance increased by 40% with first-byte time down to less than 0.5 seconds. The results of the PRI team’s efforts can be viewed at PRI.org.

PRI is a global non-profit media company focused on the intersection of journalism and engagement to effect positive change in people’s lives. PRI’s mission is to serve audiences as a distinctive content source for information, insights and cultural experiences essential to living in our diverse, interconnected world.

Overall load time performance increased by 40% with first-byte time down to less than 0.5 seconds.

Four Kitchens and PRI approached this project with two technical goals. The first was to design and build a full-featured REST API in PRI’s existing Drupal 7 application. We used RESTFul, a Drupal module for building APIs, to create a JSON-API compliant API.

Our second technical goal was to create a robust frontend backed by the new API. To achieve that goal, we used React to create component-based user interfaces and styled them with using the CSS Modules pattern. This work was done in a library of components in which we used Storybook to demonstrate and test the components. We then pulled these components into a Next-based application, which communicates with the API, parses incoming data, and uses that data to populate component properties and generate full pages. Both the component library and the Next-based application used Jest and Enzyme heavily to create thorough, robust tests.

A round of well-deserved kudos to the PRI team: Technical Project Manager, Suzie Nieman managed this project from start to finish, facilitating estimations that led the team to success. Senior JavaScript Engineer, Patrick Coffey, provided keen technical leadership as well as deep architectural knowledge to all facets of the project, keeping the team unblocked and motivated. Engineer, James Todd brought his Drupal and JavaScript expertise to the table, architecting and building major portions of PRI’s new API. Senior Frontend Engineer, Evan Willhite, brought his wealth of frontend knowledge to build a robust collection of elegant components in React and JavaScript. Architect, David Diers created mechanisms that will be responsible for managing PRI’s API documentation that can be used in future projects.

Special thanks to Patrick Coffey and Suzie Nieman for their contributions to this launch announcement. 

Four Kitchens

The place to read all about Four Kitchens news, announcements, sports, and weather.

Apr 05 2018
Apr 05
April 5th, 2018

DrupalCon Nashville has lifted the veil on sessions at this year’s event and we’re thrilled to be a part of it! Our Web Chefs will be giving talks, facilitating the Business Summit, and running BOFs, so keep an eye out for our green jackets. We’re always happy to have a conversation!


Michal Minecki
Director of Technology at Four Kitchens


Patrick Coffey
Senior JavaScript Engineer at Four Kitchens

Recently there have been strides in web-based VR which enable producers to publish VR experiences via the web. Four Kitchens has been keeping an eye on these technologies and we want to share our experiences building real WebVR applications.


Joel Travieso
Senior Drupal Engineer at Four Kitchens

Any amount of automation is worth it, as long as it is effective. From simple things like manipulating pull request labels and ticket statuses, or using your CI engine to build your changelogs, to strategic operations like removing obsolete Pantheon environments or ensuring you always pick the right database for your build, little chunks of automation can substantially improve your workflow.


Adam Erickson
Senior Drupal Engineer


Jeff Tomlinson
Architect

Drupal’s core search can only take you so far. In this session, we will talk about what it takes to ramp up the search functionality of your site by using Search API and Solr. We can achieve this with the addition of a few modules, configuration adjustments, and the set-up of a view. We will take you from with getting a plan in place all the way through to monitoring your site’s search usage and looking for ways to make improvements.


Randy Oest
Senior Designer and Frontend Engineer

With the growing shift towards a decoupled future a company’s presence is going to be represented by an ever-expanding collection of websites, apps, and talking speakers.

Maintaining design and tone consistency across those channels will be challenging but if done right, it can allow you to enter markets more quickly while keeping the style and tone of your company aligned.

Business Summit


Elia Albarran
Director of Operations

Elia will be co-leading the Business Summit, gathering and confirming speakers, giving feedback on the programming and schedule and emceeing the event.


Trasi Judd
Director of Support and Continuous Improvement

Trasi is speaking at the Summit with one of our South American partners, Alejandro Oses from Rootstack, on how to have a good partnership with near-shore vendors.

Four Kitchens

The place to read all about Four Kitchens news, announcements, sports, and weather.

Web Chefs Walking
Events

Blog posts about ephemeral news, events, parties, conferences, talks—anything with a date attached to it.

Read more Events
Feb 01 2018
Feb 01
February 1st, 2018

Paragraphs is a powerful Drupal module that makes gives editors more flexibility in how they design and layout the content of their pages. However, they are special in that they make no sense without a host entity. If we talk about Paragraphs, it goes without saying that they are to be attached to other entities.
In Drupal 8, individual migrations are built around an entity type. That means we implement a single migration for each entity type. Sometimes we draw relationships between the element being imported and an already imported one of a different type, but we never handle the migration of both simultaneously.
Migrating Paragraphs needs to be done in at least two steps: 1) migrating entities of type Paragraph, and 2) migrating entities referencing imported Paragraph entities.

Migration of Paragraph entities

You can migrate Paragraph entities in a way very similar to the way of migrating every other entity type into Drupal 8. However, a very important caveat is making sure to use the right destination plugin, provided by the Entity Reference Revisions module:

destination: plugin: ‘entity_reference_revisions:paragraph’ default_bundle: paragraph_type destination:plugin:entity_reference_revisions:paragraphdefault_bundle:paragraph_type

This is critical because you can be tempted to use something more common like entity:paragraph which would make sense given that Paragraphs are entities. However, you didn’t configure your Paragraph reference field as a conventional Entity Reference one, but as an Entity reference revisions field, so you need to use an appropriate plugin.

An example of the core of a migration of Paragraph entities:

source: plugin: url data_fetcher_plugin: http data_parser_plugin: json urls: 'feed.url/endpoint' ids: id: type: integer item_selector: '/elements' fields: - name: id label: Id selector: /element_id - name: content label: Content selector: /element_content process: field_paragraph_type_content/value: content destination: plugin: 'entity_reference_revisions:paragraph' default_bundle: paragraph_type migration_dependencies: { } plugin:urldata_fetcher_plugin:httpdata_parser_plugin:jsonurls:'feed.url/endpoint'    type:integeritem_selector:'/elements'    name:id    label:Id    selector:/element_id    name:content    label:Content    selector:/element_contentfield_paragraph_type_content/value:contentdestination:plugin:'entity_reference_revisions:paragraph'default_bundle:paragraph_typemigration_dependencies:{  }

To give some context, this assumes the feed being consumed has a root level with an elements array filled with content arrays with properties like element_id and element_content, and we want to convert those content arrays into Paragraphs of type paragraph_type in Drupal, with the field_paragraph_type_content field storing the text that came from the element_content property.

Migration of the host entity type

Having imported the Paragraph entities already, we then need to import the host entities, attaching the appropriate Paragraphs to each one’s field_paragraph_type_content field. Typically this is accomplished by using the migration_lookup process plugin (formerly migration).

Every time an entity is imported, a row is created in the mapping table for that migration, with both the ID the entity has in the external source and the internal one it got after being imported. This way the migration keeps a correlation between both states of the data, for updating and other purposes.

The migration_lookup plugin takes an ID from an external source and tries to find an internal entity whose ID is linked to the external one in the mapping table, returning its ID in that case. After that, the entity reference field will be populated with that ID, effectively establishing a link between the entities in the Drupal side.

In the example below, the migration_lookup returns entity IDs and creates references to other Drupal entities through the field_event_schools field:

field_event_schools: plugin: iterator source: event_school process: target_id: plugin: migration_lookup migration: schools source: school_id field_event_schools:  plugin:iterator  source:event_school  process:    target_id:      plugin:migration_lookup      migration:schools      source:school_id

However, while references to nodes or terms basically consist of the ID of the referenced entity, when using the entity_reference_revisions destination plugin (as we did to import the Paragraph entities), two IDs are stored per entity. One is the entity ID and the other is the entity revision ID. That means the return of the migration_lookup processor is not an integer, but an array of them.

process: field_paragraph_type_content: plugin: iterator source: elements process: temporary_ids: plugin: migration_lookup migration: paragraphs_migration source: element_id target_id: plugin: extract source: '@temporary_ids' index: - 0 target_revision_id: plugin: extract source: '@temporary_ids' index: - 1 field_paragraph_type_content:  plugin:iterator  source:elements  process:    temporary_ids:      plugin:migration_lookup      migration:paragraphs_migration      source:element_id    target_id:      plugin:extract      source:'@temporary_ids'      index:        -0    target_revision_id:      plugin:extract      source:'@temporary_ids'      index:        -1

What we do then is, instead of just returning an array (it wouldn’t work obviously), use the extract process plugin with it to get the integer IDs needed to create an effective reference.

Summary

In summary, it’s important to remember that migrating Paragraphs is a two-step process at minimum. First, you must migrate entities of type Paragraph. Then you must migrate entities referencing those imported Paragraph entities.

More on Drupal 8

Top 5 Reasons to Migrate Your Site to Drupal 8

Creating your Emulsify 2.0 Starter Kit with Drush

Web Chef Joel Travieso
Joel Travieso

Joel focuses on the backend and architecture of web projects seeking to constantly improve by considering the latest developments of the art.

Web Chef Dev Experts
Development

Blog posts about backend engineering, frontend code work, programming tricks and tips, systems architecture, apps, APIs, microservices, and the technical side of Four Kitchens.

Read more Development
Jan 18 2018
Jan 18
January 18th, 2018

What are Spectre and Meltdown?

Have you noticed your servers or desktops are running slower than usual? Spectre and Meltdown can affect most devices we use daily. Cloud servers, desktops, laptops, and mobile devices. For more details go to: https://meltdownattack.com/

How does this affect performance?

We finally have some answers to how this is going to affect us. After Pantheon patched their servers they released an article showing the 10-30% negative performance impact that servers are going to have. For the whole article visit: https://status.pantheon.io/incidents/x9dmhz368xfz

I can say that I personally have noticed my laptop’s CPU is running at much higher percentages than before the update for similar tasks.
Security patches are still being released for many operating systems, but traditional desktop OSs appear to have been covered now. If you haven’t already, make sure your OS is up to date. Don’t forget to update the OS on your phone.

Next Steps?

So what can we do in the Drupal world? First, you should follow up with your hosting provider and verify they have patched your servers. Then you need to find ways to counteract the performance loss. If you are interested in performance recommendations, Four Kitchens offers both frontend and backend performance audits.

As a quick win, if you haven’t already, upgrade to PHP7 which should give you a performance boost around 30-50% on PHP processes. Now that you are more informed about what Spectre and Meltdown are, help with the performance effort by volunteering or sponsoring a developer on January 27, 2018 and January 28, 2018 for the Drupal Global Sprint Weekend 2018, specifically on performance related issues: https://groups.drupal.org/node/517797

Web Chef Chris Martin
Chris Martin

Chris Martin is a support engineer at Four Kitchens. When not maintaining websites he can be found building drones, computers, robots, and occasionally traveling to China.

Web Chef Dev Experts
Development

Blog posts about backend engineering, frontend code work, programming tricks and tips, systems architecture, apps, APIs, microservices, and the technical side of Four Kitchens.

Read more Development
Dec 20 2017
Dec 20
December 20th, 2017

One of the most common requests we get in regards to Emulsify is to show concrete examples of components. There is a lot of conceptual material out there on the benefits of component-driven development in Drupal 8—storing markup, CSS, and JavaScript together using some organizational pattern (à la Atomic Design), automating the creation of style guides (e.g., using Pattern Lab) and using Twig’s include, extends and embed functions to work those patterns into Drupal seamlessly. If you’re reading this article you’re likely already sold on the concept. It’s time for a concrete example!

In this tutorial, we’ll build a full site header containing a logo, a search form, and a menu – here’s the code if you’d like to follow along. We will use Emulsify, so pieces of this may be specific to Emulsify and we will try and note those where necessary. Otherwise, this example could, in theory, be extended to any Drupal 8 project using component-driven development.

Planning Your Component

The first step in component-driven development is planning. In fact, this may be the definitive phase in component-driven development. In order to build reusable systems, you have to break down the design into logical, reusable building blocks. In our case, we have 3 distinct components—what we would call in Atomic Design “molecules”—a logo, a search form, and a menu. In most component-driven development systems you would have a more granular level as well (“atoms” in Atomic Design). Emulsify ships with pre-built and highly flexible atoms for links, images, forms, and lists (and much more). This allows us to jump directly into project-specific molecules.

So, what is our plan? We are going to first create a molecule for each component, making use of the atoms listed above wherever possible. Then, we will build an organism for the larger site header component. On the Drupal side, we will map our logo component to the Site Branding block, the search form to the default Drupal search form block, the menu to the Main Navigation block and the site header to the header region template. Now that we have a plan, let’s get started on our first component—the logo.

The Logo Molecule

Emulsify automatically provides us with everything we need to print a logo – see components/_patterns/01-atoms/04-images/00-image/image.twig. Although it is an image atom, it has an optional img_url variable that will wrap the image in a link if present. So, in this case, we don’t even have to create the logo component. We merely need a variant of the image component, which is easy to do in Pattern Lab by duplicating components/_patterns/01-atoms/04-images/00-image/image.yml and renaming it as components/_patterns/01-atoms/04-images/00-image/image~logo.yml (see Pattern Lab documentation).

Next, we change the variables in the image~logo.yml as needed and add a new image_link_base_class variable, naming it whatever we like for styling purposes. For those who are working in a new installation of Emulsify alongside this tutorial, you will notice this file already exists! Emulsify ships with a ready-made logo component. This means we can immediately jump into mapping our new logo component in Drupal.

Connecting the Logo Component to Drupal

Although you could just write static markup for the logo, let’s use the branding block in Drupal (the block that supplies the theme logo or one uploaded via the Appearance Settings page). These instructions assume you have a local Drupal development environment complete with Twig debugging enabled. Add the Site Branding block to your header region in the Drupal administrative UI to see your branding block on your page. Inspect the element to find the template file in play.

In our case there are two templates—the outer site branding block file and the inner image file. It is best to use the file that contains the most relevant information for your component. Seeing as we need variables like image alt and image src to map to our component, the most relevant file would be the image file itself. Since Emulsify uses Stable as a base theme, let’s check there first for a template file to use. Stable uses core/themes/stable/templates/field/image.html.twig to print images, so we copy that file down to its matching directory in Emulsify creating templates/fields/image.html.twig (this is the template for all image fields, so you may have to be more specific with this filename). Any time you add a new template file, clear the cache registry to make sure that Drupal recognizes the new file. Now the goal in component-driven development is to have markup in components that simply maps to Drupal templates, so let’s replace the default contents of the image.html.twig file above ( <img{{ attributes }}> ) with the following:

{% include "@atoms/04-images/00-image/image.twig" with { img_url: "/", img_src: attributes.src, img_alt: attributes.alt, image_blockname: "logo", image_link_base_class: "logo", } %} {%include"@atoms/04-images/00-image/image.twig"with{  img_url:"/",  img_src:attributes.src,  img_alt:attributes.alt,  image_blockname:"logo",  image_link_base_class:"logo",

We’re using the Twig include statement to use our markup from our original component and pass a mixture of static (url, BEM classes) and dynamic (img alt and src) content to the component. To figure out what Drupal variables to use for dynamic content, see first the “Available variables” section at the top of the Drupal Twig file you’re using and then use the Devel module and the kint function to debug the variables themselves. Also, if you’re new to seeing the BEM class variables (Emulsify-specific), see our recent post on why/how we use these variables (and the BEM function) to pass in BEM classes to Pattern Lab and the Drupal Attributes object. Basically, this include statement above will print out:

<a class="logo" href="https://www.fourkitchens.com/"> <img class="logo__img" src=”/themes/emulsify/logo.svg" alt="Home"> </a> <aclass="logo"href="/">    <imgclass="logo__img"src=/themes/emulsify/logo.svg" alt="Home">

We should now see our branding block using our custom component markup! Let’s move on to the next molecule—the search form.

The Search Form Molecule

Component-driven development, particularly the division of components into controlled, separate atomic units, is not always perfect. But the beauty of Pattern Lab (and Emulsify) is that there is a lot of flexibility in how you markup a component. If the ideal approach of using a Twig function to include other smaller elements isn’t possible (or is too time consuming), simply write custom HTML for the component as needed for the situation! One area where we lean into this flexibility is in dealing with Drupal’s form markup. Let’s take a look at how you could handle the search block. First, let’s create a form molecule in Pattern Lab.

Form Wrapper

Create a directory in components/_patterns/02-molecules entitled “search-form” with a search-form.twig file with the following contents (markup tweaked from core/themes/stable/templates/form/form.html.twig):

<form {{ bem('search')}}> {% if children %} {{ children }} {% else %} <div class="search__item"> <input title="Enter the terms you wish to search for." size="15" maxlength="128" class="form-search"> </div> <div class="form-actions"> <input type="submit" value="Search" class="form-item__textfield button js-form-submit form-submit"> </div> {% endif %} </form> <form{{bem('search')}}>  {%ifchildren%}    {{children}}  {%else%}    <divclass="search__item">      <inputtitle="Enter the terms you wish to search for."size="15"maxlength="128"class="form-search">    </div>    <divclass="form-actions">      <inputtype="submit"value="Search"class="form-item__textfield button js-form-submit form-submit">    </div>  {%endif%}

In this file (code here) we’re doing a check for the Drupal-specific variable “children” in order to pass one thing to Drupal and another to Pattern Lab. We want to make the markup as similar as possible between the two, so I’ve copied the relevant parts of the markup by inspecting the default Drupal search form in the browser. As you can see there are two classes we need on the Drupal side. The first is on the outer <form>  wrapper, so we will need a matching Drupal template to inherit that. Many templates in Drupal will have suggestions by default, but the form template is a great example of one that doesn’t. However, adding a new template suggestion is a minor task, so let’s add the following code to emulsify.theme:

/** * Implements hook_theme_suggestions_HOOK_alter() for form templates. */ function emulsify_theme_suggestions_form_alter(array &$suggestions, array $variables) { if ($variables['element']['#form_id'] == 'search_block_form') { $suggestions[] = 'form__search_block_form'; } } * Implements hook_theme_suggestions_HOOK_alter() for form templates.functionemulsify_theme_suggestions_form_alter(array&$suggestions,array$variables){  if($variables['element']['#form_id']=='search_block_form'){    $suggestions[]='form__search_block_form';

After clearing the cache registry, you should see the new suggestion, so we can now add the file templates/form/form--search-block-form.html.twig. In that file, let’s write:
{% include "@molecules/search-form/search-form.twig" %} {%include"@molecules/search-form/search-form.twig"%}

The Form Element

We have only the “search__item” class left, for which we follow a similar process. Let’s create the file components/_patterns/02-molecules/search-form/_search-form-element.twig, copying the contents from core/themes/stable/templates/form/form-element.html.twig and making small tweaks like so:

{% set classes = [ 'js-form-item', 'search__item', 'js-form-type-' ~ type|clean_class, 'search__item--' ~ name|clean_class, 'js-form-item-' ~ name|clean_class, title_display not in ['after', 'before'] ? 'form-no-label', disabled == 'disabled' ? 'form-disabled', errors ? 'form-item--error', ] %} {% set description_classes = [ 'description', description_display == 'invisible' ? 'visually-hidden', ] %} <div {{ attributes.addClass(classes) }}> {% if label_display in ['before', 'invisible'] %} {{ label }} {% endif %} {% if prefix is not empty %} <span class="field-prefix">{{ prefix }}</span> {% endif %} {% if description_display == 'before' and description.content %} <div{{ description.attributes }}> {{ description.content }} </div> {% endif %} {{ children }} {% if suffix is not empty %} <span class="field-suffix">{{ suffix }}</span> {% endif %} {% if label_display == 'after' %} {{ label }} {% endif %} {% if errors %} <div class="form-item--error-message"> {{ errors }} </div> {% endif %} {% if description_display in ['after', 'invisible'] and description.content %} <div{{ description.attributes.addClass(description_classes) }}> {{ description.content }} </div> {% endif %} </div>   setclasses=[    'js-form-item',    'search__item',    'js-form-type-'~type|clean_class,    'search__item--'~name|clean_class,    'js-form-item-'~name|clean_class,    title_displaynotin['after','before']?'form-no-label',    disabled=='disabled'?'form-disabled',    errors?'form-item--error',  setdescription_classes=[    'description',    description_display=='invisible'?'visually-hidden',<div{{attributes.addClass(classes)}}>  {%iflabel_displayin['before','invisible']%}    {{label}}  {%endif%}  {%ifprefixisnotempty%}    <spanclass="field-prefix">{{prefix}}</span>  {%endif%}  {%ifdescription_display=='before'anddescription.content%}    <div{{description.attributes}}>      {{description.content}}    </div>  {%endif%}  {{children}}  {%ifsuffixisnotempty%}    <spanclass="field-suffix">{{suffix}}</span>  {%endif%}  {%iflabel_display=='after'%}    {{label}}  {%endif%}  {%iferrors%}    <divclass="form-item--error-message">      {{errors}}    </div>  {%endif%}  {%ifdescription_displayin['after','invisible']anddescription.content%}    <div{{description.attributes.addClass(description_classes)}}>      {{description.content}}    </div>  {%endif%}

This file will not be needed in Pattern Lab, which is why we’ve used the underscore at the beginning of the name. This tells Pattern Lab to not display the file in the style guide. Now we need this markup in Drupal, so let’s add a new template suggestion in emulsify.theme like so:

/** * Implements hook_theme_suggestions_HOOK_alter() for form element templates. */ function emulsify_theme_suggestions_form_element_alter(array &$suggestions, array $variables) { if ($variables['element']['#type'] == 'search') { $suggestions[] = 'form_element__search_block_form'; } }   * Implements hook_theme_suggestions_HOOK_alter() for form element templates.functionemulsify_theme_suggestions_form_element_alter(array&$suggestions,array$variables){    if($variables['element']['#type']=='search'){      $suggestions[]='form_element__search_block_form';

And now let’s add the file templates/form/form-element--search-block-form.html.twig with the following code:

{% include "@molecules/search-form/_search-form-element.twig" %} {%include"@molecules/search-form/_search-form-element.twig"%}

We now have the basic pieces for styling our search form in Pattern Lab and Drupal. This was not the fastest element to theme in a component-driven way, but it is a good example of complex concepts that will help when necessary. We hope to make creating form components a little easier in future releases of Emulsify, similar to what we’ve done in v2 with menus. And speaking of menus…

The Main Menu

In Emulsify 2, we have made it a bit easier to work with another complex piece of Twig in Drupal 8, which is the menu system. The files that do the heavy-lifting here are components/_patterns/02-molecules/menus/_menu.twig  and components/_patterns/02-molecules/menus/_menu-item.twig  (included in the first file). We also already have an example of a main menu component in the directory

themes/emulsify/components/_patterns/02-molecules/menus/main-menu themes/emulsify/components/_patterns/02-molecules/menus/main-menu

which is already connected in the Drupal template

templates/navigation/menu--main.html.twig templates/navigation/menu--main.html.twig

Obviously, you can use this as-is or tweak the code to fit your situation, but let’s break down the key pieces which could help you define your own menu.

Menu Markup

Ignoring the code for the menu toggle inside the file, the key piece from themes/emulsify/components/_patterns/02-molecules/menus/main-menu/main-menu.twig is the include statement:

<nav id="main-nav" class="main-nav"> {% include "@molecules/menus/_menu.twig" with { menu_class: 'main-menu' } %} </nav> <navid="main-nav"class="main-nav">  {%include"@molecules/menus/_menu.twig"with{    menu_class:'main-menu'

This will use all the code from the original heavy-lifting files while passing in the class we need for styling. For an example of how to stub out component data for Pattern Lab, see components/_patterns/02-molecules/menus/main-menu/main-menu.yml. This component also shows you how you can have your styling and javascript live alongside your component markup in the same directory. Finally, you can see a more simple example of using a menu like this in the components/_patterns/02-molecules/menus/inline-menu component. For now, let’s move on to placing our components into a header organism.

The Header Organism

Now that we have our three molecule components built, let’s create a wrapper component for our site header. Emulsify ships with an empty component for this at components/_patterns/03-organisms/site/site-header. In our usage we want to change the markup in components/_patterns/03-organisms/site/site-header/site-header.twig to:

<header class="header"> <div class="header__logo"> {% block logo %} {% include "@atoms/04-images/00-image/image.twig" %} {% endblock %} </div> <div class="header__search"> {% block search %} {% include "@molecules/search-form/search-form.twig" %} {% endblock %} </div> <div class="header__menu"> {% block menu %} {% include "@molecules/menus/main-menu/main-menu.twig" %} {% endblock %} </div> </header> <headerclass="header">  <divclass="header__logo">    {%blocklogo%}      {%include"@atoms/04-images/00-image/image.twig"%}    {%endblock%}  </div>  <divclass="header__search">    {%blocksearch%}      {%include"@molecules/search-form/search-form.twig"%}    {%endblock%}  </div>  <divclass="header__menu">    {%blockmenu%}      {%include"@molecules/menus/main-menu/main-menu.twig"%}    {%endblock%}  </div>

Notice the use of Twig blocks. These will help us provide default data for Pattern Lab while giving us the flexibility to replace those with our component templates on the Drupal side. To populate the default data for Pattern Lab, simply create components/_patterns/03-organisms/site/site-header/site-header.yml and copy over the data from components/_patterns/01-atoms/04-images/00-image/image~logo.yml and components/_patterns/02-molecules/menus/main-menu/main-menu.yml. You should now see your component printed in Pattern Lab.

Header in Drupal

To print the header organism in Drupal, let’s work with the templates/layout/region--header.html.twig file, replacing the default contents with:

{% extends "@organisms/site/site-header/site-header.twig" %} {% block logo %} {{ elements.emulsify_branding }} {% endblock %} {% block search %} {{ elements.emulsify_search }} {% endblock %} {% block menu %} {{ elements.emulsify_main_menu }} {% endblock %} {%extends"@organisms/site/site-header/site-header.twig"%}{%blocklogo%}  {{elements.emulsify_branding }}{%endblock%}{%blocksearch%}  {{elements.emulsify_search }}{%endblock%}{%blockmenu%}  {{elements.emulsify_main_menu }}{%endblock%}

Here, we’re using the Twig extends statement to be able to use the Twig blocks we created in the component. You can also use the more robust embed statement when you need to pass variables like so:

{% embed "@organisms/site/site-header/site-header.twig" with { variable: "something", } %} {% block logo %} {{ elements.emulsify_branding }} {% endblock %} {% block search %} {{ elements.emulsify_search }} {% endblock %} {% block menu %} {{ elements.emulsify_main_menu }} {% endblock %} {% endembed %} {%embed"@organisms/site/site-header/site-header.twig"with{  variable:"something",  {%blocklogo%}    {{elements.emulsify_branding }}  {%endblock%}  {%blocksearch%}    {{elements.emulsify_search }}  {%endblock%}  {%blockmenu%}    {{elements.emulsify_main_menu }}  {%endblock%}{%endembed%}

For our purposes, we can simply use the extends statement. You’ll notice that we are using the elements variable. This variable is currently not listed in the Stable region template at the top, but is extremely useful in printing the blocks that are currently in that region. Finally, if you’ve added the file, be sure and clear the cache registry—otherwise, you should now see your full header in Drupal.

Final Thoughts

Component-driven development is not without trials, but I hope we have touched on some of the more difficult ones in this article to speed you on your journey. If you would like to view the branch of Emulsify where we built this site header component, you can see that here. Feel free to sift through and reverse-engineer the code to figure out how to build your own component-driven Drupal project!

This fifth episode concludes our five-part video-blog series for Emulsify 2.x. Thanks for following our Emulsify 2.x tutorials. Miss a post? Read the full series here.

Pt 1: Installing Emulsify | Pt 2: Creating your Emulsify 2.0 Starter Kit with Drush | Pt 3: BEM Twig Function | Pt 4: DRY Twig Approach

Just need the videos? Watch them all on our channel.

Download Emulsify

Web Chef Evan Willhite
Evan Willhite

Evan Willhite is a frontend engineer at Four Kitchens who thrives on creating delightful digital experiences for users, clients, and fellow engineers. He enjoys running, hot chicken, playing music, and being a homebody with his family.

Dec 14 2017
Dec 14
December 14th, 2017

When working with Pantheon, you’re presented with the three typical environments: Dev, Test and Live. This scheme is very common in major hosting providers, and not without a reason: it allows to plan and execute an effective and efficient development process that takes every client need into consideration. We can use CircleCI to manage that process.

CircleCI works based in the circle.yml file located at the root of projects. It is a script of all the stuff the virtual machine will do for you in the cloud, including testing and delivery. The script is triggered by a commit into the repository unless you have it configured to just react to commits to branches with open pull requests. It is divided into sections, each representing a phase of the Build-Test-Deploy process.

In the deployment section, you can put instructions in order to deploy your code to your web servers. A common deployment would look like the following:

deployment: dev: branch: master commands: - ./merge_to_master.sh deployment:    branch:master    commands:      -./merge_to_master.sh

This literally means, perform the operations listed under commands every time a commit is merged into the master branch. You may think there is not a real reason to use a deployment block like this to do an actual deployment. And it’s true, you can do whatever you want there. It’s ideal to perform deployments, but in essence, the deployment section allows you to implement conditional post-build subscripts that react differently depending on the nature of the action that triggered the whole build.

The Drops 8 Pantheon upstream for Drupal 8 comes with a very handy circle.yml that can help you set up a basic CircleCI workflow in a matter of minutes. It heavily relies on the use of Pantheon’s Terminus CLI and a couple of plugins like the excellent Terminus Build Tools Plugin, that provides probably the most important call of the whole script:

terminus build:env:create -n "$TERMINUS_SITE.dev" "$TERMINUS_ENV" --yes --clone-content --db-only --notify="$NOTIFY" terminusbuild:env:create-n"$TERMINUS_SITE.dev""$TERMINUS_ENV"--yes--clone-content--db-only--notify="$NOTIFY"

The line above creates a multidev environment in Pantheon, merging the code and generated assets in Circle’s VM into the code coming from the dev environment, and also clones the database from there. You can then use drush to update that database with the changes in configuration that you just merged in.

Once you get to the deployment section, you already have a functional multidev environment. The deployment happens by merging the artifact back into dev:

deployment: build-assets: branch: master commands: - terminus build:env:merge -n "$TERMINUS_SITE.$TERMINUS_ENV" --yes deployment:    build-assets:        branch:master        commands:            -terminusbuild:env:merge-n"$TERMINUS_SITE.$TERMINUS_ENV"--yes

This workflow assumes a simple git workflow where you just create feature branches and merge them into master where they’re ready. It also takes the deployment process just to the point code reaches dev. This is sometimes not enough.

Integrating release branches

When in a gitflow involving a central release branch, the perfect environment to host a site that completely reflects the state of the release branch is the dev environment, After all, development is only being actively done in that branch. Assuming your release branch is statically called develop:

deployment: dev: branch: develop commands: # Deploy to DEV environment. - terminus build-env:merge -n "$TERMINUS_SITE.$TERMINUS_ENV" --yes --delete - ./rebuild_dev.sh test: branch: master commands: # Deploy to DEV environment. - terminus build-env:merge -n "$TERMINUS_SITE.$TERMINUS_ENV" --yes --delete - ./rebuild_dev.sh # Deploy to TEST environment. - terminus env:deploy $TERMINUS_SITE.test --sync-content - ./rebuild_test.sh - terminus env:clone-content "$TERMINUS_SITE.test" "dev" --yes deployment:    branch:develop    commands:        # Deploy to DEV environment.      -terminusbuild-env:merge-n"$TERMINUS_SITE.$TERMINUS_ENV"--yes--delete      -./rebuild_dev.sh    branch:master    commands:        # Deploy to DEV environment.      -terminusbuild-env:merge-n"$TERMINUS_SITE.$TERMINUS_ENV"--yes--delete      -./rebuild_dev.sh        # Deploy to TEST environment.      -terminusenv:deploy$TERMINUS_SITE.test--sync-content      -./rebuild_test.sh      -terminusenv:clone-content"$TERMINUS_SITE.test""dev"--yes

This way, when you merge into the release branch, the multidev environment associated with it will get merged into dev and deleted, and dev will be rebuilt.

The feature is available in Dev right after merging into the release branch.

The same happens on release day when the release branch is merged into master, but after dev is rebuilt, it is also deployed to the test environment:

The release branch goes all the way to the test environment.

A few things to notice about this process:

  • The --sync-content option brings database and files from the live environment to test at the same time code is coming there from dev. By rebuilding test, we’re now able to test the latest changes in code against the latest changes in content, assuming live is your primary content entry point.
  • The last Terminus command takes the database from test and sends it back to dev. So, to recap, the database originally came from live, was rebuilt in test using dev’s fresh code, and now goes to dev. At this moment, test and dev are identical. Just until the next commit is thrown into the release branch.
  • This process facilitates testing. While the next release is already in progress and transforming dev, the client can take all the time to give the final approval for what’s in test. Once that happens, the deployment to live should occur in a semi-automatic way at most. But nothing really prevents you from using this same approach to automate also the deployment to live. Well, nothing but good judgment.
  • By using circle.yml to handle the deployment process, you contribute to keep workflow configuration centralized and accessible. With the appropriate system in place, you can trigger a complete and fully automated deployment just by doing a commit to Github, and all you’ll ever need to know about the process is in that single file.
Web Chef Joel Travieso
Joel Travieso

Joel focuses on the backend and architecture of web projects seeking to constantly improve by considering the latest developments of the art.

Web Chef Dev Experts
Development

Blog posts about backend engineering, frontend code work, programming tricks and tips, systems architecture, apps, APIs, microservices, and the technical side of Four Kitchens.

Read more Development
Nov 13 2017
Nov 13
November 13th, 2017

Welcome to the fourth episode in our video series for Emulsify 2.x. Emulsify 2.x is a new release that embodies our commitment to component-driven design within Drupal. We’ve added Composer and Drush support, as well as open-source Twig functions and many other changes to increase ease-of-use.

In this video, we’re going to teach you how to best use a DRY Twig approach when working in Emulsify. This blog post accompanies a tutorial video, embedded at the end of this post.

DRYing Out Your Twigs

Although we’ve been using a DRY Twig approach in Emulsify since before the 2.x release, it’s a topic worth addressing because it is unique to Emulsify and provides great benefit to you workflow. After all, what drew you to component-driven development in the first place? Making things DRY of course!

In component-driven development, we build components once and reuse them together in different combination—like playing with Lego. In Emulsify, we use Sass mixins and BEM-style CSS to make our CSS as reusable and isolated as possible. DRY Twig simply extends these same benefits to the HTML itself. Let’s look at an example:

Non-DRY Twig:

<h2 class=”title”> <a class=”title__link” href=”/”>Link Text</a> </h2> <h2class=title><aclass=title__link” href=/>LinkText</a>

DRY Twig:

<h2 class=”title”> {% include "@atoms/01-links/link/link.twig" with { "link_content": “Link Text”, "link_url": “/”, "link_class": “title__link”, } %} </h2> <h2class=title>{%include"@atoms/01-links/link/link.twig"with{"link_content":LinkText,"link_url":/,"link_class":title__link”,

The code with DRY Twig is more verbose, but by switching to this method, we’ve now removed a point of failure in our HTML. We’re not repeating the same HTML everywhere! We write that HTML once and reuse it everywhere it is needed.

The concept is simple, and it is found everywhere in the components directory that ships in Emulsify. HTML gets written mostly as atoms and is simply reused in larger components using the default include, extends or embed functions built into Twig. We challenge you to try this in a project, and see what you think.

[embedded content]

Thanks for following our Emulsify 2.x tutorials. Miss a post? Read the full series here.

Pt 1: Installing Emulsify | Pt 2: Creating your Emulsify 2.0 Starter Kit with Drush | Pt 3: BEM Twig Function | Pt 4: DRY Twig Approach | Pt 5: Building a Full Site Header in Drupal

Just need the videos? Watch them all on our channel.

Download Emulsify

Web Chef Evan Willhite
Evan Willhite

Evan Willhite is a frontend engineer at Four Kitchens who thrives on creating delightful digital experiences for users, clients, and fellow engineers. He enjoys running, hot chicken, playing music, and being a homebody with his family.

Oct 26 2017
Oct 26
October 26th, 2017

Welcome to the third episode in our video series for Emulsify 2.x. Emulsify 2.x is a new release that embodies our commitment to component-driven design within Drupal. We’ve added Composer and Drush support, as well as open-source Twig functions and many other changes to increase ease-of-use.

In this video, we’re going to teach you how Emulsify works with the BEM Twig extension. This blog post accompanies a tutorial video, embedded at the end of this post.

Background

In Emulsify 2.x, we have enhanced our support for BEM in Drupal by creating the BEM Twig extension. The BEM Twig extension makes it easy to deliver classes to both Pattern Lab and Drupal while using Drupal’s Attributes object. It also has the benefit of simplifying our syntax greatly. See the code below.

Emulsify 1.x:

{% set paragraph_base_class_var = paragraph_base_class|default('paragraph') %} {% set paragraph_modifiers = ['large', 'red'] %} <p class="{{ paragraph_base_class_var }}{% for modifier in paragraph_modifiers %} {{ paragraph_base_class_var }}--{{ modifier }}{% endfor %}{% if paragraph_blockname %} {{ paragraph_blockname }}__{{ paragraph_base_class_var }}{% endif %}"> {% block paragraph_content %} {{ paragraph_content }} {% endblock %} </p> {%setparagraph_base_class_var=paragraph_base_class|default('paragraph')%}{%setparagraph_modifiers=['large','red']%}<pclass="{{ paragraph_base_class_var }}{% for modifier in paragraph_modifiers %} {{ paragraph_base_class_var }}--{{ modifier }}{% endfor %}{% if paragraph_blockname %} {{ paragraph_blockname }}__{{ paragraph_base_class_var }}{% endif %}">  {%blockparagraph_content%}    {{paragraph_content }}  {%endblock%}

Emulsify 2.x:

<p {{ bem('paragraph', ['large', 'red']) }}> {% block paragraph_content %} {{ paragraph_content }} {% endblock %} </p> <p{{bem('paragraph',['large','red'])}}>  {%blockparagraph_content%}    {{paragraph_content }}  {%endblock%}

In both Pattern Lab and Drupal, this function above will create p class=”paragraph paragraph--large paragraph--red”, but in Drupal it will use the equivalent of p{{ attributes.addClass('paragraph paragraph--large paragraph--red') }}, appending these classes to whatever classes core or other plugins provide as well. Simpler syntax + Drupal Attributes support!

We have released the BEM Twig function open source under the Drupal Pattern Lab initiative. It is in Emulsify 2.x by default, but we wanted other projects to be able to benefit from it as well.

Usage

The BEM Twig function accepts four arguments, only one of which is required.

Simple block name:
h1 {{ bem('title') }}

In Drupal and Pattern Lab, this will print:

h1 class="title"

Block with modifiers (optional array allowing multiple modifiers):

h1 {{ bem('title', ['small', 'red']) }}

This creates:

h1 class="title title--small title--red"

Element with modifiers and block name (optional):

h1 {{ bem('title', ['small', 'red'], 'card') }}

This creates:

h1 class="card__title card__title--small card__title--red"

Element with block name, but no modifiers (optional):

h1 {{ bem('title', '', 'card') }}

This creates:

h1 class="card__title"

Element with modifiers, block name and extra classes (optional, in case you need non-BEM classes):

h1 {{ bem('title', ['small', 'red'], 'card', ['js-click', 'something-else']) }}

This creates:

h1 class="card__title card__title--small card__title--red js-click something-else"

Element with extra classes only (optional):

h1 {{ bem('title', '', '', ['js-click']) }}

This creates:

h1 class="title js-click"

Ba da BEM, Ba da boom

With the new BEM Twig extension that we’ve added to Emulsify 2.x, you can easily deliver classes to Pattern Lab and Drupal, while keeping a nice, simple syntax. Thanks for following along! Make sure you check out the other posts in this series and their video tutorials as well!

[embedded content]

Thanks for following our Emulsify 2.x tutorials. Miss a post? Read the full series is here.

Pt 1: Installing Emulsify | Pt 2: Creating your Emulsify 2.0 Starter Kit with Drush | Pt 3: BEM Twig Function | Pt 4: DRY Twig Approach| Pt 5: Building a Full Site Header in Drupal

Just need the videos? Watch them all on our channel.

Download Emulsify

Web Chef Evan Willhite
Evan Willhite

Evan Willhite is a frontend engineer at Four Kitchens who thrives on creating delightful digital experiences for users, clients, and fellow engineers. He enjoys running, hot chicken, playing music, and being a homebody with his family.

Oct 13 2017
Oct 13
October 13th, 2017

Welcome to the second episode in our new video series for Emulsify. Emulsify 2.x is a new release that embodies our commitment to component-driven design within Drupal. We’ve added Composer and Drush support, as well as open-source Twig functions and many other changes to increase ease-of-use.

In this video, we’re going to teach you how to create an Emulsify 2.0 starter kit with Drush. This blog post follows the video closely, so you can skip ahead or repeat sections in the video by referring to the timestamps for each section.

PURPOSE [00:15]

This screencast will specifically cover the Emulsify Drush command. The command’s purpose is to setup a new copy of the Emulsify theme.

Note: I used the word “copy” here and not “subtheme” intentionally. This is because the subtheme of your new copy is Drupal Core’s Stable theme, NOT Emulsify.

This new copy of Emulsify will use the human-readable name that your provide, and will build the necessary structure to get you on your way to developing a custom theme.

REQUIREMENTS [00:45]

Before we dig in too deep I recommend that you have the following installed first:

  • a Drupal 8 Core installation
  • the Drush CLI command at least major version 8
  • Node.js preferably the latest stable version
  • a working copy of the Emulsify demo theme 2.X or greater

If you haven’t already watched the Emulsify 2.0 composer install presentation, please stop this video and go watch that one.

Note: If you aren’t already using Drush 9 you should consider upgrading as soon as possible because the next minor version release of Drupal Core 8.4.0 is only going to work with Drush 9 or greater.

RECOMMENDATIONS [01:33]

We recommend that you use PHP7 or greater as you get some massive performance improvements for a very little amount of work.

We also recommend that you use composer to install Drupal and Emulsify. In fact, if you didn’t use Composer to install Emulsify—or at least run Composer install inside of Emulsify—you will get errors. You will also notice errors if npm install failed on the Emulsify demo theme installation.

AGENDA [02:06]

Now that we have everything setup and ready to go, this presentation will first discuss the theory behind the Drush script. Then we will show what you should expect if the installation was successful. After that I will give you some links to additional resources.

BACKGROUND [02:25]

The general idea of the command is that it creates a new theme from Emulsify’s files but is actually based on Drupal Core’s Stable theme. Once you have run the command, the demo Emulsify theme is no longer required and you can uninstall it from your Drupal codebase.

WHEN, WHERE, and WHY? [02:44]

WHEN: You should run this command before writing any custom code but after your Drupal 8 site is working and Emulsify has been installed (via Composer).

WHERE: You should run the command from the Drupal root or use a Drush alias.

WHY: Why you should NOT edit the Emulsify theme’s files. If you installed Emulsify the recommended way (via Composer), next time you run composer update ALL of your custom code changes will be wiped out. If this happens I really hope you are using version control.

HOW TO USE THE COMMAND? [03:24]

Arguments:

Well first it requires a single argument, the human-readable name. This name can contain spaces and capital letters.

Options:

The command has defaults set for options that you can override.

This first is the theme description which will appear within Drupal and in your .info file.

The second is the machine-name; this is the option that allows you to pick the directory name and the machine name as it appears within Drupal.

The third option is the path; this is the path that your theme will be installed to, it defaults to “themes/custom” but if you don’t like that you can change it to any directory relative to your web root.

Fourth and final option is the slim option. This allows advanced users who don’t need demo content or don’t want anything but the bare minimum required to create a new theme.

Note:

Only the human_readable_name is required, options don’t have to appear in any order, don’t have to appear at all, or you can only pass one if you just want to change one of the defaults.

SUCCESS [04:52]

If your new theme was successfully created you should see the successful output message. In the example below I used the slim option because it is a bit faster to run but again this is an option and is NOT required.

The success message contains information you may find helpful, including the name of the theme that was created, the path where it was installed, and the next required step for setup.

THEME SETUP [05:25]

Setting up your custom theme. Navigate to your custom theme on the command line. Type the yarn and watch as pattern lab is downloaded and installed. If the installation was successful you should see a pattern lab successful message and your theme should now be visible within Drupal.

COMPILING YOUR STYLE GUIDE [05:51]

Now that we have pattern lab successfully installed and you committed it to you version control system, you are probably eager to use it. Emulsify uses npm scripts to setup a local pattern lab instance for display of your style guide.

The script you are interested in is yarn start. Run this command for all of your local development. You do NOT have to have a working Drupal installation at this point to do development on your components.

If you need a designer who isn’t familiar with Drupal to make some tweaks, you will only have to give them your code base, have them use yarn to install, and yarn start to see your style guide.

It is however recommended the initial setup of your components is done by someone with background knowledge of Drupal templates and themes as the variables passed to each component will be different for each Drupal template.

For more information on components and templates keep an eye out for our soon to come demo components and screencasts on building components.

VIEWING YOUR STYLE GUIDE [07:05]

Now that you have run yarn start you can open your browser and navigate to the localhost URL that appears in your console. If you get an error here you might already have something running on port 3000. If you need to cancel this script hit control + c.

ADDITIONAL RESOURCES [07:24]

Thank you for watching today’s screencast, we hope you found this presentation informative and enjoy working with Emulsify 2.0. If you would like to search for some additional resources you can go to emulsify.info or github.com/fourkitchens/emulsify.

[embedded content]

Thanks for following our Emulsify 2.x tutorials. Miss a post? Read the full series is here.

Pt 1: Installing Emulsify | Pt 2: Creating your Emulsify 2.0 Starter Kit with Drush | Pt 3: BEM Twig Function | Pt 4: DRY Twig Approach | Pt 5: Building a Full Site Header in Drupal

Just need the videos? Watch them all on our channel.

Download Emulsify

Web Chef Chris Martin
Chris Martin

Chris Martin is a support engineer at Four Kitchens. When not maintaining websites he can be found building drones, computers, robots, and occasionally traveling to China.

Oct 05 2017
Oct 05
October 5th, 2017

Welcome to the first episode in our new video series for Emulsify. Emulsify 2.x is a new release that embodies our commitment to component-driven design within Drupal. We’ve added Composer and Drush support, as well as open-source Twig functions and many other changes to increase ease-of-use.

In this video, we’re going to get you up and running with Emulsify. This blog post accompanies a tutorial video, which you can find embedded at the end.

Emulsify is, at it’s core, a prototyping tool. At Four Kitchens we also use it as a Drupal 8 theme starter kit. Depending on how you want to use it, the installation steps will vary. I’ll quickly go over how to install and use Emulsify as a stand alone prototyping tool, then I’ll show you how we use it to theme Drupal 8 sites.

Emulsify Standalone

Installing Emulsify core as a stand alone tool is a simple process with Composer and NPM (or Yarn).

  1. composer create-project fourkitchens/emulsify --stability dev --no-interaction emulsify
  2. cd emulsify
  3. yarn install (or npm install, if you don’t have yarn installed)

Once the installation process is complete, you can start it with either npm start or yarn start:

  1. yarn start

Once it’s up, you can use the either the Local or External links to view the Pattern Lab instance in the browser. (The External link is useful for physical device testing, like on your phone or tablet, but can vary per-machine. So, if you’re using hosted fonts, you might have to add a bunch of IPs to your account to accommodate all of your developers.)

The start process runs all of the build and watch commands. So once it’s up, all of your changes are instantly reflected in the browser.

I can add additional colors to the _color-vars.scss file, Edit the card.yml example data, or even update the 01-card.twig file to modify the structure of the card component.

That’s really all there is to using Emulsify as a prototyping tool. You can quickly build out your components using component-driven design without having to have a full web server, and site, up and running.

Emulsify in a Composer-Based Drupal 8 Installation

It’s general best practice to install Drupal 8 via Composer, and that’s what we do at Four Kitchens. So, we’ve built Emulsify 2 to work great in that environment. I won’t cover the details of installing Drupal via Composer since that’s out of scope for this video, and there are videos that cover that already. Instead, I’ll quickly run through that process, and then come back and walk through the details of how to install Emulsify in a Composer-based Drupal 8 site.

Okay, I’ve got a fresh Drupal 8 site installed. Let’s install Emulsify alongside it.

From the project root, we’ll run the composer require command:

  • composer require fourkitchens/emulsify

Next, we’ll enable Emulsify and its dependencies:

  • cd web
  • drush en emulsify components unified_twig_ext -y

At this point, we highly recommend you use the Drush script that comes with Emulsify to create a custom clone of Emulsify for your actual production site. The reason is that any change you make to Emulsify core will be overwritten when you update Emulsify, and there’s currently no real good way to create a child theme of a component-based, Pattern Lab -powered, Drupal theme. So, the Drush script simply creates a clone of Emulsify and makes the file renaming process into a simple script.

We have another video covering the Drush script, so definitely watch that for all of the details. For this video though, I’ll just use emulsify core, since I’m not going to make any customizations.

  • cd web/themes/contrib/emulsify/ (If you do create a clone with the drush script, you’ll cd web/themes/custom/THEME_NAME/)
  • yarn install

  • yarn start

Now we have our Pattern Lab instance up and running, accessible at the links provided.

We can also head over to the “Appearance” page on our site, and set our theme as the default. When we do that, and go back to the homepage, it looks all boring and gray, but that’s just because we haven’t started doing any actual theming yet.

At this point, the theme is installed, and you’re ready to create your components and make your site look beautiful!

[embedded content]

Thanks for following our Emulsify 2.x tutorials. Miss a post? Read the full series is here.

Pt 1: Installing Emulsify | Pt 2: Creating your Emulsify 2.0 Starter Kit with Drush | Pt 3: BEM Twig Function | Pt 4: DRY Twig Approach | Pt 5: Building a Full Site Header in Drupal

Just need the videos? Watch them all on our channel.

Download Emulsify

Web Chef Brian Lewis
Brian Lewis

Brian Lewis is a frontend engineer at Four Kitchens, and is passionate about sharing knowledge and learning new tools and techniques.

Sep 28 2017
Sep 28
September 28th, 2017

If your site was built with Drupal within the last few years, you may be wondering what all the D8 fuss is about. How is Drupal 8 better than Drupal 6 or 7? Is it worth the investment to migrate? What do you need to know to make a decision? In this post we’ll share the top five reasons our customers—people like you—are taking the plunge. If you know you’re ready, tell us.

  1. Drupal 8 has a built-in services-based, API architecture. That means you can build new apps to deliver experiences across lots of devices quickly and your content only needs to live in one place. D8’s architecture means you don’t have to structure your data differently for each solution—we’ve helped clients build apps for mobile, Roku, and Amazon Alexa using this approach (read how we helped NBC). If you’re on Drupal 6 now, a migration to Drupal 8 will allow you to do unleash the power of your content with API integration.
  2. You can skip Drupal 7 and migrate straight to D8. If you’re on Drupal 6, migrating directly to Drupal 8 is not just doable—it’s advisable. It will ensure every core and contributed module, security patch, and improvement is supported and compatible for your site for longer.
  3. The Drupal 8 ecosystem is ready. One of the reasons people love Drupal is for the amazing variety of modules available. Drupal 8 is mature enough now that most of the major Drupal modules you have already work for D8 sites.
  4. Drupal 8 is efficient. Custom development on Drupal 8 is more efficient than previous versions—we’ve already seen this with our D8 clients and others in the Drupal community are saying the same thing. When you add that to the fact that Drupal 8 is the final version to require migration—all future versions will be minor upgrades—you’ve got a solid business reason to move to Drupal 8 now.
  5. It’s a smart business decision. Drupal 6 is no longer supported—and eventually Drupal 7 will reach “end of life”—which means any improvements or bug fixes you’re making to your existing site will need to be re-done when you do make the move. Migrating to Drupal 8 now will ensure that any investments you make to improving or extending your digital presence are investments that last.

If you’re still not sure what you need, or if you would like to discuss a custom review and recommendation, get in touch. At Four Kitchens, we provide a range of services, including user experience research and design, full-stack development, and support services, each with a strategy tailored to your success.

LET’S TALK!

Read about American Craft Council’s move to Drupal 8
Your site should use component-based theming, here’s how
See what we’ve done for other clients >>
Read more about the services we provide >>
Meet the team >>

Web Chef Todd Ross Nienkerk
Todd Ross Nienkerk

Todd Ross Nienkerk is the CEO and co-founder of Four Kitchens. He was born in a subterranean cave in the future.

Aug 09 2017
Aug 09
August 9th, 2017

We are excited to announce the completion of the second major development phase of our engagement with Forcepoint: improving the authoring experience for editors and implementing a new design.

Reimagining the Editorial Experience

Four Kitchens originally launched Forcepoint’s spiffy new Drupal site in January 2016. Since then, Forcepoint’s marketing strategy has evolved, and they hired a marketing agency to perform some brand consulting, while Four Kitchens implemented their new approach in rebuilding the site. We also took the opportunity to revisit the editorial experience in Drupal’s administrative backend.

Four Kitchens has been using Paragraphs on some recent Drupal 8 projects and found it to be a compelling solution for clients that like to exert substantive editorial control at the individual page level—clients like Forcepoint. Providing content templates for markup that works hand in hand with the component-driven theming approach we favor is a primary benefit we get from using Paragraphs for body content.

Editorially, the introduction of Paragraphs gives Forcepoint a more flexible means of controlling content layout for individual pages without having to rely as heavily on Panels as we did for the initial launch. We’re still using Panels for boilerplate and some content type specific data rendering, but the reduced complexity required for editors to layout body content will allow their content to evolve and scale more easily.

In addition to using paragraphs for WYSIWYG content entry, Forcepoint editors are now also able to insert and rearrange related content, Views, Marketo forms, videos, and components that require more complex markup to render.

We’re big proponents of carefully crafted content models and structured data. Overusing Paragraphs runs the risk of removing some or even a lot of that structure. Used judiciously however, it allows us to give clients like Forcepoint the flexibility they want while still enforcing desirable constraints inherent in the design.

Congratulations!

We’ve been working with Forcepoint for over a year now, and are incredibly proud of the solutions we’ve created with them. This kind of close relationship and collaboration is what we strive for with all of our partners. We thrive on understanding our partners’ underlying business challenges and goals, collaborating with their teams, and creating solutions that delight their customers.

The Forcepoint team was led by Chris Devidal as the project manager, working alongside Taylor Smith who acted as internal product owner. Jeff Tomlinson was technical lead and assisted Patrick Coffey who adeptly wrangled all the difficult backend issues. Significant frontend technical leadership was provided by Evan Willhite who worked with Brad Johnson to implement a challenging design. Props also go to Keith Halpin, Neela Joshi and Adam Bennett at Forcepoint for their many contributions.

Web Chef Jeff Tomlinson
Jeff Tomlinson

Jeff Tomlinson enjoys working with clients to provide them with smart solutions to realize their project’s goals. He loves riding his bicycle, too.

Jul 13 2017
Jul 13
July 13th, 2017

When creating the Global Academy for continuing Medical Education (GAME) site for Frontline, we had to tackle several complex problems in regards to content migrations. The previous site had a lot of legacy content we had to bring over into the new system. By tackling each unique problem, we were able to migrate most of the content into the new Drupal 7 site.

Setting Up the New Site

The system Frontline used before the redesign was called Typo3, along with a suite of individual, internally-hosted ASP sites for conferences. Frontline had several kinds of content that displayed differently throughout the site. The complexity with handling the migration was that a lot of the content was in WYSIWYG fields that contained large amounts of custom HTML.

We decided to go with Drupal 7 for this project so we could more easily use code that was created from the MDEdge.com site.

“How are we going to extract the specific pieces of data and get them inserted into the correct fields in Drupal?”

The GAME website redesign greatly improved the flow of the content and how it was displayed on the frontend, and part of that improvement was displaying specific pieces of content in different sections of the page. The burning question that plagued us when tackling this problem was “How are we going to extract the specific pieces of data and get them inserted into the correct fields in Drupal?”

Before we could get deep into the code, we had to do some planning and setup to make sure we were clear in how to best handle the different types of content. This also included hammering out the content model. Once we got to a spot where we could start migrating content, we decided to use the Migrate module. We grabbed the current site files, images and database and put them into a central location outside of the current site that we could easily access. This would allow us to re-run these migrations even after the site launched (if we needed to)!

Migrating Articles

This content on the new site is connected to MDEdge.com via a Rest API. One complication is that the content on GAME was added manually to Typo3, and wasn’t tagged for use with specific fields. The content type on the new Drupal site had a few fields for the data we were displaying, and a field that stores the article ID from MDedge.com. To get that ID for this migration, we mapped the title for news articles in Typo3 to the tile of the article on MDEdge.com. It wasn’t a perfect solution, but it allowed us to do an initial migration of the data.

Conferences Migration

For GAME’s conferences, since there were not too many on the site, we decided to import the main conference data via a Google spreadsheet. The Google doc was a fairly simple spreadsheet that contained a column we used to identify each row in the migration, plus a column for each field that is in that conference’s content type. This worked out well because most of the content in the redesign was new for this content type. This approach allowed the client to start adding content before the content types or migrations were fully built.

Our spreadsheet handled the top level conference data, but it did not handle the pages attached to each conference. Page content was either stored in the Typo3 data or we needed to extract the HTML from the ASP sites.

Typo3 Categories to Drupal Taxonomies

To make sure we mapped the content in the migrations properly, we created another Google doc mapping file that connected the Typo3 categories to Drupal taxonomies. We set it up to support multiple taxonomy terms that could be mapped to one Typo3 category.
[NB: Here is some code that we used to help with the conversion: https://pastebin.com/aeUV81UX.]

Our mapping system worked out fantastically well. The only problem we encountered was that since we were allowing three taxonomy terms to be mapped to one Typo3 category, the client noticed some use cases where too many taxonomies were assigned to content that had more than one Typo3 category in certain use cases. But this was a content-related issue and required them to re-look at this document and tweak it as necessary.

Slaying the Beast:
Extracting, Importing, and Redirecting

One of the larger problems we tackled was how to get the HTML from the Typo3 system and the ASP conference sites into the new Drupal 7 setup.

The ASP conference sites were handled by grabbing the HTML for each of those pages and extracting the page title, body, and photos. The migration of the conference sites was challenging because we were dealing with different HTML for different sites and trying to get get all those differences matched up in Drupal.

Grabbing the data from the Typo3 sites presented another challenge because we had to figure out where the different data was stored in the database. This was a uniquely interesting process because we had to determine which tables were connected to which other tables in order to figure out the content relationships in the database.

The migration of the conference sites was challenging because we were dealing with different HTML for different sites and trying to get get all those differences matched up in Drupal.

A few things we learned in this process:

  • We found all of the content on the current site was in these tables (which are connected to each other): pages, tt_content, tt_news, tt_news_cat_mm and link_cache.
  • After talking with the client, we were able to grab content based on certain Typo3 categories or the pages hierarchy relationship. This helped fill in some of the gaps where a direct relationship could not be made by looking at the database.
  • It was clear that getting 100% of the legacy content wasn’t going to be realistic, mainly because of the loose content relationships in Typo3. After talking to the client we agreed to not migrate content older than a certain date.
  • It was also clear that—given how much HTML was in the content—some manual cleanup was going to be required.

Once we were able to get to the main HTML for the content, we had to figure out how to extract the specific pieces we needed from that HTML.

Once we had access to the data we needed, it was a matter of getting it into Drupal. The migrate module made a lot of this fairly easy with how much functionality it provided out of the box. We ended up using the prepareRow() method a lot to grab specific pieces of content and assigning them to Drupal fields.

Handling Redirects

We wanted to handle as many of the redirects as we could automatically, so the client wouldn’t have to add thousands of redirects and to ensure existing links would continue to work after the new site launched. To do this we mapped the unique row in the Typo3 database to the unique ID we were storing in the custom migration.

As long as you are handling the unique IDs properly in your use of the Migration API, this is a great way to handle mapping what was migrated to the data in Drupal. You use the unique identifier stored for each migration row and grab the corresponding node ID to get the correct URL that should be loaded. Below are some sample queries we used to get access to the migrated nodes in the system. We used UNION queries because the content that was imported from the legacy system could be in any of these tables.

SELECT destid1 FROM migrate_map_cmeactivitynode WHERE sourceid1 IN(:sourceid) UNION SELECT destid1 FROM migrate_map_cmeactivitycontentnode WHERE sourceid1 IN(:sourceid) UNION SELECT destid1 FROM migrate_map_conferencepagetypo3node WHERE sourceid1 IN(:sourceid) … SELECTdestid1FROMmigrate_map_cmeactivitynodeWHEREsourceid1IN(:sourceid)UNIONSELECTdestid1FROMmigrate_map_cmeactivitycontentnodeWHEREsourceid1IN(:sourceid)UNIONSELECTdestid1FROMmigrate_map_conferencepagetypo3nodeWHEREsourceid1IN(:sourceid)

Wrap Up

Migrating complex websites is rarely simple. One thing we learned on this project is that it is best to jump deep into migrations early in the project lifecycle, so the big roadblocks can be identified as early as possible. It also is best to give the client as much time as possible to work through any content cleanup issues that may be required.

We used a lot of Google spreadsheets to get needed information from the client. This made things much simpler on all fronts and allowed the client to start gathering needed content much sooner in the development process.

In a perfect world, all content would be easily migrated over without any problems, but this usually doesn’t happen. It can be difficult to know when you have taken a migration “far enough” and you are better off jumping onto other things. This is where communication with the full team early is vital to not having migration issues take over a project.

Web Chef Chris Roane
Chris Roane

When not breaking down and solving complex problems as quickly as possible, Chris volunteers for a local theater called Arthouse Cinema & Pub.

Jul 05 2017
Jul 05
July 5th, 2017

We’re happy to announce the new Global Academy for continuing Medical Education (GAME) site! GAME, by Frontline, provides doctors and medical professionals with the latest news and activities to sharpen their skills and keep abreast on the latest medical technologies and techniques.

As a followup to our launch of Frontline Medical communication’s MDEdge portal last October, the new GAME site takes all of the strengths of MDEdge—strong continuing education materials, interactive video reviews, content focused on keeping medical professionals well-trained—and wraps that in a fresh new package. The new GAME site is optimized for performance so that visitors can learn from their phones on-the-go, in the field on their tablets, or at their desktops in the office between meetings. Behind the scenes, site administrators have an interface that streamlines their workflow and allows them to focus on creating content.
[NB: Read our MDEdge launch announcement, here.]

The Project

Four Kitchens worked with the Frontline and GAME teams to…

  • migrate a bevy of static and dynamic content from their existing Typo3 CMS site and ten external ASP-based conference sites.
  • create a method to streamline canonical content sharing between the GAME site and the MDEdge portal through web standard APIs, and a mirror API for automated content creation from the portal to the GAME site.
  • create a single domain home for conferences originally resting on multiple source domains, redirecting as needed while keeping the source domains public for advertising use without requiring extra domain hosting.
  • provide functional test coverage across the platform for high-value functionality using Behat and CircleCI.
  • revise the design and UX of the site to help engage users directly with the content they were seeking.

Engineering and Development Specifics

Check out the new Global Academy for continuing Medical Education (GAME) site today!

  • built on Drupal 7
  • hosted on Pantheon Elite
  • code standards enforced with ESLint and PHP_CodeSniffer
  • site migration via custom migration module plugins and Google Docs mapping
  • custom MDEdge and other 3rd party integrations
  • style guide produced and reviewed using Emulsify

The Team

The Four Kitchens team of Web Chefs included James Todd as technical lead, Chris Roane as lead engineer, Randy Oest as the designer and frontend engineer, and Scott Riley as the project manager. Additional engineering work was completed by Diego Tejera, Justin Riddiough, and Web Chef Patrick Coffey.

Web Chef James Todd
James Todd

James tinkers with hardware, software, and everything in between.

Jun 29 2017
Jun 29
June 29th, 2017

Recently I was working in a Drupal 8 project and we were using the improved Features module to create configuration container modules with some special purposes. Due to client architectural needs, we had to move the /features folder into a separate repository. We basically needed to make it available to many sites in a way we could keep doing active development over it, and we did so by making the new repo a composer dependency of all our projects.

One of the downsides of this new direction was the effects in CircleCI builds for individual projects, since installing and reverting features was an important part of it. For example, to make a new feature module available, we’d push it to this ‘shared’ repo, but to actually enable it we’d need to push the bit change in the core.extension.yml config file to our project repo. Yes, we were using a mixed approach: both features and conventional configuration management.

So a new pull request would be created in both repositories. The problem for Circle builds—given the approach previously outlined—is that builds generated for the pull request in the project repository would require the master branch of the ‘shared’ one. So, for the pull request in the project repo, we’d try to build a site by importing configuration that says a particular feature module should be enabled, and that module wouldn’t exist (likely not present in shared master at that time, still a pull request), so it would totally crash.

There is probably no straightforward way to solve this problem, but we came with a solution that is half code, half strategy. Beyond technical details, there is no practical way to determine what branch of the shared repo should be required for a pull request in the project repo, unless we assume conventions. In our case, we assumed that the correct branch to pair with a project branch was one named the same way. So if a build was a result of a pull request from branch X, we could try to find a PR from branch X in the shared repo and if it existed, that’d be our guy. Otherwise we’d keep pulling master.

So we created a script to do that: &lt;?php $branch = $argv[1]; $github_token = $argv[2]; $github_user = $argv[3]; $project_user = $argv[4]; $shared_repos = array( 'organization/shared' ); foreach ($shared_repos as $repo) { print_r("Checking repo $repo for a pull request in a '$branch' branch...\n"); $pr = <strong class="markup--strong markup--pre-strong">getPRObjectFromBranch</strong>($branch, $github_token, $github_user, $project_user, $repo); if (!empty($pr)) { print_r("Found. Requiring...\n"); exec("<strong class="markup--strong markup--pre-strong">composer require $repo:dev-$branch</strong>"); print_r("$repo:dev-$branch pulled.\n"); } else { print_r("Nothing found.\n"); } } function <strong class="markup--strong markup--pre-strong">getPRObjectFromBranch</strong>($branch_name, $github_token, $github_user, $project_user, $repo) { $ch = curl_init(); curl_setopt($ch,CURLOPT_URL,"https://api.github.com/repos/$repo/pulls?head=$project_user:$branch_name"); curl_setopt($ch,CURLOPT_RETURNTRANSFER,true); curl_setopt($ch, CURLOPT_USERPWD, "$github_user:$github_token"); curl_setopt($ch, CURLOPT_USERAGENT, "$github_user"); $output=json_decode(curl_exec($ch), TRUE); curl_close($ch); return $output; } $branch=$argv[1];$github_token=$argv[2];$github_user=$argv[3];$project_user=$argv[4];$shared_repos=array(  'organization/shared'foreach($shared_reposas$repo){  print_r("Checking repo $repo for a pull request in a '$branch' branch...\n");  $pr=<strongclass="markup--strong markup--pre-strong">getPRObjectFromBranch</strong>($branch,$github_token,$github_user,$project_user,$repo);if(!empty($pr)){    print_r("Found. Requiring...\n");    exec("<strong class="markup--strongmarkup--pre-strong">composer require $repo:dev-$branch</strong>");    print_r("$repo:dev-$branch pulled.\n");  else{    print_r("Nothing found.\n");function<strongclass="markup--strong markup--pre-strong">getPRObjectFromBranch</strong>($branch_name,$github_token,$github_user,$project_user,$repo){  $ch=curl_init();    curl_setopt($ch,CURLOPT_URL,"https://api.github.com/repos/$repo/pulls?head=$project_user:$branch_name");  curl_setopt($ch,CURLOPT_RETURNTRANSFER,true);  curl_setopt($ch,CURLOPT_USERPWD,"$github_user:$github_token");  curl_setopt($ch,CURLOPT_USERAGENT,"$github_user");$output=json_decode(curl_exec($ch),TRUE);  curl_close($ch);  return$output;

As you probably know, Circle builds are connected to the internet, so you can make remote requests. What we’re doing here is using the Github API in the middle of a build in the project repo to connect to our shared repo with cURL and try to find a pull request whose branch name matches the one we’re building over. If the request returned something then we can safely say there is a branch named the same way than the current one and with an open pull request in the shared repo, and we can require it.

What’s left for this to work is actually calling the script:

- php scripts/require_feature_branch.php "$CIRCLE_BRANCH" "$GITHUB_TOKEN" "$CIRCLE_USERNAME" "$CIRCLE_PROJECT_USERNAME" -phpscripts/require_feature_branch.php"$CIRCLE_BRANCH""$GITHUB_TOKEN""$CIRCLE_USERNAME""$CIRCLE_PROJECT_USERNAME"

We can do this at any point in circle.yml, since composer require will actually update the composer.json file, so any other composer interaction after executing the script should take your requirement in consideration. Notice that the shared repo will be required twice if you have the requirement in your composer.json file. You could safely remove it from there if you instruct to require the master branch when no matching branch has been found in the script, although this could have unintended effects in other types of environments, like for local development.

Note: A quick reference about the parameters passed to the script:

$GITHUB_TOKEN: #Generate from <a class="markup--anchor markup--pre-anchor" href="https://medium.com/r/?url=https%3A%2F%2Fgithub.com%2Fsettings%2Ftokens" target="_blank" rel="nofollow noopener noreferrer" data-href="https://medium.com/r/?url=https%3A%2F%2Fgithub.com%2Fsettings%2Ftokens">https://github.com/settings/tokens</a> $CIRCLE_*: #CircleCI vars, automatically available $GITHUB_TOKEN:#Generate from <a class="markup--anchor markup--pre-anchor" href="https://medium.com/r/?url=https%3A%2F%2Fgithub.com%2Fsettings%2Ftokens" target="_blank" rel="nofollow noopener noreferrer" data-href="https://medium.com/r/?url=https%3A%2F%2Fgithub.com%2Fsettings%2Ftokens">https://github.com/settings/tokens</a>$CIRCLE_*:#CircleCI vars, automatically available

[Editor’s Note: The post “Running CircleCI Builds Based on Many Repositories” was originally published on Joel Travieso’s Medium blog.]

Web Chef Joel Travieso
Joel Travieso

Joel focuses on the backend and architecture of web projects seeking to constantly improve by considering the latest developments of the art.

Web Chef Dev Experts
Development

Blog posts about backend engineering, frontend code work, programming tricks and tips, systems architecture, apps, APIs, microservices, and the technical side of Four Kitchens.

Read more Development
May 31 2017
May 31
May 31st, 2017

In the last post, we created a nested accordion component within Pattern Lab. In this post, we will walk through the basics of integrating this component into Drupal.

Requirements

Even though Emulsify is a ready-made Drupal 8 theme, there are some requirements and background to be aware of when using it.

Emulsify is currently meant to be used as a starterkit. In contrast to a base theme, a starterkit is simply enabled as-is, and tweaked to meet your needs. This is purposeful—your components should match your design requirements, so you should edit/delete example components as needed.

There is currently a dependency for Drupal theming, which is the Components module. This module allows one to define custom namespaces outside of the expected theme /templates directory. Emulsify comes with predefined namespaces for the atomic design directories in Pattern Lab (atoms, molecules, organisms, etc.). Even if you’re not 100% clear currently on what this module does, just know all you have to do is enable the Emulsify theme and the Components module and you’re off to the races.

Components in Drupal

In our last post we built an accordion component. Let’s now integrate this component into our Drupal site. It’s important to understand what individual components you will be working with. For our purposes, we have two: an accordion item (<dt>, <dd>) and an accordion list (<dl>). It’s important to note that these will also correspond to 2 separate Drupal files. Although this can be built in Drupal a variety of ways, in the example below, each accordion item will be a node and the accordion list will be a view.

Accordion Item

You will first want to create an Accordion content type (machine name: accordion), and we will use the title as the <dt> and the body as the <dd>. Once you’ve done this (and added some Accordion content items), let’s add our node template Twig file for the accordion item by duplicating templates/content/node.html.twig into templates/content/node--accordion.html.twig. In place of the default include function in that file, place the following:

{% include "@molecules/accordion-item/accordion-item.twig"
   with {
      "accordion_term": label,
      "accordion_def": content.body,
   }
%}

As you can see, this is a direct copy of the include statement in our accordion component file except the variables have been replaced. Makes sense, right? We want Drupal to replace those static variables with its dynamic ones, in this case label (the node title) and content.body. If you visit your accordion node in the browser (note: you will need to rebuild cache when adding new template files), you will now see your styled accordion item!

But something’s missing, right? When you click on the title, the body field should collapse, which comes from our JavaScript functionality. While JavaScript in the Pattern Lab component will automatically work because Emulsify compiles it to a single file loaded for all components, we want to use Drupal’s built-in aggregation mechanisms for adding JavaScript responsibly. To do so, we need to add a library to the theme. This means adding the following code into emulsify.libraries.yml:

accordion:
  js:
    components/_patterns/02-molecules/accordion-item/accordion-item.js: {}

Once you’ve done that and rebuilt the cache, you can now use the following snippet in any Drupal Twig file to load that library [NB: read more about attach_library]:

{{ attach_library('emulsify/accordion') }}

So, once you’ve added that function to your node–accordion.html.twig file, you should have a working accordion item. Not only does this function load your accordion JavaScript, but it does so in a way that only loads it when that Twig file is used, and also takes advantage of Drupal’s JavaScript aggregation system. Win-win!

Accordion List

So, now that our individual accordion item works as it should, let’s build our accordion list. For this, I’ve created a view called Accordion (machine name: accordion) that shows “Content of Type: Accordion” and a page display that shows an unformatted list of full posts.

Now that the view has been created, let’s copy views-view-unformatted.html.twig from our parent Stable theme (/core/themes/stable/templates/views) and rename it views-view-unformatted--accordion.html.twig. Inside of that file, we will write our include statement for the accordion <dl> component. But before we do that, we need to make a key change to that component file. If you go back to the contents of that file, you’ll notice that it has a for loop built to pass in Pattern Lab data and nest the accordion items themselves:

<dl class="accordion-item">
  {% for listItem in listItems.four %}
    {% include "@molecules/accordion-item/accordion-item.twig"
      with {
        "accordion_item": listItem.headline.short,
        "accordion_def": listItem.excerpt.long
      }
    %}
  {% endfor %}
</dl>

In Drupal, we don’t want to iterate over this static list; all we need to do is provide a single variable for the  Views rows to be passed into. Let’s tweak our code a bit to allow for that:

<dl class="accordion-item">
  {% if drupal == true %}
    {{ accordion_items }}
  {% else %}
    {% for listItem in listItems.four %}
      {% include "@molecules/accordion-item/accordion-item.twig"
        with {
          "accordion_term": listItem.headline.short,
          "accordion_def": listItem.excerpt.long
        }
      %}
    {% endfor %}
  {% endif %}
</dl>

You’ll notice that we’ve added an if statement to check whether “drupal” is true—this variable can actually be anything Pattern Lab doesn’t recognize (see the next code snippet). Finally, in views-view-unformatted--accordion.html.twig let’s put the following:

{% set drupal = true %}
{% include "@organisms/accordion/accordion.twig"
  with {
    "accordion_items": rows,
  }
%}

At the view level, all we need is this outer <dl> wrapper and to just pass in our Views rows (which will contain our already component-ized nodes). Rebuild the cache, visit your view page and voila! You now have a fully working accordion!

Conclusion

We have now not only created a more complex nested component that uses JavaScript… we have done it in Drupal! Your HTML, CSS and JavaScript are where they belong (in the components themselves), and you are merely passing Drupal’s dynamic data into those files.

There’s definitely a lot more to learn; below is a list of posts and webinars to continue your education and get involved in the future of component-driven development and our tool, Emulsify.

Recommended Posts

  • Shared Principles There is no question that the frontend space has exploded in the past decade, having gone from the seemingly novice aspect of web development to a first-class specialization.…
  • Webinar presented by Brian Lewis and Evan Willhite 15-March-2017, 1pm-2pm CDT Modern web applications are not built of pages, but are better thought of as a collection of components, assembled…
  • Welcome to Part Three of our frontend miniseries on style guides! In this installment, we cover the bits and pieces of atomic design using Pattern Lab.
Evan Willhite
Evan Willhite

Evan Willhite is a frontend engineer at Four Kitchens who thrives on creating delightful digital experiences for users, clients, and fellow engineers. He enjoys running, hot chicken, playing music, and being a homebody with his family.

Development

Blog posts about backend engineering, frontend code work, programming tricks and tips, systems architecture, apps, APIs, microservices, and the technical side of Four Kitchens.

Read more Development
May 24 2017
May 24
May 24th, 2017

In the last post, we introduced Emulsify and spoke a little about the history that went into its creation. In this post, we will walk through the basics of Emulsify to get you building lovely, organized components automatically added to Pattern Lab.

Prototyping

Emulsify is at its most basic level a prototyping tool. Assuming you’ve met the requirements and have installed Emulsify, running the tool is as simple as navigating to the directory and running `npm start`. This task takes care of building your Pattern Lab website, compiling Sass to minified CSS, linting and minifying JavaScript.

Also, this single command will start a watch task and open your Pattern Lab instance automatically in a browser. So now when you save a file, it will run the appropriate task and refresh the browser to show your latest changes. In other words, it is an end-to-end prototyping tool meant to allow a developer to start creating components quickly with a solid backbone of automation.

Component-Based Theming

Emulsify, like Pattern Lab, expects the developer to use a component-based building approach. This approach is elegantly simple: write your DRY components, including your Sass and JavaScript, in a single directory. Automation takes care of the Sass compilation to a single CSS file and JavaScript to a single JavaScript file for viewing functionality in Pattern Lab.

Because Emulsify leverages the Twig templating engine, you can build each component HTML(Twig) file and then use the Twig functions include, embed and extends to combine components into full-scale layouts. Sound confusing? No need to worry—there are multiple examples pre-built in Emulsify. Let’s take a look at one below.

Simple Accordion

Below is a simple but common user experience—the accordion. Let’s look at the markup for a single FAQ accordion item component:

<dt class="accordion-item__term">What is Emulsify?</dt>
<dd class="accordion-item__def">A Pattern Lab prototyping tool and Drupal 8 base theme.</dd>

If you look in the components/_patterns/02-molecules/accordion-item directory, you’ll find this Twig file as well as the CSS and JavaScript files that provide the default styling and open/close functionality respectively. (You’ll also see a YAML file, which is used to provide data for the component in Pattern Lab.)

But an accordion typically has multiple items, and HTML definitions should have a dl wrapper, right? Let’s take a look at the emulsify/components/_patterns/03-organisms/accordion/accordion.twig markup:

<dl class="accordion-item">
  {% for listItem in listItems.four %}
    {% include "@molecules/accordion-item/accordion-item.twig"
      with {
        "accordion_item": listItem.headline.short,
        "accordion_def": listItem.excerpt.long
      }
    %}
  {% endfor %}
</dl>

Here you can see that the only HTML added is the dl wrapper. Inside of that, we have a Twig for loop that will loop through our list items and for each one include our single accordion item component above. The rest of the component syntax is Pattern Lab specific (e.g., listItems, headline.short, excerpt.long).

Conclusion

If you are following along in your own local Emulsify installation, you can view this accordion in action inside your Pattern Lab installation. With this example, we’ve introduced not only the basics of component-based theming, but we’ve also seen an example of inheriting templates using the Twig include function. Using this example as well as the other pre-built components in Emulsify, we have what we need to start prototyping!

In the next article, we’ll dive into how to implement Emulsify as a Drupal 8 theme and start building a component-based Drupal 8 project. You can also view a recording of a webinar we made in March. Until then, see you next week!

Recommended Posts

  • Webinar presented by Brian Lewis and Evan Willhite 15-March-2017, 1pm-2pm CDT Modern web applications are not built of pages, but are better thought of as a collection of components, assembled…
  • Welcome to the final post of our frontend miniseries on style guides! In this installment, the Web Chefs talk through how we chose Pattern Lab over KSS Node for Four…
  • Shared Principles There is no question that the frontend space has exploded in the past decade, having gone from the seemingly novice aspect of web development to a first-class specialization.…
Evan Willhite
Evan Willhite

Evan Willhite is a frontend engineer at Four Kitchens who thrives on creating delightful digital experiences for users, clients, and fellow engineers. He enjoys running, hot chicken, playing music, and being a homebody with his family.

Development

Blog posts about backend engineering, frontend code work, programming tricks and tips, systems architecture, apps, APIs, microservices, and the technical side of Four Kitchens.

Read more Development
May 17 2017
May 17
May 17th, 2017

Shared Principles

There is no question that the frontend space has exploded in the past decade, having gone from the seemingly novice aspect of web development to a first-class specialization. At the smaller agency level, being a frontend engineer typically involves a balancing act between a general knowledge of web development and keeping up with frontend best practices. This makes it all the more important for agency frontend teams to take a step back and determine some shared principles. We at Four Kitchens did this through late last summer and into fall, and here’s what we came up with. A system working from shared principles must be:

1. Backend Agnostic

Even within Four Kitchens, we build websites and applications using a variety of backend languages and database structures, and this is only a microcosm of the massive diversity in modern web development. Our frontend team strives to choose and build tools that are portable between backend systems. Not only is this a smart goal internally but it’s also an important deliverable for our clients as well.

2. Modular

It seems to me the frontend community has spent the past few years trying to find ways to incorporate best practices that have a rich history in backend programming languages. We’ve realized we, too, need to be able to build code structures that can scale without brittleness or bloat. For this reason, the Four Kitchens frontend team has rallied around component-based theming and approaches like BEM syntax. Put simply, we want the UI pieces we build to be as portable as the structure itself: flexible, removable, DRY.

3. Easy to Learn

Because we are aiming to build tools that aren’t married to backend systems and are modular, this in turn should make them much more approachable. We want to build tools that help a frontend engineer who works in any language to quickly build logically organized component-based prototypes quickly and with little ramp-up.

4. Open Source

Four Kitchens has been devoted to the culture of open-source software from the beginning, and we as a frontend team want to continue that commitment by leveraging and building tools that do the same.

Introducing Emulsify

Knowing all this, we are proud to introduce Emulsify—a Pattern Lab prototyping tool and Drupal 8 starterkit theme. Wait… Drupal 8 starterkit you say? What happened to backend agnostic? Well, we still build a lot in Drupal, and the overhead of it being a starterkit theme is tiny and unintrusive to the prototyping process. More on this in the next post.
[NB: Check back next week for our next Emulsify post!]

With these shared values, we knew we had enough of a foundation to build a tool that would both hold us accountable to these values and help instill them as we grow and onboard new developers. We also are excited about the flexibility that this opens up in our process by having a prototyping tool that allows any frontend engineer with knowledge in any backend system (or none) to focus on building a great UI for a project.

Next in the series, we’ll go through the basics of Emulsify and explain its out-of-the-box strengths that will get you prototyping in Pattern Lab and/or creating a Drupal 8 theme quickly.

Recommended Posts

Evan Willhite
Evan Willhite

Evan Willhite is a frontend engineer at Four Kitchens who thrives on creating delightful digital experiences for users, clients, and fellow engineers. He enjoys running, hot chicken, playing music, and being a homebody with his family.

Development

Blog posts about backend engineering, frontend code work, programming tricks and tips, systems architecture, apps, APIs, microservices, and the technical side of Four Kitchens.

Read more Development
May 10 2017
May 10
May 9th, 2017

DrupalCon is many things to many people. For me, this year’s North America DrupalCon in Baltimore was a chance to connect with my remote co-workers in the same place, help share knowledge while learning things myself, and celebrate all the things that Drupal makes possible.

The Drupal 8 with React.js and Waterwheel Training

Our first big event was “API First Drupal 8 with React.js and Waterwheel Training”, where Web Chef Luke Herrington took a canonical JavaScript application—a todo list built with React—and hooked it up to Drupal 8 through a new JavaScript library called Waterwheel.js. Todos were stored in a headless Drupal site via the JSON API module, and we even provided a login page and a `like` button for todos. Although we had a small army of Web Chefs available to help, Luke had created such a great training that our extra support wasn’t needed, and the attendees were really able to dive deep into how everything worked.

Future of the CMS: Decoupled

“I’ve completely rewritten my talk,” said Todd, the Four Kitchens CEO, at the team dinner on Monday night. I’ve seen him give this talk before but this declaration really piqued my curiosity.

There were a lot of talks at DrupalCon about the “how” of decoupling, but Todd’s revised talk is a great summary of the “why”. In it, Todd talks about the differences between CMSes being “content management systems” versus “website management systems” and about how that content can be managed so that it is reuseable on all categories of devices. Because the technology is always changing, it’s a talk he rewrites at least once a year, and I’m glad I got to see this version of the 2017 talk when I did.

Supercharge Your Next Web App with Electron

To show off his work in Electron, Web Chef James Todd brought two drawing robots to DrupalCon that he set up in our booth. Each machine was powered by RoboPaint, a packaged-up web app. I’ve been curious about Electron for a while, and when I learned that James was giving a talk on the subject I immediately reached out to help him build his slide deck so that I could learn more. His presentation was thorough and entertaining, and he encouraged people to “experiment and play with it, it’ll be fun”.

Drinks with a Mission

The Drupal community believes that open source technology has the power to improve the lives of others, so instead of the usual DrupalCon party, this year, Four Kitchens teamed up with Kalamuna and Manatí to host “Drinks with a Mission”.

We started the night by asking, “If you had a magic wand that would fix a problem, what problems would you fix?” Answers were written down on post-it notes, which were then sorted into groupings, and finally assigned to teams. Each team took their topic, such as How to Better Connect with Nature, and had to come up with solutions to the topic problem. Great ideas can begin in unexpected places, and the ensuing solutions were as thoughtful as they were hilarious.

Watch the recorded stream of the event: Part 1, Part 2

Taking the Train Home

In the last few years I’ve started to become enamored with the concept of “taking the train”. So at the end of DrupalCon I got my wish, and instead of flying, I spent an entire day traveling by rail: from Baltimore, through Philadelphia’s gorgeous train station, and then on to home in the middle of Pennsylvania.

Recommended Posts

  • A mostly full report on what went down last week in the Big Easy, gonzo journalism -style.
  • Fun & Games DrupalCon Baltimore is next week and we’re so excited to get back together in Baltimore! As the official Drupal Games sponsors, we take fun very seriously and…
  • "API First" or, as some may call it, "Decoupled Drupal", remains a topic of much discussion among the Drupal community. Here are just a few sessions being presented at Drupalcon…
Randy Oest
Randy Oest

Randy Oest is an avid Star Trek fan, plays too many board games, and bought his mother an iPad so that he wouldn't have to fix her computer anymore.

Apr 24 2017
Apr 24
April 24th, 2017

Making Huge Strides Back to Desktop

So what is this Electron thing everyone keeps talking about? Even if you haven’t heard of it, you may have used it! With over 4 millions daily users on Slack’s business oriented chat system, their cross-platform desktop application helps them reach their users outside of browsers, but these systems are in fact part of the same thing.

Back in May 2014, prolific bastions of open source and $2b valuated company, GitHub, took the custom application wrapper it originally created for its Atom code editor and released into the world—and Electron was born. Rebranded from “Atom Shell” in 2015, Electron began to take off almost immediately, allowing regular web developers the ability to make native-like, high performance desktop applications using the exact same HTML, CSS, and JavaScript technologies they use to make the rest of the web.

Piggybacking on the huge wave of API first work in Drupal 8 utilized via the Waterwheel client wrapper, building with Electron allows you to create nearly native desktop experiences using frameworks like React, Redux, Angular, or anything else that your team can construct to run in a web browser. Beyond even that, Electron gives JavaScript direct access to low level Node.js and operating system APIs, allowing your application direct file access, running custom binaries for data processing, execution of alternative scripting languages, serial port or hardware access, and tons more.

Supercharge Your Next Web App

This year at DrupalCon Baltimore, we present “Supercharge Your Next Web App with Electron”, a session that digs deep and covers everything you need in order to dip into the waters of Electron. We’ll talk about what big companies have already taken the plunge and even provide a checklist for when not to move from the web to a desktop app.

Though an Electron app may not be the right choice for your next application, knowing what tools are available to you—and understanding their incredible possibilities—is going to serve you anytime you’re  considering user-oriented frameworks. Don’t miss out on this interesting view into a future of low-energy/high-return desktop applications in the DrupalCon Horizons track this year.

And, during active exposition hours, make sure to come over to the Four Kitchens booth to see a live demo of an Electron app powered by JavaScript—we build a robot artist!

Four Kitchens: We make content go

Recommended Posts

  • In this issue: Launching the new EW.com, MeteorJS; plus Sane Stack, Herp Derpsum, and switching to Sublime Text 3.
  • Fun & Games DrupalCon Baltimore is next week and we’re so excited to get back together in Baltimore! As the official Drupal Games sponsors, we take fun very seriously and…
  • "API First" or, as some may call it, "Decoupled Drupal", remains a topic of much discussion among the Drupal community. Here are just a few sessions being presented at Drupalcon…
James Todd
James Todd

James tinkers with hardware, software, and everything in between.

Events

Blog posts about ephemeral news, events, parties, conferences, talks—anything with a date attached to it.

Read more Events
Apr 18 2017
Apr 18
April 18th, 2017

Fun & Games

DrupalCon Baltimore is next week and we’re so excited to get back together in Baltimore! As the official Drupal Games sponsors, we take fun very seriously and this year you can be sure to find some exciting things to do at our booth—we won’t spoil the surprise but let’s just say you’ll get to see some of us IRL and IVRL.

And if you visited us last year, you know we are all about that Free Throw game. Our undefeated Web Chef, Brian Lewis, will be there to take on any challenger. We’ve all been practicing and we are READY. Are you?

We’ll also have some of our widely-enjoyed Lightning Talks during lunch intervals right at our booth! Learn something new in just a few minutes, howbowdat? Stop by our booth to check out the schedule.

Web Chef Talks

It’s been an exciting year and the Web Chefs are ready to drop some knowledge, including:

Future of the CMS: Decoupled, Multichannel, and Content-as-a-Service, presented by Four Kitchens Co-Founder and CEO, Todd Ross Nienkerk.

Supercharge Your Next Web App with Electron, presented by Web Chef engineer, James Todd.

Why Klingon Matters for Content: The Secret Power of Language, presented by our content specialist, Douglas Bigham.

Training: API First Drupal 8 with React.js and Waterwheel, a training with JavaScript engineer, Luke Herrington.

Party with a Purpose

Last—but definitely not least—you’re cordially invited to our official DrupalCon gathering, Drinks with a Mission, hosted by Four Kitchens and our friends at Kalamuna and Manatí.

Join us on April 25th at Peter’s Pour House from 6-9pm for lively conversation, free-flowing libations, and a structured forum for hashing out ideas on how to use Drupal to overcome the challenges many of our communities face in today’s national and global political climate.

RSVP here!

See you in BMD!

Oh! The kittens are coming along to Baltimore as well—four of them to be exact—and we can’t wait to reveal this year’s DrupalCon t-shirt design. We’re not kitten around. We wish we could show you right meow.

P.S. Check out the 10-day Baltimore weather forecast.

Recommended Posts

Lucy Weinmeister
Lucy Weinmeister

Lucy Weinmeister is the marketing coordinator at Four Kitchens. She loves to share all the new and exciting things the Web Chefs are cooking up at 4K. She is forever reading a book.

Events

Blog posts about ephemeral news, events, parties, conferences, talks—anything with a date attached to it.

Read more Events
Mar 02 2017
Mar 02
March 2nd, 2017

You might have heard about high availability before but didn’t think your site was large enough to handle the extra architecture or overhead. I would like to encourage you to think again and be creative.

Background

Digital Ocean has a concept they call a floating IPs. A Floating IP is an IP address that can be instantly moved from one Droplet to another Droplet in the same data center. This idea is great, it allows you to keep your site running in the event of failure.

Credit

I have to give credit to BlackMesh for handling this process quite well. The only thing I had to do was create the tickets to change the architecture and BlackMesh implemented it.

Exact Problem

One of our support clients had the need for a complete site relaunch due to a major overhaul in the underlying architecture of their code. Specifically, they had the following changes:

  1. Change in the site docroot
  2. Migration from a single site architecture to a multisite architecture based on domain access
  3. Upgrade of PHP version that required a server replacement/upgrade in linux distribution version

Any of these individually could have benefited from this approach. We just bundled all of the changes together to delivering minimal downtime to the sites users.

Solution

So, what is the right solution for a data migration that takes well over 3 hours to run? Site downtime for hours during peak traffic is unacceptable. So, the answer we came up with was to use a floating IP that can easily change the backend server when we are ready to flip the switch. This allows us to migrate our data on a new separate server using it’s own database (essentially having two live servers at the same time).

Benefits

Notice that we won’t need to change the DNS records here which meant we didn’t have to wait for DNS records to propagate all over the internet. The new site was live instantly.

Additional Details

Some other notes during the transition that may lead to separate blog posts:

  1. We created a shell script to handle the actual deployment and tested it before the actual “go live” date to minimize surprises.
  2. A private network was created to allow the servers to communicate to each other directly and behind the scenes.
  3. To complicate this process, during development (prelaunch) the user base grew so much we had to off load the Solr server on to another machine to reduce server CPU usage. This means that additional backend servers were also involved in this transition.

Go-Live (Migration Complete)

After you have completed your deployment process, you are ready to switch the floating ip to the new server. In our case we were using “keepalived” which responds to a health check on the server. Our health check was a simple php file that responded with the text true or false. So, when we were ready to switch we just changed the health checks response to false. Then we got an instant switch from the old server to the new server with minimal interruption.

Acceptable Losses

There were a few things we couldn’t get around:

  1. The need for a content freeze
  2. The need for a user registration freeze

The reason for this was that our database was the database updates required the site to be in maintenance mode while being performed.

A problem worth mentioning:

  1. The database did have a few tables that would have to have acceptable losses. The users sessions table and cache_form table both were out of sync when we switched over. So, any new sessions and saved forms were unfortunately lost during this process. The result is that users would have to log in again and fill out forms that weren’t submitted. In the rare event that a user changed their name or other fields on their preferences page those changes would be lost.

Additional Considerations

  1. Our mail preferences are handled by third parties
  2. Comments aren’t allowed on this site

Recommended Posts

  • Engineers find solving complex problems exciting, but as I’ve matured as an engineer, I’ve learned that complexity isn’t always as compelling as simplicity.
  • Cloudflare Bug May Have Created Security Leak Cloudflare, a major internet host, had some unusual circumstances that caused their servers to output information that contained private information such as HTTP…
  • When you already have a design and are working with finalized content, high fidelity wireframes might be just what the team needs to make decisions quickly.
Chris Martin
Chris Martin

Chris Martin is a junior engineer at Four Kitchens. When not maintaining websites he can be found building drones, computers, robots, and occasionally traveling to China.

Development

Blog posts about backend engineering, frontend code work, programming tricks and tips, systems architecture, apps, APIs, microservices, and the technical side of Four Kitchens.

Read more Development
Feb 27 2017
Feb 27
February 27th, 2017

Drupal at the Beach.
(The Very Windy Beach)

Every year in February, Drupalers from across the country travel to San Diego to get away from the harsh winter and enjoy the perfect 72 degree California weather. Attendees visit Pacific Beach, walk down the boardwalk, and sometimes even go sailing.

Picture of former Web Chefs sailing.Former Web Chefs Matt Grill and Dustin Younse sail through Mission Bay after a weekend at SANDCamp 2016.

This year, however, attendees were met with … a little weather.

San Diegans, like myself, always find weather surprising and novel to the point where any time it rains for more than 10 minutes, we describe it as “really coming down”. But this time it really was pouring. 75 mph gusts of wind, cloudy skies, and a strong atmospheric river causing record rainfall. Drupal was not at the beach this year.

Weather map showing storms over San Diego.SANDCamp 2017: A little weather.

Drupal Near the Beach

Falling in mid-February every year, SANDCamp affords many speakers the opportunity to field test trainings and sessions before they’re given at DrupalCon.

Drupal 8 with React.js and Waterwheel.js

With the help of my fellow Web Chefs, I presented the first iteration of my training API First Drupal 8 with React.js and Waterwheel.js which I’m happy to announce will also be given at Drupalcon Baltimore! In the training, we took the canonical JavaScript application, a todo list built with React, and hooked it up to Drupal 8 through a new JavaScript library called Waterwheel.js. Todos were stored in a headless Drupal site via the JSON API module and we even provided a login page, and a like button for todos. Overall, the feedback on the training was excellent. People enjoyed learning how to turn Drupal 8 into a world class content API while also getting their feet wet with a frontend JavaScript framework like React. I’m looking forward to improving the training and giving it at Drupalcon Baltimore this year.

Every Project is a Story

One notable session was Dwayne McDaniel’s talk Every project is a story: Applying storytelling to your client interactions in which he explained how the patterns that form good stories, form good projects, budgets, and discoveries. Dwayne explored these story structures and how they can help translate clients’ and stakeholders’ dreams into real plans.

Kalastatic

The session that caught my interest the most was From Prototype to Drupal Site with Kalastatic. Through a case study, Crispin explained the benefits of component driven design and showed off an open-source framework Kalamuna built called Kalastatic. It’s a kss-node driven static site framework for building prototypes and living style guides that integrate with Drupal. It’s a tool very similar to Emulsify, Four Kitchens’ component-driven prototyping tool and Drupal 8 theme. It is great to see the Drupal community converge on component driven development as a solid pattern for building frontends.

Keynote Surprise!

Due to the inclement weather California was experiencing that week, the scheduled keynote speaker, Darin Andersens, had his flight cancelled and couldn’t be there. Luckily Todd, Four Kitchen’s CEO and Co-Founder, always has a keynote in his back pocket. He fired up his laptop and gave his talk on The Future of The CMS, pontificating on where the web is going and what CMSes like Drupal must do to stay relevant.

Always Be Keynoting. https://t.co/OIqmOBur3L

— Four Kitchens (@FourKitchens) February 17, 2017

Thanks, SANDcamp!

Maybe I’ll see you at SANDcamp next year! Also, if you’ll be at DrupalCon Baltimore, sign up for my training API First Drupal 8 with React.js and Waterwheel.js, and check out the other Four Kitchens Web Chefs, too!

Recommended Posts

Luke Herrington
Luke Herrington

Luke Herrington writes JavaScript for work and for fun; he enjoys hacking on new technology and reading about the ethics of artificial intelligence.

Jan 30 2017
Jan 30
January 30th, 2017

Welcome to the final episode of Season 2 of Sharp Ideas! On this episode, Randy and Doug talk to Four Kitchens directors Elia Albarran, Todd Nienkerk, and Aaron Stanush, about keeping your team happy, working ethically with clients, and how to prepare your people for the future of work.

Broadcasting directly to you from wherever the web meets business and design. You can listen to us on SoundCloud (on the site or download the app!) or find us with your other favorite podcasts on the Stitcher app.

Recommended Posts

Douglas Bigham
Douglas Bigham

Doug is a writer and ex-academic with a background in digital publics and social language use. He likes dark beer, bright colors, and he speaks a little Klingon.

Jan 11 2017
Jan 11
January 11th, 2017

American Craft Council and Four Kitchens Take the Best in Biz Gold!

The American Craft Council and Four Kitchens have been named gold winners for Website of the Year in the Best in Biz Awards, the only independent business awards program judged by members of the press and industry analysts.

The American Craft Council is a national, nonprofit educational organization that has celebrated and promoted American craft for more than 75 years through its award-winning magazine, American Craft, juried fine craft shows in Baltimore, Atlanta, Saint Paul, and San Francisco, an extensive library and archives (print and digital) of craft resources, and more.

“We’re so thrilled to share this honor with Four Kitchens,” said ACC’s executive director, Chris Amundsen. “They have been a fantastic partner to work with on our website redesign. With their guidance and expertise, our new site now better serves our members and the broader craft community, and it more effectively helps us fulfill our mission to champion craft. We’ve received such a positive reception from the craft community, and now it’s wonderful to be recognized with a Best in Biz award for all we’ve achieved together.”

“Our partnership with American Craft Council has been a wonderful experience,” said Todd Ross Nienkerk, CEO and co-founder of Four Kitchens. “We’re very happy to have earned this award for such an interesting project, and I applaud the hard work of everyone at ACC and the expertise of the Four Kitchens Web Chefs who led the way.”

Winners of Best in Biz Awards 2016 were determined based on scoring from an independent panel of 50 judges from widely known newspapers, business, consumer and technology publications, TV outlets, and analyst firms.

Read the full press release here.

Do you have a project you’d like to run by us? Give us a shout!
Would you like to join an award-winning team? We’re hiring!

Recommended Posts

Lucy Weinmeister
Lucy Weinmeister

Lucy Weinmeister is the marketing coordinator at Four Kitchens. She loves to share all the new and exciting things the Web Chefs are cooking up at 4K. She is forever reading a book.

Client Stories

Launch announcements, blog posts about the project management process on a specific job, technical posts about the implementation of a feature for a specific client, announcements that a client’s website has won an award.

Read more Client Stories
Jan 06 2017
Jan 06
January 6th, 2017

On this episode of Sharp Ideas, Randy and Doug talk to Jen Lampton, co-founder of Backdrop CMS, about changes in the Drupal community, the importance of open source, and how to make sure we’re hearing a diversity of voices in our projects.

Broadcasting directly to you from wherever the web meets business and design. You can listen to us on SoundCloud (on the site or download the app!) or find us with your other favorite podcasts on the Stitcher app.

Recommended Posts

Douglas Bigham
Douglas Bigham

Doug is a writer and ex-academic with a background in digital publics and social language use. He likes dark beer, bright colors, and he speaks a little Klingon.

Dec 23 2016
Dec 23
December 23rd, 2016

On this episode of Sharp Ideas, Doug and Randy are joined from the basement of BADCamp X by Jon Peck and Heather Rodriguez.

Recorded on-site at BADCamp 2016, we’re talking the history and principles of BADCamp (the Bay Area Drupal Camp), the importance of human diversity in the tech world, the values and ethics of the open source movement, and staying aware of imposter syndrome when you’re giving back to your community.

Broadcasting directly to you from wherever the web meets business and design. You can listen to us on SoundCloud (on the site or download the app!) or find us with your other favorite podcasts on the Stitcher app.

Recommended Posts

Douglas Bigham
Douglas Bigham

Doug is a writer and ex-academic with a background in digital publics and social language use. He likes dark beer, bright colors, and he speaks a little Klingon.

Dec 13 2016
Dec 13
December 13th, 2016

WordPress is growing. It currently runs more than one quarter of all websites on the Internet, including Four Kitchens’ own website). I’ve been immersed in Drupal for the last five years or so, but I’m curious what is going on with WordPress and its community. And so I bought a ticket to WordCamp US, dusted off my WordPress skills (that I haven’t used in over a quinquennium), and drove to Philadelphia.

What is WordCamp US?

WordCamp US (WCUS) is a conference that focuses on everything WordPress. People from around the world attend—casual users to core developers— to participate, share ideas, and get to know each other.

Community

The first thing that I noticed about WCUS is that WordPress has a huge umbrella—international travelers were plentiful, there were a lot of women, and there was a wide range of diversity. There was even a 10 year old boy in a hallway, face in his laptop, working on his WordPress blog for Minecraft.

The sessions were setup to be accessible to everyone. Each presenter’s slide deck had a space at the top for closed captioning that was done live at the event. And for those who couldn’t make it to the event, every session was recorded and live-streamed in real time.

Everyone was welcoming, questions were encouraged, and conversation flowed. I was upfront with everyone that I was a Drupal developer exploring a foreign land and I got a lot of good information about the WordPress ecosystem.

Comparing Modules and Plugins

Drupal and WordPress both share a love for being open source. Both communities strongly encourage contributing back to the project. But there is one place where Drupal and WordPress have very different opinions—paid modules and plugins.

Drupal modules generally provide building blocks for developers to use as they implement custom solutions for clients. In WordPress, this is sometimes the case, but usually WordPress plugins are complete solutions for a need. For example, to implement a custom intranet with user groups and a Facebook-style feed, a Drupal dev would install a few modules, build some views, and style the new theme elements—and that would all take time and expertise to put together. To accomplish the same thing on WordPress, a user (who doesn’t even have to be a developer) would simply install BuddyPress.org and fill out some administration choices.

I believe that because of this difference between modules and plugins, the WordPress community welcomes paid plugins. And just because they are paid doesn’t mean that they get to be proprietary. The expectation for paid plugins is that they still be open source and what you are paying for is a license for upgrades and support. A lot of the people who I talked to either have their own plugins that they sell as part of their own business or make generous use of paid plugins. Why not pay $100 for a full featured calendar plugin that saves you hours (or weeks) of work?

Looking Forward to WordPress

I enjoyed my trip to WCUS and exploring WordPress. It is a great community and I’m looking forward to continuing to explore it more. Right now I’m looking into development workflows, so if you have any advice, I’d love to hear it in the comments.

Recommended Posts

  • In this issue: Launching the new EW.com, MeteorJS; plus Sane Stack, Herp Derpsum, and switching to Sublime Text 3.
  • The Drupal community is self-reflective enough to see the flaws in the project and brave enough to reinvent itself.
  • Halloween is over, but have one last batch of _spoopy_ links to kill off your Friday. Here's what we've been talking about this week…
Randy Oest
Randy Oest

Randy Oest is an avid Star Trek fan, plays too many board games, and bought his mother an iPad so that he wouldn't have to fix her computer anymore.

Nov 17 2016
Nov 17
November 17th, 2016

Background

Automated (or “living”) style guides are a popular topic in frontend development these days. And it’s no surprise, as they benefit the integrity of the project as well as ease design and development (as you can see in Part 1 of this miniseries!). Four Kitchens has a long history of using static style guides within projects, but as the frontend team re-evaluated our process this year, we wanted to standardize around an automated styleguide for future projects. By automating this part of our process, we ensure the style guide always reflects the latest design and code during the entire life of the project.

We began by evaluating existing automated style guide tools and narrowed the selection down to a couple that made sense alongside our existing toolset: KSS and Pattern Lab. We then committed to testing these in real projects to expose strengths/weaknesses so that we could eventually choose the best tool for our needs. Randy discussed KSS in a previous article, and in this article we will explore Pattern Lab.

Pattern Lab & Atomic Design

Pattern Lab is one of the more established style guide tools out there and is the brainchild of Brad Frost, best known for his “atomic design” principles. When evaluating Pattern Lab, it’s best to start by understanding atomic design.

Atomic Design

Put simply, atomic design just asserts that like organic matter, design systems can be broken down into their smallest pieces and built up from there. In web development, this means we shift from the mentality of building “pages” to breaking down designs into patterns, organized smallest to largest, and use these building-block patterns to develop the project. Here are the categories commonly used in this approach:

  1. Atoms: simple HTML tags (e.g., <button>, <input type="text" />, </button>
    <h1>
    , <a>, </a></h1>)
  2. Molecules: small combinations of atoms (search form, menu, lists)
  3. Organisms: groups of molecules forming a distinct design section (header, masthead, footer)
  4. Templates: organisms combined to form contextualized designs
  5. Pages: fully actualized templates often with real content

There is a video from the Pattern Lab website that demonstrates this best. Some folks get distracted by the lingo (atoms, molecules, etc.), but you should see these naming conventions as only one way to break down components into a logical structure. Pattern Lab actually allows you to use any category names you want. Pattern Lab does, however, expect you to use atomic design in that it wants you to organize patterns smallest to largest regardless of the category names.

Pattern Lab

On the technical side, Pattern Lab is a static site generator powered by either PHP or Node that supports Mustache and Twig templating languages. The Node version has Grunt and gulp support as well. Importantly, Pattern Lab is open-source and actively maintained.

In terms of built-in perks, Pattern Lab not only ships with a nice stock UI, it allows you to populate pattern data using JSON or YAML and then annotate the patterns using Markdown. It also provides a way to easily create pattern variations as well as pattern states (e.g., in progress, needs review, approved). It also ships with a pattern searching tool and a viewport resizer in the toolbar to easily test/demo your components across any screen size.

Building Patterns in Pattern Lab

Patterns are small chunks of HTML that may also contain CSS and/or JavaScript. In other words, there are no technical hurdles for a current Frontend developer to build these components—only the mental shift in breaking a design down to its logical atomic design parts.

Let’s take a look at building a simple button component. We’ll be using Twig as our templating language.

The button component is comprised of a file with the button’s markup (button.twig):


<a href="https://www.fourkitchens.com/blog/article/frontend-style-guide-miniseries-part-three-pattern-lab//{{ url }}" class="button{% if variation %}--{{ variation }}{% endif %}">{{ text }}</a>

and a stylesheet containing the component styles (button.scss)


a.button {
  background-color:#35AA4E;
  border:none;
  color:#fff;
  cursor:pointer;
  font-size:100%;
  padding:1em 2em;
  text-transform:uppercase;

  &:hover {
    background-color:#eee;
    color:#35AA4E;
    text-decoration:underline;
  }

  &--alt {
    @extend .button;
    font-size: 80%;
    padding: .5em 1em;
  }
}

To take full advantage of Pattern Lab, let’s also create some default data (button text and URL) and some annotations to help describe the component. For the data, let’s create a simple button.yml file:


url:
  "/"
text:
  "Default Button"

This is what will populate the Twig variables in our markup above. And now let’s create an informative annotation that will display in the style guide. For this, we’ll create a Markdown file (button.md):


---
el: ".button"
title: "Default button"
---
Here we can see the default button styling in action.

This all shows up in Pattern Lab like this:

Screenshot one.

As you can see, we have our component name, followed by our annotations with code snippets in both Twig and HTML versions (another Pattern Lab perk) and then we have the design element itself, our button.

Let’s now add an alternative button option. It’s as simple as adding an alternative YML file (button~alternative.yml). The tilde character tells Pattern Lab this is a variation, and Pattern Lab uses the name after the tilde as the variation name. Let’s add this content to the file:


url:
  "/"
text:
  "Alternate Button"
variation:
  "alt"

You may have noticed that button.twig contained a check for a variation variable that added the variation as a modifier class (class="button{% if variation %}--{{ variation }}{% endif %}"). This alternate YML file supplies that variable, which means our template will change the class accordingly. This is what Pattern lab looks like now:

Screenshot two.

As you can see, Pattern Lab makes it quick and painless to create components with variations and metadata. From here, Pattern Lab also makes it easy to build nested patterns and to link patterns to one another to form a full, working prototype.

Final Thoughts

Adopting any new technology has its pain points, and it is of course no different with Pattern Lab. The latest version of Pattern Lab (v2) overcame our frontend team’s strongest critiques, namely that Twig is now natively supported and data can be specified in YAML format. I personally also like that annotations can now be written in Markdown, as it is a great format for this type of notation. My only remaining critique is that while writing components comes easily, debugging or tweaking core Pattern Lab does take some effort as the codebase is fairly large. This critique, for me, is far outweighed by all the built-in perks I mentioned above but I thought it worth mentioning.

Overall, I would argue Pattern Lab is one of the strongest contenders on the market for creating an automated styleguide. If you would like to learn more, consider reading through the documentation on their website or jumping into the codebase. Mostly, I would recommend downloading and installing Pattern Lab, as it’s the most rewarding way to learn atomic design while building an automated styleguide.

Stay tuned next week for the thrilling conclusion of our Frontend Style Guide Miniseries!

Recommended Posts

  • In this issue: come see us at DrupalCon Amsterdam, Instagram CSS3 filters, ontology is overrated, Douglas Crockford: the better parts, iPhone promo products from Japan, Go-powered robots, tree simulator MMO,…
  • In this issue: Come hang out with us at DrupalCon LA: we're speaking, training, playing Drupal Games, and bowling with Aten and Kalamuna! Plus, Markdown Here, more responsive media sites,…
  • In this issue: DrupalCamp Stanford 2015, party at DrupalCon Los Angeles, Headless Drupal and Frontend Performance training, and UX methods; plus WIRED's new multimedia stories, the look and feel and…
Evan Willhite
Evan Willhite

Evan Willhite is a frontend engineer at Four Kitchens who thrives on creating delightful digital experiences for users, clients, and fellow engineers. He enjoys running, hot chicken, playing music, and being a homebody with his family.

Design and UX

Posts about user experience: best practices, tools we use, methodologies we love. Posts about the design process: wireframes, colors, shapes, patterns. Frontend posts that focus more on the user interface aspects (visual, aural, etc.) than on coding practices.

Read more Design and UX
Nov 16 2016
Nov 16
November 16th, 2016

Speed Up Migration Development

One of the things that Drupal developers do for clients is content migration. This process uses hours of development time and often has one developer dedicated to it for the first half of the project. In the end, the completeness of the migration depends partly on how much time your client is willing to spend on building out migration for each piece of their content and settings. If you’ve come here, you probably want to learn how to speed up your migration development so you can move on to more fun aspects of a project.

The Challenge

Our client, the NYU Wagner Graduate School of Public Service was no exception when they decided to move to Drupal 8. Since our client had 65 content types and 84 vocabularies to weed through, our challenge was to build all those migrations into their budget and schedule.

The Proposed Solution

Since this was one of our first Drupal 8 sites, I was the first to dig my hands into the migration system. I was particularly stoked in the fact that everything in Drupal 8 is considered to be an entity. This opened up a bunch of possibilities. Also, the new automated migration system—Migrate Drupal—that came with core was particularly intriguing. In fact, our client had started down the path of using Migrate Drupal to upgrade their site to D8. Given they had field collections, entity references, and the fact that the Migrate Drupal module was still very much experimental for Drupal 7 upgrades, this didn’t pan out with a complete migration of their data.

The proposed solution was to use the --configure-only method on the drush tool migrate-upgrade. Doing so would build out templated upgrade configurations that would move all data from Drupal 7 or Drupal 6 to Drupal 8. The added bonus is that you can use that as a starting point and modify them from there.

Migration in Drupal 7 vs Drupal 8

Since we have the 100 mile high view of what the end game is, lets talk a little about why and how this works. In Drupal 7 Migrations are strictly class-based. You can see an example of a Drupal 7 migration in the Migrate Example module. The structure of the migration tends to be one big blob of logic (broken up by class functions of course) around a single migration. Here are the parts:

  • Class Constructor: where you define your source, destination, and field mapping
  • Prepare: a function where you do all your data processing

In Drupal 8, the concept of a migration has been abstracted out into the various parts that makes them reusable and feel more like “building with blocks” approach. You can find an example inside the Migrate Plus module. Here are the parts:

  • Source Plugins: a class defining the query, initial data alteration, keys, and fields provided by the source
  • Destination Plugins: a class defining how to store the data received in Drupal 8
  • Process Plugins: a class defining how to transform data from the source to something that can be used by the destination or other process plugins; you can find a full list of what comes with core in Migrate’s documenation
  • Migration Configuration: a configuration file that brings the configuration of all the source, destination, and process plugins to make a migration

Now yall might have noticed I left out hook_prepare_row. Granted, this is still available. It was also a key way many people used to manipulate data across several source fields that behaved the same. With the ideal of process plugins, you can now abstract out that functionality and use it in your field mapping.

How “Migrate Drupal” Makes the Process Better

There are tons of reasons to use Migrate Drupal to start your migration.

It builds a migration from your Drupal site

You might have seen above that I mentioned that Migrate Drupal provides a templated set of configurations. This is a product of some very elaborate migration detection classes. This means you will get all the configurations for:

  • content types
  • field setup
  • field configuration
  • various site settings
  • taxonomy vocabularies
  • custom blocks
  • content and their revisions
  • etc…

These will be built specifically for the site you are migrating from. This results in tons of configuration files—my first attempt created over 140 migration YAML files.

It’s hookable

Hookable means that it’s not just a part of core thing and that it’s expandable. That means that contributed modules can provide their own templates for their entities and field types, allowing Migrate Drupal to move over that data too. For example, it is completely possible (and in progress) for the Field Collection module to build in migration templates so that the migration will know how to import a field collection field. Not only that, the plugins provided by the contributed modules can be used in your custom migrations as well.

No initial need for redirection of content

Here’s an interesting one, everything comes over pretty much verbatim. Node IDs, term IDs, etc. are exactly the same. URL aliases come over, too, by default. Theoretically, you could have the same exact site from D7 on D8 if you ported over the theme.

More time to do the alterations the client wants

Since you aren’t spending your time building all the initial source plugins, process plugins, destination plugins, and configurations, you now have more time to alter the migrations to fit the new content model, or work with the new spiffy module like paragraphs.

How-To: Start a Migration with “Migrate Drupal”

Ok so here is the technical part. From here on is a quick How-To that gets you up and going. Things you will need are:

  • a Drupal 6 or 7 site
  • your brand new Drupal 8 site
  • a text editor
  • drush

1. Do a little research and install contrib modules.

We first need to find out if our contrib modules that are installed on our Drupal 6/7 site are available and have a migration component to them in Drupal 8. Once we identify the ones that we can use, go ahead and install them in Drupal 8 so they can help you do the migration. Here are a couple of thoughts:

Is the field structure the same as in Drupal 6/7? The entity destination plugin is a glorified way to say $entity->create($data); $entity->save();. Given this, if you know that on Drupal 6/7 that the field value was, for example…

[
  'value' => 'This is my value.',
  'format' => 'this is my format'
]

…and that it’s the same on Drupal 8, then you can rest easy. The custom field will be migrated over perfectly.

Is there a cckfield process plugin in the Drupal 8 Module for the custom field type? When porting fields, there is an automated process of detecting field types. If the field type you are pulling from equates to a known set of field types by the cckfield migration plugin, it will be used. You can find these in src/Plugin/migrate/cckfield of any given module. The Text core module has an example.

Is there a migration template for your entity or field in the Drupal 8 module? A migration template tells the Drupal Migrate module that there are other migrations that need to be created. In the case of the Text module. you will see one for the teaser length configuration. There can be multiple and look like migrations themselves, but are appended to in such a way to make them special for your site. You can find these in
migration_templates in the module.

Are there source, process, or destination plugins in the Drupal 8 module? These all help you (or the Migrate Drupal module) move content from your old site to your new one. It’s very possible that there are plugins not wired up to be used in an automated way yet, but that doesn’t keep you from using them! Look for them in src/plugin/migrate.

2. Install the contrib migrate modules.

First you must install all the various contributed modules that help you build these configurations and test your migrations. Using your favorite build method, add the following modules to your project:

NOTE: Keep in mind that you will need to be mindful of the version that goes with what version of Drupal Core. Example 8.x-1.x goes with Drupal 8.0.*, 8.x-2.x goes with Drupal 8.1.*, and 8.x-3.x goes with Drupal 8.2.*.

3. Set up the upgrade/migrate databases.

Be sure to give your database an key. The default is ‘upgrade’ for drush migrate-upgrade and ‘migrate’ for drush migrate-import. I personally stick with ‘migrate’ and just be sure to give the custom settings to migrate-upgrade. I use drush migrate-import a ton more than drush migrate-upgrade.

$databases = [
  'default' => [
    'default' => [
      'database' => 'drupal',
      'username' => 'user',
      'password' => 'pass',
      'host' => 'localhost',
      'port' => '',
      'driver' => 'mysql',
      'prefix' => '',
    ],
  ],
  'migrate' => [
    'default' => [
      'database' => 'migrate',
      'username' => 'user',
      'password' => 'pass',
      'host' => 'localhost',
      'port' => '',
      'driver' => 'mysql',
      'prefix' => '',
    ],
  ],
];

4. Export the migration configuration.

First I want to give credit to Mike Ryan for originally documenting this process. Without it, or his help in IRC, you wouldn’t have gotten this article today.

Go ahead and import your Drupal 6/7 database if you aren’t connecting to a live instance in your database settings with your preferred method. Take your pick:

  • drush sql-sync
  • drush sql-drop --database=migrate; gunzip -c /path/to/migrate.sql.gz | drush sqlc --database=migrate

Next run Migrate Upgrade to get your configuration built and stored in the Drupal 8 site.

drush migrate-upgrade --legacy-db-key=migrate --configure-only

Finally store your configuration. I prefer just to stick it in the sync directory created by Drupal 8 (or in my case configure for checking into Git).

drush config-export sync -y

I’m verbose about the directory because we usually have one for local development stored in the local key also. You can leave off the word sync if you only have a single sync directory.

5. Update your migration group with the info for the migration.

This is a quick and simple step. Find migrate_plus.migration_group.migrate_drupal_7.yml or migrate_plus.migration_group.migrate_drupal_6.yml and set the shared configuration. I usually make mine look like this:

langcode: en
status: true
dependencies: {  }
id: migrate_drupal_7
label: 'Import from Drupal 7'
description: 'Migrations originally generated from drush migrate-upgrade --configure-only'
source_type: 'Drupal 7'
module: null
shared_configuration:
  source:
    key: migrate

5. Alter the configuration.

Ok here comes the fun part. You should now have all the configurations to import everything. You could in fact now run drush mi --all and in theory get a complete migration of your old site to your new site in the data sense.

With that said, you will most likely need to make alterations. For example, in my migration we didn’t want all of the filters migrated over. Instead, we wanted to define the filters first, and then use a map to map filters from one type to another. So I did a global find across all the migration files for:

    plugin: migration
    migration: upgrade_d7_filter_format
    source: format

And replaced it with the following:

    plugin: static_map
    source: format
    map:
      php_code: filter_null
      filtered_html: basic_html

Another example of a change you can make is the change of the source plugin. This allows you to change the data you wanted. For example, I extended the node source plugin to add a where-clause so that I could only get data created after a certain time.

namespace Drupalwg_drupal7_migratePluginmigratesource;

use DrupalnodePluginmigratesourced7Node as MigrateD7Node;
use DrupalmigrateRow;

/**
 * Drupal 7 nodes source from database.
 *
 * @MigrateSource(
 *   id = "wg_d7_node",
 *   source_provider = "node"
 * )
 */
class Node extends MigrateD7Node {

  /**
   * {@inheritdoc}
   */
  public function query() {
    $query = parent::query();
    // If we pass in a timestamp... only get things created since then.
    if (isset($this->configuration['highwater'])) {
      $query->condition('n.created', $this->configuration['highwater'], '>=');
    }
    return $query;
  }

}

Lastly, you may want to change the destination configuration. By default, the configuration of the migration will go to a content type with the same name. It may be possible that you changed the name of the content type or are merging several content types together. Simply altering…

destination:
  plugin: 'entity:node'
  default_bundle: page

…to be…

destination:
  plugin: 'entity:node'
  default_bundle: landing_page

…may be something you need to do.

Once you are done altering the migration save the configuration files. You can use the sync directory or if you plan on distributing it in a module, you can use the
config/install folder of you module.

Rebuild your site with the new configuration via your preferred method, or simply run drush config-import sync -y.

6. Migrate the data.

This is the last step. When you are ready, migrate the data either by running each of the migrations individually using --force, run the migration even though other pieces haven’t, use the --execute-dependencies, or just go ahead and go for the gold drush migrate-import --all

Caveats

So finally after you go through all the good news, there are a few valid points that need to be made about the limitations of this method.

IDs are verbatim due to the complexity of dependencies

So this means that the migrations are currently expecting all the nids, tids, fids, and other IDs, to be exactly what they were on Drupal 6 or 7. This causes issues when your client is building new staged data. You have three options in this case:

  1. Alter the node, node_revision, file_managed, taxonomy_term_data, users, and probably some others I’m missing here that house the main entities that entity reference fields will need, so that their keys are something your client will not reach on their current production site while you are developing.
  2. Do not start adding or altering content on Drupal 8 until all migrations are done.
  3. Go through all the migrations and add migration process plugins where an entity is referenced, and then remove the main id from the migration of that entity.

In my case, I went with the first solution because this realization hit me kinda late. Our plan was to migrate now for data so our client would have something to show their stakeholders, and then migrate again later to get the latest data before going live.

There are superfluous migrations

You will always find out that you don’t want to keep the settings verbatim to the Drupal 6 or 7 site. This means you will have to remove that migration and remove it’s dependency from all the other migrations that depend on it. Afterwords, you will need to make sure that that case is covered. I shared an example in this article where we decided to go ahead and configure new filter formats. Another example may be that you don’t even give a crap about the dblog settings from your old Drupal site.

Final Thoughts

For NYU Wagner, we were able to save a ton of time having the migrations built out for us to start with. Just the hours spent on building the field configurations for the majority of the content types that were to stay the same was worth it. It was also a great bridge into “How Do Migrations Work?” We now have a more complete custom migration for our client in a fraction of the time once our feature set was nailed down, than if we were to go build out the migrations one at a time. Happy migrating.

Allan Chappell
Allan Chappell

Allan brings technological know-how and grounds it with some simple country living. His interests include DevOps, animal husbandry (raising rabbits and chickens), hiking, and automated testing.

Nov 10 2016
Nov 10
November 10th, 2016

Dynamic Style Guides in Drupal 8 with KSS

With a smile on my face I sat down at my desk and installed Drupal 8. I’ve been following along with the all of the feature announcements and now I had my first Drupal 8 project. The client was open-minded and receptive to ideas, so I decided to not only explore Drupal 8 but to see how far I could push it—I was going to create a living style guide.

The designer in me loves style guides. They allow me to talk through design choices and serve as a point of reference for discussions with marketing, design, and development. When you have new needs you add a component to your style guide to see how it fits in, and when a component retires you can remove it. The pain point with style guides is that they are their own artifact and frankly, once created, they rarely get tended to or updated.

Keeping It Living

Drupal 8 solves this problem with its “get off the island” thinking. This means that instead of needing to create something “in Drupal,” I was free to use tools from around the Web. A pair of my favorite style guide tools—Twig and KSS—work wonderfully with Drupal 8.

Every website has a wealth of components that need to be built and maintained. The header may have the site logo, navigation, search bar; the footer has utility navigation, social media, and a feed of the latest content; and the main content area has all the wonderful things that folks come to the site for, articles, image galleries, and forms to sign up for more information.

When we use a component-driven design approach, we are better able to maintain a site and provide consistency. We can predict changes across the whole site. By having a style guide, a client can see how the design comes together and understand how the whole will work.

What is KSS?

KSS (or Knyle Style Sheets) is documentation for humans. You enter some information into your CSS (or SASS, LESS, etc.) and when the CSS is processed by KSS, a style guide is created.

Let’s go through a common example: creating buttons. For our buttons, we’re going to create a call to action button and a disabled button. We’ll start by adding the following comment to the top of our buttons.scss file.


// Buttons
//
// .button--call-to-action - Call to action button.
// .button--disabled - Disabled button.
//
// Markup: buttons.twig
//
// Style guide: components.button
//

Let’s breakdown the content of this comment and how it relates to KSS.

The first line, “Buttons,” is the header for the section in the style guide.

The next few lines define our alternate classes for the buttons. This is because KSS will generate all the examples that we need to demonstrate the different states of the button.

For the markup, there are two ways to integrate it with KSS. The first, which isn’t shown here, is to actually write the code in the CSS comment, e.g. <a href="#" class="button {{ modifier_class }}">{{ label }}</a>. We aren’t doing that here because we want the markup to be in its own file so that it can be used by Drupal templates. Because of that, we point to a separate TWIG file.

The last line in the comment, “Style guide,” is how we keep the style guide organized. References can be either be numbers (1.2.1) or alpha (a.b.a or components.button). Use whatever makes sense for you and your team.

Using KSS to Build Components

Let’s build another component, a promo card, to show this process. This promo card is going to be a self-contained element on the site that will be used to promote content and entice users to click through.

Here is the HTML for the promo card as card--promo.twig:

<article 
  {% if attributes %}
    {{ attributes.addClass('card-promo') }}
  {% else %}
    class="{{ classes }} {{ modifier_class }}"
  {% endif %}
>
  <h2>{{ label }}</h2>
  {% if content.body %}
    {{ content.body }}
  {% else %}
    {{ body }}
  {% endif %}
</article>

Then we create a card--promo.json file with the information (title, content, etc.) that we’d like to have displayed in the template. KSS was built to recognize that when a .json file shares a name with a .twig file and is in the same folder, it will import the json data into the template.


{
  "classes": " card-promo ",
  "label": "This is the card title",
  "body": "JSON Lorem ipsum dolor sit amet, consectetur adipisicing elit. Aliquam atque doloremque eligendi, fugiat fugit laudantium mollitia obcaecati perspiciatis quasi rem rerum saepe sint voluptate? Ab dicta eius harum magnam praesentium.

” }

Here is the SCSS for the promo card as card--promo.scss:


.card-promo {
  border: 1px solid rebeccapurple;
  border-radius: 3px;
  max-width: 300px;
  padding: 20px;

  h2 {
    margin-top: 0;
    color: rebeccapurple;
    font-family: $font-family--garamond; 
    font-size: 1.5em;
    text-transform: none;
  }

  &:hover,
  &.pseudo-class-hover {
    border-color: red;
    color: red;

    h2 {
      color: red;
    }
  }
}

And now we add our KSS comment at the top of card--promo.scss:


// Promo Card
// This promo card is used to promote content on the site.
//
// :hover - Hover state for the promo card.
//
// Markup: card--promo.twig
//
// Style guide: components.card.promo
//

Bringing all this together in the style guide shows us what the card will look like.

Promo card

Making Components out of Components

Now that the single promo card has been built we can create an example of what a row of three promo cards might look like in the site.

Since we are using TWIG, we will simply include card--promo.twig into our new component, card--triptych.twig. (A triptych is a piece of artwork shown in three panels. This is a bit of my art history studies influencing my naming conventions.)

Here is card--triptych.twig:

<section class="card-triptych-wrapper">
  {% include 'card-promo.twig' %}
  {% include 'card-promo.twig' %}
  {% include 'card-promo.twig' %}
</section>

And here is the corresponding card--triptych.scss:


.card-triptych-wrapper {
  display: flex;
  flex-flow: row nowrap;

  .card-promo {
    margin: 0 10px;
  }
}

Now that we have the HTML and CSS written, let’s add the KSS comment at the top of card--triptych.scss:


// Promo Card Triptych
// This shows the cards in a row of three, just like on the home page.
//
// Markup: card--triptych.twig
//
// Style guide: components.card.triptych
//

And here you can see the triptych in action:

Promo card triptych

Adding KSS Components to Drupal Templates

Now that we’ve put together a few components and got the client’s acceptance, it is time to connect our templates with Drupal templates. We’re going to connect up the promo card with an article teaser view.

Now we will setup our Drupal template. We start by creating the file that we need, node--article--teaser.html.twig. Inside this file we will be extending our original card file, card--promo.twig, and overwriting generic style guide content with Drupal-specific content.

{#
/**
 * Extending the base template for Drupal.
 */
#}
{% extends "@styleguide/card-promo.twig" %}

With this in place every change or edit that we make to our promo cards will be reflected on the live Drupal 8 site!

Creating Dynamic Style Guides Benefits the Whole Project

This component-driven approach is a big win for everyone in the process. The client wins because they can always see what is going on in their site via the style guide and make decisions based on live comps. Backend developers win because connecting Drupal templates to other TWIG files is an easy process. Frontend developers win because they can use their own organization structure or methodology for building components. And designers (who code) win because they can use their own skills to create components.

If you’d like to play with KSS and Drupal yourself here is a repo for you.

Stay tuned next week for part three of our frontend style guide miniseries, when Evan Willhite will cover atomic design principles with Pattern Lab!

Recommended Posts

  • Welcome to Part Three of our frontend miniseries on style guides! In this installment, we cover the bits and pieces of atomic design using Pattern Lab.
  • In this issue: come see us at DrupalCon Amsterdam, Instagram CSS3 filters, ontology is overrated, Douglas Crockford: the better parts, iPhone promo products from Japan, Go-powered robots, tree simulator MMO,…
  • Welcome to the final post of our frontend miniseries on style guides! In this installment, the Web Chefs talk through how we chose Pattern Lab over KSS Node for Four…
Randy Oest
Randy Oest

Randy Oest is an avid Star Trek fan, plays too many board games, and bought his mother an iPad so that he wouldn't have to fix her computer anymore.

Design and UX

Posts about user experience: best practices, tools we use, methodologies we love. Posts about the design process: wireframes, colors, shapes, patterns. Frontend posts that focus more on the user interface aspects (visual, aural, etc.) than on coding practices.

Read more Design and UX
Nov 02 2016
Nov 02
November 2nd, 2016

We are pleased to introduce to you and the world to the new website for the New York University Robert F. Wagner Graduate School of Public Service . NYU Wagner is a public policy school that offers a comprehensive curriculum in public and nonprofit administration, policy, and management. Over the course of only five months the site has been updated front-to-back to better serve the educational and community goals of Wagner’s students, faculty, and staff.

On the backend we’ve upgraded the site from Drupal 7 into a fresh and modern Drupal 8 installation, continuing our focus on keeping our clients current on technology for a longer project lifetime. On the frontend the site has fresh visuals and updated content to aid visitors as they explore or search the site. For site admins we integrated a living style guide into the site that stays up-to-date when changes to the Drupal theme are made.

Migrating to Drupal 8

The NYU Wagner team was eager to take this opportunity while migrating from Drupal 7 to Drupal 8 to streamline their architecture. We reviewed the previous architecture and consolidated the myriad specialized content types into a consistent collection of reusable and scalable content types.

The migration process was tricky. As work progressed on the site the Migration path between Drupal 7 and Drupal 8 was evolving, which necessitated updates mid-stream. The process also brought a few migration bugs to light which were fixed for this project and then submitted to the Drupal community so that they could get resolved.

Creating a Living Style Guide

Drupal 8 opens up a lot of new options for frontend development, thanks to the new template engine, Twig, and we used this to create a solution for Wagner’s site editors. One of the concerns on this project was making sure that site editors were consistent in the application of styles across the site. Enter a living style guide, built with KSS and Twig. KSS was used to organize and assemble the style guide and Twig was used for the style templates which were in turn were directly used in Drupal templates. Now when styles are updated in the theme, the style guide is too.

Engineering and Development Specifics

  • Drupal 8 site extended with contrib and custom modules
  • Migration advancements for Drupal 7 to Drupal 8 migrations
  • Migration of videos hosted on Vimeo and controls to keep all video information up-to-date from Vimeo
  • Living style guide for site editors by integrating twig components with Drupal templates
  • Config-based build utilizing Drupal 8 improvements to Aquifer, GitHub, and CircleCI

The Team

The Four Kitchens team was headed by Allan Chappell (backend engineer and migration) and included Alex Hicks (project manager), Jon Peck (backend engineer), Donna Habersaat (UX), Randy Oest (frontend engineer) and Ben Teegarden (frontend engineer). Thanks also to the NYU Wagner internal team of Lawrence Mirsky, Hollis Calhoun, Rachel Szala Grant, Chun Fang and Haris Ahmed.

Randy Oest
Randy Oest

Randy Oest is an avid Star Trek fan, plays too many board games, and bought his mother an iPad so that he wouldn't have to fix her computer anymore.

Nov 01 2016
Nov 01

Trip Report: BADcamp 2016I've just returned from my first big Drupal camp and I'd like to tell you about my experience.

The post Trip Report: BADcamp 2016 — Teaching, Learning, and Bonding appeared first on Four Kitchens.

Oct 25 2016
Oct 25
October 25th, 2016

MDedge is a digital and print publisher providing resources for health industry professionals. The MDedge.com portal currently contains 35 separate properties including medical journals, specialized hubs, and news publications.

The Project

Four Kitchens worked with the MDedge team to…

  • migrate over 130,000 articles, quizzes, and other content from a variety of sources
  • create a two-way integration with their print publication tool, teambase, to enable the desired editorial workflow
  • create custom integrations with a number of 3rd-party vendors to provide powerful advertisement targeting and customer identity management
  • implement a content distribution system between all properties providing a deep level of customization alongside robust automation
  • provide functional test coverage for identified high-value functionality using behat and CircleCI

Timeline and Milestones.

The Discovery phase started in September 2015, with the Design and and Build phases in the following months. The new MDedge.com was deployed in three stages: The first launch included a single property in July 2016—the main MDedge.com site featuring OBG Management. This was fast-followed by a second launch phase that included fifteen properties in September 2016, while the remaining properties were launched in October 2016.

Engineering and development specifics

  • Drupal 7
  • hosted on Pantheon
  • functional test coverage with Behat
  • continuous integration builds on CircleCI
  • code standards enforced with ESLint and PHP_CodeSniffer
  • site migration
  • custom 3rd-party integrations

The Team

Peter Sieg – Technical Lead
Mike Klanac – Project Manager
Mike Minecki – Business Analyst
Caris Hurd – UX Designer
David Diers – Engineer
Randy Oest – Designer & Frontend Engineer
Brian Lewis – Frontend Engineer
Joe Tower – Frontend Engineer
with additional support from Alex Hicks and Suzy Bates

with
James Todd – Engineer
Roberto Cardenas – Engineer
Simon Mora – Engineer
Hans Riemenschneider – Frontend Engineer
Karl Kaufmann – Designer

Peter Sieg
Peter Sieg

Peter Sieg is an engineer who loves making things, learning, and solving problems.

Oct 06 2016
Oct 06
October 6th, 2016

Four Kitchens had previously worked with the NYU Rory Meyers College of Nursing, creating the first mobile-responsive version of their existing Drupal 6 website in 2014. While the previous site was more than capable “back in the day,” the needs of the site’s current visitors and administrators had outgrown the technology.

NYU wanted a new online experience that was as modern as their user base, admin friendly for a diverse range of content, and leveraged the speed and responsiveness of Drupal 8. Partnering with Four Kitchens this summer, both teams set out to develop the brand new Rory Meyers College of Nursing site. Throughout the project, I served as the Project Manager/Numbers Guy/Scrum Master/Group Therapy Facilitator/Product Owner/Bad News Guy.

Building A Stable Foundation for Future Growth

“First founded in 1932, the NYU Rory Meyers College of Nursing is the second-largest private university college of nursing in the US, and reflects the intellectual curiosity, dynamism, and quality characteristic of NYU.”

As the quote from their About page says, NYU College of Nursing understands change. Starting as a department, then growing to a division, and finally being established as an independent college, growth is in the DNA of the College of Nursing. Despite the relative newness of Drupal 8, it was the platform the college had been waiting for and they wanted D8 to serve as their foundation for future growth. Throughout the project, NYU staff and our team had three primary goals for the new site:

Modernize the site to reflect the current user base

From prospective students looking for admissions information on their iPad, to a professor keeping their research publications up to date on their desktop, to alumni finding an event on their phone, the new site had to offer the same delightful experience across a variety of platforms. Speed and efficiency were also top of mind as well, as the site had to perform on both wired and limited bandwidth platforms.

NYU Nursing website on desktop and mobile

Make the site administrator friendly, allowing a wide variety of content to be edited by non-technical administrators

In a field as diverse as medicine, making complicated content easy to find (and edit) was of critical importance. College of Nursing staff needed the ability for specific areas of the site to be managed independently, without lengthy trainings or complicated technical workflows. The team used Drupal modules such as Focal Point, Paragraphs, and Scheduled Updates in order to reduce the workload on both developers and content administrators.

Screen shot of NYU Nursing admin panel.

Leverage the built-in advantages of D8 and new modules to improve speed and reduce workload

Already sold on the advantages of a Drupal platform, NYU was looking to D8 to serve as its technology platform for years to come. Responsiveness, speed, and modular design and styling options were just a few of the features that were important to staff. Knowing that other programs or colleges may also come to benefit from the technological advancements we were building into the College of Nursing site , the Four Kitchens team developed the project with a component based theming* approach, and utilized PatternLab to create a living style guide. When appropriate, these components can be re-used or re-skinned for future site sections, or even for additional NYU global sites, saving both time and money on future projects.

[NB: check out Web Chef Evan Willhite’s short video series on component-based theming for Drupal 8—developed while on the NYU Nursing team!]

Team. Work.

As a project manager (and entry-level product owner), I could not have asked for a better team. It’s a clichéd cliché, but successful projects always come down to communication—whether the situation was lighthearted (“Let’s just assign that story to Brian, since he’s on vacation today.”) or serious (“Well, plans A through E didn’t work, but I have a great feeling about F…”), our entire group communicated regularly (every day through Slack and Zoom video conference calls) and honestly (the good, bad, and the ugly). At the end of the project, we all felt like we had truly been through the wringer together. We’re not just a customer with specifications, and developers cranking out orders, but an actual team, with diverse perspectives, skill sets, and roles that pulled together to accomplish a formidable task.

Not to toot our own horns too much, but in the words of Todd, “Honk-Honk!” The new site launched on time, on budget, and without any major technical issues, thanks to the hard work of the NYU College of Nursing team and the web chefs.

Honk-Honk!

Chris Devidal
Chris Devidal

Chris Devidal is a Digital Project Manager and Scrum Master from Austin, Texas. He enjoys Cabernet, long walks on the beach, and snarky blog signatures.

Oct 05 2016
Oct 05

We’re excited to announce the new NYU Rory Meyers College of Nursing website, which launched at the beginning of September, just in time for fall semester. The new site is a complete rethink and rework of their original Drupal 6 site—complete with a fresh design, improved content model, and Drupal 8 site build.

Four Kitchens began the site build in April 2016 reviewing wireframes and comps provided by NYU’s design partner Hello Pilgrim. Work began quickly, developing the content model and building out the content types. Given the end-of-summer deadline and the fact that there wouldn’t be a migration, a priority was placed on getting the site ready for content entry as early as possible for the client’s editors.

The experience of developing on Drupal 8 was pretty new to everyone on the the project team and provided some opportunities for retooling our development process. Work was done to get our Aquifer build tool working with D8’s Composer-based workflow. On the frontend, Drupal’s Twig templating allowed us to experiment with a component-based theming approach using Pattern Lab. We fell in love with Paragraphs and YAML Form to provide adaptable content entry patterns and data capture for the client. We also really enjoyed working with D8’s object-oriented architecture when creating custom modules for syncing user, academic program, and research publication data from various sources with the content in Drupal.

College of Nursing homepage screenshot.

Engineering and development specifics

  • Drupal 8
  • Pattern Lab for living style guide and component-based Drupal theme
  • Aquifer build tool
  • CircleCI for build testing, code sniffing, and deployment
  • custom data syncing with client faculty/staff database, PubMed and Crossref publication metadata, and NYU’s student information system
  • hosted on Acquia

The Four Kitchens team was led by Chris Devidal (Project Manager and Product Owner), with Jeff Tomlinson (Technical Lead and backend engineering), Evan Willhite (frontend engineering), and Brian Lewis (site building and frontend development). Also thanks to David Diers and Allan Chappell for additional code review support. The team from NYU included David Resto (NYU systems integration and general wrangling), Keith Olsen (content specialist), and Hershy Korik (backend development). Mike Kelly, from Hello Pilgrim, worked on the sitemap, wireframes, and graphic design.

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web