Aug 08 2018
Aug 08

Security maintenance — and the ability to apply security updates quickly — is part and parcel to open source project success. 

Updating is typically done as part of the normal software release cycle, however, there are times when a security advisory needs to be released ASAP. A strong incident response plan builds a first defense line to mitigate and patch vulnerabilities. 

But what does a successful security response look like in action?

On the heels of a recent Drupal security update on August 1, 2018, Mediacurrent’s Senior Project Manager Christine Flynn had the same question. To find out, she interviewed our Open Source Security Lead, Mark “shrop” Shropshire, to get a layperson’s perspective on the security team’s approach.

Christine and Shrop on a call

 

“An off-cycle Drupal security advisory dropped on August 1, 2018. What does that mean for folks who aren’t developers?”

Flynn: I was watching the Slack channel as our team fixed sites, and I got some idea of what was happening. I’m not going to jiggle anybody’s elbows while they’re applying a security update, but I’m really curious now that the fixes are all in. 

Shrop: The official Drupal Security Advisory came out late in the day, after Symphony published their announcement in the morning. There was also one from Zend.

Flynn: I read all of those links while the team was applying the security update, but I feel like I didn’t totally understand the implications. I’d love to get a better picture from you of what they mean.

Shrop: You bet! I hope you can hear me, I’m at a coffee shop right now.

Flynn: Are you on their unsecured WiFi?

Shrop: Nope! I’m on a hotspot and on VPN. It’s funny, the more you know about security, the more it changes what you do. Other people think you’re paranoid. But you’re not! You just understand the realities. 

Flynn: Ha! Why am I not surprised? All right, let’s dig in.

“What was the security update for?”

Shrop: Drupal Core was updated because there were some security releases for Symfony. We call those “upstream” in the biz, which means that Drupal depends on them, and they are actively worked on outside of Drupal. I understand the Symfony project worked closely with the Drupal Security Team to make sure Symfony and Drupal were both updated and ready to be announced publicly at the same time. Drupal version 8.5.6 pulls in the Symfony updates as part of the Drupal update process. 

Flynn: Was that the only update?

Shrop: No, at the same time, there was also an update to Zend Framework, but that was only an issue for users who were making use of modules or sites that used Zend Feed or Daictoros. There is a core issue to update the related Zend libraries for those who require or need the updates. 

“If not updated, what could a malicious user do to a site?”

Shrop: This is a hard one to answer this soon after the release of the security advisory. I’m going to do some checking to see if I can get more information on this for academic purposes, but the Drupal Security Team is not going to make any statements that could help someone attack a site. It is up to security teams and researchers to dig into the code and determine more about the risks involved.

Based on the Symfony project’s blog post, it appears that a specially crafted request could allow a user access to a URL they do not have access to, bypassing access control provided by web servers and caching mechanisms. That’s a fancy-pants way of saying that a website visitor could gain access to pages you don’t want them to see.

“When will we know more?”

Shrop: Within days - sometimes hours - we might start to see exploit methods posted on the Internet. Taking security seriously and responding quickly once a drupal.org security advisory is announced is a way to stay ahead of these concerns.

Mediacurrent doesn’t want to fearmonger, but it is better to be safe than sorry. That’s why I always push to update as soon as possible while weighing in on mitigating factors that may lessen the severity of the issue for a particular application. But I will keep digging. I’m curious! 

“If you had to tell a CEO or CFO the value that implementing this security update swiftly provided, what would you say? Let’s say this CEO does not have a strong background in technology or security.”

Flynn: I could see an executive with a strong public safety or physical security background being pretty understanding of why you want to apply a security update for a potential vulnerability quickly, but what if it’s someone who doesn’t have that experience, and isn’t a technologist?

Shrop: Check out this link from Acquia about the security update. This helped me so much. They published this shortly after the PSA came out, and although they’ve updated the text since then, they said at the time, “It is advised that customers set aside time for a core upgrade immediately following.” When I read, “immediately,” I knew that we had to get the update out within hours. If I was asked to get on a call with the executives from any company, at that point, I am confident. If Acquia is saying it, we need to do it. That’s enough to stand on with anybody. I’m not saying that the Acquia team has more information, but they have a very robust security team. They always dig in quickly. They have to, to know if they can mitigate the issue by adding web application firewall rules.

Flynn: Firewall rules? How does that work? 

Shrop: The last few core updates, Pantheon and Acquia put mitigations into their WAF - that’s Web Application Firewall. Pantheon confirmed the night of the security advisory release that they were blocking attempts on their platform, and Acquia did the same thing. So if someone tried to exploit a site that was hosted there before Drupal was updated, they were there, helping to prevent that site from being attacked successfully. It’s a great extra layer of protection. Now, me and Acquia and Pantheon will always still want to update Core on each site, because WAF-level mitigation might not catch everything. But I am super happy when I see it because there’s a good chance that it will catch anything that happens while a team is still implementing a security update.

Security is all risk assessment and mitigation. You want to layer defenses. And something like this, we are going to make sure we deal with this problem. That’s why Acquia, Pantheon, Platform.sh, and others in the community immediately add those extra mitigations to their firewalls. It’s to buy time so that people can get their updates in. That’s not where mitigation ends, but it helps. 

“What type of sites were affected by this? Does everyone use Symfony?”

Flynn: When I first read about the upcoming security advisory, I saw that it affected “third party libraries.” That made me think that some of our clients might not be affected because it would only affect certain modules. Can you tell me what types of sites were affected?

Shrop: Got a link for you, but basically, anything on Drupal 8 was affected. Drupal 8 uses components from the Symfony project. The Drupal community made the decision to use Symfony so that we didn’t have to maintain everything ourselves. So this is a great example of the power of open source, with the Symfony and Drupal security teams working together to release this fix. We all end up benefiting from having a larger community to fix issues. There’s no way an internal team working by themselves can write as secure applications on their own compared to open source software, in my opinion. It has nothing to do with how good you are, it’s the nature of development. With open source, you have a greater team with Drupal and then again, with Symfony, an even greater team to lean on. With each community that is included you are expanding your team and your ability to detect and prevent threats. 

“How was the security vulnerability discovered?”

Shrop: That’s generally never disclosed because you never want to tell malicious users how you found an opening. 

But we do have a few people to thank: Michael Cullum and @chaosversum were thanked by Symfony for separately reporting the two issues addressed in Symfony security releases. They also thanked Nicolas Grekas for implementing the fix. I would also give a huge thanks to Symfony and the Drupal Security Team for coming together to implement the fix and for coordinating the announcements. It’s hard work, and it shows the community at its best.

“So when we have an off-cycle security release, first the PSA comes out. Can you tell me a bit about what Mediacurrent does from the time the PSA comes out to just before the security advisory drops?”

Flynn: As someone on the team at Mediacurrent, I can see some of the things you do. But I’m wondering what else happens behind the scenes? 

Shrop: The first thing that happens is that I’m notified about the PSA coming out. I’m signed up for updates via email, Twitter, and RSS feeds from https://www.drupal.org/security, and so are a lot of other folks at Mediacurrent. Internally, we have some processes that we have standardized over time for how to deal with security updates that we follow across the company. We centralize information we have on the security PSA/advisory, recommend client communications, and talk about how to prepare as a team. We have multiple communication threads internally, as well, so no one can miss it. I send an email to the staff and I post in our Slack in a few places to get us ready.

Flynn: I know that we often clear time in advance for the team to implement the security updates.

Shrop: Yep. All of us share more information as a team as official information is released or as our own investigations reveal information. For example, early on the day the security advisory was released, our DevOps Lead, Joe Stewart, noticed that Symfony had put out a notice that they were also going to be releasing a security update that day, so that gave us a heads up that it might be related. We couldn’t know for sure until the security advisory actually came out, though. No one can do it by themselves, which is why we have a whole team working on it - it’s the only way to handle these things. ​​​​​​

Christine and Shrop on another call

“So then the security advisory drops. How did we go about fixing the issue?” 

Shrop: First, we reviewed the advisory to assess risk and for any mitigations that help determine how quickly we need to perform updates. With this advisory, it was needed pretty much immediately, so we started to update Drupal core for our clients and pushed to test environments. Our QA team performed regression testing related to the update. Once QA approved each update for each client, we worked with folks to approve the updates and release them to the live environments. 

The important points are to line everyone and everything up in advance, have the talent in-house who can work on clients of all shapes and sizes and needs, and then to work as a team to resolve the issue on every client site as quickly as possible. 

“Were there any sites that were trickier to update? Why?”

Shrop: Clients that were on older versions of Drupal Core, who had delayed upgrading, were harder to update. Every site was updated within a short time, regardless, but even though they started at the same time, those clients did not finish first, because there was more development and testing needed on each site.

Flynn: What was different about the process to update those sites? 

Shrop: If a client wasn’t on version 8.5.x, the lead technologist on the project had to work on an alternative update to secure the site or application, since there wasn’t a security update released for it. Figuring out an alternative process on the fly always introduces risk. It’s part of the value that we bring, that we have team members that have the expertise to evaluate that sort of thing. For example, we had one new client that was on an older version of Drupal 8 core. So one of our Senior Drupal Developers, Ryan Gibson, had to go in and determine what to do. He ended up updating Symfony itself to mitigate the risk. 

Flynn: I’m guessing that we are going to recommend to that client that we update Drupal core for them very soon?

Shrop: Yes. The big takeaway is you’re lowering your risk of problems by staying on the most recent, up-to-date minor version of Drupal 8. Version 8.5.x is current and stable right now, so you should be on that.

Flynn: Why would a client not update?

Shrop: There are always dynamics. I hear lots of good excuses, and I’m not exaggerating, they are good, real reasons! The client is busy, the client has multiple workstreams, it’s hard - but it is getting to a point where I want to recommend even more strongly to clients that it is more expensive to not upgrade. It is going to cost them more when there is an update because we have these additional evaluation and update tasks. The whole point of Drupal 8’s release cycle is to spread the maintenance cost over years rather than getting hit all at once. 

Flynn: And it introduces greater risk. A security breach is an order of magnitude more expensive than extra mitigation steps.

Shrop: Definitely.

“When is the next version of Drupal Core coming out?”

Shrop: Version 8.6.0 will be released in September. Our teams are already starting to test the early versions of this release on some of our projects. If a security update comes out in September, we want all of our clients to be prepared by being on the currently supported version of Drupal core. That way, they will receive security updates.

Flynn: One of the nice things about the Drupal development community is that they provide the betas of the next version of Drupal core so you can get ahead of the next release, right?

Shrop: Yes. When the community starts releasing betas or release candidates, especially release candidates, you want to start testing ahead of time. If you have a Drupal site, you can get your developers to test. If you find a problem, it may not be with your site, it might be an issue with Drupal core and this is a great opportunity to contribute your findings back to drupal.org and help the greater community. There might be a security release weeks after a version comes out and you want to be prepared to implement it.

Flynn: It goes back to risk mitigation.

Shrop: If you are on, say, an 8.2 site right now, you’re on the higher risk side, unfortunately. We advise our clients that it is in their best interest to be on the current, stable version. It costs our clients more in the long run if they don’t update on a steady basis.

Flynn: So if you’re on an older version of Drupal Core, you might not get an easy-to-implement security update when a vulnerability is discovered?

Shrop: The quotes from the Drupal Security team I really want to emphasize are, “Previous minor releases will become unsupported when a new minor release is published,” and, “Any additional security updates for officially unsupported branches are at the sole discretion of the security team.” This is important to understand. For the SA Core 2018-002 fix earlier this year they provided release updates for older versions of Drupal… but they didn’t have to. In the case of the fix last week, they did not.

“What was the best gif exchange of the Drupal core security update process?”

Flynn: I nominate this one, from mid-afternoon:

Slack Gif Example

Shrop: Definitely! 

“What story didn’t we tell yet?”

Shrop: I think we covered most of it. The last thing I’d put out there is for the technical folks reading this. You need to read the security advisories, join Drupal Slack, read what Acquia, Pantheon, and others are saying about each announcement. Then, you take all of that in and make your assessment of what actions you are going to recommend your organization take. This should lead your organization to a documented security plan that you follow. But, you know… 

Flynn: “Update all the things”?

Shrop: Exactly!

Other Resources
7 Ways to Evaluate the Security and Stability of Drupal Contrib Modules | Mediacurrent Pantheon Guest Blog 
Security by Design: An Introduction to Drupal Security | Mediacurrent Webinar

Aug 03 2018
Aug 03

We all have heard the debate about Open Source Software and Closed or Proprietary; but what is the real difference?

Simply: 

Open source software is available for the general public to use and modify from its original design free of charge. 

versus

Closed source where the source code is not shared with the public for anyone to look at or change. 

One of the main advantages of open source software is the cost; however, when applied to OSS, the term "free" has less to do with overall cost and more to do with freedom from restrictions. 

90% 
According to Forrester Research 90% the code in a typical application is Open Source.  

For a Closed Source CMS, depending on the choice of software, the cost can vary between a few thousand to a few hundred thousand dollars, which includes a base fee for software, integration and services, and annual licensing/support fees. 

Open Source Software vs Proprietary Graphic

In a 2017 report by Black Duck Software by Synopsys, nearly 60% of respondents said their organizations’ use of open source increased in the last year citing: 

  • cost savings, easy access, and no vendor lock-in (84%)
  • ability to customize code and fix defects directly (67%)
  • better features and technical capabilities (55%)
  • the rate of open source evolution and innovation (55%).

1M+ 

Websites across every industry vertical —from Georgia.gov to Harvard— trust Drupal as a secure open source CMS platform. 

Open Source Software has a long-term viability and is always on the cutting edge of technology.  Selecting technologies means committing to solutions that will support an active, growing business over the long term, so it requires careful consideration and foresight.  Here are some of the benefits of open-source to consider:

1. Value for cost 
       Worried about your marketing budget when looking at changing your CMS? Open source software has no licensing fees! It’s free, which means room to spend your $ on other initiatives.
2. Added Security
       Open source - means community. The more people (developers) looking at the source code, the more fixes and regular updates will be available. What can sometimes takes weeks or months to resolve with proprietary software takes just hours or days with open source. This will help give your marketing team a piece of mind knowing that if you don’t have time to look at the code of your site - or you don’t know how - then there are developers all over the world continuously checking for bugs & fixes.
3. Customizability
       Have a really customized idea for your site that you’ve never seen elsewhere? Open Source can help. By customizing the code to fit your needs, it provides a competitive advantage for your business.
4. Flexibility
       Open-source technology naturally provides flexibility to solve business problems. Your team and organization can be more innovative, and it gives you the ability to stay on the cutting edge of latest trends & designs.
5. Integrations
       With open source, especially Drupal, you can Integrate the best-of-breed marketing technology. It is architected for easy integration with your tools - Marketing automation, email service providers, CRM, etc… Drupal 8 gives you the foundation for your digital experience ecosystem.
6. Speed
       This isn’t just about site speed, but the ability to get your site up and running - with full marketing capabilities - on time & within budget. Open source allows you to deliver value right away.
7. Scalability 
       Drupal and other open source platforms give you the advantage of being able to scale your digital presence for the future. You're not confined to stick with what you already have. You can continue to evolve and build long-term with open source.

The Benefits of Open Source can go on for pages but it’s important when evaluating your options to think about your business and its goals. Once consistent need we see is having access to a CMS that is easy for you and your team to manage on a day-to-day basis.

In the next blog of the series - we’ll hear from the Associate Director of Digital Marketing at Shirley Ryan Abilitylab, about how he is leveraging open source - particularly Drupal - to achieve his business goals. 

Aug 03 2018
Aug 03

If you haven’t heard of the phrase “Natural Language Processing” by now, you soon will. Natural Language processing is an expanding and innovative use of technology to analyze large amounts of data or content and derive meaning from it, short-cutting a tremendous amount of manual effort needed to do that ourselves. It’s been around in some form for quite a while, but it was often relegated to complex enterprise systems or large corporations with a vested interest in automating the data mining of huge amounts of data to figure out what the patterns were, for example, in consumer purchasing trends or social media behavior. It’s a cool idea (it’s a form of artificial intelligence after all) and fuels a lot of our online experience now whether it’s product recommendations, content recommendations, targeted ads, or interactive listening services like Siri or Alexa. What’s even better is that this sort of thing is becoming more and more accessible to use in our own software solutions as many of these now provide services with APIs. This allows us to provide a more personalized or meaningful experience for site visitors on web projects that likely don’t have the budget or requirements to justify attacking natural language processing itself and can instead find accessible ways to benefit from the technology.

One such use case that we’re talking about today is using the Google Natural Language Processing APIs on our own Drupal sites. We can use it to analyze our own site content and even autotag based on a common taxonomy. We really dig integrations here at Ashday, so we’ve just released two new Drupal modules to help you get hooked up with Google’s service. They are the Google NL API and Google NL Autotag modules.

The Google NL API Module

This module is intended to be your starter module to get things going. It provides functionality to connect to Google's Natural Language API and run analysis on text, including sentiment, entities, syntax, entity sentiment and content classification. It doesn’t decide what to do with this analysis, but it provides a service with a number of methods to analyze your content and then you can decide what to do with the information. All you need to get going is a Google NL API account. Full details of installation and usage can be found here.

Here is a brief outline of what each method provides, provided by Google.

Sentiment Analysis

“Sentiment Analysis inspects the given text and identifies the prevailing emotional opinion within the text, especially to determine a writer's attitude as positive, negative, or neutral. Sentiment analysis is performed through the analyzeSentiment method.” (from Analyzing Sentiment)

Entity Analysis

“Entity Analysis inspects the given text for known entities (proper nouns such as public figures, landmarks, etc.), and returns information about those entities. Entity analysis is performed with the analyzeEntities method.” (from Analyzing Entities)

Syntax Analysis

“While most Natural Language API methods analyze what a given text is about, the analyzeSyntax method inspects the structure of the language itself. Syntactic Analysis breaks up the given text into a series of sentences and tokens (generally, words) and provides linguistic information about those tokens.” (from Analyzing Syntax)

Entity Sentiment Analysis 

“Entity Sentiment Analysis combines both entity analysis and sentiment analysis and attempts to determine the sentiment (positive or negative) expressed about entities within the text. Entity sentiment is represented by numerical score and magnitude values and is determined for each mention of an entity. Those scores are then aggregated into an overall sentiment score and magnitude for an entity.” (from Analyzing Entity Sentiment)

Content Classification 

“Content Classification analyzes a document and returns a list of content categories that apply to the text found in the document. ” (from Classifying Content)

The Google NL Autotag Module

This module is the first step in actually doing something with the natural language analysis results provided by the API module. It provides a Google NL Autotag taxonomy that you can attach to whichever content types you choose and then will automatically create the relevant taxonomy terms and relate content whenever that content is saved. So a few clicks is all you need to have a nice auto-classification system in use on your site. You can even configure which text-based fields on your content should be used for the analysis as well as specify the confidence threshold, which determines at what confidence level you consider a Google classification as valid. So Google may say that the content matches the category /Home & Garden/Bed & Bath/Bathroom, with a confidence of .4 (on a scale of 0 to 1). You can decide what confidence level is good enough to categorize your content since different use cases may justify different approaches. A full list of Google’s content categories can be found here.

That’s all this module essentially does, but it’s meant to be a simple solution for sites to easily start benefiting from Google’s natural language services. You can, of course, extend the service or add your own functionality to use these APIs however you find beneficial because it’s Drupal 8, and Drupal 8 rocks when it comes to flexibility. So install these modules and start tinkering and don’t hesitate to ask if you have any questions or suggestions for future functionality.

Aug 02 2018
Aug 02
Have you seen the latest changes for DrupalCon? They have replaced a day of sessions with an additional workshop/summit day (for an additional expense) and have increased the early-bird basic ticket price from $450 to $800. I went to DrupalCon Nashville and it was a good conference, but it felt more commercial and like a trade show with sponsors everywhere you look. I've also been to DrupalCamp Asheville (didn't get to go this year because of a scheduling conflict) and for me, I believe camps are a better bang for the buck and where I'll be focusing for continuing education.
Aug 02 2018
Aug 02

One of the most popular components on any website I've worked on is the card component.  Depending on the website, the card component is used to group various pieces of content into a container which can be used throughout your website.  Here’s an example of a card component in one of our projects.

In some instances, a card can be a user profile, content teaser/excerpt, latest news or events.  Today we will build a card component which will contain the information for the latest events.  This card can be used in the /events page to list all the latest events taking place.  The card can also be used in other pages and it may need to be displayed slightly different than the original card, this transformation is called "variations" and is usually achieved by passing a modifier CSS class to the original component which we can use to write different styles for the card.  More on this later.

Let's get started

First, let's take a look at what we are building so we have a visual of the code we will be writing.

Component Example 1

As you can see from the images above, our card takes different shapes but the information in it remains the same or similar.  This is a common practice to modify components based on where and how they are being displayed.

Let's build the card

Identify component's fields

Before starting to write any code, let's take a close look at the card images to identify the fields or data elements the card needs.  From the images above we can see the following fields will be needed:

  • Image
  • Title
  • Subtitle
  • Excerpt
  • Label or category
  • Label for comments
  • Label for time posted

Typically, components are built following the Atomic Design approach which basically means breaking things down into small pieces (atoms) and then combining those pieces to build more elaborate components.

In the interest of time, we are going to skip building each piece as its separate component and instead we will build the card as a whole.  I recently wrote a blog post on building flexible headings which demonstrate the approach I usually take when building components.

Architecting the component fields

We know the fields we need but it is helpful to determine what type of fields they are and what they will be called.  The table below gives us a good visual of our fields architecture.

List of Example Component Fields

Writing the code

1. In your components or patterns directory in your project, create a new folder called `card`

2. Inside the Card folder create two new files
          -  card.html
          -  card.css (we're using CSS to keep things simple.  Feel free to use Sass if you'd like).

3. In your card.html we'll start with the basic structure of the card

<article class="card">...</article>

So we've created an article tag as the card's container.  Why <article>  and not <div>?  Well, since the card is intended to contain an event's info, if we place a bunch of cards together it makes sense that each card is an individual event article.

4. Next we are going to create wrappers for the two types of data the card will hold.  The two types of data as we have seen from the table above are: Image/media and text or content.  Let's modify our code so it looks like this:

<article class="card">
 <div class="card__media">...</div>
 <div class="card__content">...</div>
</article>

The reason for splitting the image and the rest of the content is that this allows us to move things around independently of each other to create different variations of the card.  More on this later.

Let's pause to review how content will be laid out

If we look at the card images above, we see that some of the fields are either on the image side and others are grouped together separate from the image.  Although we could potentially place fields in the image or content wrappers, it makes sense to place them in the wrapper where they visually appear.

For example, the date and label fields (Photos), look like they belong in the same group as the image.  The rest of the fields will go in the Content wrapper.

5. Let's start with the image wrapper:

<article class="card">
 <div class="card__media">
   <img src="http://placehold.it/300x300" alt="Card image" />
   <div class="card__date">
     <span class="date--day">27</span>
     <span class="date--month">Mar</span>
   </div>
   <span class="card__category">Photos</span>
 </div>

 <div class="card__content">...</div>
</article>

So let's focus in the fields inside card__media. We see we have our image, which in this case is rendering a placeholder image.  Then we have a wrapper to hold the date fields (day and month).  Again, wrapping these two together makes it easy to manipulate them as a single item rather than two separate fields.  Lastly, we have the category for this content (Photos).  Since all these fields seem to belong together, it makes sense to place them the way we have.  Later on, we will use CSS to place each of the fields in their right position.

6. Now let's place the remaining fields inside the .card__content wrapper.  We will temporarily hide the fields in the .card__media wrapper to focus on the other fields.  Modify your markup to look like this:

<article class="card">
 <div class="card__media">...</div>

 <div class="card__content">
   <header class="card__header">
     <h2 class="card__title">City Lights in New York</h2>
     <div class="card__subtitle">The city that never sleeps</div>
   </header>
   <p class="card__excerpt">New York, the largest city in the U.S., is an architectural marvel with plenty of historic monuments, magnificent buildings and countless dazzling skyscrapers.</p>

   <footer class="card__meta" role="contentinfo">
     <span class="card__post-date">6 min ago</span>
     <span class="card__comments">39 comments</span>
   </footer>
 </div>
</article>

So the first thing we did was add a `<header>` tag to wrap the title and subtitle fields.  The header tag is a great way to indicate the role content plays in our component.  In addition, using semantically correct markup makes this component SEO friendly as well as accessible.  Next we have the teaser content wrapped in a <p> tag and finally, we add another semantic tag which is `footer` to wrap the publish date and comments fields.

Why this markup?

I'd like to bring up a couple of things to your attention regarding how I arrived at the markup above.  First, we are using BEM for naming CSS classes (card__title, card__comments, etc.).  BEM is great for creating associations on your component fields.  By looking at the markup you know where each field belongs.  This makes it easier to identify where you would find the styles or other assets for the component within your project when you have tons and tons of components.  The class card and all the other field classes are unique not only to this component but in the entire project.  This gives us the peace of mind that our styles will not inadvertently affect other areas of the website.

Notice I created two containers within the card (`card__media` and `card__content`). This allows me to split the card content into groups I can manipulate individually and independently of each other.  This will come in handy when we need to create a long or wide card.  In addition, by grouping multiple fields into a single container it is easier to manipulate the fields with CSS as a group rather than individually.

Atomic Design

If you are familiar with Atomic Design, you know the recommended way for building components is to break things down into the smallest pieces possible.  For example, in the card structure above, each of the fields that make up the card can actually be its own individual component.  Then we would combine them all into a whole new component (card).  In the interest of time/work, I am not breaking the card component into atoms. However, if I were building the card for a project I would definitely do that as this allows for components to be reused and save time/work in the long road.

Putting the entire card together

Now that we have built each piece of the card, your **card.thml** file should look like this:

<article class="card">
 <div class="card__media">
   <img src="http://placehold.it/300x300" alt="Card image" />
   <div class="card__date">
     <span class="date--day">27</span>
     <span class="date--month">Mar</span>
   </div>
   <span class="card__category">Photos</span>
 </div>

 <div class="card__content">
   <header class="card__header">
     <h2 class="card__title">City Lights in New York</h2>
     <div class="card__subtitle">The city that never sleeps</div>
   </header>
   <p class="card__excerpt">New York, the largest city in the U.S., is an architectural marvel with plenty of historic monuments, magnificent buildings and countless dazzling skyscrapers.</p>

   <footer class="card__meta" role="contentinfo">
     <span class="card__post-date">6 min ago</span>
     <span class="card__comments">39 comments</span>
   </footer>
 </div>
</article>

Card Styles

Now that we have the markup ready for the card component, we can begin writing our Sass/CSS to style it.  I am not going to go into detail about the styles as I have commented the parts that may need some explanation and should be easy to understand.  The main take away from writing styles for components with unique names is that there is little to no CSS nesting.  This makes overriding styles an easy task if there is ever a need to do so.  In addition, styles written for a component like this, don't run into the risk of leaking into other areas of the website.

That's a lot of CSS but it's all necessary to achieve the look and feel of the card.  It is important to know that your markup file (card.html), needs to reference your styles file (card.css) for things to work.  This may be obvious but if you are following along that's an extra step that needs to be done.
Creating a card variation

If  you would like to see the card displayed wide or horizontally, modify your card wrapper to look like this:

<article class="card card--wide">...</article>
 

Component Example 2

Notice that by passing the CSS class of `card--wide` to the original card component we are able to change the orientation of the card from long to wide giving us an alternative variation to use without having to rebuild the card from scratch.  These variations are pretty powerful and present many advantages.

If you look at the styles, you will notice there is a block of CSS in which the card--wide is changed to move the card's content to the right of the image.  This is a great way to provide alternative views of a component without having to recreate the markup or styles for a component.

In closing

Building components is one of my favorite things to do as a developer and there are many advantages to this style of theming.  It's important to think through how components will be used so you can plan ahead how to best build a component that can be reused.  Rushing through building components without having a full understanding of where and how a component may be used can lead to duplicating code or redoing your work to accommodate new project requirements.

Aug 01 2018
Aug 01

This is part two of a two-part series.

In part one, we cogently convinced you that regardless of what your organization does and what functionality your website has, it is in your best interest to serve your website securely over HTTPS exclusively. Here we provide some guidance as to how to make the move.

How to transition to HTTPS

To fully transition to HTTPS from HTTP means to serve your website HTML and assets exclusively over HTTPS. This requires a few basic things:

  • A digital certificate from a certificate authority (CA)
  • Proper installation of the certificate on your website’s server
  • Ensuring all assets served from your website are served over HTTPS

Let’s break these down.

Acquire a digital certificate

As briefly discussed in part one, to implement HTTPS for your website you must procure a digital certificate from a certificate authority. Just like domain name registrars lease domain names, CAs lease digital certificates for a set time period. Each certificate has a public and private component. The public component is freely shared and allows browsers to recognize that a “trusted” CA was the one to distribute the certificate. It is also used to encrypt data transmitted from the browser. The private complement is only shared with the purchaser of the certificate, and can uniquely decrypt data encrypted by the public key. CAs use various methods through email or DNS to “prove” that the person who purchased the certificate is a rightful administrator of the domain for which they purchased it. Once you’ve received the private key associated with the certificate, you’ll need to install it on your server. Annual certificate costs can be as much as $1,000 or as little as nothing. More on that in a moment.

Install the certificate

Lots of wires

Installing an HTTPS certificate manually on a server is not a trivial engineering task. We explain this at a high-level in part one. It requires expertise from someone who is experienced and comfortable administering servers. There are many different installation methods unique to each permutation of server software and hosting platform, so I won’t expend any real estate here attempting to lay it out. If you have to do a manual installation, it’s best to search your hosting provider’s documentation. However, depending on the complexity of your website architecture, there are ways to ease this process. Some hosting platforms have tools that substantially simplify the installation process. More on that in a moment as well.

Serve all resources over HTTPS: avoid mixed content

Once you’ve installed your certificate, it’s time to ensure all assets served from your pages are served over HTTPS. Naturally, this entire process should be completed in a staging environment before making the switch to your production environment. Completing a full transition to HTTPS requires attention to detail and diligence. “Mixed content”, or serving assets from a page over both HTTP and HTTPS, can be both tedious and insidious to rectify. The longer your site has been around, the more content there is and the more historic hands have been in the pot of content creation, the more work there will be to make the switch. Depending on your platform (CMS or otherwise) and how it was designed, there may be many avenues for different stakeholders to have included assets within a page over time. Developers, site admins, and content editors usually have the ability to add assets to a page. If any assets start with http://, they’ll need to be updated to https:// to prevent mixed content warnings.

We have recently helped a client who has been publishing articles at a high cadence for over 10 years with many different stakeholders over that period. Practices weren’t consistent and uncovering all the ways in which HTTP resources were served from the page was a substantial undertaking. Therefore, be prepared for a time investment here – there may be many areas to audit to ensure all assets from all your pages are being served over HTTPS. Some common ways mixed content HTTP assets are served from a site whose HTML is served over HTTPS are:

  • Hard-coding a resource: e.g. http://www.example.com/img/insecure-image.jpg
  • Using a 3rd-party library or ad network reference: http://www.example.com/js/analytics.js
    • This is common for libraries that haven’t been updated in a while. Most all of them now serve the same assets over the same path securely with HTTPS.

Even practices that were previously encouraged by Paul Irish, a leading web architect for the Google Chrome team, may have contributed to your mixed content problem, so don’t feel bad. Just know, there will likely be work to be done.

The risk of mixed content

These “not secure” bits of mixed-content expose the same risk that your HTML does when served over HTTP, so browsers rightfully show the user that the experience on your site is “not secure”.

Mixed content is categorized in two ways: active and passive. An asset that can contribute to passive mixed content would be an image; it doesn’t interact with the page but is merely presented on the page. An example that can be an active asset of mixed content is a javascript file or stylesheet since its purpose is to manipulate the page. Passive mixed content, albeit categorically less severe than active, still exposes opportunities for an attacker to learn a lot about an individual’s browsing patterns and even trick them into taking different actions than they intend to on the page. Therefore passive mixed content still constitutes enough of a threat for the browser to issue a warning display.

mixed content browser errors

In the case of an active asset that is compromised, if it were a javascript file, an attacker can take full control of the page and any interaction you may have with it like entering passwords, or credit card information. The mechanism behind this is a somewhat sophisticated man-in-the-middle attack, but suffice it to say, if the browser recognizes the vulnerability, the best scenario is the poor user experience we discussed in part one, the worst is total data compromise. Your audience, and by association your organization, will be seeing red.
Chrome misconfigured HTTPS

The good news about moving to HTTPS

Ensuring your website serves assets exclusively over HTTPS is not as hard as it used to be, and is getting easier by the day.

There are free digital certificates

There’s no such thing as a free lunch, but a free certificate from a reputable CA? It would seem so. People are just giving them away these days… Seriously, many years in the making since its founding by two Mozilla employees in 2012, the Let’s Encrypt project has vowed to make the web a secure space and has successfully endeavored to become a trusted CA that literally does not charge for the certificates they provide. They have shorter lease cycles of 60 to 90 days, but they also offer tooling around automating the process of reinstalling newly provided certificates.

There are easier and cheaper ways to install certificates

With the advent of the aforementioned free certificate, many platform-as-a-service (PaaS) hosting options have incorporated low cost or free installation of certificates through their hosting platform sometimes as easily as a few clicks. Let’s Encrypt has been adopted across a broad range of website hosting providers like Squarespace, GitHub Pages, Dreamhost, all of which we use alongside many others.

For many of our Drupal partners, we prefer to use a platform as a service (PaaS) hosting option like Pantheon, Acquia, or Platform.sh. Both Pantheon and Platform.sh now provide a free HTTPS upgrade for all hosting plans; Acquia Cloud, another popular Drupal PaaS, is still a bit behind in this regard. We have found that the efficiency gains of spending less time in server administration translates to more value to our clients, empowering additional effort for the strategy, design, and development for which they hired us. In addition to efficiency, the reliability and consistency provided by finely tuned PaaS offerings are, in most cases, superior to manual installation.

A good example of the evolution of hosting platforms maturing into the HTTPS everywhere world is our own Jekyll-based site, which we’ve written about and presented on before. We first set up HTTPS over GitHub pages using CloudFlare guided by this tutorial since we found it necessary to serve our site over HTTPS. However, about a year later GitHub announced they would provide HTTPS support for GitHub pages.

Similarly, we had previously implemented Pantheon’s workaround to make HTTPS on all of their tiers accessible to our clients on their hosting platform. Then they announced HTTPS for all sites. We’re thankful both have gotten easier.

There are tools to help with the transition to HTTPS

Through its powerful Lighthouse suite, Google has a tool to help audit and fix mixed content issues. Given the aforementioned tedium and potential difficulty of tracking down all the ways in which people have historically added content to your site, this can be an invaluable time saver.

You can also use tools like Qualys SSL Labs to verify the quality of your HTTPS installation. See how our site stacks up.

Wrap-up

Given the much greater ease at which many modern hosting platforms allow for HTTPS, the biggest barrier, primarily a matter of effort, is to clean up your content and make sure all assets are served over HTTPS from all pages within your website. So, if you haven’t already, start the transition now! Contact us over the phone or email if you need assistance and feel free to comment below.

Jul 31 2018
Jul 31

The Superfish module allows you to create multi-level dropdown menus in Drupal 8. The module uses the JavaScript Superfish library to create and display a Superfish menu block for each menu available on your site.

With a few configuration options, you can control how it’ll behavior on mobile, turn multi-column menus, change the styling and more.

The module does come with a few styling options but you’ll have to style it yourself to match your theme. When you configure Superfish the first time the dropdown functionality will, however, it may not look good.

In this tutorial, you’ll learn how to install the module and how to configure it.

Let’s start!

Getting Started

It’s recommended that you install the module using Composer. This way Composer will automatically download the required library.

composer require drupal/superfish

Installing the Superfish module

If you don’t want to use Composer then you’ll need to manually download the library and extract it into the libraries directory in your Drupal site. View the Installation section on the project page for more details.

After downloading, enable the module either by clicking the module’s checkbox in the Extend Page or using Drush.

drush en superfish -y

Enabling the moduel with drush

Create Menu

As an example, let’s create a site with a 4 level menu. Superfish is capable of handling unlimited menu levels.

1. Click Structure, Menus and click the “Edit menu” button besides the “Main Navigation” menu.

2. Click the “Add link” button to add a menu item and make sure you mark the “Show as expanded” checkbox, if the menu item will contain children.

Creating the menu

3. Click the Save button each time you create a menu item and repeat the process for the whole menu structure. At the end you should have something like this:

Menu structure

Configure Superfish Menu Block

It’s time to get rid of the default menu block and place the Superfish menu block in one of Drupal’s regions.

1. Click Structure, “Block layout”. Look for the “Primary menu” region, click the dropdown button by the “Main Navigation” block and choose Remove. Confirm by clicking the blue Remove button.

Removing the main menu block

2. Now click “Place block” in the “Highlighted” region, type the word main in the search box and click “Place block” by the Superfish “Main navigation” block.

Inserting the Supefish block

Let’s review the settings one by one.

  • Uncheck the “Display title” option. This is Drupal’s default block configuration.
  • “Menu levels” option lets you configure the display of the menu and the number of levels permitted. Leave these options untouched.
  • “Block settings” area contains configuration about the menu appearance and its basic behavior.
  • “Menu Type” “Horizontal (double row)” shows the first and second level menu items in a horizontal display.
  • “Vertical (stack)” option is the right choice if you want to place the menu in one of the sidebars for example.
  • The style option lets you choose a menu with some predefined styles.
  • The “Slide in” effect option is there to configure the dropdown functionality of all submenus. You can install the jQuery Easing library (read the module’s documentation) if you want to have more options here.

Superfish settings

  • The “Superfish plugins” area deals with the display of the dropdown menus in the browser and on small screens, take a look at the possibilities, but generally speaking you won’t need to change this options, leave the default options. Pay attention to the Supersubs area, you can define here the width of your submenus.

Supefish plugins area

  • In the “Advanced Settings” area, you can control the animation speed of the Superfish menu, some hyperlink properties and you can also add custom CSS classes to the menu components.

Superfish advanced settings

Create Multi-column Menus

It is possible to create multi-column menus with the Superfish module, however you have to have a specific menu structure to achieve this.

Wrong menu structure

Right menu structure

Just check “Enable multi-column sub-menus” in order to use them.

Multi-column menus setting

Summary

The Superfish module allows you to build dropdown and multi-column menus and place them in your Drupal site in an easy way. But you’ll still need to spend some time styling the menus to match your theme.

Jorge Montoya

About Jorge Montoya

Jorge has been reading about Drupal and playing with it for almost 5 years. He likes it very much! He also likes to translate German and English documents into Spanish. He's lived in some places. Right now, he lives in the city of Medellín in his homeland Colombia with his wife and son.

Jul 30 2018
Jul 30
Comparing Drupal POS, Shopify POS and Square POS


If you need to accept card payment in a physical location, you need a point of sale (POS) system. There are many different POS systems out there so knowing how to choose the right one for your business can be challenging. All systems claim to be everything you need, however this might not be the case for all businesses. Most POS systems are designed around “industry best practices,” meaning that they try to serve the majority of businesses based on the most common needs. Many systems start to fail when the requirements of the business break away from the norm.

How do you choose the right point of sale for your business? The best way I’ve found is to look at three or four different examples and do a direct comparison. Today I’ll compare 3 different web-based point of sales systems - Drupal POS, Shopify POS, and Square POS. I’ll look at features, costs, usability, integrations, and more. In the end, I’ll try to understand the strengths and weaknesses of each and ultimately determine what business types they work best with.

All of the POS systems I examine today are web-based (or cloud-based). This means that these systems are connected to the internet and all of the data is kept online. Web-based systems are increasingly becoming more popular because they are generally easier to setup and require less time and knowledge to maintain. They can also integrate with your eCommerce store. You can read more benefits here.

The point of sale systems

Here is an introduction to the three POS systems I’ll be comparing.

Drupal POS

Drupal POS is a free add-on to the popular Drupal content management system. Drupal is open-source and completely free to use. It’s known as a very developer-friendly platform to build a website on and has a massive community, over a million strong, helping to advance the software and keep it secure. The open-source eCommerce component for Drupal is called Drupal Commerce. While Drupal Commerce has a relatively small market share, the platform is very powerful and can be a very good choice for businesses that have demanding requirements or unique product offerings.

Shopify POS

Shopify POS integrates with the popular Shopify SaaS eCommerce platform. Unlike Drupal Commerce, Shopify is a standalone product and stores running on the platform pay a monthly subscription fee to use it. With that said, business owners are given a well developed tool out-of-the-box that has all of the bells and whistles most stores require to get up and running fast. Shopify aims to serve the common needs of most businesses, so very unique business requirements can be hard to achieve.

Square POS

Square POS is an add-on point of sale service for your business and is not really a platform for running your entire store, although it does now offer a basic eCommerce component. It can also integrate with many eCommerce platforms, including Drupal Commerce. Square aims to make the process of accepting card payment easy to do, without bulky equipment.

Service comparison

Below is a side-by-side comparison of each service (as of July, 2018). Note that some of the information below applies to stores who also have an eCommerce component. If you don’t need eCommerce, you can ignore those items.

Note for mobile viewers: Swipe the table side-to-side to see it all.

 

Drupal logo

Drupal POS

Shopify logo

Shopify POS

Square logo

Square POS

Service philosophy

Open-source 

ProprietaryProprietaryService support Yes *
* via Drupal Commerce, in-house IT or third-party support  Yes *
* via Shopify or third-party support Yes *
* via Square Setup costs for basic service  $0 *
* The software doesn’t cost anything to use, however you may need to pay someone to set it up for you

$29 USD *
* Basic package pricing

$0 Ongoing costs for basic service $0 *
* The software doesn’t cost anything to use, however you may need to pay someone to apply occasional software updates. Third-party transactions fees may apply. Website domain and hosting also required $29/mth plus transaction fees and add-on product fees. Monthly fee increases with package Transaction fees and add-on product fees Payment gateways Third-party Shopify or third-party Square Accept cash payments Yes  Yes Yes  Accept card payments Yes Yes Yes Save cards (card on file) Yes  Yes  Yes Process recurring payments (i.e. subscriptions) Yes Yes *
* Third-party add-on required with separate monthly fees Yes Accept mobile payments Yes *
* Third-party hardware required Yes *
* Monthly fee for service hardware Yes *
* $59 USD one time price for service hardware Built in invoicing Yes *
* Using free add-on Yes Yes Apply discounts and promotions Yes Yes Yes Use with gift cards & coupon codes Yes Yes *
* Not available for basic plan  Yes  Printed gift cards provided by service  No *
* Add-on could be created to allow this functionality, but does not currently exist Yes *
* Additional fee for printing  Yes *
* Additional fee for printing Integrated taxes  Yes *
* Advanced taxes can be handled via third-party add-ons or configured directly within the platform Yes Yes *
* Third-party add-ons required
  Apply additional custom fees (i.e. environment fees, tipping, donations, etc.) Yes Yes Yes *
* Limited to tipping Built-in eCommerce Shop Yes *
* Drupal POS is an add-on for Drupal Commerce Yes *
* Shopify POS is an add-on for Shopify Yes *
* Basic Square store or integrate with third-party platforms Built-in website and blog Yes Yes  Yes  Multi-business (separate businesses using same platform or account) Yes No *
* Separate account required for each business No *
* Separate account required for each business/bank account Multi-store (multiple locations or stores of the same business) Yes  Yes  Yes  Number of products allowedUnlimited 2000-7000 *
* Number depends on device used to manage inventory Unlimited *
* Square eCommerce store only displays 1000 products. Third-party platform needed to run a larger store Number of product variations allowedUnlimited 4000-10,000 *
* Number depends on device used to manage inventory Unlimited * 
* Square eCommerce store only displays 1000 products. Third-party platform needed to run a larger store Number of registers allowedUnlimitedUnlimited  UnlimitedNumber of cashiers accounts allowedUnlimited  2 *
* Number of accounts increase with service plan Unlimited  Access controls Yes Yes  Yes *
* Additional fee of $6/employee  Create new user roles for advanced access controls Yes No Yes *
* Grouped with additional fee above. Mobile POS (i.e. use at trade shows, markets, etc.) Yes Yes Yes Sync inventory between online and offline stores Yes Yes Yes *
* Third-party platforms may not be able to sync inventory  Sync user accounts between online and offline stores Yes Yes Yes Sync orders between online and offline stores Yes Yes Yes  Park & retrieve orders Yes  Yes  Yes  Abandoned cart recovery (eCommerce) Yes *
* Using free add-on or third-party solutions Yes Yes *
* Requires third-party solutions Generate product labels Yes Yes Yes Print receipt Yes  Yes  Yes  Email receipt Yes Yes Yes  Customize receipt information Yes Yes *
* No layout customization, only the information shown Yes *
* No layout customization, only the information shown Process returns Yes Yes Yes  Basic reporting Yes Yes *
* Not available for basic plan Yes  Advanced reporting Yes *
* Using free add-on Yes *
* Not available for basic or mid-tier plans Yes  Supported operating systems Any *
* Requires only a web browser to use  Android, iOS *
* Requires app. iPad recommended with limited support for iPhone and Android Android, iOS *
* Requires app Themable (i.e. brand the POS interface) Yes  No No Customer facing display Yes No No Integrate with accounting/bookkeeping services? Yes Yes  Yes Integrate with other eCommerce sales platforms (Amazon, Ebay, etc.)? Yes Yes Yes *
* Only if using third-party eCommerce platform that supports this Integrate with marketing services (MailChimp, HubSpot, etc.)? Yes Yes Yes *
* Only if using third-party eCommerce platform that supports this Integrate with shipping providers (FedEx, UPS, etc.)? Yes Yes Yes Third-party calculated shipping rates Yes Yes *
* Not available for basic or mid-tier plans No Generate shipping labels Yes Yes Yes *
* Integration with ShipStation adds this functionality for an extra monthly cost Custom integrations with third-party services Yes Yes Yes Use offline (and have your transactions sync once back online)No *
* This is a requested feature currently in discussion Yes *
* Can only accept cash or other manual payments Yes Personalized customer feedback/support Yes Yes Yes

Hardware Requirements

Cashier terminal Third-party *
* Can be anything that runs a web browser (computer, tablet, phone, etc.) Third-party *
* iPad recommended with limited support for iPhone and Android Third-party *
* Any device running Android or iOS Card reader Third-party Provided Provided  Contactless payment Third-party Third-party Proprietary only  Cash drawer Third-party Third-party  Third-party  Barcode scanner Third-party *
* Can be a traditional barcode scanner or anything with a camera (i.e. phone, tablet, webcam, etc.) Third-party Third-party  Receipt printer Third-party Third-party  Third-party  Barcode printer Third-party Third-party None  Customer facing display Third-party *
* Can be anything that runs a web browser (computer, tablet, phone, etc) None None Custom/DIY hardware Yes No No

What business is best suited for each POS?

As you can see, all three options have most of the same features. Most businesses would probably be fine with any of them, but let’s see if we can distil down where each system fits best.

Drupal POS

Who’s it for?

If you have a medium to large business with unique business requirements, Drupal POS could be the ideal platform for you to work with. For small business, Drupal POS and Drupal Commerce might not be for you. The initial cost to get a site built might be too high for your budget, however, if you look at the long term fees charged month by month from the other venders, this upfront cost will be saved in a matter of time. Also, if you have a really obscure need that no other platform will accomodate, Drupal Commerce can.

If you’re already running a Drupal Commerce store and now want to add point of sale to your physical locations, Drupal POS is probably a no-brainer. It’s built on-top of the existing Commerce architecture, so you know it will integrate properly in every way, and you can utilize your existing web development service provider to help you set it up.

Demo Drupal Commerce today! View our demo site.Additional details:

If you’re not already using Drupal then you have some larger questions to consider. Do you already have an ecommerce website? Would you be willing to invest in replatforming? Since Drupal Commerce is an eCommerce platform, you would ideally be running your whole operation from Drupal Commerce. That’s not necessarily a bad thing though. Drupal can readily handle any business case you can throw at it. It can integrate with virtually any third-party service, it can provide you with a single location to manage all of your products, orders, customer accounts, etc., it’s built to scale with your business, and on top of all that it’s a powerful content management system that will run your blog and any other content need you might have.

From a support point of view, because Drupal is open-source, you don’t have a single source of support to contact. Instead, you would need to utilize your current web development service provider (if you have one), or work with one of the many Drupal agencies out there who are specialized in Drupal development. This means you can shop around and find the company will work best with you.

Another advantage to Drupal POS (and Drupal as a whole) is that because it’s free, open-source software, you don’t actually have any type of fee to use it. Not one cent. You can have as many stores, products, staff accounts, transactions, registers, etc. as you need, and the price is still $0. Instead of spending your hard earned money on platform fees, you can now redirect those funds to developing your website and POS to do whatever you need it to, or towards marketing, or staffing, or growing your business.

Shopify POS

Who’s it for?

If you’re a small to medium sized business who is just getting started, you don’t have a large budget, and you want the best eCommerce site with POS capabilities, Shopify and Shopify POS is probably your best bet. Also, if you’re already running a Shopify site and happy with it, the Shopify POS is probably ideal for you.

For your business is growing or you run a large, enterprise level company, Shopify and Shopify POS probably won’t cut it with what you need. For one, the fees associated with this level of company can be significant. If you’re at that point, replatforming to something like Drupal Commerce can recuperate a lot of lost earnings and give you full control of your development path, without restrictions.

Additional details:

Shopify has built their business around being easy. Whether it’s opening up a new store or managing your inventory and customers, the Shopify interface is clean and straightforward. As mentioned earlier, it’s ideal for small and medium sized companies just getting started.

However, where Shopify starts to fail is when your business growth is strong and your requirements start to become more complicated. With Shopify, the number of products and product variations you’re allowed can limit your growth. As you start adding more staff, your costs go up. You can pretty quickly go from a $29/mth plan to a $300+/mth plan in short order. 

Another possible deal-breaker is if you product offerings have very unique requirements. Shopify is built to work around the most common business requirements. When your business breaks out of this mold, the platform isn’t designed to accommodate. However, if you can stay within the “typical” business requirements, Shopify probably has everything you need as long as you’re willing to pay for it.

Square POS

Who’s it for?

Square POS is great for small businesses and food service businesses. It’s an easy to use, low-cost option that doesn’t really require anything more than your phone and the provided card reader. Their software interface is clean and easy to understand.

If you’re a medium to large business, or you have very high traffic, Square POS might not be for you. Square is mainly an add-on service to existing businesses, so don’t expect much from an eCommerce perspective. 

Additional details:

Square has become a pretty common sight around town these days, especially when you’re at small business such as cafes or walking around a farmers/artisan market. Square has been able to provide a very good product that allows people to jump in to card transactions easily. It fills this need.

When your business grows and you start having multiple stores and an eCommerce component, you may quickly grow beyond Square’s capabilities. Drupal POS and Shopify POS both have native eCommerce that they work with. This is important when you’re talking about inventory management and other integrations. While Square does have a basic eCommerce component and can integrate with various eCommerce platforms (Drupal Commerce being one of them), you may struggle to get some of the features that Drupal Commerce and Shopify have by default.

Your point of sale integrator

Acro Media is an open-source eCommerce development agency. Our experience in this area is vast and we would love to share it with you. If you have a project that you’d like to discuss, one of our friendly business developers are always available to have that discussion at no cost to you.

Contact Acro Media Today!

Jul 27 2018
Jul 27

Five Drupal Features essential to the publishing industry.

If you are in the publishing industry, you already know that Drupal 8 is by far the most useful CMS for publishers. It was great in Drupal 6 & 7 and with 8 it keeps getting better with each major release. Combined with the community contributed modules, Drupal 8 is the best platform for publishers yet. Here are five features in Drupal 8 that are essential to publishing.

1. Workflows

Moderation of content is crucial for quality when you have a lot of authors across multiple platforms. Drupal 8 has that solved out of the box with the core Workflow and Content Moderation modules. Check out this short post we did on Workflows in Drupal 8 core here for more info. Your CMS workflow should match your internal publishing workflow. Drupal 8 can automate most manual processes that you currently have. This is a prime example of getting your CMS to do some heavy lifting within your organization.

2. Integrations

Drupal is a fantastic framework for integrations. In the publishing business, you are going to require a decent amount of integrations. Google Tag Manager, Doubleclick for Publishers, HubSpot, and SalesForce just to name a few. Drupal’s object-oriented framework allows for integrations to be built quickly and efficiently. Plus, they can be custom tailored to your business needs. Many other popular CMS and site-builders are limited to the available integrations and options. Drupal is an open framework that can be custom built to do just about anything you need.

3. Content Management

Organizing, managing, and scheduling content is key to a successful publishing site. Drupal 8 builds on top of the fantastic content management system started in Drupal 7. Custom admin tools tailored to your business can be built on the fly using the core Views module. With custom tools you can build admin pages for any type of content you wish. You can even make an admin screen for niche needs. For example, if you want an admin screen for articles that were published in 2012 that were about cats, you can very easily make one. Content admin tools have the ability to be built around the permissions of the different roles in your organization. You can fine tune what access and permissions your editors, authors, admins, or others can have. This is all out of the box functionality in Drupal 8 and requires no custom code.

4. Asset Management

Like content management, asset management is fantastic in Drupal 8. Assets are treated much in the same way that content is. When I say assets I’m loosely defining them as any reusable element. Images, video, audio clips and other files are obvious assets, but you can also create some more obscure assets like tweets or Instagram posts. Assets are fieldable, so you can add additional data for display and categorization. Using the core Media module, you can manage all of your assets. Custom libraries can be created as well as custom selection widgets that are restricted to specific types of assets. This can be a huge timesaver for assets that are commonly used across many authors.

5. Web Services

Drupal 8 comes with a RESTful Web Services API in core. To put it simply, a RESTful web service can be a conduit to send your content from Drupal to another system like a native app. This can jump start a “publish once distribute everywhere” content model. For publishers this can be essential for content control and efficiency. There is an API first initiative actively going in Drupal 8 development which is pushing for new features in each version release. Besides the out of the box RESTful Web Services API, there are several contrib modules that include JSON API and GraphQL. These web services are also what gives Drupal the ability to go headless and communicate with a JavaScript front end such as React.

Drupal as the hub to a publishing ecosystem

We will cover more on this in the near future, but these 5 things really prove why Drupal 8 is a first class CMS for the publishing industry. Drupal 8 combined with great content, is a killer recipe for a successful publishing site. The out of the box features of Drupal 8 will save you time and money. If you are a publisher who is constantly struggling and fighting with your website, it’s time to give Drupal 8 a look. 

Offer for a free consultation to determine if Drupal is the right choice for your organization

Jul 26 2018
Jul 26
July 26th, 2018

Intro

In this post, I’m going to run through how I set up visual regression testing on sites. Visual regression testing is essentially the act of taking a screenshot of a web page (whether the whole page or just a specific element) and comparing that against an existing screenshot of the same page to see if there are any differences.

There’s nothing worse than adding a new component, tweaking styles, or pushing a config update, only to have the client tell you two months later that some other part of the site is now broken, and you discover it’s because of the change that you pushed… now it’s been two months, and reverting that change has significant implications.

That’s the worst. Literally the worst.

All kinds of testing can help improve the stability and integrity of a site. There’s Functional, Unit, Integration, Stress, Performance, Usability, and Regression, just to name a few. What’s most important to you will change depending on the project requirements, but in my experience, Functional and Regression are the most common, and in my opinion are a good baseline if you don’t have the capacity to write all the tests.

If you’re reading this, you probably fall into one of two categories:

  1. You’re already familiar with Visual Regression testing, and just want to know how to do it
  2. You’re just trying to get info on why Visual Regression testing is important, and how it can help your project.

In either case, it makes the most sense to dive right in, so let’s do it.

Tools

I’m going to be using WebdriverIO to do the heavy lifting. According to the website:

WebdriverIO is an open source testing utility for nodejs. It makes it possible to write super easy selenium tests with Javascript in your favorite BDD or TDD test framework.

It basically sends requests to a Selenium server via the WebDriver Protocol and handles its response. These requests are wrapped in useful commands and can be used to test several aspects of your site in an automated way.

I’m also going to run my tests on Browserstack so that I can test IE/Edge without having to install a VM or anything like that on my mac.

Process

Let’s get everything setup. I’m going to start with a Drupal 8 site that I have running locally. I’ve already installed that, and a custom theme with Pattern Lab integration based on Emulsify.

We’re going to install the visual regression tools with npm.

If you already have a project running that uses npm, you can skip this step. But, since this is a brand new project, I don’t have anything using npm, so I’ll create an initial package.json file using npm init.

  • npm init -y
    • Update the name, description, etc. and remove anything you don’t need.
    • My updated file looks like this:
{ "name": "visreg", "version": "1.0.0", "description": "Website with visual regression testing", "scripts": { "test": "echo \"Error: no test specified\" && exit 1" } }   "name":"visreg",  "version":"1.0.0",  "description":"Website with visual regression testing",  "scripts":{    "test":"echo \"Error: no test specified\" && exit 1"

Now, we’ll install the npm packages we’ll use for visual regression testing.

  • npm install --save-dev webdriverio chai wdio-mocha-framework wdio-browserstack-service wdio-visual-regression-service node-notifier
    • This will install:
      • WebdriverIO: The main tool we’ll use
      • Chai syntax support: “Chai is an assertion library, similar to Node’s built-in assert. It makes testing much easier by giving you lots of assertions you can run against your code.”
      • Mocha syntax support “Mocha is a feature-rich JavaScript test framework running on Node.js and in the browser, making asynchronous testing simple and fun.”
      • The Browserstack wdio package So that we can run our tests against Browserstack, instead of locally (where browser/OS differences across developers can cause false-negative failures)
      • Visual regression service This is what provides the screenshot capturing and comparison functionality
      • Node notifier This is totally optional but supports native notifications for Mac, Linux, and Windows. We’ll use these to be notified when a test fails.

Now that all of the tools are in place, we need to configure our visual regression preferences.

You can run the configuration wizard by typing ./node_modules/webdriverio/bin/wdio, but I’ve created a git repository with not only the webdriver config file but an entire set of files that scaffold a complete project. You can get them here.

Follow the instructions in the README of that repo to install them in your project.

These files will get you set up with a fairly sophisticated, but completely manageable visual regression testing configuration. There are some tweaks you’ll need to make to fit your project that are outlined in the README and the individual markdown files, but I’ll run through what each of the files does at a high level to acquaint you with each.

  • .gitignore
    • The lines in this file should be added to your existing .gitignore file. It’ll make sure your diffs and latest images are not committed to the repo, but allow your baselines to be committed so that everyone is comparing against the same baseline images.
  • VISREG-README.md
    • This is an example readme you can include to instruct other/future developers on how to run visual regression tests once you have it set up
  • package.json
    • This just has the example test scripts. One for running the full suite of tests, and one for running a quick test, handy for active development. Add these to your existing package.json
  • wdio.conf.js
    • This is the main configuration file for WebdriverIO and your visual regression tests.
    • You must update this file based on the documentation in wdio.conf.md
  • wdio.conf.quick.js
    • This is a file you can use to run a quick test (e.g. against a single browser instead of the full suite defined in the main config file). It’s useful when you’re doing something like refactoring an existing component, and/or want to make sure changes in one place don’t affect other sections of the site.
  • tests/config/globalHides.js
    • This file defines elements that should be hidden in ALL screenshots by default. Individual tests can use this, or define their own set of elements to hide. Update these to fit your actual needs.
  • tests/config/viewports.js
    • This file defines what viewports your tests should run against by default. Individual tests can use these, or define their own set of viewports to test against. Update these to the screen sizes you want to check.

Running the Test Suite

I’ll copy the example homepage test from the example-tests.md file into a new file /web/themes/custom/visual_regression_testing/components/_patterns/05-pages/home/home.test.js. (I’m putting it here because my wdio.conf.js file is looking for test files in the _patterns directory, and I like to keep test files next to the file they’re testing.)

The only thing you’ll need to update in this file is the relative path to the globalHides.js file. It should be relative from the current file. So, mine will be:

const visreg = require('../../../../../../../../tests/config/globalHides.js'); constvisreg=require('../../../../../../../../tests/config/globalHides.js');

With that done, I can simply run npm test and the tests will run on BrowserStack against the three OS/Browser configurations I’ve specified. While they’re running, we can head over to https://automate.browserstack.com/ we can see the tests being run against Chrome, Firefox, and IE 11.

Once tests are complete, we can view the screenshots in the /tests/screenshots directory. Right now, the baseline shots and the latest shots will be identical because we’ve only run the test once, and the first time you run a test, it creates the baseline from whatever it sees. Future tests will compare the most recent “latest” shot to the existing baseline, and will only update/create images in the latest directory.

At this point, I’ll commit the baselines to the git repo so that they can be shared around the team, and used as baselines by everyone running visual regression tests.

If I run npm test again, the tests will all pass because I haven’t changed anything. I’ll make a small change to the button background color which might not be picked up by a human eye but will cause a regression that our tests will pick up with no problem.

In the _buttons.scss file, I’m going to change the default button background color from $black (#000) to $gray-darker (#333). I’ll run the style script to update the compiled css and then clear the site cache to make sure the change is implemented. (When actively developing, I suggest disabling cache and keeping the watch task running. It just makes things easier and more efficient.)

This time all the tests fail, and if we look at the images in the diff folder, we can clearly see that the “search” button is different as indicated by the bright pink/purple coloring.

If I open up one of the “baseline” images, and the associated “latest” image, I can view them side-by-side, or toggle back and forth. The change is so subtle that a human eye might not have noticed the difference, but the computer easily identifies a regression. This shows how useful visual regression testing can be!

Let’s pretend this is actually a desired change. The original component was created before the color was finalized, black was used as a temporary color, and now we want to capture the update as the official baseline. Simply Move the “latest” image into the “baselines” folder, replacing the old baseline, and commit that to your repo. Easy peasy.

Running an Individual Test

If you’re creating a new component and just want to run a single test instead of the entire suite, or you run a test and find a regression in one image, it is useful to be able to just run a single test instead of the entire suite. This is especially true once you have a large suite of test files that cover dozens of aspects of your site. Let’s take a look at how this is done.

I’ll create a new test in the organisms folder of my theme at /search/search.test.js. There’s an example of an element test in the example-tests.md file, but I’m going to do a much more basic test, so I’ll actually start out by copying the homepage test and then modify that.

The first thing I’ll change is the describe section. This is used to group and name the screenshots, so I’ll update it to make sense for this test. I’ll just replace “Home Page” with “Search Block”.

Then, the only other thing I’m going to change is what is to be captured. I don’t want the entire page, in this case. I just want the search block. So, I’ll update checkDocument (used for full-page screenshots) to checkElement (used for single element shots). Then, I need to tell it what element to capture. This can be any css selector, like an id or a class. I’ll just inspect the element I want to capture, and I know that this is the only element with the search-block-form class, so I’ll just use that.

I’ll also remove the timeout since we’re just taking a screenshot of a single element, we don’t need to worry about the page taking longer to load than the default of 60 seconds. This really wasn’t necessary on the page either, but whatever.

My final test file looks like this:

const visreg = require('../../../../../../../../tests/config/globalHides.js'); describe('Search Block', function () { it('should look good', function () { browser .url('./') .checkElement('.search-block-form', {hide: visreg.hide, remove: visreg.remove}) .forEach((item) => { expect(item.isWithinMisMatchTolerance).to.be.true; }); }); }); constvisreg=require('../../../../../../../../tests/config/globalHides.js');describe('Search Block',function(){  it('should look good',function(){    browser      .url('./')      .checkElement('.search-block-form',{hide:visreg.hide,remove:visreg.remove})      .forEach((item)=>{        expect(item.isWithinMisMatchTolerance).to.be.true;      });

With that in place, this test will run when I use npm test because it’s globbing, and running every file that ends in .test.js anywhere in the _patterns directory. The problem is this also runs the homepage test. If I just want to update the baselines of a single test, or I’m actively developing a component and don’t want to run the entire suite every time I make a locally scoped change, I want to be able to just run the relevant test so that I don’t waste time waiting for all of the irrelevant tests to pass.

We can do that by passing the --spec flag.

I’ll commit the new test file and baselines before I continue.

Now I’ll re-run just the search test, without the homepage test.

npm test -- --spec web/themes/custom/visual_regression_testing/components/_patterns/03-organisms/search/search.test.js

We have to add the first set of -- because we’re using custom npm scripts to make this work. Basically, it passes anything that follows directly to the custom script (in our case test is a custom script that calls ./node_modules/webdriverio/bin/wdio). More info on the run-script documentation page.

If I scroll up a bit, you’ll see that when I ran npm test there were six passing tests. That is one test for each browser for each test. We have two test, and we’re checking against three browsers, so that’s a total of six tests that were run.

This time, we have three passing tests because we’re only running one test against three browsers. That cut our test run time by more than half (from 106 seconds to 46 seconds). If you’re actively developing or refactoring something that already has test coverage, even that can seem like an eternity if you’re running it every few minutes. So let’s take this one step further and run a single test against a single browser. That’s where the wdio.conf.quick.js file comes into play.

Running Test Against a Subset of Browsers

The wdio.conf.quick.js file will, by default, run test(s) against only Chrome. You can, of course, change this to whatever you want (for example if you’re only having an issue in a specific version of IE, you could set that here), but I’m just going to leave it alone and show you how to use it.

You can use this to run the entire suite of tests or just a single test. First, I’ll show you how to run the entire suite against only the browser defined here, then I’ll show you how to run a single test against this browser.

In the package.json file, you’ll see the test:quick script. You could pass the config file directly to the first script by typing npm test -- wdio.conf.quick.js, but that’s a lot more typing than npm run test:quick and you (as well as the rest of your team) have to remember the file name. Capturing the file name in a second custom script simplifies things.

When I run npm run test:quick You’ll see that two tests were run. We have two tests, and they’re run against one browser, so that simplifies things quite a bit. And you can see it ran in only 31 seconds. That’s definitely better than the 100 seconds the full test suite takes.

Let’s go ahead and combine this with the technique for running a single test to cut that time down even further.

npm run test:quick -- --spec web/themes/custom/visual_regression_testing/components/_patterns/03-organisms/search/search.test.js

This time you’ll see that it only ran one test against one browser and took 28 seconds. There’s actually not a huge difference between this and the last run because we can run three tests in parallel. And since we only have two tests, we’re not hitting the queue which would add significantly to the entire test suite run time. If we had two dozen tests, and each ran against three browsers, that’s a lot of queue time, whereas even running the entire suite against one browser would be a significant savings. And obviously, one test against one browser will be faster than the full suite of tests and browsers.

So this is super useful for active development of a specific component or element that has issues in one browser as well as when you’re refactoring code to make it more performant, and want to make sure your changes don’t break anything significant (or if they do, alert you sooner than later). Once you’re done with your work, I’d still recommend running the full suite to make sure your changes didn’t inadvertently affect another random part of the site.

So, those are the basics of how to set up and run visual regression tests. In the next post, I’ll dive into our philosophy of what we test, when we test, and how it fits into our everyday development workflow.

Web Chef Brian Lewis
Brian Lewis

Brian Lewis is a frontend engineer at Four Kitchens, and is passionate about sharing knowledge and learning new tools and techniques.

Web Chef Dev Experts
Development

Blog posts about backend engineering, frontend code work, programming tricks and tips, systems architecture, apps, APIs, microservices, and the technical side of Four Kitchens.

Read more Development
Jul 26 2018
Jul 26

Encryption is an important part of any website that needs to store sensitive information. Encryption takes sensitive data that is in a readable form and encodes it, making it unreadable. This essentially hides the information from anyone who might try to access it without permission to do so. The encoded information can only be decoded by an entity that has a paired decryption key.

Our requirements for this particular Drupal website build included:

  • Acquia Cloud - One of the leading Drupal hosting providers.
  • Libsodium - Because of Acquia Cloud, we needed a custom compiled php extension
  • Encrypt - A Drupal module that exposes encryption APIs to other modules.
  • Key and Lockr.io - Drupal modules for managing the encryption key.
  • Sodium - A Drupal module to provide libsodium to the encrypt module.

Why use libsodium instead of mcrypt?

Libsodium is a portable, cross-platform implementation of NaCl. Experts recommend libsodium for its simple interface and strong cryptography. The sodium Drupal module takes an easier approach, which is to use a high-level package, paragonie/halite, to work with libsodium.

The other choice for encryption in PHP is mcrypt. It's the default method in the Drupal 7 version of the encrypt module. Despite that, it's a bad choice because it's difficult to use correctly. Mcrypt is deprecated in PHP 7.1 and removed in PHP 7.2.

Contact us and learn more about our custom ecommerce solutions

Installing Libsodium on Acquia's PHP 7.0

PHP 7.2 has libsodium built in and if you're on 7.1 or below you can install it from PECL. We're going to be using Acquia Cloud, so we can't yet run PHP 7.2 and we can't install any PHP extension we want - not as easily as we'd like to.

Acquia requires that extensions be compiled including their dependencies. The php-libsodium extension depends on libsodium itself and we have to produce one binary for both libraries. We'll be compiling libsodium the crypto library as a static library and php-libsodium the php extension that provides bindings to libsodium for PHP applications as a dynamically linked library so it can be loaded by a regular PHP install.

Let's get started!

  1. Download the latest libsodium from https://github.com/jedisct1/libsodium/releases.
  2. Compile libsodium so it's static, not shared. Put it in a directory we'll use later.

    $ ./configure --libdir=/home/me/sodium/library --disable-shared --enable-static--enable-static makes it static, not shared. It'll be a part of the php extension when we build it instead of a separate dependency.

    --disable-shared prevents creating a shared library version of the library.

    --libdir puts it in a directory where we'll use it later.

  3. Compile with PIC (Position Independent Code).

    $ make CFLAGS='-g -O2 -fPIC'
    $ sudo make install
    Here's our sodium library and a pkgconfig directory we'll need to point the php extension at.

    $ ls /home/me/sodium/library
    libsodium.a libsodium.la pkgconfig

  4. Download the latest version 1 release of the libsodium php extension from https://github.com/jedisct1/libsodium-php/releases.

    Use phpize to get the extension ready to compile. Normally a PHP extension is compiled as part of PHP. This script is used to set up things up so it's like we're doing that. You need the -dev version of PHP to get phpize, so install php7.1-dev or the equivalent for your situation.

    $ phpize7.1
    Configuring for:
    PHP Api Version: 20160303
    Zend Module Api No: 20160303
    Now you'd notice a lot more files in the directory, like the configure script.

  5. Set the package config directory to the one where we installed libsodium.

    $ export PKG_CONFIG_DIR=/home/me/sodium/library/pkgconfig

  6. Configure libsodium-php with the path to libsodium.

    $ ./configure --with-libsodium=/home/me/sodium/library --libdir=/home/me/sodium/library--with-libsodium tells it where to find the dependency we just created.

  7. Check that libsodium.so is not looking for a shared libsodium library.

    $ ldd modules/libsodium.so
    linux-vdso.so.1 => (0x00007ffcdd68e000)
    libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f71f26eb000)
    /lib64/ld-linux-x86-64.so.2 (0x00007f71f2d0f000)
    There's no libsodium dependency there, so we're good to use our libsodium.so PHP extension! Deploy the file and configure PHP to load the extension. Since we're on Acquia Cloud, Acquia does that after we provide the file.

Get encrypted!

If you're running Drupal and need encryption setup, or if you're looking to start a new project and exploring options and requirements, get in touch with us! One of our business developers will be happy to help.

Jul 26 2018
Jul 26
I had a hard time getting the #default_value to work for the date form field because the examples module and the documentation uses this approach:

    $form['send_date'] = [
      '#type' => 'date',
      '#title' => $this->t("Send Date"),
      '#default_value' => [

        'month' => 2,
        'day' => 15,
        'year' => 2020,
      ],
    ];

After playing around with it for a while, I was able to get it to work by passing in a YYYY-MM-DD value, using date.formatter (the new way to format_date()).

    $date_formatter = \Drupal::service('date.formatter');
    $form['send_date'] = [
      '#type' => 'date',
      '#title' => $this->t("Send Date"),
      '#default_value' => $date_formatter->format(REQUEST_TIME, 'html_date'),
    ];

Looks like this is a known issue that has also caused others lost time. :( Hopefully this will get rolled out soon.

Jul 25 2018
Jay
Jul 25

 image of spotlights focused on drupal logo

It is essential that content be well-categorized, especially on large websites. Drupal includes the Taxonomy module for doing just this, and it is able to account for most content tagging scenarios. However, when you actually go to categorize content, it can be a bit confusing.

Drupal offers a helpful autocomplete, but this doesn't always work as well as you might hope if you have a taxonomy tree which includes nested terms (for instance, "Cats" as a subtopic of "Pets"). In fact, the autocomplete doesn't show the hierarchy of these terms at all, which can get confusing if you have multiple terms with the same (or similar) names which are children of different terms. And although Drupal allows you to create new terms right from the content editing page, doing so always causes it to create the new term at the topmost level of the taxonomy, which often isn't where you actually want it.

What is Straw?

Enter Straw - the Super Term Reference Autocomplete Widget. We built Straw when we were working with a multilevel taxonomy and found that Drupal's default widgets just weren't cutting it. Now, we've made Straw available as a module for anyone else who might find themselves in a similar situation. Straw does two main things which make working with these taxonomies a breeze.

Find Matching Content Tags

First, both when displaying a selected term on the edit page and when searching to find matching tags based on what the editor is typing, the Straw widget shows the entire hierarchy of terms. That means that instead of simply showing "Cats", it will show "Pets >> Cats", and if you start typing "Pets" it will show both "Pets" and "Pets >> Cats", rather than only showing "Pets" like Drupal's default autocomplete would. This makes it much easier to see which exact term is being matched, and also makes it easier to find sub-terms if you don't quite remember their names.

screen shot of Drupal Straw Widget

Create New Terms

Straw makes it a breeze to create new terms on the fly. Say you're writing a new article for your site about a particular breed of dog, but so far your taxonomy only has terms related to cats. Without Straw, you could type in "Beagles" and it would create a new term of that name, but then you'd have to go to your taxonomy tree and create a "Dogs" term beneath the "Pets" term and then move "Beagles" into it. With Straw, this is much simpler… you just type in "Pets >> Dogs >> Beagles", and it creates all the necessary terms and puts them in the right place in the taxonomy relative to each other.

With Straw, tagging your content can be easier than ever before. It's easy to install (with only a little bit of configuration required) and then you'll be able to see right away how much of a difference it makes.

Offer for a free consultation with an Ashday expert

Jul 20 2018
Jul 20

Illustration of workflow concept

Did you know that setting up a content workflow is included in Drupal 8 core? It can be easily set up by simply turning on the Workflow and Content Moderation modules. The Workflow module gives you the ability to define a workflow and the Content Moderation module sets up a simple workflow for drafts and the ability to create more content moderation workflows.

Introduction to Content Moderation

If you aren’t familiar with content moderation in Drupal, let’s fix that with a quick overview of what that means. Without this module, Drupal content can only be in one of two states, published or unpublished. The content is either available to the public or it isn’t. Different users with different permissions can manage the publishing status certain types of content. All of this combines to make a pretty useful and normal experience. This should also seem like a normal experience for anyone that has ever used content management system.

Where the Content Moderation module comes in is when this workflow needs to be more than just on or off. This can allow for content to be placed into different statuses or states. With built-in states available, like Draft, content can be staged by one editor and then approved by a user with different permissions. In the publishing world, this matches workflows that exist for paper content, so this is an attractive feature for those in that vertical.

Set up in 5 minutes

Content Moderation in core provides a very simple but useful workflow. Simply turning on the Workflows and Content Moderation modules adds the Editorial workflow. The Editorial workflow adds the Draft state and sets up the content workflow to go from Draft to Published and provides an admin view for drafts that need to be moderated. Using permissions you can restrict authors to only be able to create and edit drafts. Then grant the "Transition drafts to published" permission to your editors and boom!—you have content moderation set up in a matter of minutes.

View of content moderation in Drupal core

 

If you are running a Drupal 8 site and have multiple content contributors that need feedback, there is little reason to not use moderation. The out-of-the-box Content Moderation should be able to handle most situations. If that doesn’t quite fit the needs of your content workflow then there is still good news, you can create a custom workflow.

Custom workflows

If you need a more complex workflow, you probably still don’t need to write any custom code. If it’s just content type, block, or paragraph based entities that you are building a workflow for, you can just create a new workflow based on the "Content" moderation type. Different states can be defined, for example let's say you have the states draft, ready for review, ready for second review, reviewed, and published. Next you need to define the transitions, for example one transition for the above states would be “move ready for review to ready for second review”. These transitions are going to be what you give different user roles permission to do. After that is set up, you and your team are ready to roll with your new workflow.

Setting up a complex workflow in Drupal 8 Core

 

Another example of a custom moderation workflow could be for a site that publishes to multiple platforms. A workflow could be set up to allow the editors of different platforms to approve content onto the system they manage. Let’s say you have a front-facing site, a native app, and an internal portal. You can create a workflow that goes through moderation for the front-facing site and then adds content to a queue to be published or declined for each of the other outlets. This is just one of the many possible use cases for a custom workflow.

Illustration of workflow in Drupal 8 Core for publishing to multiple platforms

If you need to extend Workflow further, maybe for a custom entity, you can write a WorkflowType plugin that covers your needs. This can be used for any entity that needs to change states, so think beyond the idea of content moderation. It could be used for steps in a manufacturing process, or steps for ordering in a restaurant app, the possibilities are limitless.

Do you need it?

Workflows are super powerful and moderation comes mostly ready to go with Drupal core but does that mean you should always use them? On some sites with only a handful of admins and not a lot of roles, it may be more cumbersome than useful. Just because workflows are an option, it shouldn’t be implemented unless your users understand the human element of a workflow. Moderators should know their role in content moderation. If most of your authors have admin privileges and can just push content straight through the flow, then your workflow is mostly standing in the way of being efficient. Every good workflow should start with a meeting of your entire editorial team, and have things worked out on a whiteboard or Slack channel first. Workflows are amazing as long as everyone understands their role in the publishing chain. 

New Call-to-action

Jul 18 2018
Jul 18

Selecting a CMS for a university can be a challenging decision. There are so many needs and nuances to consider - costs of implementation and maintenance, a wide-range of technical ability among site administrators, developers and content editors, a variety of end users looking for different information and the list goes on and on. While your answer likely isn’t as easy as, “let’s just do what everyone else is doing,” by better understanding why other universities made the choice they did can shed light into your decision-making process. 

Drupal is far and above the most used CMS in higher education - 26% of all .edu domain sites are in Drupal, including 71 of the top 100 universities. 

So why are universities like MIT, Georgia Tech, Louisiana State University, Butler, Stanford, Harvard and the rest of the Ivy League universities choosing Drupal? 

Simply put, Drupal makes good business sense, especially with the added benefits of Drupal 8. At Mediacurrent, we believe your website is your greatest digital asset and can be leveraged to accomplish organizational-wide goals. Drupal makes that possible. Here’s how:  

Communicate With All Students - Prospective, Current, and Alumni 

If you want to reach your full recruiting and fundraising potential, you need to communicate with your entire audience. There are a variety of Drupal features that ease the stress of common communication challenges. 

Language:  Not only are their multiple languages spoken within the U.S., but our country hosts over a million international students. Drupal makes creating a multilingual digital experience simpler. Native language handling is built directly into Drupal 8’s core APIs, giving you over 100 languages to choose from. With that functionality it is easier than ever to engage with prospective students across the globe in a meaningful way.

Accessibility: The CDC estimates that 20% of U.S. adults identify as having a disability. These disabilities often hinder people’s ability to interact with the average website. Drupal is an inclusive community and has committed to ensuring that all features of Drupal conform with w3C and WCAG 2.0. Pair that with a strong higher-education focused accessibility strategy and your potential audience could grow by 20%. 

Technology: According to the 2017 College Explorer Market Research Study, the average college student owns 5.6 devices and spends 137+ hours on them! This may seem like common sense now, but if you want to engage with students, you need to account for a variety of screen sizes. Thankfully, Drupal 8 is designed with a mobile-first mentality and includes out-of-the-box responsive functionality. 

Personalization: Universities face added complexity when it comes to digital strategy due to the broad audiences they appeal to. With so many unique people coming to the same pages, content strategy, conversion path mapping and optimization, and defining strong call to actions can be a struggle. By incorporating personalization into your content strategy, whether that is personalized based on user authentication or by integrating tools like Acquia Lift or Salesforce Marketing Cloud, you can speak to the masses but make them feel like you’re speaking specifically to them. 

Reduce Overhead Costs + Increase Operational Efficiencies with Drupal

Drupal can have a dramatic impact on reducing overhead costs and increasing operational efficiency. Universities have a big need for multiple websites: departments, colleges, libraries, and student organizations all want their own website. The direct cost of supporting this many sites along with resourcing the training and support is expensive and encourages unnecessary technology sprawl. As an open source technology (no licensing fees!) along with the multisite feature, creating sites for these different groups is exponentially easier, more cost effective, and ensures brand consistency. 

You can also increase efficiency, ensure content consistency and improve the user experience by creating a “source of truth”.

Write content once and publish it anywhere it’s relevant.

Having to update content such as curriculum or an academic calendar on multiple pages is inefficient and unnecessary. Write once, publish everywhere, save time. 

Improve Brand Equity + Amplify Digital Strategy

As a university, your brand is a powerful asset. You spend significant energy and resources on building loyalty to bolster several organizational goals from recruiting efforts, engaging current students on campus and fundraising among alumni.

With your website being the hub of your marketing strategy, it is critical for your CMS of choice to play nice with your marketing efforts.

Drupal happens to be very SEO friendly out of the box, but there are also advanced configuration options available to support a more sophisticated SEO strategy. You can amplify your digital strategy by integrating your marketing tools and communication platforms directly with Drupal. And the 26% percent of other .edu sites using Drupal make integrating your university-specific tools to your website easier. 

Reduce Risk

I’d be remiss without mentioning security and GDPR compliance. As a university, you hold sensitive information about the students who have attended your school and they are trusting you to keep that secure.

The Drupal community is passionate about security and has an industry leading global security team to ensure your site is protected.

Additionally, as the landscape of privacy rights changes around the world (most recently, GDPR), it’s in your best interest to stay on top of it and reduce the risk of being penalized for data collection practices. 

Have questions about how Drupal can benefit your university? Let us know. We’d be happy to chat. 

Jul 18 2018
Jay
Jul 18

drupal 8 logo featured alongside drupal logos showing the growth of Drupal

Now that Drupal 8 has gained some momentum, it is time to start planning out your upgrade strategy. You want to upgrade to get the latest benefits and take advantage of the future stability that comes with the direction that Drupal will be taking from here on out. Before upgrading you will want to consider some things about what your current site has. In this article we will be covering some of those questions with some context to assist in the decision making process. Let’s determine if you website is adequately serving the current needs of your business and which content will need to be brought over to the new Drupal 8 site. There may be a difficulty in the switch, but being prepared will put you in position to handle whatever comes up.

So, you have a nice Drupal 7 site that's been running happily for years, and maybe you're thinking it's time to upgrade – or, if you haven't given it much thought yet, there's certainly plenty of reasons to do so now. But, Drupal 8 is a pretty big step forward from Drupal 7. What exactly is an upgrade going to entail? What do you need to consider before you do so? We've upgraded numerous sites now, and today I'd like to go over a few things that I think are important to know before starting on such a project.

Before You Start

Just because you've decided to upgrade, doesn't mean that you can do it right away and be done with it by dinner. Unfortunately, there isn't any one-click solution to take your site from Drupal 7 to Drupal 8 (although future upgrades should be closer to reaching that level of simplicity). Here are a few things to consider before you touch the first line of code or content:

Do you still want the same site? Doing a large upgrade such as this is an excellent time to re-evaluate your site's functionality and appearance. Now that you've had it for some time, is there anything you want to change? Any new features you want to add that maybe weren't feasible before because of how the old site was built? Although upgrading can be done in such a way that everything stays almost exactly the same for your site's visitors, it's good to at least consider what potential improvements you could make, perhaps by leveraging some of Drupal 8's great new features. Maybe your site just needs a facelift, or has some long-standing bugs you want to fix, or you want to take advantage of new CSS or JavaScript technologies which weren't available when you first built the site. Upgrading is the perfect time to take a fresh look at things.

How are you migrating? One critical part of most upgrades is to migrate content and user information from the old site to the new one. Databases for Drupal 8 are structured differently than they were in Drupal 7, which means that to upgrade the site you can't just swap out the code while keeping the same database… you have to actually move the content into a new database as well and adjust it to fit Drupal 8's slightly different structure. This is a pretty big departure from how Drupal upgrades worked in the past.

There are two main ways to do this migration. Drupal 8 includes a Migrate API, which is perfect for some simple sites. However, not all contributed modules have the plugins necessary to migrate their data in this way, and any data you might be storing in some customized way certainly doesn't. Although it is possible to write your own plugins, in our experience it's easier to take a different approach by writing an entirely custom migration. This does however require having some more familiarity with Drupal coding than the Migrate API route does. Generally, what a custom migration involves is writing a script and appropriate queries to first retrieve your information from the old database, and then to write it to the new Drupal 8 database in the appropriate format.

Do you know enough Drupal 8? If your site is mostly built using core and contributed modules and themes, you'll probably be good to go. Drupal 8's admin UI does have some changes from Drupal 7's, but none of them are very difficult to get used to and most introduce new ways to navigate and improve your site. However, if your site uses much in the way of custom modules or if it has a custom theme, then before upgrading you should make sure your developers and themers are familiar with Drupal 8, perhaps by working on some other simple Drupal 8 project first or by getting some training on the changes, either in-person or through an online course. The changes in how you code custom modules and themes in Drupal 8 are significant, so they may take some getting used to, but ultimately they are the foundation of what makes Drupal 8 leaps and bounds better than its predecessor. Knowing how to use them effectively is key to having a Drupal 8 site that continues to be maintainable for years to come.

Still Not Sure You're Ready to Upgrade?  Learn more about the benefits and our upgrade process. 

What to Expect

Alright, so you've gotten everything figured out. All that's left to do now is to, well, actually upgrade the site. Here are a couple of things you should plan for when starting most any upgrade.

Expect to have a few difficulties: Even if the upgrade seems like it will be a simple one, you should go into it expecting to have a few unexpected challenges come up, because they almost certainly will. Maybe it's one particular type of field that doesn't have the same settings in 8 as it does in 7. Maybe it's a combination of contrib modules which used to work together one way and don't anymore (one example: The Context module can no longer be used in conjunction with Metatags to set metadata in contextual circumstances). Maybe it's that your site uses a lot of different webforms, and although the submission data can get migrated without too much trouble, recreating the forms themselves can't be done automatically. 

Regardless of what it ends up being, plan to have some unexpected difficulties during the upgrade. In my experience, these challenges tend to be fairly easy to fix or work around through some combination of contributed modules and custom code, so although they can add some complexity to the upgrade it's rare to find a dealbreaker so late in the process.

Expect to have two sites for a little while: Your new Drupal 8 site is completely separate from your old Drupal 7 site, and that means that it needs to be installed somewhere other than where your old site is, and it probably has to have its own domain so you can access it during the upgrade process. You'll be able to keep your old domain though - once the upgrade is done and your Drupal 8 site is ready to go, all you should have to do is change your main domain's settings to point to the new website rather than the old one.

When you first start the upgrade, your Drupal 8 site is likely to be almost empty. If you have lots of custom modules to recreate or anything and the migration itself isn't done yet, then maybe it has some placeholder content to start with. Then, once the migration is done, all the relevant content from the old site gets migrated into the new one. At this point, your content and user information is being stored in both sites. It also means that, from now until the new version of the site launches, any changes you make to the content of the old site won't be automatically reflected in the new site and will be lost when you deploy the upgrade. For many sites, this problem can be mitigated by having a period of time in which changes get made to both sites, instead of just to one of them.

Still, it isn't ideal to have to enter new content twice for an extended period of time, so to solve this, one thing we like to do is to actually have two migrations. The first migration happens early on in the development of the Drupal 8 site, and we use the content from that migration while finishing most of the upgrade. Then, shortly before the upgrade goes live, we do a "final migration" to get any changes which have been made on the Drupal 7 site since the original migration. If all goes well, this results in only a day or two (or zero!) during which changes have to be made in both sites.

Once the upgrade is finished and your domain name settings adjusted so that the new site is live, it's worthwhile to do one last check of the old site to see if there are any recent changes or user registrations that may have been missed. Depending on how your migration script works, these could be migrated again with that, or they could be replicated manually. Once that's done though, huzzah! Your newly upgraded Drupal 8 site is finally ready for the world to see.

 Offer for a free consultation with an Ashday expert

Jul 17 2018
Jul 17

[embedded content]

In the video above, you’ll learn how to build powerful media management functionality using Drupal 8.5. I start the webinar with a review of what’s new in Drupal 8 and then jump right into a live demo.

If you prefer text, then read our tutorial on “Managing Media Assets using Core Media in Drupal 8“.

Modules

In the video we use the following modules:

Below is a time coded list of all the sections in the video. Just click on the time code and it’ll jump to that part of the video.

Video Sections

  • Agenda: (00:41)
  • What’s new in Drupal 8: (01:35)
  • Required Modules: (09:53)
  • Demo time: (11:02)
  • Install Media module: (11:49)
  • How to Upload and view assets: (12:34)
  • Overview of Media types: (14:42)
  • Create remote video media type: (15:34)
  • Field mapping in media types: (17:45)
  • Create remote video asset: (18:52)
  • Customize remote video formatters: (19:48)

Media Field

  • Attach media field to Article content type: (20:24)
  • Using Inline Entity Form on media field: (22:53)
  • Install Inline Entity Form: (23:27)
  • Configure IEF on media field widget: (24:46)

Entity Embed

  • Install Entity Embed: (27:44)
  • Create embed button: (28:37)
  • Configure text format for embed button: (29:40)
  • Testing embed button: (32:27)

Entity Browser

  • Install Entity Browser: (35:50)
  • Create entity browser: (38:21)
  • Integrating entity browser and entity embed: (39:45)
  • Create view for entity browser: (41:02)
  • Add IEF to entity browser: (47:35)

Entity Browser and DropzoneJS

  • Add dropzonejs to entity browser: (51:59)
  • Customize fields on media types using form modes: (54:26)

Drupal 8.6

Ivan Zugec

About Ivan Zugec

Ivan is the founder of Web Wash and spends most of his time consulting and writing about Drupal. He's been working with Drupal for 10 years and has successfully completed several large Drupal projects in Australia.

Jul 13 2018
Jul 13

Omeda and Drupal are a perfect match for managing customer relationships

As you may have figured out by now, Drupal is a great platform for 3rd party integrations. Whether it’s eSignatures with Hellosign, more sophisticated search with Solr, or a host of other options, Drupal works best when it’s not trying to reinvent every wheel and is instead used to leverage existing business tools by tying them all together into a robust and useful package. Today, we’re going to take a look at a new set of integration modules that Ashday has just contributed back to the Drupal community: Omeda, Omeda Subscriptions and Omeda Customers.

The Omeda Solution

If you haven’t heard of Omeda by now, let’s take care of that. They are a family-founded company who have been around for over 30 years and offer a host of Audience Relationship Management tools meant to properly leverage modern technology to properly segment and target your customer base. They are on top of their game and really know their stuff (they even collaborated on the brainstorming of these new modules). And best of all, they offer a very complete and well-documented API, which is key to any good integration.

While their API is extensive, much of it can be very tailored to a particular customer’s needs so we set out to build a contributable integration that covers some great typical use cases as well as allow for custom development where necessary. So far, this has taken the shape of three modules that potentially can be extended later on.

Omeda Base Module

The Omeda base module is meant to be a simple core module to get you wired up to Omeda so you can start doing things. It has settings to configure your connection to the API as well as a setting to work in testing mode. At its heart is a Drupal service that is meant to be injected into other contrib modules that can tackle specific Drupal solutions. This service provides the Comprehensive Brand Lookup, which the module uses to cache your Omeda brand config daily, as well as a generic utility for making API calls, which has the benefit of consolidating some basic error handling and formulation of HTTP requests. Here’s a little example of how simple it is to call the API using this service:

 

$brand_lookup = \Drupal::service('omeda')->brandComprehensiveLookup();

 

So that’s pretty easy, right? If anything is wrong with the API config or the service is down or something along those lines, the service throws an exception with a helpful message. Otherwise, it returns the Omeda API response to the caller. Again, it’s simple but provides the base functionality you need everytime you wish to connect with Omeda.

Omeda Subscriptions Module

The first module to leverage the Omeda base module is the Omeda Subscriptions module. This module adds a My Subscriptions tab to the user profile, for the roles you select, where users can manage their deployment subscriptions. You can also configure which Omeda deployments you wish to allow your users to manage. The available deployments come from the stored Comprehensive Brand Lookup data cached by the base Omeda module. This module includes a new Omeda Subscriptions service that adds functions to fetch a logged in user’s opt ins and opt outs as well as allow the user to opt in or opt out of a deployment. It’s a simple, specialized module (which is the best kind) to provide an out-of-the-box solution for the most common Omeda user subscription management needs.

Omeda Customers Module

The Omeda Customers module is another extension to the base Omeda module that allows you to map user fields to your Omeda customer entities and sync them on user updates. You can choose which roles will sync and which user fields will sync. Since field mapping can get quite hairy, we provide a solution for simple use cases where you need to simply tell Omeda that “this field = that field”. If it’s a standard base Omeda field like “First Name”, those are called base fields and are meant to be a simple hand off of the Drupal field value to Omeda. If it’s an email, phone number, or address field, you can choose that type and we will then ask you to determine which contact type it represents so that it gets into Omeda properly. It should be noted that for addresses, we only support Address fields from the Address module since mapping a bunch of individual Drupal fields to a single Address entity in Omeda is more complicated and likely needs a custom solution.

This mapping config also provides a simple solution for Omeda Demographic fields, which are more complex and dynamic fields that store IDs instead of literal values. It allows you to choose which demographic field a Drupal user field maps to and then create a mapping of possible Drupal field values with available Omeda field values. So if you have a field on the Drupal user called “Primary business role”, but you want to map it to the Omeda “Job Title” demographic field, you can do that. You would then hand enter a mapping that indicates that a Drupal field value of “President” maps to the Omeda value of “President / CEO” so that we can send Omeda it’s desired ID value of 5******* instead of the literal text of “President / CEO”, which would be invalid. Again, this is for more simple use cases; if your Omeda fields don’t map 1-to-1 to Drupal fields, the necessary business logic is wide-ranging and you will likely need custom programming. The great thing though is that we’ve included support for a custom hook (hook_omeda_customer_data_alter) to inject your own adjustments into the data mapping process and provide your own custom alterations.

In Conclusion

Hopefully these modules prove useful to you - especially if you’re already an Omeda client with a Drupal site - and the goal again is to provide some basic functionality that doesn’t require developer resources to integrate with some basic Omeda functionality. We ourselves have custom needs beyond what the modules can offer and find that it’s quite easy to extend them to do what we need for particular customer solutions, largely thanks to Drupal 8’s architecture. As always, if you need assistance please feel free to contact us and if you’d like to offer any module patches or report a bug, feel free to use the issue queues for each project and we’ll check it out!

 If you need to integrate Drupal with anything, talk to us! We can integrate just about anything with Drupal

Jul 12 2018
Jul 12

This tutorial will walk you through setting up an awesome Drupal Commerce product catalog using Search API and Solr, and then adding various ways of filtering the results (product search, sorting options and Facet categories). The end result of this guide will be a catalog that functions in the same way as our Urban Hipster Drupal Commerce demo site’s catalog. You can try it here. If you don’t already know why we use Search API, Solr and Facets for catalogs, check out this article to get up to speed.

Even though we’re going to be using products, once you understand how it works you’ll be able to apply the same method for other type of content such as a blog, videos, resources, and more. The datasource can change but the process is the same.

Let's get started! Follow along with this video or skip below for a written guide. 

[embedded content]

What you need before starting

  1. A running Solr server (Solr 6.4+)
    This tutorial assumes you have Sor running and can make a new core.
  2. Drupal 8 installed with the following modules:

    TIP: Get most of what you need quickly with Commerce Kickstart. Note that you will still need to install the Facets module after getting your Kickstart package.
    • Commerce 
      composer require drupal/commerce
    • Search API 
      composer require drupal/serach_api
    • Solr
      NOTE: This module requires you're running Solr 6.4+ and PHP 7
      composer require drupal/serach_api_solr
    • Facets 
      composer require drupal/facets

Getting started

  1. Install/start Solr and add a new core that we’ll use later.
  2. Enable the Commerce, Search API, Solr and Facets modules.

Setup a basic Commerce store

For this tutorial, get your site and basic store set up by doing the following:
  1. Add a product category taxonomy vocabulary that is at least 2 levels deep.
    If we use clothing as an example, we might have Men and Women as the first level, then t-shirts, shorts and shoes as the second level for each.
  2. Setup you basic Commerce store, product types and product variation types.
    If you’re new to Commerce, take a look at their documentation to get up and running quickly.

    NOTE: Make sure to add the taxonomy vocabulary you created as a ‘taxonomy reference’ field to your Product Type.

  3. Add 10 or more simple products for testing the catalog as we go.

Demo Drupal Commerce today! View our demo site.

Add a Search API server and index

Search API server

Admin: Configuration > Search and metadata > Search API
Admin menu path: /admin/config/search/search-api

  1. Click ‘Add server’.
  2. Configure the server.
    1. Name your server and enable it.
    2. Set ‘Solr’ as the server ‘Backend’.
    3. Configure the Solr connector.
      The defaults are usually fine. The main things to add are:
      • Solr connector = ‘Standard’.
      • Solr core = Whatever you named your core.
    4. Under ‘Advanced’, check ‘Retrieve result data from Solr’.
    5. Look over the remaining settings and update if you need to.
Search API index

Admin: Configuration > Search and metadata > Search API
Admin menu path: /admin/config/search/search-api

The index is where you set what data source is used by Search API. Eventually, you’ll also specify specific fields to be used for filtering the displayed results.

  1. Click ‘Add index’.
  2. Configure the index.
    1. Name your index.
    2. Data source should be ‘Product’
      This can be anything, but we’re creating a Commerce catalog and so we want to use the store products.
    3. Select the server you just created.
    4. Save. Don’t add any fields for now, we’ll do that later.
    5. Go to the ‘View’ tab and index your results. This will index all of the products you have added so far.

Create a View for the catalog

Admin: Structure > Views
Admin menu path: /admin/structure/views

The View will use the data source we’ve identified in our index and allow us to create a catalog with it, and then assign ways of filtering the catalog results (i.e. a search field and/or facets).

  1. Create a new View.
    1. View Settings, select your index.
    2. Create a page (this will become our catalog).
  2. View Display settings.
    1. Format > Show
      Set as ‘Rendered entity’, then in the settings, set your product types to use a ‘Teaser’ view mode instead of the default.

      NOTE: You may need to create this view mode if it doesn’t already exist.

      NOTE:You could alternately use Fields instead of view modes, but I like to keep my product display settings all within the product type’s display settings. Then you can potentially customize the display per product type if you have more than one.

  3. Save the view .
    These basic settings should give us our overall catalog. You can confirm by previewing the view or visiting the page you just created.

Add Fulltext datasource fields for a catalog search field

Now we’ll start setting up a Fulltext search field to let our users filter results using a product search field. The first thing we need to do is add some datasource fields to our index that the search will use.

  1. Go to your Search API Index and go to the Fields tab.
  2. Add Fulltext fields that you would like to be searchable (such as Title, SKU, Category taxonomy name, etc.).
    Here’s an example for adding the title:
    1. Click ‘Add fields’.
    2. Under the ‘Product’ heading, click ‘Add’ beside the ‘Title’ field.

      NOTE: If you’re adding a different field instead, you may need to drill down further into the field by clicking ( + ) next to the field. For example, to make a taxonomy term a searchable field, you would go to Your Vocabulary > Taxonomy Term > Name .

    3. Click ‘Done’.
    4. Set the field ‘Type’ to ‘Fulltext’.
      This is an important step as only Fulltext fields are searchable with the user-entered text search we are currently setting up.

      NOTE: Under the fields section is a ‘Data Types’ section. You can open that to read information about each available type.

    5. Optionally change the ‘Boost’ setting.
      If you have more than one Fulltext field, the boost setting allows you to give a higher priority to specific fields for when the terms are being searched.

      For example, multiple products could have a similar title, but each product would have an individual SKU. In this case, SKU could be given a higher boost than title to make sure that search results based on the SKU come back first.

  3. Next, add another field for the ‘Published’ status.
  4. Once you’ve added this field, set it’s type as ‘Boolean’.
  5. Reindex your data (from within the index view tab).

Set up the catalog search field within the catalog View

We can now set up the actual search field that our customers will use to find products, and use the datasource fields we added to our index to do this.

  1. Go to your catalog View.
  2. Under ‘Filter criteria’.
    1. Add ‘Fulltext search’ and configure its settings.
      • Check ‘Expose this filter to visitors, to allow them to change it’.
        IMPORTANT: This is what gives the user the ability to use this search field.
      • ‘Filter type to expose’, set as ‘Single filter’.
      • ‘Operator’, set as ‘Contains any of these words’.
      • ‘Filter identifier’, optionally adds an identifier into the url to identify a search term filter.
        (i.e. yoursite.com/products?your-filter-identifier=search-term)
      • Apply/save your settings.
    2. Add ‘Published’ and configure it so that it is equal to true.
      This uses the field we added to the index earlier to make sure the product is actually published. Chances are you don’t want unpublished results shown to your customers.
  3. Under ‘Sort criteria’.
    1. Add ‘Relevance’.
    2. Configure so that the order is sorted ascending.
      This will show the more relevant results first (factoring in the boost you may have applied to your index fields).
  4. Now we need to expose the search field to our customers. To do this:
    1. Open the ‘Advanced’ section of your catalog view.
    2. In the ‘Exposed Form’ area.
      • Set ‘Exposed form in block’ to ‘Yes’.
        This creates a block containing a search field that we can place on the site somewhere.
      • Set ‘Exposed form style’ to ‘Basic’ and update the settings. For now, the settings you might change are customizing the submit button text and maybe including a reset button.
  5. Add the search block to your site.
    Admin menu path: /admin/structure/block
    1. In your preferred region, click the ‘Place block’ button.
    2. Find the Views block that starts with ‘Exposed form’ and click ‘Place block’.
      Its full name will be determined by you view’s machine name and page display name (i.e. Exposed form: products-page_1).
    3. Configure the block as you see fit, and save.
  6. Test your search!
    You should now be able to see the search field on your site frontend and try it out.

Add more datasource fields for sorting options

We can optionally sort the catalog and search results with some additional sorting filters, such as sorting by Title, Price, Date added, etc. Let’s add the ability to sort our products by title with the option to choose ascending or descending order.

  1. Go to your Search API Index fields and add another 'Title' field the same as you did earlier. However, this time you want to change the field ‘Type’ to ‘String’. You should now have two Title fields added, one as ‘Fulltext’ and one as ‘String’.NOTE: The field type can be different depending on what field you’re adding. If you’re adding a sorting field such as Price > Number, you might use the ‘Decimal’ field type.

    TIP: I would recommend changing the Label for the new Title field to something like ‘Title for sorting’ so that it’s easier to identify later. You could even change the fulltext Title label to ‘Title for search’, just to keep them organized and easy to understand.

  2. Reindex your data (from within the index view tab).
  3. Go to your catalog View.
    1. Under ‘Sort criteria’.
      • Add the new title datasource and configure it.
        • Check ‘Expose this sort to visitors, to allow them to change it’.
          IMPORTANT: This is what gives the user the ability to use this sorting option.
        • Add a label for the filter
        • Set the initial sorting method.
      • Add any additional sorting fields if you added more.
    2. Open the settings for the ‘Exposed form style’ (within the view’s ‘Advanced’ section).
      • Check ‘Allow people to choose the sort order’.
      • Update the labels as you see fit.
    3. Save your view!
      Refresh your catalog page and you should now see sorting options available in the search block that you added earlier.

      TIP: If you DO NOT see the sorting options, this is a bug and is easily fixed. All you need to do is remove the search block and the re-add it.

      TIP: You can place this block in multiple regions of your site and hide the elements you don’t want to see with CSS. This way you can have a block with the site search and no filters in your header, and then also have another block on your catalog pages that shows the sorting filters but no search field.

Add Facets to the catalog

The filters we added earlier can only be used one at a time, however, often we want to filter the results based on a number of different options. For example, if I’m browsing an online store looking for shoes of a certain style and size, I don’t want to see everything else the store has to offer. I want to be able to go to a ‘shoe’ category, then pick the ‘type’ of shoe that I’m after, and finally pick the ‘size’ of shoe that’s relevant to me. I want to see all of the results that fit that criteria. Facets let use taxonomy (and other datasources) to achieve this.

Let’s add a Facet that uses the taxonomy vocabulary we created in the initial store setup. This will be our main catalog menu for narrowing down the product results. Each facet that is created creates a block that we can add into any region of our template.

  1. Add a Search API index fields for your taxonomy vocabulary. Set the field ‘Type’ as ‘String’.TIP: Like we did earlier, I would recommend renaming the label for this field to something like ‘Categories for Facet’.
  2. Reindex your data (from within the index view tab).
  3. Go to the Facets page.
    Admin: Configuration > Search and metadata > Facets
    Admin menu path: /admin/config/search/facets

    You should see a ‘Facet source’ available to use. When we created a View using our index, this is what added the Facet source here. Now that we have a source, we can create Facets to filter it.

  4. Click ‘Add facet’.
    1. Choose the ‘Facet source’ to use.
    2. Select the index ‘Field’ that this Facet will use (i.e. Categories for Facet, or whatever you labelled your field).
    3. Name your Facet (i.e. Categories).
  5. Configure the Facet.
    This will cover the basic settings that you will need. Start with this and then you can always play around with other settings later. Each setting has a pretty good description to help you understand what it does.
    1. Widget.
      Choose a ‘Widget’ for displaying the categories. For categories, I like to use ‘List of checkboxes’.
    2. Facet Settings.
      Check the following:
      • Transform entity ID to label.
      • Hide facet when facet source is not rendered.
      • URL alias as ‘cat’ (or whatever you like).
      • Empty facet behavior as ‘Do not display facet’.
      • Operator as ‘OR’.
      • Use hierarchy.
      • Enable parent when child gets disabled.
      • NOTE: the Facets Pretty Paths module can be used to give you nicer looking URL paths.
    3. Facet Sorting.
      Configure as you see fit. In this example, I would only check the following. These settings make sure that the taxonomy follows the same order that you have set within the vocabulary itself.
      • Sort by taxonomy term weight.
      • Sort order as ‘Ascending’.
    4. Save.
  6. Add the Facet block to your site.
    Admin: Structure > Block layout
    Admin menu path: /admin/structure/block
    1. In your preferred region, click the ‘Place block’ button.
    2. Find the ‘Categories’ facet block (or whatever you named it) and click ‘Place block’.
    3. Configure the block as you see fit.
    4. Save.
  7. Test your Facet!
    You should now see your Facet on the catalog page. Click the checkboxes and test out how it works!

One last thing...

The sites we've been building use the Facets Pretty Paths module for making nicer looking URLs with our catalog and filters. For a while we were plagued with a problem where, when the user selects a Facet category and then uses the sorting options, the Facets would uncheck and reset. This is obviously not good because the user is trying to sort the filtered down items, not the overall catalog. We need to be able to maintain the active facets when using the filters.

Luckily, a coworker came up with this nice little solutions that you can apply to your theme's .theme file. You just need to replace YOUR_THEME, YOUR-VIEW (i.e. products-page-1), and YOUR-PATH (i.e. products) in the code below. Ideally, this will be fixed within the module itself soon, but this will work while we wait.

/**
* Implements hook_form_alter().
*/
function YOUR_THEME_form_alter(&$form, FormStateInterface $form_state, $form_id) {
  // Store - Product Listing view exposed form.
  if ($form['#id'] == 'views-exposed-form-YOUR-VIEW') {
    $current_path = \Drupal::request()->getRequestUri();

    // If current path is within your catalog, correct the form action path.
    if ((strpos($current_path, '/YOUR-PATH') === 0)) {
      // Fix for views using facets with pretty paths enabled.
      // Replace form action with current path to maintain active facets.
      $form['#action'] = $current_path;
    }
  }
}

Done!

There you have it! You have now created a Search API index using Solr, setup a View to display the results of the index, and then implemented 3 different ways to filter the results (search, sorting and Facets). This is the start of an awesome product catalog and you can expand on it with more datasource fields however you want. Cool!

Contact us and learn more about our custom ecommerce solutions

Jul 11 2018
Jul 11

Coffee is a magical thing. It gets you going, clarifies your thoughts, makes you feel all warm inside. I don’t know what I’d do without it. So when we consider installing the Drupal module named after this irreplaceable daily beverage, we see that it has a similar effect. It just makes things better. Am I overstating things? Probably. But I haven’t had enough coffee yet today and I need to get this blog going with some pizzazz.

So simply put, the Coffee module gives you a Mac-like live search experience of the Drupal admin menu. It is triggered by the shortcut of Alt+D on the Mac, or Alt+ctrl+D in Windows IE. When the search bar pops up, just start typing and you can do an ajax live search of that deeply nested admin menu and get to things in a hurry! 

Screenshot of Coffee Drupal Module

Now, it’s not perfect. It can’t dig beyond certain levels, such as actually finding the settings of individual contact forms, but it gets nearly everywhere in the first couple of layers and can save you a lot of clicks when in development and site building mode. Even cooler - you can add your own custom coffee commands in your custom module to make your actions discoverable. That’s great flexibility. 

So there you have it! A simple module that shaves off a few seconds of time all day long. Pretty awesome!

Time to refill my mug.

Jul 11 2018
Jul 11
July 11th, 2018

Someone recently asked the following question in Slack. I didn’t want it to get lost in Slack’s history, so I thought I’d post it here:

Question: I’m setting a CSS background image inside my Pattern Lab footer template which displays correctly in Pattern Lab; however, Drupal isn’t locating the image. How is sharing images between PL and Drupal supposed to work?

My Answer: I’ve been using Pattern Lab’s built-in data.json files to handle this lately. e.g. you could do something like:

footer-component.twig:

... {% footer_background_image = footer_background_image|default('/path/relative/to/drupal/root/footer-image.png') %} ... {%footer_background_image=footer_background_image|default('/path/relative/to/drupal/root/footer-image.png')%}

This makes the image load for Drupal, but fails for Pattern Lab.

At first, to fix that, we used the footer-component.yml file to set the path relative to PL. e.g.:

footer-component.yml:

footer_background_image: /path/relative/to/pattern-lab/footer-image.png footer_background_image:/path/relative/to/pattern-lab/footer-image.png

The problem with this is that on every Pattern Lab page, when we included the footer copmonent, we had to add that line to the yml file for the page. e.g:

basic-page.twig:

... {% include /whatever/footer-component.twig %} ... {%include/whatever/footer-component.twig%}

basic-page.yml:

... footer_background_image: /path/relative/to/pattern-lab/footer-image.png ... footer_background_image:/path/relative/to/pattern-lab/footer-image.png

Rinse and repeat for each example page… That’s annoying.

Then we realized we could take advantage of Pattern Labs global data files.

So with the same footer-component.twig file as above, we can skip the yml files, and just add the following to a data file.

theme/components/_data/paths.json: (* see P.S. below)

{ "footer_background_image": "/path/relative/to/pattern-lab/footer-image.png" }     "footer_background_image":"/path/relative/to/pattern-lab/footer-image.png"

Now, we can include the footer component in any example Pattern Lab pages we want, and the image is globally replaced in all of them. Also, Drupal doesn’t know about the json files, so it pulls the default value, which of course is relative to the Drupal root. So it works in both places.

We did this with our icons in Emulsify:

_icon.twig

paths.json

End of the answer to your original question… Now for a little more info that might help:

P.S. You can create as many json files as you want here. Just be careful you don’t run into name-spacing issues. We accounted for this in the header.json file by namespacing everything under the “header” array. That way the footer nav doesn’t pull our header menu items, or vise-versa.

example homepage home.twigthat pulls menu items for the header and the footer from data.json files

header.json

footer.json

Web Chef Brian Lewis
Brian Lewis

Brian Lewis is a frontend engineer at Four Kitchens, and is passionate about sharing knowledge and learning new tools and techniques.

Web Chef Dev Experts
Development

Blog posts about backend engineering, frontend code work, programming tricks and tips, systems architecture, apps, APIs, microservices, and the technical side of Four Kitchens.

Read more Development
Jul 09 2018
Jul 09

Strata Center at MIT

Strata Center at MIT

Nestled right off Main Street in Cambridge, Massachusetts lies the Ray and Maria Stata Center on the campus of the Massachusetts Institute of Technology (MIT).  This abstract building designed by Pritzker Prize-winning architect Frank Gehry was the perfect venue to house the longest running, front-end focused Drupal conference in the US, Design 4 Drupal.  It demonstrates that the modern and abstract design Cambridge and MIT has seen can work perfectly with the structure needed within.

The Design 4 Drupal conference highlights training sessions and seminars focusing on designing for, and building the “front-end” of websites, or what gets seen and used by end users.  This area of focus encompasses graphic design, user experience, accessibility, performance, tooling, and much more.

Like a lot of our Higher-Ed clients, MIT is a user of Drupal, and is proud to offer this space to the Drupal community to learn and share knowledge.  I was pleased to be asked to present two sessions at the conference, and even more pleased with the knowledge I was able to take away from attending the event.

Meta and Schema: Defining the Content about your Content

The first session I presented focuses on designing and implementing a metadata strategy for your website.  Metadata is the content that describes your content. It is very important in how web pages are found in search engines, and how they are displayed on social network sites.

The presentation is a deep dive into the different specifications for meta tags and Schema.org schemas, how to decide what to markup, and then how to text and validate that you’ve done it correctly.

This session was not recorded due to technical difficulties, but the slides are available at jimbir.ch/meta-schema-drupal

Building a Better Page Builder with Bootstrap Paragraphs

The second session presented reviews the Bootstrap Paragraphs module for Drupal 8 that I developed and how it combines the power of the world’s most popular front end framework, Bootstrap, with Drupal Paragraphs, the powerful module that allows content creators to build layouts and structured pages using content components.

Last session begins for #D4DBoston #Design4Drupal 2018 with @thejimbirch and Lego Uncle Jim explaining the Bootstrap Paragraphs pic.twitter.com/p1zBJv2bMZ

— Dwayne McDaniel (@McDwayne) June 28, 2018

I have been working on this module since I first presented it at the BADCamp 2016 Front-end Summit.  The module installs a suite of components that allow content creators to quickly build out pages.

I love giving this presentation because I always get great feedback from people who use the module; who are going to use the module; or who are going to use my methodology to create their own version that fits their specific needs.  The module currently has over 25,000 downloads, and has had users from all around the world.

You can watch a recording of the session here.

Need help designing your website? Contact us and we can help

The Keynotes

The Building Blocks Of The Indie Web – Jeremy Keith

The conference featured not one, but two great keynotes.  On the first day author and developer Jeremy Keith, who was also in town for An Event Apart Boston, presented a session in which he encouraged a return from social media publishing to independent publishing.  It was a great reminder that the web was ham radio before it was cable, and can still be so.

Learning about the IndieWeb principals from @adactio at @D4DBoston #D4DBoston #Drupal pic.twitter.com/nRDdbCKsld

— Jim Birch (@thejimbirch) June 27, 2018

Exploring the New Drupal Front-end with JavaScript – Dries Buytaert

The second keynote was given by founder and project lead of Drupal, Dries Buytaert.  Dries was the keynote at the very first Design 4 Drupal, so it was a special treat have have him back for the tenth anniversary.  His presentation reviewed the history of JavaScript on the web, API-first vs. API-only approaches and gave behind-the-scenes insights into Drupal’s JavaScript modernization initiative.

Exploring the New Drupal Front-end with JavaScript at @D4DBoston with Drupal founder @Dries #Drupal #Design4Drupal #D4DBoston pic.twitter.com/Eonj7XdAQz

— Jim Birch (@thejimbirch) June 28, 2018

The Sessions

Thanks to Kevin Thull, and the organizers of Design 4 Drupal,  most of the presentations were recorded and are available to anyone at Design 4 Drupal’s YouTube channel.  There was a broad mix of different types of sessions that covered developers, designers, User Experience (UX), Accessibility (A11Y), and Tools.  Below are some highlights of the sessions I went to.

Web Accessibility Tips and Tools – Abby Kingman

Abby gave a session that was near and dear to my heart.  We are always learning about how to make our websites more accessible, and Abby’s presentation covered where to find current guidelines and specifications, and then when onto to review tools for testing.  There are lots of great links to follow from this one.

This session validated the approach we take at Kanopi to accessibility in design and development.  A lot of the tools and testing techniques were all part of our processes, and I look forward to exploring the ones I didn’t know about!

Great detailed presentation reviewing the guidelines, laws, and tools used in Web Accessibility #a11y by @abbykinz at @D4DBoston !#Drupal #design4drupal #d4dboston pic.twitter.com/8QFlI4KFvx

— Jim Birch (@thejimbirch) June 28, 2018

Webform Accessibility – Jacob Rockowitz

Jacob is the current maintainer and a prolific blogger and thought leader in the Drupal-sphere.  We penned an article in advance of this presentation where he reviewed his thought process, and recorded his presentation.  My favorite takeaway from this presentation was:

“Learning about accessibility can be overwhelming. We don’t have to be accessibility experts. We just need to care about accessibility.”

Kanopi has a long history of both building new and retrofitting existing sites to be WCAG compliant.  This presentation showed me that our approach, ongoing learning and iteration have us on the right track.

“Learning about accessibility can be overwhelming. We don’t have to be accessibility experts. We just need to care about accessibility.” – @jrockowitz on working on #a11y in the Drupal Webform module at @D4DBoston #D4DBoston #Drupal #Design4Drupal pic.twitter.com/nEDIMdtyhs

— Jim Birch (@thejimbirch) June 27, 2018

Variable Fonts and Our New Typography – Jason Pamental

I’m a big fan of Jason’s body of work, from his book, Responsive Typography: Using Type Well on the Web, to his blog, and frequent appearances on the Talking Drupal Podcast.

Jason’s deep knowledge of typography truly shows in this presentation that gives a brief history of type, how it moved from paper to the screen, and how the future of digital typography will be with variable fonts.

I look forward to exploring more about variable fonts with the designers at Kanopi.  The design possibilities, and the performance gains make these new tools very attractive.

“Type is never neutral” and “Type is how we hear what we read”. Great thoughts from Variable Fonts and Our New Typography by @jpamental at @D4DBoston #Drupal #Design4Drupal #D4DBoston pic.twitter.com/v7TXhAFXRD

— Jim Birch (@thejimbirch) June 27, 2018

Building a Living Style Guide with Herman – in Your Sass! – Chris Wells

In this technical presentation, Chris Wells, CTO of Redfin Solutions gave a nice overview of Herman, which uses SassDoc to reads comments in your stylesheets to build a static website  that is your style guide. It is not as extensive as a full blown style guide like Pattern Lab, but can be very useful for smaller teams.

This presentation has me researching simple style guide solutions.  Not every project has the budget or need for a solution like Pattern Lab, but since I already try to comment style sheets and templates, it makes sense to do it in a way that something like Herman or KSS Node can parse.

Building a Living Style Guide with Herman presented by @sceo at @D4DBoston @Drupal #Design4Drupal #D4DBoston #Drupal pic.twitter.com/MOMct6Feld

— Jim Birch (@thejimbirch) June 27, 2018

Thanks!

Thanks to all of the volunteer organizers, especially Leslie Glynn, who was my point of contact before, during, and after the event, and in true New England fashion, made sure I took home some famous Boston cannolis for my mother.  Kanopians help organize a few different conferences across the states including BADcamp and MIDcamp, and we know putting on these conferences is a labor of love, so thank you!

Jul 03 2018
Jul 03

What’s the greatest entrepreneurship lesson that Mediacurrent partner Dave Terry has learned?

In a recent guest spot on the Good People, Good Marketing podcast, Dave weighs in on the evolution of open source technology and shares his path to building a leading Drupal-based agency.

Interview Sound Bites 

Technology should be shared and free. 

Giving back to the Drupal community is embedded in Mediacurrent’s DNA. Dave explains why that’s so important.

Culture is about people and who we hire. I know a lot of companies say that, but it’s really about putting the processes behind how you identify the right people within the company.

A successful company culture attracts great talent while also managing accountability with a distributed team. Here, Dave shares the three tenets of culture at Mediacurrent.

Tune in

Listen to the complete podcast, Episode 47: Interview with Dave Terry, on the Sideways8 blog.

Related Content 
Why Should Companies Support Drupal? | Blog
Drupal Contrib: Why it's Important | Video
Creating a Culture of Giving in Your Organization | Blog
 

Jul 02 2018
Jul 02

Recorded June 21st, 2018

This episode we welcome Jess Snyder who is on the planning committee for GovCon, coming up real soon in August, to talk about… GovCon.. Which is coming up real soon in August. We’ll cover some Drupal News, it looks like Bob has the Pro Project Pick this episode and Ryan will bring it to a close with The Final Bell.

Episode 38 Audio Download Link

Updates:

Jess Snyder from WETA, DC talking about Govcon

  • What do you do at WETA?

  • What got you into Drupal

  • What got you involved in GovCon

  • Any notable keynotes / sessions you’re looking at?

  • ** Random questions **

Drupal News:

Pro Project Pick:

Events: (https://www.drupical.com/?type=drupalcon,regional)

The Final Bell: (Ryan)

Jun 28 2018
Jun 28

The majority of Drupal's underlying code is PHP. As a Drupal developer, the better you know PHP, the better your code will be. In this Acro Media Tech Talk video, Drupal developer Rob Thornton discusses code nesting and how you can optimize your code in order to reduce unnecessary nesting. 

[embedded content]

Code nesting can basically be described as when a block of code is contained within another block of code. If you're code isn't well thought out, you can potentially end up with deep nesting that is both hard to read and difficult to maintain. Aside from reducing difficult to read code and making your code more maintainable, reducing the amount of nesting helps you find bugs and lets other developers contribute to your code easier. Rob uses a number of examples of common nesting scenarios, walking you through how to find and fix them.

If you liked this video, you might also like these posts too.

Contact us and learn more about our custom ecommerce solutions

Jun 27 2018
Jun 27

Community. Sharing. Helping. This is the spirit of Drupal. These things bind us all together. Be a part of it by joining us during Drupal Europe between 10–14 September 2018 in Darmstadt, Germany.

photo credit Susanne Coates @flickr

The track dedicated to Social + Non-Profit will gather ambitious life stories about helping others and projects whose purpose is to invest everything in making the world a better place. You will have the opportunity to meet colleagues from your field of interest and join forces, learn how to use pre-configured Drupal distributions and get inspired by ambitious social impact projects built with Drupal. Also learn how Drupal can be used to ensure accountability, trustworthiness, honesty, and openness to every person who has invested time, money, and faith into a non-profit organization. Talk and share ideas, learn from each other, improve, innovate … and take a leap forward. There are a lot of things you will learn, no matter your technical skill level. From developers to people with a big heart, you will for sure find something that inspires you.

Interested in attending? Buy your ticket now at https://www.drupaleurope.org/tickets.

We are looking for submissions in various topics. Here are some ideas to share your experience on with the rest of the world.

  1. Every nonprofit organization must apply the 3 E’s: Economy, Efficiency, Effectiveness. Economy forces you to handle your project with low budgets, that is almost always the case with non-profit organizations. Efficiency is required also due to low resources available to most non-profit organizations. Effectiveness ensures you get the job done and complete your targets. How are you doing that? What tools and practices ensure this?
  2. We live in a world that is changing every day and technology is a big part of it. What are the new technologies you integrate in social projects? What do you need and what do you find on the market? How drupal is helping you achieve your goals?
  3. Transparency, accountability and full disclosure on operations is a must for all non-profit organizations. People will donate to and support campaigns only if they know exactly where the money goes and how are things handled. This way, they ensure their credibility in front of the world. How do you technically implement this?
  4. A lot of people talk about making the world a better place. But talking is not enough. You have to take action! How do you plan to do it? How do social activities raise the level of engagement in your community? How are people’s lives improved by your actions?
  5. Non-profit is done mainly from the heart. Volunteering is the key word. What are your life stories about helping others, inspirational first hand experiences? Why, what and how did you do it? What drives you? What are your goals?

We look forward to your submission sharing you experience with the other attendees.

See you in Darmstadt!

About industry tracks

As you’ve probably read in one of our previous blog posts, industry verticals are a new concept being introduced at Drupal Europe and replace the summits, which typically took place on Monday. At Drupal Europe these industry verticals are integrated with the rest of the conference — same location, same ticket and provide more opportunities to learn and exchange within the industry verticals throughout three days.

Now is the perfect time to buy your ticket for Drupal Europe. Session submission is only open for a few more days so please submit your sessions and encourage others who have great ideas.

Please help us to spread the word about this awesome conference. Our hashtag is #drupaleurope.

To recommend speakers or topics please get in touch at [email protected].

About the Drupal Europe Conference

Drupal is one of the leading open source technologies empowering digital solutions in the government space around the world.

Drupal Europe 2018 brings over 2,000 creators, innovators, and users of digital technologies from all over Europe and the rest of the world together for three days of intense and inspiring interaction.

Location & Dates

Drupal Europe will be held in Darmstadtium in Darmstadt, Germany — which has a direct connection to Frankfurt International Airport. Drupal Europe will take place 10–14 September 2018 with Drupal contribution opportunities every day. Keynotes, sessions, workshops and BoFs will be from Tuesday to Thursday.

Jun 26 2018
Jun 26

There were a lot of amazing sessions at DrupalCon Nashville 2018, but one of the few sessions that sparked my interest was “PDFs in Drupal” presented by Dan Hansen. In this session, Dan goes through the importance of PDFs, gave a short introduction to some of the more popular PDF rendering libraries, and gave a demo on some tips and tricks that I found very useful for my future projects.

Most, if not all of us, have opened a PDF recently. PDFs are popular because they are universal as a document format and can easily be sent to others without having to worry about whether their machine can open them. Despite this, Dan notes that it feels like PDFs are behind in support, and it would be nice to have better PDF handling in Drupal core - similar to images in media libraries.

PDF Rendering Libraries

This session introduced a handful of popular PDF rendering libraries:

  • Print-to-PDF
  • jsPDF
  • FPDF
  • mPDF
  • TCPDF
  • FPDI
  • DOMPDF
  • Wkhtmltopdf
  • PDFtk

PDFs in Drupal

In Drupal 7, the most popular module for generating PDFs is the Print module - but does not support Drupal 8. Fortunately, there are options available for Drupal 8:

  • Printable - based on the Print module to allow generation of PDFs. It relies on the PDF API, which is currently not stable.
  • Entity Print (recommended) - allows for printing any Drupal entity or View (D8 only) to PDF. This module provides flexibility with PDF rendering libraries and is more lightweight compared to the Print module and has a stable release for both D7 and D8.
  • FillPDF - allows for filling PDF with values. This module can be used with the PDFtk library or a paid third-party service, and can help in reducing overhead of rendering PDFs.
     

Tips and Tricks

I found Dan’s demos to be the most interesting - as he showed some code examples of various (and seemingly common tasks) related to PDFs. The following examples from Dan’s session shows how simple and straightforward it is to work with PDFs:

Making a PDF from HTML

A custom controller can simply return the following output:

$dompdf = new Dompdf();
// Pass the HTML markup.
$dompdf->loadHtml($markup);
// Render the HTML as PDF.
$dompdf->render();
// Stream the generated PDF back to user via browser.
$dompdf->stream();

Combining 2 PDFs

Using the PDFtk library:

$pdf = new Pdf([ 
  'A' => '/path/file1.pdf', // A is alias for file1.pdf 
  'B' => ['/path/file2.pdf','pass**word'], // B is alias for file2.pdf ]);
$pdf->send();

Notice that you can specify a password for the PDF file (if there is one). You can also extract specific pages from the PDF files as well:

$pdf->cat(1, 5, 'A') // pages 1-5 from A 
  ->cat(3, null, 'B') // page 3 from B 
  ->cat(7, 'end', 'B', null, 'east') // pages 7-end from B, rotated East 
  ->cat('end', 3,'A','even') // even pages 3-end in reverse order from A ->cat([2,3,7], 'C') // pages 2,3 and 7 from C    
  ->saveAs('/path/new.pdf');

More of these examples can be found at https://packagist.org/packages/mikehaertl/php-pdftk.

Fill in a PDF Template

Using the FillPDF module:

$pdf = new Pdf([‘PATH_TO_PDF’]);
$pdf->fillForm([
  ‘name_of_text_field’ => ‘Some value’
])
->needAppearances()
->send(‘filled.pdf’);

I really enjoyed and learned a lot of useful tips from Dan’s session, and I encourage anyone who is looking to work with PDFs in Drupal to check out the session.

Related Content:
Accessibility: Let's Talk PDFs | Blog
Top Drupal 8 Modules | Blog
Mediacurrent Top Drupal 8 Modules: Drupalcon 2018 Edition | Blog

Jun 26 2018
Jun 26

The Block field module lets you insert a Drupal block as a field on your content.

A Drupal theme is divided into regions and you can place blocks or your own custom blocks into these regions. You accomplish this task by dragging and ordering blocks in the “Block Layout” screen. That means you can append blocks before or after the main content of your content type. This “Block Layout” screen will soon be cluttered if you have multiple content types and/or multiple single nodes, each one with a different custom block. Block Layout Screen

However, there’s a way to insert a block (or many blocks) directly into your content as a field. Thus, you don’t have to place the block in the “Block Layout” screen, instead, you insert the block as a field on the node.

Blocks as fields

In this tutorial, we’re going to cover the usage of the Block field module. Let’s start!

Installing the Block Field module

The first thing you’ll need to do is install the Block field module.

Using Composer:

composer require drupal/block_field

This module has no dependencies, so just install it and you’re good to go.

Enable Block Field module

Add Block Field to Content Type

Let’s start by adding a block field to the Article content type.

1. Go to Structure, “Content types”, click “Manage fields” on the Article now, then click on “Add field”.

2. Select “Block (plugin)” under the reference category and give it a proper label.

Adding a block field

3. Leave the “Allowed number of values” set to 1 and click “Save field settings”.

On the edit screen, you can add a default value for this field. There’s also a list of all blocks, which you can select or deselect in order to make them available to this particular field (custom blocks will appear here too). Click “Save settings” once again.

Availbale Blocks

Test Block Field

Click Content, “Add Content”, Article in order to create a test article. Choose a system block (for example the “Powered by Drupal” block) from the drop-down and click Save.

Creating content with a block field

You’ll see the selected block inserted as a field in the content. This block won’t appear in the “Block Layout” screen.

Block field inside the content

You can also use this module to insert a view via a block inside the content. Edit the article and select the “Who’s online” block under the “Lists (Views)” category, click Save and you will see this block in your content.

Please note: you won’t be able to use Views contextual filters with the Block Field module.

Display a view via a block in a field

Summary

The Block Field module lets you insert not only block views into a field but all types of blocks, even custom blocks. This allows the creation of complex content types and/or nodes and provides great flexibility when building a site.

Jorge Montoya

About Jorge Montoya

Jorge has been reading about Drupal and playing with it for almost 5 years. He likes it very much! He also likes to translate German and English documents into Spanish. He's lived in some places. Right now, he lives in the city of Medellín in his homeland Colombia with his wife and son.

Jun 26 2018
Jun 26
photo: Paul Johnson @ flickr

Drupal is our business.

Regardless of being a freelancer, a two person shop or a hundred plus agency, Drupal is vital to our success in growing and supporting our business.

The business ecosystem is changing rapidly, thereby making it a necessity for agency leaders, managers and advisors to focus on a multitude of challenges and opportunities.

Understanding how the marketplace is evolving, driving innovation, fostering the right company culture, and adopting efficient project management methodologies, are all challenges faced by businesses today.

We all want to transform our business by working with the smartest team, create and deliver amazing projects, and have ideal customers lining up to work with us.

Any Drupal conference cannot be complete without in-depth discussions and debates about these challenges and more.

What is this track about?

The Agency Business track will provide insight, support and real stories from people running businesses and managing projects. Learn about other people’s experiences, and get tips and ideas on how to tackle the challenges faced in your business or project.

Come to this track to learn / speak about

Photo: Michael Cannon @ Flickr

Agency growth

Growing and scaling your business can be a tricky and daunting task. We need to consider strategies for how to grow our businesses, and how to do so sustainably.

With increased competition from both other agencies and other platforms, we need to look at not only how we generate new leads for our businesses, but how do we convince potential clients that Drupal is the best, that we are the best?

Leadership and Culture

What is the right company culture for my business? How can I better lead my agency through the challenges ahead? How can I provide good leadership to my team? How can we grow and scale our business, without losing our company culture along the way? These are just some of the questions we will look to answer in the Agency Business track.

Operations

Project management is a bit of a juggling act, with many different needs and tasks that need to be taken care of simultaneously. We’re always on the look-out for ways to increase a project’s effectiveness and efficiency, while reducing the risk of it getting out of control. Let’s share our experiences and ideas on how we can improve project planning, better manage timelines & budgets, and keep staff motivated, while all the time keeping clients happy and engaged in the process.

Diversification

Markets change faster and faster, so does our market. We need to adapt our products and offering to stay competitive and minimize our business risks. Perhaps it means diversifying your service offerings, perhaps it means developing a product, perhaps it means extending into new markets or verticals. However, we also need to consider how to keep clients happy and how to continue to meet their changing needs through innovation and/or diversification.

How to get involved

At Drupal Europe, we want to ensure that attendees get the most from this track through highly valuable and insightful sessions. We are looking for speakers to openly and honestly share stories about their challenges and how they solved it. We want to hear about your experiments, successes and failures, process discoveries, strategies, and tactics. We want real-life learnings, supported by facts and figures — prove to us that your way is best.

Want to submit a session under Agency Business Track?

Session submissions are open and will close on 30 June 2018.

Whatever your experience is, whether it be running a small 2 person operation or scaling to 30 and beyond, or managing projects and project teams, we want to hear from you. Your experience and insight is invaluable and we know others will think so too.

Come to Drupal Europe and share your experiences with us — submit a session to the Agency Business track today!

Know a great speaker?

Do you know someone who could be a great speaker? Or perhaps you know someone who has an interesting story to share? If so, please get in touch with the program team at [email protected].

And don’t forget to help us to spread the word about this awesome conference. Our hashtag is #drupaleurope.

We look forward seeing you in Darmstadt!

About Drupal Europe Conference

Drupal is one of the leading open source technologies empowering digital solutions in the government space around the world.

Drupal Europe 2018 brings over 2,000 creators, innovators, and users of digital technologies from all over Europe and the rest of the world together for three days of intense and inspiring interaction.

Location & Dates

Drupal Europe will be held in Darmstadtium in Darmstadt, Germany — with a direct connection to Frankfurt International Airport. Drupal Europe will take place 10–14 September 2018 with Drupal contribution opportunities every day. Keynotes, sessions, workshops and BoFs will be from Tuesday to Thursday.

Jun 26 2018
Jun 26

This is part one of a two-part series on transitioning to HTTPS

For some time, major internet players have advocated for a ubiquitous, secure internet, touting the myriad benefits for all users and service providers of “HTTPS everywhere”. The most prominent and steadfast among them is Google. In the next week, continuing a multi-year effort to shepherd more traffic to the secure web, Google will make perhaps its boldest move to date which will negatively impact all organizations not securely serving their website over HTTPS.

To quote the official Google Security Blog

Beginning in July 2018 with the release of Chrome 68, Chrome will mark all HTTP sites as “not secure”

Chrome insecure message for HTTP
Google blog

Given the ambiguous “in July 2018”, with no clearly communicated release date for Chrome 68, it’s wise to err on the side of caution and assume it will roll out on the 1st. We have readied our partners with this expectation.

So what does this mean for your organization if your site is not served over HTTPS? In short, it’s time to make the move. Let’s dig in.

What is HTTPS?

HTTP, or HyperText Transfer Protocol, is the internet technology used to communicate between your web browser and the servers that the websites you visit are on. HTTPS is the secure version (s for secure) which is served over TLS: Transport Layer Security. What these technical acronyms equate to are tools for internet communication that verify you’re communicating with who you think you are, in the way you intended to, in a format that only the intended recipient can understand. We’ll touch on the specifics in a moment and why they’re important. Put simply, HTTPS enables secure internet communication.

Why secure browsing matters

Leaving aside the technical details for a moment and taking a broader view than communication protocols reveals more nuanced benefits your organization receives by communicating securely with its audience.

HTTPS improves SEO

Since Google accounts for 75-90% of global search queries (depending on the source) SEO is understandably often synonymous with optimizing for Google. Given their market domination, competitors are taking queues from Google and in most cases it’s safe to assume what’s good for SEO in Google is good for optimizing competing search engines.

In the summer of 2014, Google announced on their blog that they would begin to favorably rank sites who used HTTPS over HTTP. It’s already been nearly four years since we’ve known HTTPS to be advantageous for SEO. Since then, Google has consistently advocated the concept of HTTPS ubiquity, frequently writing about it in blog posts and speaking about it at conferences. The extent to which serving your site over HTTPS improves your SEO is not cut and dry and can vary slightly depending on industry. However, the trend toward favoring HTTPS is well under way and the scales are tipped irreversibly at this point.

HTTPS improves credibility and UX

Once a user has arrived at your site, their perceptions may be largely shaped by whether the site is served over HTTP or HTTPS. The user experience when interacting with a site being served over HTTPS is demonstrably better. SEMrush summarizes well what the data clearly indicate; people care a great deal about security on the web. A couple highlights:

You never get a second chance to make a first impression.

With engaging a participant of your target audience, you have precious few moments to instill a sense of credibility with them. This is certainly true of the first time a user interacts with your site, but is also true for returning users. You have to earn your reputation every day, and it can be lost quickly. We know credibility decisions are highly influenced by design choices and are made in well under one second. Combining these two insights, with the visual updates Chrome is making to highlight the security of a user’s connection to your site, drawing the user’s attention to a warning in the URL bar translates to a potentially costly loss in credibility. Unfortunately it’s the sort of thing that users won’t notice unless there’s a problem, and per the referenced cliché, at that point it may be too late.

Browsers drawing attention to insecure HTTP

Much like search, browser usage patterns have evolved over the last five years to heavily favor Google Chrome. Therefore, what Google does carries tremendous weight internet-wide. Current estimations of browser usage put Chrome between 55% and 60% of the market (again, depending on sources). Firefox has followed suit with Chrome as far as HTTP security alerts go, and there’s no indication we should expect this to change. So it’s safe to assume a combined 60-75% of the market is represented by Chrome’s updates.

Google Chrome HTTP warning roll out

Google (and closely mirroring behind, Firefox) has been getting more stringent in their display of the security implications of a site served over HTTP (in addition to sites misconfigured over HTTPS). They’ve shared details on the six-step roll out on their general blog as well as on a more technical, granular level on the Chrome browser blog.

In January 2017, they began marking any site that collects a password field or credit card information, served over HTTP as subtly (grey text) not secure.

Chrome insecure message for HTTP
Laravel News

Then, in October 2017, they tightened things up so that a site that collected any form information over HTTP, would have the same “not secure” messaging. They added the more action-based aspect of showing the warning on the URL bar when a user entered data into a form. This is an especially obtrusive experience on mobile due to space constraints, which more deeply engages the user cognitively as to exactly what is unsafe about how they’re interacting with the site.

Chrome insecure message for HTTP
Google blog

Next, in July 2018, all HTTP sites will be marked as not secure.

In September 2018, secure sites will be marked more neutrally, removing the green secure lock by default connoting a continuing expectation that HTTPS is the norm and no longer special.

Chrome insecure message for HTTP
Google blog

In October 2018, any HTTP site that accepts any form fields will show affirmatively not secure with a bold red label, much like a misconfigured HTTPS site does now.

Chrome insecure message for HTTP
Google blog

Though they haven’t yet announced a date, Google intends to show affirmatively not secure for all HTTP sites. The drive is clearly to establish the norm that all the web traffic should be served over HTTPS and that outdated HTTP is not to be trusted. This is a pretty strong message that if Google has their way (which they usually do) HTTPS will inevitably be virtually mandatory. And inevitably in internet years, may be right around the corner.

HTTPS vastly improves security for you and your users

Returning to the technical, as mentioned previously, HTTPS helps secure communication in three basic ways.

  • Authentication “you’re communicating with who you think you are”
  • Data integrity “in the way you intended to”
  • Encryption: “in a format that only the intended recipient can understand”

What authentication does for you

In order for the browser to recognize and evaluate an HTTPS certificate, it must be verified by a trusted certificate authority (CA). There are a limited amount of CAs who are entrusted to distribute HTTPS certificates. Through public-key cryptography, a fairly complex but interesting topic, through inherent trust in the CA who has provided the HTTPS certificate for a given site, the browser can verify any site visitor is positively communicating with the expected entity with no way of anyone else posing as that entity. No such verification is possible over HTTP and it’s fairly simple to imagine what identify theft would be possible if you were communicating with a different website than you appeared to be. In the event any of the major browsers cannot validate the expected certificate, they will show a strong, usually red warning that you may not be communicating with the expected website, and strongly encourage you to reconsider interacting at all.

Chrome misconfigured HTTPS

Therefore, the authentication gives your users the confidence you are who you say you are, which is important when you’re engaging with them in any way whether they’re providing an email, credit card or simply reading articles.

How data integrity helps you

Ensuring perfect preservation of communication over the internet is another guarantee HTTPS provides. When a user communicates with a website over HTTPS, the browser takes the input of that communication and using a one-way hashing function creates a unique “message digest”: a concise, alphanumeric string. The digest may only be reliably recreated by running the exact same input through the same hash algorithm irrespective of where and when this is done. For each request the user makes to the website, the browser passes a message digest alongside it and the server then runs the input it receives from the request through the hash algorithm to verify it matches the browser-sent digest. Since it is nearly computationally impossible to reverse engineer these hash functions, if the digests match, it proves the message was not altered in transit. Again, no such data integrity preservation is possible over HTTP, and there is therefore no way to tell if a message has been altered en route to the server from the browser.

What encryption does for you

Communicating over an unencrypted HTTP connection allows for some easily exploitable security risks in the case of authentication to a site. To demonstrate how easy it can be to take over someone’s account on an HTTP connection, a tool called Firesheep was developed and openly released in mid 2010. Major social media platforms Facebook and Twitter were both susceptible to this exploit for some time after Firesheep was released. The identity theft is carried out through a means called session hijacking. With Firesheep installed, a few clicks could log you in as another user who was browsing over WiFi nearby on any HTTP website. This form of session hijacking is possible when the authentication cookies, small identifying pieces of information that live in your browser while you’re logged into a site, are transmitted to the server on each request over HTTP. Over WiFi these messages are broadcasted into the air in plain text, and can be picked up by anyone listening. HTTPS prevents this since the communication is encrypted and unintelligible to eavesdroppers.

In the example of a CMS like Drupal or any other system in which there is a login, if an administrator with elevated site permissions is logged in over HTTP, they’re subject to the same risk if that traffic is monitored or “sniffed” at any point along its path from the browser to the server. This is especially easy over WiFi but is not relegated to only WiFi. The cookies are sent to the server upon every request, regardless of whether or not the user entered their password during the active session or not. Depending on the admin’s privileges, this access can be easily escalated to complete control of the website. Encryption is a big deal.

HTTPS is required for the modern web

One of the more promising developments of the last few years, is the pervasiveness and effectiveness of Progressive Web Apps (PWAs). PWAs is the name coined for a set of technologies that provide a feature-set for mobile browsing akin to native applications, yet is entirely served through the web browser. PWAs require all communication to be done over HTTPS. Some of the possibilities with PWAs that were previously relegated to native applications only are:

  • Providing content and services based on the user’s location data
  • Providing interaction with the user’s camera and microphone within the browsing experience
  • Sending push notifications
  • Serving off-line content

If you aren’t taking advantage of any of these features that are possible through PWAs, it’s something your organization should strongly consider to further engage users. Before the ambitions to be on feature parity with native applications are fully borne-out, PWAs will continue to evolve the power of layering deeper engagement with users on top of your existing mobile experience with minimal effort. PWAs simply do not work over HTTP. HTTPS is required to open the door to their possibilities.

Barriers to HTTPS have been lifted

Historically, considering a move to HTTPS has been held back by some valid concerns for webmasters whose job it was to select where and how their websites were hosted. A few of the fundamental apprehensions could be categorized as:

  • No perceived benefit. People often assumed if they weren’t collecting financial or personal information, it wasn’t necessary. We’ve covered why holding this belief in 2018 is a misconception. Savas Labs made the move in July 2017 to serve exclusively over HTTPS for our statically-generated Jekyll website even though at the time we had no forms or logins.
  • Performance costs. We know reducing latency is crucial for optimizing conversions and HTTPS does require additional communication and computation. However, with the broad adoption of the HTTP/2 protocol over the last few years, HTTPS now usually outperforms HTTP.
  • Financial costs. HTTPS was too complex and costly to implement for some. Large strides have been made across many hosting providers who now bundle HTTPS into their hosting offerings by default, often at no additional cost. Let’s Encrypt, a relatively new and novel certificate authority, first began offering free certificates (which they still do) and then made it easy to automatically renew those certificates, helping to ease the burden and cost of implementation.

We cover each of these in more detail in part two that will help guide you on how to make the move to HTTPS.

Conclusion

To revisit Google’s announcement:

Beginning in July 2018 with the release of Chrome 68, Chrome will mark all HTTP sites as “not secure”.

Interpreting that and providing our perspective:

You’re not part of the modern web unless you’re exclusively using HTTPS.

A bold, if slightly controversial statement, but for ambitious organizations like the folks we’re fortunate enough to work with each day, HTTPS-only is the standard in mid 2018 and beyond. Given the benefits, the lifted previous barriers, and the opportunity for the future, very few organization have a good reason not to exclusively serve their sites over HTTPS.

Have we convinced you yet? Great! Read part two for some assistance on how to make the move.

Additional resources

Jun 25 2018
Jun 25

Drupal Europe is both a technology conference and a family reunion for the Drupal community. Bringing together 1600+ attendees, it is the largest community driven Drupal event taking place on the European continent this year. For anyone connected with Drupal this is a unique opportunity to share your experience, learn, discuss, connect and contribute back to the community.

Being a community driven conference, we wanted to focus on real life case studies and not the usual technology driven structure. So we’ve introduced industry tracks which focus on specific industry sectors.

Calling all college, university, and educational Drupalers!

Photo with CCO licence via Pexels.com from StartupStockPhotos

The Higher Education track is for anyone using Drupal or thinking of migrating to Drupal at a college or university who is looking to connect with other Higher-Ed Drupal users.

If you have experience of delivering Drupal solutions in the higher education sector or are looking for inspiration on how you continue to develop your CMS further, this is the right track for you.

Be inspired

Drupal is a popular choice in higher education, and many of us are using it in creative and inventive ways. With Drupal 8, the opportunities for exploration and experimentation expand even further — from headless Drupal to top-tier configuration management. Let’s showcase our successes and best-practices with Drupal 8!

We know many universities are still on Drupal 7 and are keen to migrate to Drupal 8, so come to share what works for you and see wins from your peers.

Share your experience

Photo with CCO licence via Pexels.com from StatusStockphoto

Have you launched a Drupal 8 project recently that you are proud of? Started a campus Drupal users group and have tips for others looking to create their own? Developed a great user support model for your content editors? Conquered decoupled Drupal with your frontend stack? Share your awesome projects and lessons learned with your peers.

Target groups:

  • Education sector

Example talks:

Photo with CCO licence via Pexels.com from Pixbay
  • Drupal in a Day (how Global Training Days got to be a localized event)
  • From CMS to LMS
  • Web accessibility in higher education
  • GDPR and childrens information
  • Javascript for higher education
  • Migration from Drupal 7 to 8
  • How Drupal 8 API-first helps to 
    integrate with existing IT-Infrastructure
  • Build your own Drupal Community

Session submission is open and we ask you to submit interesting session proposals to create an awesome conference. Session proposals are not limited to Drupal and all topics in relationship with Higher Education are welcome.

Please also help us to spread the word about this awesome conference. Our hashtag is #drupaleurope.

If you want to participate in the organisation or want to recommend speakers or topics please get in touch at [email protected].

About Drupal Europe Conference

Drupal is one of the leading open source technologies empowering digital solutions around the world.

Drupal Europe 2018 brings over 2,000 creators, innovators, and users of digital technologies from all over Europe and the rest of the world together for three days of intense and inspiring interaction.

Location & Dates

Drupal Europe will be held in Darmstadtium in Darmstadt, Germany — with a direct connection to Frankfurt International Airport. Drupal Europe will take place 10–14 September 2018 with Drupal contribution opportunities every day. Keynotes, sessions, workshops and BoFs will be from Tuesday to Thursday.

Jun 25 2018
Jun 25
Cybercrime is a growing problem in today's community.


In 2017 alone, we saw the record-breaking attack of the WannaCry ransomware scandal bring plenty of businesses to their knees. In fact, the spread of the ransomware across 150 countries caused many professionals in the industry to call it the biggest offensive in cybercrime history.

As the digital world continues to evolve, it brings with it countless new opportunities for success, along with various vulnerabilities that modern entrepreneurs need to be aware of. If you're launching an eCommerce company in the current marketplace, you'll need more than just a solid USP and a great marketing strategy - you'll need a CMS (content management system) solution that keeps your online presence secure.

After all, if customers don't believe that your site is secure, then they're not going to buy your products. It doesn't matter if you sell the best products or services in the world, no-one's going to risk giving their details to an unsecured store.

Drupal Commerce is one of the most trusted and secure eCommerce solutions on the web today - used by organizations across the globe. To help boost your chances of a successful eCommerce venture, we're going to explore some of the ways you can develop a more secure site with Drupal Commerce.

1. Commit to Regular Updates

Updates are key to any cyber-security strategy. When a company rolls out an update for a piece of software, they're not just giving you new features to play with, they're also delivering bug fixes and patches that can protect your system against vulnerabilities that exist within the network.

People often postpone updates because they get in the way of day-to-day tasks. However, this could mean that you're leaving entry windows open for people who want to worm their way into your files. The developers of Drupal and Drupal Commerce regularly release timely updates that fix the security issues in your CMS. You should be making sure that these updates are being applied when they become available, either by doing it yourself or by contracting a company to do them.

2. Use the Right Login and Password Security

The login page at the front of your eCommerce site acts as the door to your organization. The best way to protect your future is to fortify that door with the correct security measures. While a great password and username is a great way to get started, some statistics suggest that around 35% of users have weak passwords - and many of the remaining 65% can still be cracked.

With Drupal as your CMS, from the moment you first install the system, the passwords on your database get encrypted and "salted". This means that your password becomes almost impossible to hack. Additionally, Drupal offers a range of user-contributed modules thanks to its open-source framework, which support everything from SSL certificates to two-factor authentication.

3. Configure Your eCommerce Access Controls

When it comes to securing your eCommerce site, there's only so much any CMS can do to help you. At the end of the day, you'll need to make sure that you're making full use of the control systems that solutions like Drupal provide to give you absolute authority over the accounts that have access to your website. For instance, a blog account might have access to write content on your site, but not change the price of products.

Drupal and Drupal Commerce have a range of access controls that allow you to choose authority levels throughout your website. You can create categorized accounts for specific parts of your website, ensuring that each person gets only the permissions they need and no more. Essentially, this reduces the risk of human error as you add more people to your eCommerce team.

Demo Drupal Commerce today! View our demo site.

4. Stay Ahead of the Curve

One of the most important things you can do to protect your eCommerce site is make sure that you're always aware of the latest DDoS attacks, web issues, and attacks facing your industry. The more you know about the threats you're facing every day, the easier it will be to establish a strategy that helps you to fight back against them.

Drupal helps eCommerce site owners to stay ahead of the curve, with one of the largest communities in the world, packed full of more than 1 million strategists, developers, and designers. This kind of attention ensures that any time an issue or error in the code might be subject to a new attack, it's duly reported and dealt with.

Aside from the support of the community, Drupal users can also access the "Status Report" function on their UI, which keeps you updated on any problems with the code on the site, this is the easiest way to keep on top of your website management, and make sure you're not falling behind on security.

5. Enable the Right Security Modules

When you want to make your home more secure, you add new alarm systems, motion detectors, and even locks. On the other hand, when you want to keep your eCommerce site safe, Drupal offers modules to help you accomplish specific security measures. For instance, there's:

  • The password policy module: This allows you to establish a specific rule for passwords. You might demand that all passwords have one capital letter and special character for instance.
  • Username enumeration modules: This module ensures that hackers can't gain access to your site by constantly trying to guess usernames.
  • Automated logout: This module allows you to choose a time limit for user sessions. If someone remains inactive for too long, they'll be automatically logged out.
  • Honeypot: This module helps to eliminate spam-bots from using website forms and spamming your administrators and users.

6. Add HTTPS

Building a secure Drupal Commerce website doesn't just protect your business from attacks, it can also give you a better reputation in your chosen eCommerce industry. Adding the HTTPS certificate to your Drupal hosting set-up is a great way to deliver that peace of mind. 

HTTP Secure is what you get when you install an SSL certificate onto your website server. It ensures that cybercriminals can't intercept and tamper with the information sent back and forth between you and a customer. Aside from the obvious protection they offer, SSL certificates give you those little green padlocks next to your URL that leaves your customers feeling confident and help you to achieve a better ranking in the search engines. 

In fact, there's no reason not to include and SSL certificate now since you can do this for free through Let's Encrypt!

Secure eCommerce Starts with a Trusted CMS

In the ever-evolving world of website security, it's difficult to guarantee any company's security. One moment, it can seem as though everything is running smoothly, then the next, you're struggling to retrieve your details from a hacker. The only thing you can do is take every possible step to protect yourself from an intrusion. 

Drupal, with its security modules, password protection, and state-of-the-art open-source software has earned the trust of everyone from industry giants to government agencies. The websites of UNESCO, the White House, Fox News, and Harvard University are all built on the Drupal framework. After all, just because it's open-source doesn't mean that Drupal isn't secure. Every module contributed by a user is thoroughly reviewed by the community. Drupal also has a dedicated security team that is always leading the security initiative. 

With this safety-first approach, Drupal ensures that every eCommerce site you build has the best chance of standing strong against attackers and delighting your customers in the process. 

Contact us and learn more about our custom ecommerce solutions

Raj Jana is the CEO and founder of the JavaPresse Coffee Company. As an eCommerce entrepreneur, Raj knows a thing or two about running a secure website, and he's always looking for new ways to keep his customers safe.

Jun 22 2018
Jun 22

Happy Friday everyone! In this episode Front End Developer, Grayson Hicks, joins the show to answer some questions about Gatsby.js, an up and coming static site generator that you can use with Drupal or any other data source. Video and full transcription below. 

[embedded content]

Jun 22 2018
Jay
Jun 22

One of the many things Drupal excels at is integrating with other services. Some popular integrations are made even easier by the existence of contributed Drupal modules (such as the one for Google Analytics). But, many times, there isn't a ready-built solution, or the one that's available doesn't quite suit your needs. At Ashday, we've built many integrations between Drupal and other systems, and although every integration is different, there are some things we've learned that are good to consider when writing most any integration.

Do You Need to Build Something New?

Even if there is an existing Drupal module for the integration that you need, it's important to determine if you actually want to use it. Sometimes, integration modules are written for certain use cases, which may not line up with exactly what you need the integration to do, and could in fact make things more complicated for you than not using a pre-existing module at all.

For instance, when we created a Solr integration, we had to evaluate whether to use the existing Search API Solr module or write our own integration, and in the end, we decided to write our own. Why? Because when we were evaluating our options, we found that the module wasn't designed to index the content of our site in the particular way we needed it to, and it also didn't quite properly support the "More Like This"-style search which Solr itself is capable of. Because of this, we only had a few options.

  • We could modify our site to fit the capabilities of the module, but most of the time, as in our case, that isn't really an option.
  • We could use the module and try to adapt it to our needs. This would mean not only installing the module and figuring out how to use it, but also digging into how it was written in order to make changes to it. With some modules, such as Search API Solr, this can be a pretty big task. It's a large, complicated module, and figuring out where and how to make the necessary changes can be a challenge… and that's assuming that the module is structured in such a way that the changes are easy to make.
  • We could write an entirely custom integration.

We wound up going with the third option. "But wait," you might wonder, "If the existing module is so complicated, wouldn't writing your own integration mean needing to create all of that same stuff yourself?"

And the answer to that is no. Whereas a Drupal contrib module usually has to be designed in a rather generic way, so that it can suit as many people's disparate needs as it possibly can, an entirely custom-coded integration can focus solely on the functionality that is needed for your particular use case. Just because the system you're integrating with supports something, doesn't mean you necessarily have to. With the Solr integration in particular, building exactly what we needed as a custom integration wound up being much faster and more maintainable than it probably would have been to take the existing module and modify it to suit our needs.

How Much Drupal Should You Use?

Depending on what sort of integration you're writing, one thing to consider is how much it should rely on Drupal. Drupal includes numerous powerful tools, such as its service architecture and Form API, which can be leveraged to make your integration more flexible, and even save time in the process. At the same time, tying your integration to Drupal makes it less portable. For instance, if you were creating an integration which would display content from a remote system on several different websites, and only some of those sites actually use Drupal, then it may be worth creating the integration mostly using regular ol' PHP. By creating the integration as a set of classes or functions that don't depend on Drupal, each site can then use those same functions, and add just the few things that are needed to make them work well on that site; for instance, a Drupal site may need routes and permissions set up "the Drupal way" separate from the reusable part of the integration.

Creating a more generic integration in this way can also make it more compatible between different versions of Drupal. If your main integration doesn't rely on Drupal, it will work on Drupal 7, 8, 9, and even more, with only the minimal code to make it actually work on each site needing to be changed between them.

Another example: Say you're integrating with an analytics tool. You could either create a complex configuration form that fits into the Drupal administration interface and uses Drupal's form API, config system, and permissions, or you could just hardcode the specific settings you need somewhere in your codebase. The former way is the more "Drupal" way of doing it and is more configurable (this is the approach taken by the Google Analytics module), but the latter method can be very fast to code, and if you only rarely need to change anything about your configuration, it can be a great time savings over creating a complex Drupal-specific integration module.

Can This Be Contributed?

If your integration ultimately does make use of Drupal's helpful built-in features, great! Now it's time to ask "Can I contribute this module so that other other people can use it on their own Drupal sites?" If you are integrating with some tool or remote system that other people might also want to integrate with in a similar fashion, then the answer should probably be yes.

Contributing your custom integration module to Drupal.org can have a number of benefits—not the least of which is that it is an excellent way to give back to the Drupal community as a way to say "thank you" for all they've done to make Drupal possible. But there's another benefit as well: If you contribute a custom integration module, that means that other developers may start using it, looking at the code, and making improvements to it. Maybe your integration is fairly small, and only accounts for the handful of use cases that you need. The next person who uses the module thinks that something could be added or changed to improve the module. Suddenly, you have other people making suggestions or even writing code for you, which you can then benefit from by including such changes in the integration you're already using.

This is what we did with our HelloSign module for creating eSignature requests; we built it for one site which needed such functionality, then we contributed it. Once we did so, other developers started looking at the module and helping us to make all sorts of improvements to the module, and some even submitted patches to help improve the module. We also got free t-shirts out of it, so hey, who knows what might happen when you contribute!

Conclusion

There are many things to think about when you're building a Drupal integration, and although I've only written about three of them today, I think they are the guiding questions which help to determine how your integration is built. If you can answer all three of these questions before getting too far into writing your integration, you're much more likely to end up with a system in place that will serve you well long-term, even if the exact requirements for the integration eventually changes—which, in my experience, they almost certainly well.

If you need to integrate Drupal with anything, talk to us! We can integrate just about anything with Drupal

Jun 22 2018
Jun 22

The e-commerce industry continues to grow rapidly year over year, bringing more merchants online and driving larger profits. With that growth comes the increased need for rich content, innovative product merchandising, and integration into an ever increasing number of third party sales, marketing, and fulfillment tools. Drupal has always excelled as a platform for building unique customer experiences, and it continues to come into its own as an adaptive sales platform via projects like Drupal Commerce.

Photo by Mike Petrucci on Unsplash

This track includes content that helps merchants understand how to start and grow their online businesses, demonstrates to developers how to build ambitious e-commerce sites, and incorporates solution providers who improve the whole process via integrations.

In the e-commerce track you will learn how to start to sell online, how to grow your existing business and reach a wider audience, and the best tools to use for developing your platform.

The track is focused on the following topics:

  • Drupal vs other e-commerce solutions: comparison, the cost of entry and scale
  • What competitive advantages does Drupal bring to online merchants?
  • What are the benefits of Drupal-native eCommerce solutions vs. integrating external systems?
  • Case studies for unique or ambitious implementations of Drupal for e-commerce
  • Latest trends in eCommerce (e.g. payment, fulfillment, security, taxes, etc.)
  • Latest trends in building eCommerce websites (e.g. headless, multichannel, AI, etc.)

About industry tracks

As you’ve probably read in one of our previous blog posts, industry verticals are a new concept being introduced at Drupal Europe and replace the summits, which typically took place on Monday. At Drupal Europe these industry verticals are integrated with the rest of the conference — same location, same ticket and provide more opportunities to learn and exchange within the industry verticals throughout three days.

Now is the perfect time to buy your ticket for Drupal Europe. Session submission is only open for a few more days so please submit your sessions and if encourage others who have great ideas.

Please help us to spread the word about this awesome conference. Our hashtag is #drupaleurope.

To recommend speakers or topics please get in touch at [email protected].

About the Drupal Europe Conference

Drupal is one of the leading open source technologies empowering digital solutions in the government space around the world.

Drupal Europe 2018 brings over 2,000 creators, innovators, and users of digital technologies from all over Europe and the rest of the world together for three days of intense and inspiring interaction.

Location & Dates

Drupal Europe will be held in Darmstadtium in Darmstadt, Germany — which has a direct connection to Frankfurt International Airport. Drupal Europe will take place 10–14 September 2018 with Drupal contribution opportunities every day. Keynotes, sessions, workshops and BoFs will be from Tuesday to Thursday.

Jun 21 2018
Jun 21

Drupal is built on PHP so any developer working with Drupal needs some PHP knowledge. PHP memory management is something that can initially be a difficult concept to grasp.

In this Acro Media Tech Talk video, Rob Thornton covers PHP arrays and how they use memory. He goes over various examples, helping to shed some light on how to use arrays effectively. Along the way, Rob discusses passing arrays by value vs. by reference and shares some tips about each.

If you find this video helpful, you may also be interested in these related topics:

Contact us and learn more about our custom ecommerce solutions

Jun 20 2018
Jun 20
Jeff Geerling @flickr

With Drupal 8 core now in full swing and the contrib space rapidly maturing, it is an excellent time to get more deeply involved with one of the world’s largest open-source development communities. The Drupal + Technology track is focused on educating developers on the latest techniques and tools for increasing the quality and efficiency of their projects.

The Drupal + Technology track is the place for Drupal experts actively working on and contributing to Drupal to share their knowledge and help attendees to grow their Drupal skills.

We expect deeply technical sessions that inspire developers to see what is possible with Drupal. We welcome sessions sharing knowledge about integrating Drupal with bleeding-edge technologies (blockchain, IoT, decoupled frontend, etc) to empower the audience to create amazing digital experiences.

This year, the Drupal Europe program is designed around the idea of industry verticals with sessions and workshops based on specific industries. We expect a huge amount of session submissions in the Drupal + Technology track and would kindly advise you to look if an industry track could be more appropriate for your talk to have a better chance of being accepted.

Be ready to sharpen your skills and connect with other tech-minded folks. Convince your boss to invest in your skills, and get you a regular Drupal Europe ticket before they increase in price on 12th of August.

There will also be plenty of contribution opportunities during the event. All expertises and energy levels are equally invited!

Susanne Coates @flickr

Location & Dates

The deadline for the call for papers is 30th of June. Share your skills and empower other developers at the Drupal + Technology track. Submit your session now!

About Drupal Europe 2018

Drupal Europe will be held in Darmstadtium in Darmstadt, Germany — with a 15 min direct connection to Frankfurt International Airport. Drupal Europe will take place 10–14 September 2018 and will bring over 2,000 creators, innovators, and users of digital technologies from all over Europe and the rest of the world together for three days of intense and inspiring interaction.

Jun 19 2018
Jun 19
Amazee labs @flickr

Drupal Europe brings a unique opportunity to connect, share and learn from the Drupal community and to talk about what holds us together. We grew to be the biggest open source community under the tagline “Come for the code and stay for the community” which we strive to uphold.

Join us on September 10–14, 2018 in Darmstadt, Germany to discuss and learn about growing and strengthening communities and the challenges that come with that.

Drupal has been a historic example of how Open Source communities can thrive and to maintain this leading position we need to learn from each other, include others and inspire everybody to be an active contributor. This might bring its challenges from time to time, so please come and share your stories, expertise and lessons learned with us. This is the only way to keep our community strong, diverse and open minded.

Who should attend?

You! This vertical topic will be the meeting place for everyone in Drupal and other communities.

Whether you want to organise events, you’re new to the community and want to know where you can get involved, or you want to share a success story from your community, you are welcome.

Target groups:

  • Members of the Drupal community
  • Other open source communities
  • Organisations and those interested in how communities work and prosper

Example talks:

  • Being Human
  • Challenges of contribution
  • Community help
  • Community retention
  • Growing leaders & influencers (by empowering, enabling and adding trust)
  • Growing the Drupal Community
  • Improving diversity
  • Mentorship, sponsorship and allies
  • Organizing events
  • Succession planning for organizers and leaders

As you’ve probably read in one of our previous blog posts, industry verticals are a new concept being introduced at Drupal Europe and replace the summits, which typically took place on Monday. At Drupal Europe. These industry verticals are integrated with the rest of the conference — same location, same ticket and provide more opportunities to learn and exchange within the industry verticals throughout three days.

Industry vertical icons by @sixeleven

Now is the perfect time to buy your ticket for Drupal Europe. Session submission is already open so please submit your sessions and encourage others who have great ideas.

Please help us to spread the word about this awesome conference. Our hashtag is #drupaleurope.

To recommend speakers or topics please get in touch at [email protected].

About Drupal Europe Conference

Drupal is one of the leading open source technologies empowering thousands of digital solutions around the world.

Drupal Europe 2018 brings over 2,000 creators, innovators, and users of digital technologies from all over Europe and the rest of the world together for three days of intense and inspiring interaction.

Location & Dates

Drupal Europe will be held in Darmstadtium in Darmstadt, Germany — which has a direct connection to Frankfurt International Airport. Drupal Europe will take place 10–14 September 2018 with Drupal contribution opportunities every day. Keynotes, sessions, workshops and BoFs will be from Tuesday to Thursday.

Jun 19 2018
Jun 19

Omnichannel generally means the shopping experience is unified and seamless whether you do it on your laptop, in store, through your phone, etc. The team at Acro Media set out to demonstrate just how easy it is to give your customers a true omnichannel experience using Drupal and Drupal Commerce.

The omnichannel setup

As part of our demo at DrupalCon in Nashville, we did a pseudo T-shirt pre-order. Before the conference, attendees could use our Urban Hipster eCommerce demo site to pre-order a Drupal Commerce shirt in their size. When they completed their pre-order, they got an order number to bring with them to our booth. 

Check Out Our High Five Drupal Web SeriesPeople who didn't pre-order could also come to our booth and "purchase" (for free) a T-shirt using a self serve kiosk running the same demo site. 

So one side of the booth was the set up as the cashier/fulfillment area. The other side had the self-serve kiosk. We also had other laptops available so that we could bring up the admin interface as if we were a customer support person assisting a customer over the phone. The "support person" could find the customers order number or email address and fulfill the order. Easy peasy.

The whole time, our inventory of shirt sizes was counting down until the stock count hit 0. When our inventory reached 0 for a certain size, orders for that size could no longer be placed.

Why is this so amazing?

Some people were impressed but also a little puzzled, thinking that this sort of setup should just exist everywhere. Which it should, but it doesn't. With most retail stores, the online and in-store experiences are completely separate. They might as well be two different companies. If you buy something online and try to return it in store, it often can't happen. Loyalty points often don't transfer. The list goes on. Some places will let you buy online and pick up in store, but there might be a delay. They might say sure, you can pick it up in store, but not for 24 hours. In that case, you might as well just go to the store and find it yourself. Even knowing if an item is in stock can be tricky. The website might say there are three left, but that's just a snapshot from a certain point in time, and you don't know how often that gets updated. Maybe that was valid six hours ago, but that item has since sold out.

Why Drupal rocks

What makes Drupal so cool is that the point of sale and the Commerce module both use the same orders. A point of sale order is just a Drupal Commerce order. It has some specifics to the point of sale, but it can be loaded up in a regular interface. They use the same stock, the same products, everything. This is surprisingly rare. A lot of POS systems in particular are very antiquated. They date from pre-Internet times and have no concept of syncing up with things.

But we've created a true omnichannel experience. We've done it, and implemented it, and it's all open source and freely available. Anyone else could set up the same omnichannel setup that we did. We used a laptop, a cash drawer, a couple of iPads, nothing too fancy.

What's more, as the software matures, we're working on an even better demo with more smoothed out features, better integration, nicer interface, etc. Stay tuned.

Demo Drupal Commerce today! View our demo site.

More from Acro Media

Let's talk omnichannel!

We're always happy to help you understand how you can deliver a true omnichannel experience for you customers. Contact us today to talk to one of our business development experts.

Contact Us

Jun 18 2018
Jun 18
June 18th, 2018

Last month, Ithaca College introduced the first version of what will represent the biggest change to the college’s website technology, design, content, and structure in more than a decade—a redesigned and rebuilt site that’s more visually appealing and easier to use.

Over the past year, the college and its partners Four Kitchens and design firm Beyond have been hard at work within a Team Augmentation capacity to support the front-to-back overhaul of Ithaca.edu to better serve the educational and community goals of Ithaca’s students, faculty, and staff. The results of the team’s efforts can be viewed at https://www.ithaca.edu.  

Founded in 1892, Ithaca College is a residential college dedicated to building knowledge and confidence through a continuous cycle of theory, practice, and performance. Home to some 6,500 students, the college offers more than 100 degree programs in its schools of Business, Communications, Humanities and Sciences, Health Sciences and Human Performance, and Music.

Students, faculty, and staff at Ithaca College create an active, inclusive community anchored in a keen desire to make a difference in the local community and the broader world. The college is consistently ranked as one of the nation’s top producers of Fulbright scholars, one of the most LGBTQ+ friendly schools in the country, and one of the top 10 colleges in the Northeast.

We emphasized applying automation and continuous integration to focus the team on the efficient development of creative and easy to test solutions.

On the backend, the team—including members of Ithaca’s dev org working alongside Four Kitchens—built a Drupal 8 site. The transition to Drupal 8 keeps the focus on moving the college to current technology for sustainable success. Four Kitchens emphasized applying automation and continuous integration to focus the team on the efficient development of creative and easy to test solutions. To achieve that, the team set up automation in Circle CI 2.0 as middleware between the GitHub repository and hosting in PantheonGitHub was used throughout the project to implement, automate, and optimize visual regression, advanced communication between systems and a solid workflow using throughout the project to ensure fast and effective release cycles. Learn from the experiences obtained from implementing the automation pipeline in the following posts:

The frontend focused heavily on the Atomic Design approach. The frontend team utilized Emulsify and Pattern Lab to facilitate pattern component-based design and architecture. This again fostered long-term ease of use and success for Ithaca College.

The team worked magic with content migration. Using the brainchild of Web Chef, David Diers, the team devised a plan to migrate of portions of the site one by one. Subsites corresponding to schools or departments were moved from the legacy CMS to special Pantheon multidevs that were built off the live environment. Content managers then performed a moderated adaptation and curation process to ensure legacy content adhered to the new content model. A separate migration process then imported the content from the holding environment into the live site. This process allowed Ithaca College’s many content managers to thoroughly vet the content that would live on the new site and gave them a clear path to completion. Learn more about migrating using Paragraphs here: Migrating Paragraphs in Drupal 8

Steady scrum rhythm, staying agile, and consistently improving along the way.

In addition to the stellar dev work, a large contributor to the project’s success was establishing a steady scrum rhythm, staying agile, and consistently improving along the way. Each individual and unit solidified into a team through daily 15-minute standups, weekly backlog grooming meetings, weekly ‘Developer Showcase Friday’ meetings, regular sprint planning meetings, and biweekly retrospective meetings. This has been such a shining success the internal Ithaca team plans to carry forward this rhythm even after the Web Chefs’ engagement is complete.     

Engineering and Development Specifics

  • Drupal 8 site hosted on Pantheon Elite, with the canonical source of code being GitHub and CircleCI 2.0 as Continuous Integration and Delivery platform
  • Hierarchical and decoupled architecture based mainly on the use of group entities (Group module) and entity references that allowed the creation of subsite-like internal spaces.
  • Selective use of configuration files through the utilization of custom and contrib solutions like Config Split and Config Ignore modules, to create different database projections of a shared codebase.
  • Migration process based on 2 migration groups with an intermediate holding environment for content moderation.
  • Additional migration groups support the indexing of not-yet-migrated, raw legacy content for Solr search, and the events thread, brought through our Localist integration.
  • Living style guide for site editors by integrating twig components with Drupal templates
  • Automated Visual Regression
Aerial view of the Ithaca College campus from the Ithaca College homepage. From the Ithaca College Homepage.

A well-deserved round of kudos goes to the team. As a Team Augmentation project, the success of this project was made possible by the dedicated work and commitment to excellence from the Ithaca College project team. The leadership provided by Dave Cameron as Ithaca Product Manager, Eric Woods as Ithaca Technical Lead and Architect, and John White as Ithaca Dev for all things legacy system related was crucial in the project’s success. Ithaca College’s Katherine Malcuria, Senior Digital User Interface Designer,  led the creation of design elements for the website. 

Katherine Malcuria, Senior Digital User Interface Designer, works on design elements of the Ithaca.edu website

Ithaca Dev Michael Sprague, Web Chef David Diers, Architect,  as well as former Web Chef Chris Ruppel, Frontend Engineer, also stepped in for various periods of time on the project.  At the tail end of the project Web Chef, Brian Lewis, introduced a new baby Web Chef to the world, therefore the amazing Randy Oest, Senior Designer and Frontend Engineer, stepped in to assist in pushing this to the finish line from a front-end dev perspective. James Todd, Engineer, pitched in as ‘jack of all trades’ connoisseur helping out where needed.

The Four Kitchens Team Augmentation team for the Ithaca College project was led by Brandy Jackson, Technical Project Manager, playing the roles of project manager, scrum master, and product owner interchangeably as needed. Joel Travieso, Senior Drupal Engineer, was the technical lead, backend developer, and technical architect. Brian Lewis, Frontend Engineer, meticulously worked magic in implementing intricate design elements that were provided by the Ithaca College design team, as well a 3rd party design firm, Beyond, at different parts of the project.

A final round of kudos goes out to the larger Ithaca project team, from content, to devOps, to quality assurance, there are too many to name. A successful project would not have been possible without their collective efforts as well.

The success of the Ithaca College Website is a great example of excellent team unity and collaboration across multiple avenues. These coordinated efforts are a true example of the phrase “teamwork makes the dream work.” Congratulations to all for a job well done!

Special thanks to Brandy Jackson for her contribution to this launch announcement. 

Four Kitchens

The place to read all about Four Kitchens news, announcements, sports, and weather.

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web