Jun 20 2019
Jun 20

What does Drupal 8 do that Laravel does not? What key functionalities, that Drupal ships with, do you need to build from scratch in Laravel? And how would opting for Laravel benefit your specific type of project? In short: Laravel or Drupal 8?

“It's like comparing apples to oranges” some might say since one's a framework and the other one a CMS.

Even so, if it's unclear to you what are their particular use cases and their built-in features, you can't know whether it's a CMS or a framework that best suits your project type, right? That best serves your project-specific needs:
 

  • to be super fast
  • to leverage a solid, off-the-shelf content management system for publishing different pieces of content on the website
  • to feature an easy to scale database
  • to support multisite
  • to tap into robust user and content management features that are already implemented
  • to be built on top of a solid framework acting as a reliable back-end application
  • to leverage a highly intuitive admin user interface
  • to be 101% secure
  • to leverage a mixture of server and client-side logic
     

Now, keep your list of project requirements and constraints at hand to evaluate these 2 technologies' pros and cons against it:
 

1. Drupal 8: Top Benefits, Main Drawbacks, and Specific Use Cases

If a robust user and content management system is critical for your project, then Drupal 8 makes the smartest choice. It's that “thing” that Drupal excels at that, which would take you a whole lot more time to do in Laravel.

And it's not just its robustness that might “lure you in”, but the level of convenience that it provides: a lot of the essential features and functionalities that you might need are already built-in.

Moreover, you can easily manage them and custom-tune them via your admin interface...

By comparison, you'd need to build these functionalities, from the ground up, if you chose to go with Laravel.
 

Top benefits:
 

  • you can rest assured that your website runs on a particularly robust, Symfony-based CMS
  • there's a huge, dedicated community backing it up
  • you get to create various content types, for different parts of your website, assigned with different roles; unlike basic CMSs, that only enable you to write... posts and to create new web pages
  • you can set up different editorial workflows and assign specific user roles, with fine-grained access control
  • you can always further extend its CMS-specific functionalities: extensibility is one of the strongest Drupal 8 benefits
     

Main drawbacks:
 

  • you do need a team of Drupal experts (senior-level preferably) to keep an eye on your Drupal 8 website/app and keep everything properly maintained
  • you can't get away with a “get it up and running and... move on” type of philosophy; Drupal 8 is a more of a long-term commitment: there's always a newly launched promising module to consider adding on, a new update to run...
     

Specific Use Cases for Drupal 8:
 

  • large-scale projects that depend on a robust and reliable content management system; one that withstands an intense, ongoing process of creating, editing and publishing lots of fresh content
  • Laravel or Drupal 8? Definitely the later if it's a multi-site, multi-language web project that you plan to develop; not only that it streamlines content publishing  across your whole network, but it significantly speeds up localization thanks to its server-side caching capabilities
     

It means that no matter the place on the globe where that your users might be located, they get to access your web pages and have them loaded... instantly.
 

2. Laravel: Pros, Cons, and Project Types that It's Best Suited For

Laravel stands out as a highly reputed, powerful PHP framework

If:
 

  • maintainability is one of your biggest concerns
  • you're looking for a robust framework
  • you need to carry out your project fast enough
  • you need a framework that ships with all the latest functionalities
     

... then Laravel is what you need.
 

Top Benefits:
 

  • a fast-growing, devoted community
  • you can easily integrate LDAP authentication 
  • it leverages the Model-View-Controller architecture
  • it's just... fast
  • provides you with a great admin user interfaces
  • it “spoils” you with intiutive, beautifully written code
  • it ships with a heavy “toolbox”: scan through and pick the most suitable one(s) for your project
  • in-built code for social login and sending out emails
  • everything you might need to set up during the development process is right there, already integrated into your code: cron jobs, database queries, routes...
     

Main drawbacks:
 

  • more often than not identifying performance issues isn't that straightforward
  • upgrading to the latest version of Laravel can turn out to be quite a challenge: be prepared for “buggy scenarios” and for the need to rewrite code
  • you can't just jump straight to Laravel: learning the basics of OOPS first things first is a must
     

Specific Use Cases:
 

  • your project needs a back-end application (rather than an off-the-shelf CMS)
  • when the benefits of the MVC architecture (faster development process, suitable for large-scale projects, multiple views, etc.) are critical for the given project 
  • whenever you need to mix the client-side with server logic
  • whenever time factor is the main concern for you: you just need your project developed super fast
     

3. So... Laravel or Drupal 8? 

Now, I'm sure that you already anticipate my answer:

The choice depends strictly on your project requirement and objectives.

On your own hierarchy of priorities in terms of features and functionalities.

And depending on these key aspects, that should be clearly defined, one technology will benefit you over the other.

So... what type of project are you looking to build?

Photo by Raquel Martínez on Unsplash 

Jun 20 2019
Jun 20

We’ve been starting many of our projects using Acquia’s Lightning distribution. This gives a good, consistent starting point for and helps speed development through early adoption of features that are still-in-the-works for Drupal 8. Like other distributions, Lightning bundles Drupal Core with a set of contributed modules and pre-defined configuration.

While Lightning is a great base to start from, sometimes you want to deviate from the path it provides. Say for example you want to use a Paragraphs based system for page components, your client has a fairly complex custom publishing workflow, and you also have different constraints for managing roles. Out-of-the-box, Acquia Lightning has a number of features you may find yourself in conflict with. Things like Lightning Layout provide a landing page content type that may not fit the needs for the site. Lightning Roles has a fairly hard-coded set of assumptions for role generation. And while it is a good solution for many sites, Lightning Workflow may not always be the right fit.

You may find yourself tempted to uninstall these modules and delete the configuration they brought to the party, but things are not always that simple. Because of the inter-relationships and dependencies involved, simply uninstalling these modules may not be possible. Usually, all looks fine, then when it comes time for a deployment things fall apart quickly.

This is where sub-profiles can save the day. By creating a sub-profile of Acquia Lightning you can tweak Lightning’s out-of-the-box behavior and include or exclude modules to fit your needs. Sub-profiles inherit all of the code and configuration from the base profile they extend. This gives the developer the ability to take an install profile like Acquia Lightning and tweak it to fit her project’s needs. Creating a sub-profile can be as easy as defining it via a *.info.yml file.

In our example above, you may create a sub-profile like this:

name: 'example_profile'
type: profile
description: 'Lightning sub-profile'
core: '8.x'
type: profile
base profile: lightning
themes:
  - mytheme
  - seven
install:
  - paragraphs
  - lightning_media
  - lightning_media_audio
  - lightning_media_video
exclude:
  - lightning_roles
  - lightning_page
  - lightning_layout
  - lightning_landing_page

This profile includes dependencies we’re going to want, like Paragraphs – and excludes the things we want to manage ourselves. This helps ensure that when it comes time for deployment, you should get what you expect. You can create a sub-profile yourself by adding a directory and info.yml file in the “profiles” directory, or if you have Drupal Console and you’re using Acquia Lightning, you can follow Acquia’s instructions. This Drupal Console command in Lightning will walk you through a wizard to pick and choose modules you’d like to exclude.

Once you’ve created your new sub-profile, you can update your existing site to use this profile. First, edit your settings.php and update the ‘install_profile’ settings.

$settings['install_profile'] = 'example_profile';

Then, use Drush to make the profile active.

drush cset core.extension module.example_profile 0

Once your profile is active and in-use, you can export your configuration and continue development.

Jun 20 2019
Jun 20

Acquia Content Cloud, a new content-as-a-service solution for simplified content creation and syndication across multi-channel digital experiences, is now available in private beta.

Earlier this week at our Acquia Engage conference in London, Acquia announced a new product called "Content Cloud", a headless, SaaS-based content-as-a-service solution built on Drupal.

Years ago, we heard that organizations wanted to:

  • Create content that is easy to re-use across different channels, such as websites and mobile applications, email, digital screens, and more.

  • Use a content management system with a modern web service API that allows them to use their favorite front-end framework (e.g. React, Angular, Vue.js, etc) to build websites and digital experiences.

As a result, Acquia spent the last 5+ years helping to improve Drupal's web services capabilities and authoring experience.

But we also heard that organizations want to:

  • Use single repository to manage all their organization's content.
  • Make it really easy to synchronize content between all their Drupal sites.
  • Manage all content editors from a central place to enable centralized content governance and workflows.
  • Automate the installation, maintenance, and upgrades of their Drupal-based content repository.

All of the above becomes even more important as organizations scale the number of content creators, websites and applications. Many large organizations have to build and maintain hundreds of sites and manage hundreds of content creators.

So this week, at our European customer conference, we lifted the curtain on Acquia Content Cloud, a new Acquia product. Acquia Content Cloud is a content-as-a-service solution that enables simplified, headless content creation and syndication across multi-channel digital experiences.

For now, we are launching an early access beta program. If you’re interested in being considered for the beta or want to learn more as Content Cloud moves toward general availability, you can sign up here.

In time, I plan to write more about Content Cloud, especially as we get closer to its initial release. Until then, you can watch the Acquia Content Cloud teaser video below:

June 20, 2019

1 min read time

db db
Jun 19 2019
Jun 19

Earlier this month we launched a redesign of the Commonwealth Fund’s Health System Data Center, a platform for exploring state health system data through custom tables, graphs and maps. With interactive visualizations covering dozens of topics, roughly 100 indicators, and tens of thousands of individual metrics, the platform helps make underlying data actionable for advocates, policy makers, and journalists tackling healthcare system issues all over the country.

We used Drupal 8 to build the data center backend, and we used React and Highcharts to render its interactive charts and graphs. Drupal 8’s flexible entity storage system made it a perfect fit for housing the data. Its capabilities for leveraging third-party APIs and JavaScript libraries made integrating with React and Highcharts far simpler than other alternatives.

We’re all incredibly excited to see the Health System Data Center live. For us at Aten, this is the latest in a series of project launches dealing with data visualization. Along the way, we’ve been working on a collection of tools specifically tailored to the unique needs of data-intensive projects. Here are six Drupal 8 modules that help solve specific challenges when working with data. (Note that some of these are sandbox modules. While sandbox modules don’t have official releases, you can still download the code, try them out, and of course, get involved in the issue queue!)

Six Drupal Modules for Working with Datasets

Datasets

We’ve worked on a lot of data projects that use a common architecture. Typically, projects include a collection of Datasets, each of which references a variable number of specific Indicators and Metrics. This module provides custom entities and related functionality for quickly deploying this common architecture.

JS Entity

When embedding multiple instances of a Javascript application on a page (in this specific case, a React app), we often need a way to very quickly pass data to the DOM. This module provides a configurable approach for defining which fields should be passed directly to Drupal’s JavaScript API for each specific view mode. It also offers a number of configuration options, including the ability to rename properties (or field names) to match what your application is looking for.

JS Component

Here at Aten, data projects often involve dynamic visualizations built as JavaScript applications (specifically with React or Vue) that are both embedded within a page rendered by Drupal and leverage data stored in Drupal’s entity system. This module provides an easy way for developers to define JavaScript apps entirely in YAML configuration files, which are exposed in Drupal automatically as blocks. Since they are ultimately just blocks, defined applications can be added to pages by any of the typical means.

MarkJS Search

We often need a way for users to quickly search through a lengthy list of indicators. This module provides fast, responsive highlighting and filtering for search input by leveraging the 3rd-party Mark.js JavaScript library.

Entity Importer

Site owners need a way to keep data accurate, relevant and up-to-date. This module provides a drag-and-drop interface for Drupal’s migrate functionality, making it easy to upload datasets as a series of CSV files. (Learn more from an earlier post: Entity Import: A User Interface for Drupal 8 Migrations.)

Entity Extra Field

When working with JavaScript applications exposed in Drupal as custom blocks, we often want a way to push those blocks directly into the node view page. This module provides a way for site builders to define Extra Fields on entities, which can be blocks, views, or tokens. Extra Fields can be placed and rearranged like any other entity field. (Entity Extra Fields module leverages Drupal’s “Extra Field” system. To learn more about Extra Fields, read Placing Components with Drupal's Extra Fields).

Let’s Talk

If you’re considering a data project for your organization and having trouble getting started, we’d love to help – whether that means talking through long-term goals, responding to a formal RFP, or anything in between. Get in touch and let’s talk about your data.

Jun 19 2019
Jun 19

Your browser does not support the audio element. TEN7-Podcast-Ep-062-2019-Flyover-Camp.mp3

Our frequent DrupalCamp attender and speaker, DevOps Tess Flynn, returns to the podcast to recap her recent experience at Flyover Camp, a brand new Drupal camp in Kansas City, Missouri.

Host: Ivan Stegic
Guest: Tess Flynn, DevOps Engineer at TEN7

Podcast highlights: 

  • We in the midwest totally own the “flyover” jokes!
  • The continuing diversity in camp talks (business, self-care, human focus tracks)
  • Tess reviews both her talks (Return of the Clustering: Kubernetes for Drupal, and Health Check Your Site)
  • How you should stretch your mind to prepare for all the rapid-fire information you get in the Kubernetes talk!
  • Location, location, location is as important for conference talks as it is for real estate
  • Listen to Ivan and Tess geek out over the Raspberry Pi session

Links:

TRANSCRIPT

IVAN STEGIC: Hey everyone you're listening to the TEN7 Podcast, where we get together every fortnight and sometimes more often, to talk about technology, business, and the humans in it. I'm your host Ivan Stegic. Let's talk about Drupal Flyover Camp 2019, that happened from Friday the 31st of May to Sunday, the 2nd of June, in Kansas City. Joining me to give her thoughts is socketwench. That's wench, not wrench. Welcome back to the podcast.

TESS FLYNN: Hello.

IVAN: Now, did I say it the right way, because I know you always have a specific way of saying it when you give your intro to socketwench.

TESS: Well, that’s pretty close. That works.

IVAN: Close? Okay. Good. So, you were at a Flyover Camp. What's in a name? I just love how Flyover Camp were poking fun at themselves in Kansas. I mean, we're pretty much in flyover land here in Minneapolis too, so I totally get it.

TESS: [laughing] So let's first frame what that is because if we're having international listeners, they might not get what the reference is.

IVAN: Good idea.

TESS: So, the thing that goes with it is, if you're from the Midwest you're considered in flyover country. And the reason why is because the joke goes, that there is nothing in the United States that's of interest unless if you're on either coast, which is actually completely untrue. However, that is what a lot of people tend to think of it. So as a result, if you're in the Midwest you kind of go, Well, you know, what we're going to own that turf.

IVAN: Exactly.

TESS: We're going to go and names things after it and take that world.

IVAN: I love it. I love that they did that. Drupal Flyover Camp in Kansas City, Missouri. And so, this is a brand-new camp, right? This is the first time they've ever done this camp. How great is that? We have a new camp on the schedule.

TESS: Yeah, I was surprised that it was new because they hit everything running. It felt like this was a well-oiled machine for a camp.

IVAN: That's wonderful. It's wonderful to have that on the calendar again. So, well-oiled machine. Did you recognize any of the organizers? Maybe these people have done it before.

TESS: I think that I recognized a few people from…oh, what is their name…VML and YL. What are they called now, because they merged with somebody?

IVAN: I don't know.

TESS: VML Y&R. Wow, that is a mouthful.

IVAN: What?

TESS: Victor, Mike, Lima, Yankee and Romeo.

IVAN: Okay. What are they, a global marketing agency that needs a new name? [laughing]

TESS: [laughing] That is their new name. [laughing]

IVAN: [laughing] Okay. I don't even know how to say it.

TESS: They used to be two different companies that got merged, and this is the resulting name.

IVAN: Oh, it's on their BOF page. If you're looking for something about VML you can still visit the VML website. If you're looking for something about Y&R, don’t sweat, you can still visit the Y&R. So, it's basically like you said, a concatenation of their former names. Maybe it's just temporary. Ok. A little bit of a tangent. [laughing] So, some sort of experience in Flyover Camp organization. Sounds like you said they were a well-oiled machine. It was a three-day camp?

TESS: I believe so. There was a day of trainings which I did not attend, and then two days of sessions, which actually has been bucking the trend lately.

IVAN: Yeah. And also, from what I can tell there were contributions as well on Sunday so, maybe it was a four-day camp, if there were trainings as well.

TESS: Might’ve been.

IVAN: Yeah. So, you were there Friday and Saturday. It looked like they had numerous tracks. So, I thought, usually these camps have five tracks and then you have five rooms and people go to the room for the track that they're interested in. This felt like it had a dozen tracks, but three rooms and it sort of was interspersed track sessions and BOFs as well amongst these three rooms. Is that what it was like? I mean I’m only gauging from the website.

TESS: So, you know the thing with the tracks is that a lot of the time it depends on how promoted they are as their own top-level entity in the data, as it were. And some camps do a very good job of this, that they have this track, this track and this track. I think DrupalCon recently reorganized so that there's only particular tracks that they directly advertise to different audiences, like a business audience, a frontend audience, something like that. Some camps have a lot of tracks and they're not particularly consistently organized, or if they are, it doesn't feel like that when you're attending because you don't tend to notice it, and Flyover Camp seemed to fall into this latter category. That's not bad but it's just a thing.

IVAN: Yeah. And I love that the tracks were so diverse as well, right? There was security, QA, site building, the usual frontend/backend stuff and there was a self-care track as well. I mean, more of that please. More mental health stuff, more business stuff, more human focus sessions. I love it.

TESS: Mm hmm.

IVAN: I love it. I think that's awesome. And, it looked like there were about 30 sessions, so similar to Drupaldelphia and those 30 sessions were spread across two days as opposed to one day at Drupaldelphia.

TESS: Yeah and it seemed to attract a lot of people from the area. I mean I was there from Minnesota and I saw people that usually I see at DrupalCorn there as well. So, it attracted a lot of people from the Midwest.

IVAN: That's wonderful. And there were BOFs as well, and it kind of looked like they were spread out across the two days as well.

TESS: I think there were, but I was so focused on other stuff that I completely missed it.

IVAN: Yeah. This was a heck of a camp for you. I mean it wasn't one session it was two sessions.

TESS: Oh man, it was a double feature. That was hard.

IVAN: I'm sure you absolutely shone on that and I'm sure you did really well. So, let's talk about those two sessions. So, your first session on Friday was the famous cloaked talk, Kubernetes called Return of the Clustering, right? The third part of the trilogy. So that was Friday. And then Saturday you gave a talk essentially about the Healthcheck module, right? What can you do to keep tabs on the health of your Drupal site?

TESS: Well, it was also about site auditing as well, in general.

IVAN: That’s right, and site auditing. So, I guess the critical question here is, did you wear a costume for both talks?

TESS: So, here's the problem with that. I don't have a car. And in order to actually get the costume for that one I would have probably had to rent a car to go to a local thrift store chain called Ax Man surplus and see if I could find like a stethoscope or whatever that little satellite dish head gear thing that they wear, I forget what it's called, and see if I could shove one of those into my luggage. But I didn't have the time to do that. Every weekend that I've had lately has just been completely booked up.

IVAN: Well maybe we'll have to work on that if you get asked to do that talk again and we'll figure out another costume for you.

TESS: Well, rumor has it that's going to happen. [laughing]

IVAN: [laughing] So, comparatively, how were the two sessions attended? Was there a drop of people on Saturday compared to Friday or was it comparable?

TESS: It was actually the other way around. I think a lot of people find the Kubernetes talk is fun, but it can be very intimidating because it seems like, “oh, that's a very devopsy, very technical talk and it's going to be way over my head.” And I was able to attract some people to come to it, especially by making a fool out of myself, by dressing up like a Jedi and standing outside of my door waving a lightsaber to have people come and join the session. But, it was a smaller room and it was still well attended, but the site audit talk actually had a lot more people in it, mostly because it was also in the main auditorium, so a lot of people who were just there were also just there, but there was a lot of people paying attention to it as well, because it tends to be a really fun, engaging talk and it tends to appeal to a much broader audience than the Kubernetes talk, which tends to be more infrastructury devopsy people. Even though I try to make that as broadly appealing as I can.

IVAN: So, location, location, location. Right. You had a wonderful location in the auditorium for that talk.

TESS: Yeah. The only downside is that when you're in an auditorium you're usually on a pedestal or a dais or something like that, and the problem is that it sounds like I'm a T-Rex walking around on stage, because the thing is hollow so the microphone just picks up everything, and I don't tend to stay still when I give a talk, I tend to gesticulate and walk around and do lots of weird things.

IVAN: Jump around I believe you do as well. [laughing]

TESS: [laughing] Yeah. Well, I think the site audit talk, I also fall to my knees at one point, dramatically. [laughing]

IVAN: [laughing] It's a good talk.

TESS: I still remember skinning my knee at DrupalCorn.

IVAN: Well, it’s a good talk. I think it gets valuable as you do that. It certainly reminds people how important it is. Right? So, what do you think the biggest question was that people had from that health check talk, from that audit talk?

TESS: You know, I didn't get many questions. I was actually thinking about this a few days ago. I tend not to get that many questions directly after a talk because usually my talks last the entire amount of time, and afterwards, I have to rush out the door for the next person to start setting up their talk. And usually I don't get many questions, and I do try to anticipate a lot of the potential questions as well within the contents of the talk. So sometimes people will come by and ask me questions later, but that hasn't happened lately.

IVAN: It's a similar sort of thing for both talks then.

TESS: Yeah. I did have a nice conversation with someone, I think they're from the U of Kansas. I forget. I remember their face. I know that they go to DrupalCorn regularly too, but they were telling me about Kubernetes operators and all of that nifty technical stuff and that was a really interesting conversation to have, but it really wasn't a question.

IVAN: Well that actually leads me into my next question. Usually you're the one educating people about whatever you're talking about. What do you think your biggest takeaway from a session was in Flyover camp? What did you learn from each of them?

TESS: Ok. Geez, I’m trying to remember all the sessions I went to because Twin Cities Drupal was last weekend, and now I'm trying to remember any of the sessions I went to.

IVAN: Oh, I think maybe you misunderstood. What I meant was, what did you learn in your talk from the audience?

TESS: Oh. So, one thing that definitely occurred to me is that, when it comes to the Kubernetes talk is, just how much technical knowledge you need, all technical terms you need to pick up very, very quickly to get anywhere with understanding Kubernetes without feeling like you're “drowning,” in technical terms, all of a sudden. And I certainly had that experience myself just trying to learn Kubernetes in the first place and that is after having a very strong background in how containers work and how Docker works and some of the top terminologies I picked up from running production workloads in Dockers form, and I realized that after that talk, like, wow, in 45 minutes I take you from, you kind of, sort of know what Docker is, and you might have heard of Ansible, but you don't know too much about it, to, here’s how you run a Drupal site in production on Kubernetes using a simple effective formula. And that kind of struck me as Wow, not many people are doing that because, wow that can be really complicated.

IVAN: Yeah, it's the bleeding edge of it isn't it?

TESS: It's not just the bleeding edge, it’s just that the underlying design that I went for strives for minimalism and simplicity, and a lot of people find that appealing because it reduces the number of working parts that you have to know. A good example would be memcached. The way that it's presented in the talk is as a stateful set and that works. A lot of people will say, "What you should do is run it as another object called a daemon set." But in order to introduce a daemon set, I'd have to introduce a completely new object type that only works for that, and afterwards it's like, "Is that really necessary to talk about it?” “How often do you add or remove notes?” If you are already thinking about adding and removing notes, you're probably going to look up this stuff for you. So, I don't need to actually tell you about this in this talk. [laughing]

IVAN: Yeah, I love that you're able to educate people in one session even at a very high level. To go from, kind of knowing to, being interested in the technology and in what we're doing and in being interested in continuing to find out more. And, maybe that's a good reason to do a separate podcast just on the talk you gave and the contents of the talk and why are we doing that? Why is TEN7 investing as much as we are in Kubernetes and in Docker and in Drupal, and, you know, sending you to all of these camps, and then putting all of this work into the open source domain? Like, maybe there's enough there to talk about. I mean, just from my perspective, we want to be independent, and using a hosting solution that is supported in the open source that is vendor agnostic. And, if we're doing it for ourselves, there's no reason why we couldn't put it out there and have others learn and leverage from it as well. So, we should probably talk about this a little more in a separate podcast.

TESS: That's not a bad idea at all.

IVAN: I love it. Okay. We'll do that. We'll ask Jonathan to make that happen for us. Okay, so, a little more about Kubernetes. I was looking through the schedule of talks and as you, Tess, know, Raspberry Pis are really near and dear to my heart. I've used them for many different things at home, most recently as an ad blocker for the whole network, but I saw that Jeff Geerling was at Flyover Camp, and he had a talk about the cluster of RPis, or the Raspberry Pis, that he's been building since 2012-2013, something like that, and how it taught him everything he knows about Kubernetes. Did you catch that talk by any small chance?

TESS: I actually did go to that talk.

IVAN: You're kidding?

TESS: Because I was like, Oh that sounds really fun and I'd like to see what he does. Is he going to use straight K8s or is he going to use that K3 that I heard about? And, what was funny to me is that I remember watching a talk that he gave, not about Kubernetes, but about Ansible. Way, way, way back in the day at MidCamp with a very similar block of a Raspberry Pi cluster in a box. And I really wanted to see what he was going to do with this. So, sure, I went to it.

IVAN: And was it everything you wished it could be? I mean, I looked at the slides and there was a shout out to socketwench in one of the slides.

TESS: Yeah, I was like thankful. I was in the front row and no one could see how I was blushing [laughing] the entire time. Like, Oh, stop talking about me please. This is your talk. [laughing]

IVAN: [laughing] That’s great. I mean, you guys are related and connected by Kubernetes, so, how wonderful that that would be the case. So, can you give me a quick synopsis of the talk? What was the nugget that you took out of it?

TESS: So, what was interesting is that Jeff has built a small Kubernetes cluster using a standard distribution Kubernetes to run on, I think it's four or five Raspberry Pis.

IVAN: I think it’s four.

TESS: I think it's four now. Four Raspberry Pis with a single ethernet switch with power over ethernet so that it reduces the amount of additional circuitry and cables he has to carry around to power them altogether.

IVAN: Hold up. Hold up. He's actually powering the RPis now through power over ethernet? That's amazing. Of course, you could.

TESS: You can get an adapter board for that.

IVAN: That's awesome.

TESS: It's not really complicated.

IVAN: That’s so great. I'm sorry. I totally interrupted you there. What was the nugget?

TESS: [laughing] Well he didn't even mention the power over ethernet except for one thing, but I was looking at the screenshots like, Oh, you’re using power over ethernet now. Nice. [laughing]

IVAN: Nice. That’s nice.

TESS: So, a lot of the talk was about how he was running his own personal site, using a Raspberry Pi cluster out of his home network. And, I used to run my own single node server out of a home network way back, many, many years ago. And there's a number of challenges that come with that out of the box. You tend not to get static IPs from most ISPs. They'll get a Dynamic IP, some of them don't like that you have a significant amount of outbound traffic or incoming traffic that's coming from the net and they may block you for that reason, if you're on a particular service tier. Some ISPs are better at that than others, it really depends. But running his own site on a Kubernetes cluster on Raspberry Pis it's like, it reminds me of this meme that I saw passed around Kubernetes Twitter a while ago, which is, the subtext is, I deployed my blog on Kubernetes and it's this big semitrailer and it has a toy truck trailer box in the middle of it, completely dwarfed by the full size trailer. [laughing] That’s kind of like, Yeah that's pretty accurate.

IVAN: Well, I mean if you ever get reddited or slashdotted, I guess maybe it'll survive?

TESS: [laughing] Kind of. There’s a degree of front side caching I think that he also used. This kind of a project always comes across to me as not a serious, You should use this instead of traditional hosting and more like, I wanted to see if I could do that and it would be fun and it's something to do and it's something that lets me learn by doing. And that's you know, a worthy pursuit in its own right.

IVAN: But if you look at the other side of that coin, you're hosting your own website, you own the hardware, you own the software, you own your site, you can see it, you're not putting the risk of hosting in another large company's data center, right? You own it all from top to bottom, and honestly if you have a small blog and you're using your ISPs connection, and you have this overkill of a Raspberry Pi cluster that is powering the static site, you're probably not going to ever get enough traffic to bring that thing down. You're probably fine.

TESS: Probably not.

IVAN: Yeah.

TESS: Although I think Jeff's site is Drupal 8.

IVAN: Oh [laughing] so, not static, not static. Well, I'm very jealous of you getting to see that talk. That must’ve been pretty cool. I'm hoping that maybe we can get Jeff on the podcast to talk about his cluster and what he's been through and how it's evolved soon. So, Jeff if you happen to be listening, watch out for an email from us about that.

Let's talk about diversity at Flyover Camp. What did it look like? Were there the kind of usual cast of people that look like I do, white males, or what did that look like amongst attendees and speakers this year?

TESS: So, there certainly is a large contingent of white straight cis male people there as well. There were a lot of women there as well, and there were several POC as well. I didn't actually take any moment to really do any kind of headcounts on that. It just never crossed my mind to log that kind of information. But I did sit with several people which were really fascinating and really interesting to talk to, and that was really nice.

IVAN: I hope we can have more of that and more attention to that in the future and we'll try to continue to talk about it and bring it up in our podcast as well. What about attendance as a whole at Flyover Camp? Was it comparable to Drupaldelphia or to Twin Cities Drupal Camp? Did you get a feel for what it was like?

TESS: I think that it was more closer to the size of Twin Cities Drupal than Drupaldelphia. Drupaldelphia had a surprising amount of people in it. And it could have been complicated by the space, because it was a smaller space than Flyover Camp or Twin Cities Drupal, but there was certainly a large number of people there.

IVAN: Any particular sessions besides Jeff’s, that were memorable to you?

TESS: Oh geez, I'd have to look it over because so much of it was kind of a blur. I was kind of sad that I missed John Rearick's session about 45 Modules and Forty-Five Minutes. I caught the end of it. But, yeah, that would have been a really interesting talk to go to, because literally every slide has a timer. So, the talk is only 45 minutes long. So every slide is only a minute long.

IVAN: So, it's kind of like an ignite session, where it's 30 seconds per slide, 20 slides, something like that?

TESS: Mm hmm. I saw one by Ria Dixon called CloudWatch-ing, which was all about creating logs and alerts using AWS CloudWatch. That was really fascinating. And, it makes me wonder if there is a way to create similar mechanisms and use similar strategies in a purely open source implementation that doesn't rely on AWS's productized version of that.

IVAN: What is CloudWatch?

TESS: It's kind of an event and log tracking mechanism meant for distributed logging. There's a lot of that I didn't get into because it was mostly a case study about how they implemented it, and how they solved their own problems. There was a lot of additional research that I'd love to circle back to, but it was a really good session and I really enjoyed it.

IVAN: So, is CloudWatch then, kind of similar to Splunk?

TESS: I think that it's a bit similar to Splunk. I know that there might be part of that that's similar to Prometheus and Grafana which is a common Kubernetes logging mechanism.

IVAN: Yeah, Prometheus is pretty widespread as well, isn't it?

TESS: Mm hmm.

IVAN: Yeah. Okay. So, a couple of good sessions. Generally, you had a good time at the camp, gave two wonderful talks. Where was the camp? Was it at the University?

TESS: I believe it was. It was a pretty good location, although because it is in the middle of Missouri, I did have a problem getting to and from the Camp, because I didn't have a car rental. So I ended up walking there and that was a 20 minute walk in Missouri in June, which was a bit warmer than I’m used to. [laughing]

IVAN: Yeah, I guess the flipside of that is it could have been Missouri in December or January.

TESS: I mean I would have been fine with that but that’s me, I like winter. [laughing]

IVAN: [laughing] Well, go figure. I like it too. So, what do you know? Okay. So, the event venue was good. The attendance was comparable to TCDrupal. And before we wrap up, overall impression of the event? If there's another one next year are you going to go?

TESS: I would love to go again. It was a lot of fun to go there. And it’s a lot more interesting than I had expected it to be, which kind of lives up to the name. [laughing]

IVAN: [laughing] That's great. Well, I appreciate the time you spent with us today once again. Thank you so much for being with me. It's really been a pleasure.

TESS: Mm hmm.

IVAN: Well, Tess Flynn or socketwench, is the Devops Engineer here at TEN7, and she was just at Drupal Flyover Camp 2019, where she gave her talk, “Return of the Clustering Kubernetes for Drupal.” Of course, that's the third in a trilogy and the other talk, “Dr. Upal Is In - Healthcheck your Site.” Those slides are all online and a recording of the sessions are also available. Just visit this episode's webpage for those links. You’ve been listening to the TEN7 Podcast. Find us online at ten7.com/podcast. And if you have a second, do send us a message, we love hearing from you. Our email address is [email protected]. Until next time, this is Ivan Stegic. Thank you for listening.

Jun 19 2019
Jun 19

Over the past few months working on migrations to Drupal 8, researching best practices, and contributing to core and contributed modules, I discovered that there are several tools available in core and contributed modules, plus a myriad of how-to articles. To save you the trouble of pulling it all together yourself, I offer a comprehensive overview of the Migrate module plus a few other contributed modules that complement it in order to migrate a Drupal site to 8.

Let's begin with the most basic element: migration files.

Migration files

In Drupal 8, the migrate module splits a migration into a set of YAML files, each of them is composed by a source (like the node table in a Drupal 7 database), a process (the field mapping and processing), and a destination (like a node entity in 8). Here is a subset of the migration files present in core:

$ find core -name *.yml | grep migrations
core/modules/statistics/migrations/statistics_settings.yml
core/modules/statistics/migrations/statistics_node_counter.yml
core/modules/shortcut/migrations/d7_shortcut_set.yml
core/modules/shortcut/migrations/d7_shortcut.yml
core/modules/shortcut/migrations/d7_shortcut_set_users.yml
core/modules/tracker/migrations/d7_tracker_node.yml
core/modules/tracker/migrations/d7_tracker_settings.yml
core/modules/tracker/migrations/d7_tracker_user.yml
core/modules/path/migrations/d7_url_alias.yml
...

And below is the node migration, located at core/modules/node/migrations/d7_node.yml:

id: d7_node
label: Nodes
audit: true
migration_tags:
  - Drupal 7
  - Content
deriver: Drupal\node\Plugin\migrate\D7NodeDeriver
source:
  plugin: d7_node
process:
  nid: tnid
  vid: vid
  langcode:
    plugin: default_value
    source: language
    default_value: "und"
  title: title
  uid: node_uid
  status: status
  created: created
  changed: changed
  promote: promote
  sticky: sticky
  revision_uid: revision_uid
  revision_log: log
  revision_timestamp: timestamp
destination:
  plugin: entity:node
migration_dependencies:
  required:
    - d7_user
    - d7_node_type
  optional:
    - d7_field_instance
    - d7_comment_field_instance

Migration files can be generated dynamically via a deriver like the node migration defines above, which uses D7NodeDeriver to generate a migration for each content type’s data and revision tables. On top of that, migrations can be classified via the migration_tags section (the migration above has the Drupal 7 and Content tags).

Configuration vs content migrations

Migration files may have one or more tags to classify them. These tags are used for running them in groups. Some migration files may have the Configuration tag—like the node type or field migrations—while others might have the Content tag—like the node or user migrations. Usually, you would run the Configuration migrations first in order to configure the new site, and then the Content ones so the content gets fetched, transformed, and inserted on top of such configuration.

Notice though that depending on how much you are planning to change the content model, you may decide to configure the new site manually and write the Content migrations by hand. In the next section, we will examine the differences between generating and writing migrations.

Generating vs Writing migrations

Migration files living within the core, contributed, and custom modules are static files that need to be read and imported into the database as migration plugins so they can be executed. This process, depending on the project needs, can be implemented in two different ways:

Using Migrate Upgrade

Migrate Upgrade module implements a Drush command to automatically generate migrations for all the configuration and content in an existing Drupal 6 or 7 site. The best thing about this approach is that you don’t have to manually create the new content model in the new site since Migrate Upgrade will inspect the source database and do it for you by generating the migrations.

If the existing content model won’t need to go through major changes during the migration, then Migrate Upgrade is a great choice to generate migrations. There is an API that developers can interact with in order to alter migrations and the data being processed. We will see a few examples further down in this article.

Writing migrations by hand

If the content model will go through a deep reorganization such as merging content from different sources into one, reorganizing fields, and changing machine names, then configuring the new site manually and writing content migrations may be the best option. In this scenario, you would write the migration files directly to the config/sync directory so then they can be imported via drush config:import and executed via drush migrate:import.

Notice that if the content model has many entity types, bundles, and fields, this can be a tough job so even if the team decides to go this route, generating content migrations with Migrate Upgrade can be useful since the resulting migrations can serve as templates for the ones to be written.

Setting up the new site for running migrations

Assuming that we have a new site created using the Composer Drupal Project and we have run the installer, we need to require and install the following modules:

$ composer require drupal/migrate_tools drupal/migrate_upgrade drupal/migrate_plus
$ drush pm:enable --yes migrate_tools,migrate_upgrade,migrate_plus

Next, we need to add a database connection to the old site, which we would do at web/sites/default/settings.local.php:

// The default database connection details.
$databases['default']['default'] = [
  'database' => 'drupal8',
  'username' => 'root',
  'password' => 'root',
  'prefix' => '',
  'host' => '127.0.0.1',
  'port' => '3306',
  'namespace' => 'Drupal\\Core\\Database\\Driver\\mysql',
  'driver' => 'mysql',
];

// The Drupal 7 database connection details.
$databases['drupal7']['default'] = [
  'database' => 'drupal7',
  'username' => 'root',
  'password' => 'root',
  'prefix' => '',
  'host' => '127.0.0.1',
  'port' => '3306',
  'namespace' => 'Drupal\\Core\\Database\\Driver\\mysql',
  'driver' => 'mysql',
];

With the above setup, we can move on to the next step, where we will generate migrations.

Generating migrations with Migrate Upgrade

The following command will read all migration files, create migrate entities out of them and insert them into the database so they are ready to be executed:

$ drush --yes migrate:upgrade --legacy-db-key drupal7 --legacy-root sites/default/files --configure-only

It is a good practice to export the resulting migrate entities as configuration so we can track their changes. Therefore, we will export configuration after running the above command, which will create a list of files like the following:

$ drush --yes config:export
 [notice] Differences of the active config to the export directory:
+------------------------------------------------------------------+-----------+
| Config                                                           | Operation |
+------------------------------------------------------------------+-----------+
| migrate_plus.migration.upgrade_d7_date_formats                   | Create    |
| migrate_plus.migration.upgrade_d7_field                          | Create    |
| migrate_plus.migration.upgrade_d7_field_formatter_settings       | Create    |
| migrate_plus.migration.upgrade_d7_field_instance                 | Create    |
| migrate_plus.migration.upgrade_d7_field_instance_widget_settings | Create    |
| migrate_plus.migration.upgrade_d7_file                           | Create    |
| migrate_plus.migration.upgrade_d7_filter_format                  | Create    |
| migrate_plus.migration.upgrade_d7_filter_settings                | Create    |
| migrate_plus.migration.upgrade_d7_image_styles                   | Create    |
| migrate_plus.migration.upgrade_d7_menu                           | Create    |
| migrate_plus.migration.upgrade_d7_menu_links                     | Create    |
| migrate_plus.migration.upgrade_d7_node_article                   | Create    |
| migrate_plus.migration.upgrade_d7_node_revision_article          | Create    |
...

In the above list, we see a mix of Configuration and Content migrations being created. Now we can check the status of each migration via the migrate:status Drush command:

$ drush migrate:status
-----------------------------------------------------------------------------------------------------
 Migration ID                                 Status   Total   Imported   Unprocessed   Last Imported
-----------------------------------------------------------------------------------------------------
 upgrade_d7_date_formats                      Idle     7       0          0
 upgrade_d7_filter_settings                   Idle     1       0          0
 upgrade_d7_image_styles                      Idle     193     0          0
 upgrade_d7_node_settings                     Idle     1       0          0
 upgrade_d7_system_date                       Idle     1       0          0
 upgrade_d7_url_alias                         Idle     25981   0          0
 upgrade_system_site                          Idle     1       0          0
 upgrade_taxonomy_settings                    Idle     0       0          0
 upgrade_d7_path_redirect                     Idle     518     0          0
 upgrade_d7_field                             Idle     253     0          0
 upgrade_d7_field_collection_type             Idle     9       0          0
 upgrade_d7_node_type                         Idle     16      0          0
 upgrade_d7_taxonomy_vocabulary               Idle     34      0          0
 upgrade_d7_field_instance                    Idle     738     0          0
 upgrade_d7_view_modes                        Idle     24      0          0
 upgrade_d7_field_formatter_settings          Idle     1280    0          0
 upgrade_d7_field_instance_widget_settings    Idle     738     0          0
 upgrade_d7_file                              Idle     5731    0          0
 upgrade_d7_filter_format                     Idle     6       0          0
 upgrade_d7_menu                              Idle     5       0          0
 upgrade_d7_user_role                         Idle     6       0          0
 upgrade_d7_user                              Idle     82      0          0
 upgrade_d7_node_article                      Idle     322400  0          0
 upgrade_d7_node_page                         Idle     342     0          0
 upgrade_d7_menu_links                        Idle     623     0          0
 upgrade_d7_node_revision_article             Idle     742577  0          0
 upgrade_d7_node_revision_page                Idle     2122    0          0
 upgrade_d7_taxonomy_term_tags                Idle     1729    0          0
-------------------------------------------- -------- ------- ---------- ------------- ---------------

You should inspect any contributed modules in use on the old site and install them in the new one as they may contain migrations. For example, if the old site is using Redirect module, by installing itm and generating migrations like we did above, you should see a new migration provided by this module ready to go.

Running migrations

Assuming that we decided to run migrations generated by Migrate Upgrade (otherwise skip the following sub-section), we would first run configuration migrations and then content ones.

Configuration migrations

Here is the command to run all migrations with the tag Configuration along with their dependencies.

$ drush migrate:import --tag=Configuration --execute-dependencies
 [notice] Processed 10 items (10 created, 0 updated, 0 failed, 0 ignored) - done with 'upgrade_d7_date_formats'
 [notice] Processed 5 items (5 created, 0 updated, 0 failed, 0 ignored) - done with 'upgrade_d7_filter_settings'
 [notice] Processed 20 items (20 created, 0 updated, 0 failed, 0 ignored) - done with 'upgrade_d7_image_styles'
 [notice] Processed 10 items (10 created, 0 updated, 0 failed, 0 ignored) - done with 'upgrade_d7_node_settings'
 [notice] Processed 1 items (1 created, 0 updated, 0 failed, 0 ignored) - done with 'upgrade_d7_system_date'
 [notice] Processed 5 items (5 created, 0 updated, 0 failed, 0 ignored) - done with 'upgrade_system_site'
 [notice] Processed 100 items (100 created, 0 updated, 0 failed, 0 ignored) - done with 'upgrade_d7_field'
 [notice] Processed 200 items (200 created, 0 updated, 0 failed, 379 ignored) - done with 'upgrade_d7_field_instance'
...

The above command will migrate site configuration, content types, taxonomy vocabularies, view modes, and such. Once this is done, it is recommended to export the resulting configuration via drush config:export and commit the changes. From then on, if we make changes in the old site’s configuration or we alter the Configuration migrations (we will see how further down), we will need to roll back the affected migrations and run them again.

For example, the Media Migration module creates media fields and alters field mappings in content migrations so after installing it we should run the following commands to re-run them:

$ drush --yes migrate:rollback upgrade_d7_view_modes,upgrade_d7_field_instance_widget_settings,upgrade_d7_field_formatter_settings,upgrade_d7_field_instance,upgrade_d7_field
$ drush --yes migrate:import --execute-dependencies upgrade_d7_field,upgrade_d7_field_instance,upgrade_d7_field_formatter_settings,upgrade_d7_field_instance_widget_settings,upgrade_d7_view_modes
$ drush --yes config:export

Once we have executed all Configuration migrations, we can run content ones.

Content migrations

Content migrations are straightforward. We can run them with the following command:

$ drush migrate:import --tag=Content --execute-dependencies

Logging and tracking

Migrate keeps track of all the migrated content via the migrate_* tables. If you check out the new database after running migrations, you will see something like this:

  • A set of migrate_map* tables, storing the old and new identifiers of each migrated entity. These tables are used by the migrate:rollback Drush command to roll back migrated data.
  • A set of migrate_messages* tables, which hold errors and warnings that occurred while running migrations. These can be seen via the migrate:messages Drush command.

Rolling back migrations

In the previous section, we rolled back the field migrations before running them again. This process is great for reverting imported Configuration or Content, which you will do often while developing a migration.

Here is an example. Let’s suppose that you have executed content migrations but articles did not get migrated as you expected. The process you would follow to fix them would be:

  1. Find the migration name via drush migrate:status | grep article.
  2. Roll back migrations with drush migrate:rollback upgrade_d7_node_revision_article,upgrade_d7_node_article.
  3. Perform the changes that you need either directly at the exported migration at config/sync or by altering them and then recreating them with migrate:upgrade like we did at Generating migrations with Migrate Upgrade. We will see how to alter migrations in the next section.
  4. Run the migrations again with drush migrate:import upgrade_d7_node_article,upgrade_d7_node_revision_article.
  5. Verify the changes and, if needed, repeat steps 2 to 4 until you are done.

Migrate events and hooks

Before diving into the APIs to alter migrations, let’s clarify that there are two processes that we can hook into:

  1. The migrate:upgrade Drush command, which reads all migration files in core, contributed, and custom modules and imports them into the database.
  2. The migrate:import Drush command, which runs migrations.

In the next sub-sections, we will see how can we interact with these two commands.

Altering migrations (migrate:upgrade)

Drupal core offers hook_migration_plugins_alter(), which receives the array of available migrations that migrate:upgrade creates. Here is a sample implementation at mymodule.module where we delegate the logic to a service:

/**
 * Implements hook_migration_plugins_alter().
 */
function mymodule_migration_plugins_alter(array &$migrations) {
  $migration_alterer = \Drupal::service('mymodule.migrate.alterer');
  $migration_alterer->process($migrations);
}

And here is a subset of the contents of the service:

class MigrationAlterer {

  /**
   * Processes migration plugins.
   *
   * @param array $migrations
   *   The array of migration plugins.
   */
  public function process(array &$migrations) {
    $this->skipMigrations($migrations);
    $this->disablePathautoAliasCreation($migrations);
    $this->setContentLangcode($migrations);
    $this->setModerationState($migrations);
    $this->persistUuid($migrations);
    $this->skipFileCopy($migrations);
    $this->alterRedirect($migrations);
  }

  /**
   * Skips unneeded migrations.
   *
   * @param array $migrations
   *   The array of migration plugins.
   */
  private function skipMigrations(array &$migrations) {
    // Skip unwanted migrations.
    $migrations_to_skip = [
      'd7_block',
      'd7_comment',
      'd7_comment_entity_display',
      'd7_comment_entity_form_display_subject',
      'd7_comment_field',
      'd7_comment_entity_form_display',
      'd7_comment_type',
      'd7_comment_field_instance',
      'd7_contact_settings',
    ];
    $migrations = array_filter($migrations, function ($migration) use ($migrations_to_skip) {
      return !in_array($migration['id'], $migrations_to_skip);
    });
  }

  // The remaining methods would go here.

}

In the next section, we will see how to alter the data being migrated while running migrations.

Altering data while running migrations (migrate:import)

Drupal core offers hook_migrate_prepare_row() and hook_migrate_MIGRATION_ID_prepare_row, which are triggered before each row of data is processed by the migrate:import Drush command. Additionally, there is a set of events that we can subscribe to such as before and after the migration starts or before and after a row is saved.

On top of the above, Migrate Plus module exposes an event that wraps hook_migrate_prepare_row(). Here is a sample subscriber for this event:

class MyModuleMigrationSubscriber implements EventSubscriberInterface {

  /**
   * Prepare row event handler.
   *
   * @param \Drupal\migrate_plus\Event\MigratePrepareRowEvent $event
   *   The migrate row event.
   *
   * @throws \Drupal\migrate\MigrateSkipRowException
   *   If the row needs to be skipped.
   */
  public function onPrepareRow(MigratePrepareRowEvent $event) {
    $this->alterFieldMigrations($event);
    $this->skipMenuLinks($event);
    $this->setContentModeration($event);
  }

  /**
   * Alters field migrations.
   *
   * @param \Drupal\migrate_plus\Event\MigratePrepareRowEvent $event
   *   The migrate row event.
   *
   * @throws \Drupal\migrate\MigrateSkipRowException
   *   If a row needs to be skipped.
   * @throws \Exception
   *   If the source cannot be changed.
   */
  private function alterFieldMigrations(MigratePrepareRowEvent $event) {
    $field_migrations = [
      'upgrade_d7_field',
      'upgrade_d7_field_instance',
      'upgrade_d7_view_modes',
      'upgrade_d7_field_formatter_settings',
      'upgrade_d7_field_instance_widget_settings',
    ];

    if (in_array($event->getMigration()->getPluginId(), $field_migrations)) {
      // Here are calls to private methods that alter these migrations.
    }
  }

  /**
   * Skips menu links that are either implemented or not needed.
   *
   * @param \Drupal\migrate_plus\Event\MigratePrepareRowEvent $event
   *   The migrate row event.
   *
   * @throws \Drupal\migrate\MigrateSkipRowException
   *   If a row needs to be skipped.
   */
  private function skipMenuLinks(MigratePrepareRowEvent $event) {
    if ($event->getMigration()->getPluginId() != 'upgrade_d7_menu_links') {
      return;
    }

    $paths_to_skip = [
      'some/path',
      'other/path',
    ];

    $menu_link = $event->getRow()->getSourceProperty('link_path');
    if (in_array($menu_link, $paths_to_skip)) {
      throw new MigrateSkipRowException('Skipping menu link ' . $menu_link);
    }
  }

  /**
   * Sets the content moderation field on node migrations.
   *
   * @param \Drupal\migrate_plus\Event\MigratePrepareRowEvent $event
   *   The migrate row event.
   *
   * @throws \Exception
   *   If the source cannot be changed.
   */
  private function setContentModeration(MigratePrepareRowEvent $event) {
    $row = $event->getRow();
    $source = $event->getSource();

    if (('d7_node' == $source->getPluginId()) && isset($event->getMigration()->getProcess()['moderation_state'])) {
      $state = $row->getSourceProperty('status') ? 'published' : 'draft';
      $row->setSourceProperty('moderation_state', $state);
    }
    elseif (('d7_node_revision' == $source->getPluginId()) && isset($event->getMigration()->getProcess()['moderation_state'])) {
      $row->setSourceProperty('moderation_state', 'draft');
    }
  }

}

Conclusion

When you are migrating a Drupal site to 8, Migrate Upgrade module does a lot of the work for free by generating Configuration and Content migrations. Even if you decide to write any of them by hand, it is convenient to run the command and use the resulting migrations as a template for your own. In the next article, we will see how to work seamlessly on a migration while the rest of the team is building the new site.

Thanks to Andrew Berry, April Sides, Karen Stevenson, Marcos Cano, and Salvador Moreno for their feedback and help. Photo by Barth Bailey on Unsplash.

Jun 19 2019
Jun 19

Jen Lampton (Backdrop user account), co-founder of Backdrop CMS, senior Drupal developer at Jeneration.com joins Mike Anello and Ryan Price to get reacquainted with Backdrop and to discuss why it could be a good long-term solution for sites after Drupal 7's end-of-life.

Discussion

DrupalEasy News

Upcoming Events

Sponsors

Follow us on Twitter

Subscribe

Subscribe to our podcast on iTunes, Google Play or Miro. Listen to our podcast on Stitcher.

If you'd like to leave us a voicemail, call 321-396-2340. Please keep in mind that we might play your voicemail during one of our future podcasts. Feel free to call in with suggestions, rants, questions, or corrections. If you'd rather just send us an email, please use our contact page.

Jun 19 2019
Jun 19

In the last article, we programmed a new module which created CSV files from data on a different server. Now, we need to migrate this data into our live-system. As first step we need to create a commerce store to which our products and product variations belong. Otherwise, they cannot be imported.

Commerce product and product variations

Afterwards we need a commerce product type and a commerce product variation type. A product type stores a general product, while a product variation type stores different variation of said product type. I.e. "Pair of cotton pants" would be defined as product type, while "Pair of cotton pants in red/blue/yellow, size s/m/l would be defined as product variation type.

Our product type is a 'room' and as fields, we want to store an id, a title and an image carousel showcasing the room.

Our product variation type needs to display different variations of a 'room'. Here we also need the id and additional fields for a variation title and a description of the room variation.

Feed settings

After these types are finished, we need to define two corresponding feed types. Both feed types need almost the same settings.

Feeds Tamper basic settingsThe fetcher is 'Directory' as parser we choose 'CSV'. As processor we choose 'product' for our rooms and 'product variation' for our room variations. As product type/product variation type we choose the commerce types we just created in the first step.

The other settings can be changed to personal liking, except the default delimiter in the Parser settings which needs to be a comma(,) because that is what we chose for our csv files.

Now, the feed type mapping has to be set up. The feeds will only work, if all mandatory fields are mapped correctly. We need to select 'Feed: Authored by' and 'Feed: Authored on' as source fields. Additionally we need 'Status' and 'SKU' which need to be set for Commerce products. These are the minimum settings we need for feeds to run. Additionally, our product type room also needs 'Store'. Otherwise commerce cannot import the product. Store and status need to be imported from the csv files, authored by and authored on don't need to be. Store has to be the store for which the product and product variation types were created for.

feeds mapping

We select all other fields from the csv file we'd like to map to our product type /product variation type and save.

If we were to import now, the variations would not appear. They are linked to their corresponding products by the sku defined in the csv file, but the feeds module needs help to understand the link. The module feeds tamper creates another tab 'Tamper' in the feeds type settings. Here we create a plugin for our feeds type room. We select explode and as field we chose the sku and as delimiter we select '|', because that is what we chose during the csv file creation.

feeds tamper

Creating Feeds and importing

After defining the feed types, we have to create actual feeds from them. In Content > Feeds. All we need the the path to our csv file and the correct delimiter ','. After creating one feed for the products and one for the product variations, all we have to do is activate them. Feeds belong to website-content, which means they need to be imported like content. Otherwise you have to create them again and they can't be exported to config-files.

feeds

When actually importing, the order is important. We need to import the product variations before the products, otherwise they will not appear. Feeds module creates a temporary table with the newly imported product variations and compares their skus with skus from newly imported products and connects them accordingly. It does not work the other way around.

Jun 19 2019
Jun 19

As a follow up of the articles from my colleagues Philipp and Blaize regarding Open Source as a business model and a career perspective, I would like to describe WHY and HOW we can contribute to Drupal in our daily work, on client projects.

This article is aimed at the stakeholders of a project. The second one from this series will focus on the technical aspect of contributing to Drupal in this context, and thus will be more developer oriented.

The WHY


The Developer Perspective
I will cite my colleague Blaize here: This is not simply about developing your taste, or reading code, but learning about the collaborative nature of programming. Contributing is the perfect opportunity to apply some inherent collaboration principles. Mentoring and pair programming enables your team to share knowledge and skills, cultivates mutual respect and ultimately improves the quality of the codebase and the documentation.

The Client / PM Perspective
Among other benefits, generalizing the components of a client project for Open Source can lower costs and technical debt and favours a clear and well-documented maintenance flow. Generalizing these components enforces modularity and documentation and thus offers flexibility while working with several teams or onboarding new team members. Gathering feedback from a broader user group helps to identify earlier new use cases and edge cases.

The Community
The Drupal ecosystem is powered and diversified by contributions, so encourage your team to join community initiatives and promote the company expertise.

The HOW


Identify Opportunities
Here is an excerpt from our internal documentation that summarises pretty well the leading theme that drives our development team:

Maximize open source: Lower initial costs, technical debt and maintenance costs by using and contributing to open source code as much as possible. For every feature required by a project that is not solvable by configuration or theming, try to find a generic solution that can be contributed (...)

Set the Contribution Scope


Extend an existing Drupal project
Contributions should improve the ecosystem by checking if existing solutions are already matching a use case. Extending an existing project can be achieved by:

  • Porting a project to Drupal 8
    
The required solution could exist for a previous version of Drupal. The Drupal 8 Contrib Porting Kanban can be used to get an idea of the remaining tasks. 

  • Creating a patch or a submodule

    An existing contributed project could meet, let’s say, 90% of your use case. The 10% left can be covered with a patch then contributed back.

  • Writing unit tests

    If the contributed project is critical for your use case, widening the test coverage might be a good way to ensure that every feature is working properly.

  • Making a project Drupal 9 ready

    Help to remove deprecations for Drupal 9 compatibility.


Create a New Project
The initial development of a generic solution can cost more (time/budget-wise), averaging between 2 or 3 times more. So before starting, here are a few points that could be considered. Does it implement a generic feature? Can the exceptions be configurable? Can this set of features be shared between projects? If they can be deployed in at least 2 or 3 projects as is, that is probably a good indicator.

Can the feature be isolated into a generic (packagist) library or reuse a generic library? This is often the case for third-party libraries like APIs. Is the initial release feature complete or does it allow progressive enhancements? It will give a hint about the needed maintenance time. Can it be maintained by the company that develops it? Does it fit internal or community feedback/contribution?

Integration to the Company Culture


Defining a development workflow will save you time
As a first evaluation, if it’s uncertain that the project will be reused, choose a generic name (not client specific) so it could be shared at a later stage without modifying the codebase. Use an easy-to-grasp name, one that reflects a single responsibility. If the project is doing too many things, it could be divided into several projects or submodules. 

Create a readme to allow quick developer onboarding. Move it as a standalone repository (GitHub, GitLab, Drupal.org, ...) and require it via Composer. If possible, split the generic features into packagist libraries and wrap them into your Drupal project.

If you are using untagged branches (e.g. a unique dev branch) set a commit hash via Composer to be sure that you are using the right version in your client project. Make your project discoverable internally, also to non-developers. Especially when there are several teams, so it will favour cross-project management. Make it public by creating a Drupal.org release and discoverable externally: document it, share it on social media, write a blog post.

Developer Onboarding and Documentation
If the project is released on Drupal.org, make sure that your team is familiar with the contribution processes like semantic versioning, maintenance, release and external contribution policy, and the code of conduct.

Add your team members as project maintainers and remote contribution tools: like how to do patches and interdiffs, apply coding standards (phpcs / phpcbf), write unit tests, …

The sprint participant cards could be a great way to gamify your internal contribution sprints.

Sprint participant cards

Open up your project to an internal/external contribution by making sure everybody can start quickly. Create or update the readme file and project documentation (site builder and developer) and set a clear roadmap by creating task issues on Drupal.org so the goals and current status are clear. Use the issue template summary and compare other related modules on the project page.

Resources

In our next article, we will give a brief introduction to the following points:

  • Work on the Drupal.org issue queue: create patches, re-rolls and reviews

  • Publish a project on Drupal.org and be prepared for Drupal 9



Jun 19 2019
Jun 19

Joris Snoek - Business Dev

+31 (0)20 - 261 14 99

Last month we worked in a project where we implemented a progressively decoupled Drupal platform. We needed a React.js frontend to to all kind of magic that was less available in Twig. Also, this way frontend engineers and backend engineers could work more loosely together, which is great.

Main reason we choose progressive decoupled instead of fully decoupled is because we wanted to make use of Drupal's roles, permissions and authentication, plus some other Drupal native stuff that in fully headless would become very cumbersome to deal with.

Check out this blog on 'How to decouple Drupal in 2019' and find out if you also need Drupal beheading or not.

React libraries dynamically in Drupal

The target was to implement the React.js libraries in Drupal pages, so React would load withín Drupal and we could use all native Drupal goodies like authentication.

By the way: you can also use Drupal authentication when fully decoupled, check this article

So, the thing with React is: every time you release a new version of your App, you'll have to render a new build, that sort of looks like this:

More specifically, the .js and .css files we need to load in Drupal are defined in asset-manifest.json:

So, those files need to be loaded in Drupal, and change every time you run a new build in your React App for example with npm run build.

But you can't add these React javascript libraries to your theme's YOURTHEME.libraries.yml because with the next build the links be be different.

hook_library_info_build() to the rescue

Aaaah, so Drupal 8 provides us with hook_library_info_build() that will save our day (づ。◕‿‿◕。)づ

Now, how we did this:

Implement the Drupal hook in your .module file (For highlighting's sake, I added .php, loose that. #nobrainer):

As you see, it will read React's asset-manifest.json and registers all .js and .css files of React.js as a library in Drupal.

Next up, you can render this library for example in a Controller like this:

Wrap up

So that's how we integrated React.js within Drupal pages and combined the power of a React frontend with the power of Drupal goodies, in a 'progressively decoupled' way.

Please let me know if you have any questions.

Jun 18 2019
Jun 18

While many things stayed the same as previous years, such as the camp location and month it is held, this year was a first for many small changes.

Programming
The first big change was programming. Unlike previous years, where there was 1 day of training, and two full days of sessions and tracks (like you would have at a DrupalCon), this year’s schedule had one day of training, and only one day of traditional sessions.

The third day of the conference was dedicated to “unconferencing,” where those attending decided as a group what topics to discuss, and everyone participated in round table discussions for the day. The feedback was extremely positive for the new schedule, so it seems likely to stick.

The @TCDrupal unconference is underway! pic.twitter.com/WqaQsNL2rk

— Twin Cities Drupal (@TCDrupal) June 8, 2019

Welcome Party
The second big change was adding a new social event, the Welcoming Party. While the first night of camp has traditionally included a party for sponsors and speakers, this year it was opened up to ALL camp attendees. One of the core values of TCDC has always been inclusiveness, and this small change just made a lot of sense.

Community Sponsors
The third big change related to sponsors. There were still a number of great organizations who stepped up to help sponsor the conference, but this year included a dedicated push to encourage more “community sponsors.” The conference was still a very affordable $50/person, but during registration, there was ask to those who used Drupal professionally to give more, if they could afford to.

And people really came through! While there had been some individual sponsors in the past, 2019 had more community sponsors than ever before. 34 people joined in, and helped keep the conference both affordable to all, and in the black.

Jun 18 2019
Jun 18

Our normally scheduled call to chat about all things Drupal and nonprofits will happen this Thursday, June 20, at 1pm ET / 10am PT. (Convert to your local time zone.)

Feel free to share your thoughts and discussion points ahead of time in our collaborative Google doc: https://nten.org/drupal/notes

We have an hour to chat so bring your best Drupal topics and let's do this thing!

Some examples to get your mind firing: how do I recreate [feature] on my Drupal 7 site in Drupal 8? I need to explain [complicated thing] to a non-technical stakeholder -- any advice? How can I get Drupal and my CRM to play nicely?

This free call is sponsored by NTEN.org but open to everyone.

View notes of previous months' calls.

Jun 18 2019
Jun 18

Drupal is lucky to benefit from a very active community of developers meaning that there is a wide and varied range of contributed modules to extend the functionality of Drupal core. In this article we’ll take a quick look at 10 interesting contributed modules; some are well known whilst others are a little bit more obscure.

1. Admin_menu (D7) / Admin_toolbar (D8)

Out of the box the Drupal admin interface can be a bit unwieldy and whilst this has been significantly improved over the years, especially with the advent of Drupal 8, there’s still room for improvement. Enter admin_menu/admin_toolbar which are two similar modules to make navigating the admin interface a whole lot easier by providing a neat little toolbar with drop downs so you can navigate the whole admin interface from any page of your site.

2. Kraken

This module allows you to use the kraken.io web service to optimise images on your website. It works be exposing a kraken optimise image style effect which can be applied to image styles on your Drupal website.

3. Popup_field_group

This is a nice little module maintained by ComputerMinds which gives the option to display the children of a field group in a popup overlay. Buttons are exposed to toggle the popup.

4. Flood_unblock

Drupal 7 introduced a feature to prevent brute force attacks meaning that no more than five failed login attempts per user in any six hour period or no more than 50 failed attempts from an IP address in a one hour period are allowed. Failed login attempts are recorded in the flood table and this module gives administrators an easy way to unblock users that have exceeded these limits.

5. Paragraphs

Paragraphs give much greater control over content creation on your Drupal site. A paragraph is a set of fields which has its own theme associated with it to give much greater flexibility over how content gets rendered on your site. So for example you might have a paragraph which floats an image left and displays text on the right - the possibilities are endless. Take a look at tiles in action to find out more about working with paragraphs (we use the term tiles to mean the same thing!)

6. Stage_file_proxy

This module is useful when working on a development version of your Drupal site by providing a proxy to the production site’s files directory. When you need a production file the module maps this the production files directory and downloads it to your development files directory.

7. Field_display_label

This is a nice little module to allow you to change a field label on the edit form so that it’s difference to what’s rendered when the field is displayed. So for example you might have a name field labelled ‘what’s your name?’ on the edit form which just renders ‘name’ when it’s displayed.

8. Custom_add_another

Another simple module maintained by ComputerMinds which gives site admins the ability to customise the ‘add another’ text for multi valued fields.

9. Notice_killer

This is a nice little module that will split out PHP notices and warnings from other Drupal notices and also logs a bit more information about each so that you can track them down and fix them more easily.

10. Rabbit_hole

This is a useful module that prevents certain entities from being viewable on their own page. So for example if you have an image content type which you never want to be accessible on node/xx then this is the module for you!

Jun 18 2019
Jun 18

The new version of Acquia Lift makes it simpler for marketers to run website personalization campaigns themselves, without the need for writing code.

Today, we released a new version of Acquia Lift, our web personalization tool.

In today's world, personalization has become central to the most successful customer experiences. Most organizations know that personalization is no longer optional, but have put it off because it can be too difficult. The new Acquia Lift solves that problem.

While before, Acquia Lift may have taken a degree of fine-tuning from a developer, the new version simplifies how marketers create and launch website personalization. With the new version, anyone can point, click and personalize content without any code.

We started working on the new version of Acquia Lift in early 2018, well over a year ago. In the process we interviewed over 50 customers, redesigned the user interface and workflows, and added various new capabilities to make it easier for marketers to run website personalization campaigns. And today, at our European customer conference, Acquia Engage London, we released the new Acquia Lift to the public.

You can see all of the new features in action in this 5-minute Acquia Lift demo video:

The new Acquia Lift offers the best web personalization solution in Acquia's history, and definitely the best tool for Drupal.

June 18, 2019

47 sec read time

db db
Jun 18 2019
Jun 18

Agiledrop is highlighting active Drupal community members through a series of interviews. Now you get a chance to learn more about the people behind Drupal projects.

Meet Mario Hernandez, Senior Frontend Developer at Mediacurrent. With over 10 years of experience in Drupal, Mario has seen the CMS evolve significantly throughout the years. Read on to find out more about some of Mediacurrent's most interesting projects and what aspect of his work Mario enjoys the most.

1. Please tell us a little about yourself. How do you participate in the Drupal community and what do you do professionally?

I am a Senior Front End Developer at Mediacurrent and the majority of my contribution to the Drupal community is around conducting training workshops and writing blog posts. In the past I’ve contributed to the Out Of The Box initiative and also helped with the implementation of the Simplify Menu module.

2. When did you first come across Drupal? What convinced you to stay, the software or the community, and why?

Back in 2007 I was working for the Federal Government in Los Angeles as a developer and was tasked with building an Emergency Preparedness website which would provide access to resources to employees for disaster recovery in the event of a major disaster. I looked and tested several content management systems and ultimately decided on Drupal as it was the one which provided the most flexibility and ability to scale.

3. What impact has Drupal made on you? Is there a particular moment you remember?

I believe after building the website I cited above, I realized the power of Drupal. As a Front End Developer with minimum programming experience, I realized Drupal could do all the heavy lifting for me in the Back-End while I focus on my area of expertise, Front-End. Nowadays and for the past 20+ years I’ve been making a living building Drupal Websites for some of the most well known brands in the world.

4. How do you explain what Drupal is to other, non-Drupal people?

First I find out if they are familiar with content management systems at all and tell them Drupal is like one of those but for enterprise level websites. If they know about WordPress or Powerpoint I draw some kind of comparison such as being open source or written in PHP. I also share examples of high profile websites built with Drupal. Best example is The Weather Channel website which most people use and I tell them we at Mediacurrent built it. (shameless plug).

5. How did you see Drupal evolving over the years? What do you think the future will bring?

I started working with Drupal 6 back in 2007 and it is great to see how far Drupal has come. As a Front-End developer going from PHP templates to Twig was a huge improvement. Mobile first and accessibility have improved drastically from the old days. Other significant improvements are APIs, Layout Builder and Configuration Management. Lastly, the promise of a better upgrade path will be a big win for people building websites as well as site owners when Drupal 9 is released.

6. What are some of the contributions to open source code or to the community that you are most proud of?

Although I have made code contributions to the Drupal ecosystem, my proudest contribution is being able to provide training and technical resources for others to consume. I personally enjoy conducting training workshops at small camps where people are eager to learn. In most cases this contribution is free of charge but it is the most rewarding experience for me personally.

7. Is there an initiative or a project in Drupal space that you would like to promote or highlight?

Mediacurrent recently introduced our own Drupal Distribution called Rain, which is an Enterprise Distribution. We have worked and continue to work very hard on providing a turn-key solution for anyone looking to build an enterprise level website. In addition to hand-picked modules and functionality, the Rain distribution provides an optional Decoupled front-end which runs on Gatsby. We are very proud of Rain and are excited to see what people build with it.

8. Is there anything else that excites you beyond Drupal? Either a new technology or a personal endeavor. 

We at Mediacurrent are working hard to take our training offerings to a new level. We have great plans for providing custom training for teams and the community and are very excited to make Mediacurrent the go-to agency for training that goes beyond Drupal.  

Jun 18 2019
Jun 18

Agiledrop is highlighting active Drupal community members through a series of interviews. Now you get a chance to learn more about the people behind Drupal projects.

Meet Mario Hernandez, Senior Frontend Developer at Mediacurrent. With over 10 years of experience in Drupal, Mario has seen the CMS evolve significantly throughout the years. Read on to find out more about some of Mediacurrent's most interesting projects and what aspect of his work Mario enjoys the most.

1. Please tell us a little about yourself. How do you participate in the Drupal community and what do you do professionally?

I am a Senior Front End Developer at Mediacurrent and the majority of my contribution to the Drupal community is around conducting training workshops and writing blog posts. In the past I’ve contributed to the Out Of The Box initiative and also helped with the implementation of the Simplify Menu module.

2. When did you first come across Drupal? What convinced you to stay, the software or the community, and why?

Back in 2007 I was working for the Federal Government in Los Angeles as a developer and was tasked with building an Emergency Preparedness website which would provide access to resources to employees for disaster recovery in the event of a major disaster. I looked and tested several content management systems and ultimately decided on Drupal as it was the one which provided the most flexibility and ability to scale.

3. What impact has Drupal made on you? Is there a particular moment you remember?

I believe after building the website I cited above, I realized the power of Drupal. As a Front End Developer with minimum programming experience, I realized Drupal could do all the heavy lifting for me in the Back-End while I focus on my area of expertise, Front-End. Nowadays and for the past 20+ years I’ve been making a living building Drupal Websites for some of the most well known brands in the world.

4. How do you explain what Drupal is to other, non-Drupal people?

First I find out if they are familiar with content management systems at all and tell them Drupal is like one of those but for enterprise level websites. If they know about WordPress or Powerpoint I draw some kind of comparison such as being open source or written in PHP. I also share examples of high profile websites built with Drupal. Best example is The Weather Channel website which most people use and I tell them we at Mediacurrent built it. (shameless plug).

5. How did you see Drupal evolving over the years? What do you think the future will bring?

I started working with Drupal 6 back in 2007 and it is great to see how far Drupal has come. As a Front-End developer going from PHP templates to Twig was a huge improvement. Mobile first and accessibility have improved drastically from the old days. Other significant improvements are APIs, Layout Builder and Configuration Management. Lastly, the promise of a better upgrade path will be a big win for people building websites as well as site owners when Drupal 9 is released.

6. What are some of the contributions to open source code or to the community that you are most proud of?

Although I have made code contributions to the Drupal ecosystem, my proudest contribution is being able to provide training and technical resources for others to consume. I personally enjoy conducting training workshops at small camps where people are eager to learn. In most cases this contribution is free of charge but it is the most rewarding experience for me personally.

7. Is there an initiative or a project in Drupal space that you would like to promote or highlight?

Mediacurrent recently introduced our own Drupal Distribution called Rain, which is an Enterprise Distribution. We have worked and continue to work very hard on providing a turn-key solution for anyone looking to build an enterprise level website. In addition to hand-picked modules and functionality, the Rain distribution provides an optional Decoupled front-end which runs on Gatsby. We are very proud of Rain and are excited to see what people build with it.

8. Is there anything else that excites you beyond Drupal? Either a new technology or a personal endeavor. 

We at Mediacurrent are working hard to take our training offerings to a new level. We have great plans for providing custom training for teams and the community and are very excited to make Mediacurrent the go-to agency for training that goes beyond Drupal.  

Jun 17 2019
Jun 17

open waters

Mediacurrent is proud to announce the launch of our new podcast with the release of our pilot episode. Open Waters is a podcast exploring the intersection of open source technology and digital marketing. It’s made especially for CMO's, Directors and Managers of Marketing, technology, and digital strategy.

Our Purpose

We think open source is an ocean of opportunity to maximize your martech investment. 

We encourage you to listen and learn about using open source technology and forward thinking marketing strategy to generate and convert leads, improve security, increase speed to market, and identify the ROI of your digital investments. Our goal is to educate about the challenges and solutions we see with our own clients and in the market.

Welcome to Open Waters - Episode 0 

Dive in to our pilot episode!

Audio Download Link

In this episode:

  • New format, shorter but more frequent episode release schedule.
  • We're taking a different direction from our Mediacurrent Dropcast, no longer focused strictly on Drupal. Instead, we will be talking about the business benefits of open source software.
  • We are going to change up some sections. A little less news, and more about solutions.
  • We will probably still do the Pro Project pick from our Dropcast
     

Upcoming Episodes:

  •  Ben Robertson, who presented at the GatsbyJS Days conference in December, will join us to talk about the benefits of Gatsby JS.
  • Mario Hernandez will be on the podcast to talk about our upcoming expanded training for components. 
  • We’ll have an episode to talk about how to choose a CMS, whether it’s Drupal, WordPress, or any of the other bazillion options.
  • Bob Kepford, you may have heard of him, will be on to talk about serverless 101 and the problems it can solve.
  • We will have Jason Lengstorf from Gatsby on to talk about the project.
  • And much, much more.

Subscribe to Open Waters 

New episodes will be released tri-weekly on Tuesdays.

How Can I Support the Show?

Subscribe, leave a review, and share about us on social media with the hashtag #openwaterspodcast. Have an idea for a topic? Tweet at our hosts Mark Casias, Bob Kepford, and Mario Hernandez. Be sure to subscribe in your favorite podcast platform. Links to follow.

Jun 17 2019
Jun 17

Subscribe to the TCDrupal News

*/
Jun 17 2019
Jun 17

Drupal 8 to Drupal 8

Drupal 9’s release is just around the corner. If you are on Drupal 8, then you are well on your way to Drupal 9 readiness. The planned release date for Drupal 9 is June 2020 with Drupal 8 and Drupal 7 reaching end of life in November 2021. Drupal 8’s new release cycle allows a straightforward upgrade path but developers no longer get the benefit of rearchitecting for the new version. Thinking about architectural decisions, technical debt, and clean site building now will help your Drupal 9 transition and success.

Upgrade to minor releases

The simplest way to prepare for Drupal 9 is by staying up to date on Drupal’s minor release cycle. New features roll out every 6-months and each release adds new functionality plus can deprecate contrib modules (workbench moderation in favor of content moderation, media in favor of core media, panels in favor of layout builder, etc). Drupal 8.7 came out on May 1 while Drupal 8.8 is planned for December 4.

With each minor release, the following should be considered in your deployment process:

Consider soon-to-be-deprecated code

Run `drupal-check` to validate what deprecated code exists in your repository or use the GUI based Upgrade Status module. Create tasks in your backlog to remove these over the following 6-months. Check custom code but also key contrib modules utilized. If deprecations are found in contrib, create an issue then patch the deprecation.

Look ahead to Drupal 9 core

Review the release notes and see what core features can take the place of something you’re using contrib or custom to achieve. Evaluate updating to the core solution. Removing custom code or contrib in favor of core ensures long term support and automated testing. Example: Are you still running contrib media? Work to upgrade this to core media.

Talk to your editors and site builders

Ask them, “What is the one thing we could do or stop doing to make your day to day better?” Work to include this feedback over your next few sprints. Yes, this won’t make your Drupal 9 upgrade easier, but it will ensure team support for Drupal across your organization. Preparing for Drupal 9 includes keeping connected to the overall team’s needs.

Audit Site Building

Drupal’s power lies in how easy it is to add fields, make customizations, and tag on new features — without ever touching code.   

Drupal’s pain lies in how easy it is to add fields, make customizations, and tag on new features — without ever touching code.

Since the core codebase and configuration will not change in Drupal 9, these customizations can add bloat, cause regression risks, and slow performance. 

Regularly check in on your content types, site configuration, and fields to see what can be deprecated, removed, or done in a more reusable way. Things to look for include:

  • Fields added for a one-time event that now sit dormant

    • Think about removing this field completely or utilize an existing block, field, or paragraph to achieve the results.
  • Content types with little to no content added

    • Is this content type still needed? Could you instead create a more generalized content type utilizing paragraphs or layout builder to allow more customization of one-off page creation requests?
  • Out-of-date help text

    • Help text is often forgotten about by power uses. But what happens when a new teammate joins the organization? What about when someone takes a vacation? Ensuring up-to-date help text, consistent field titles, and reasonably organized fields on content types are all ways to support an intuitive admin interface

Automation

In past versions of Drupal, a new major release meant any work to automating processes would grind to a halt as these would need to be recreated for the next major version. Since Drupal 9 will no longer require tearing down and rebuilding your project, any improvement made to automation will give you more time to focus on features, your editors, and your organization’s mission. No stakeholder wants to hear they can’t have a new feature because of the latest security release or a botched release. Give your team confidence to releases and features by automating testing today.

Two aspects of automation should specifically be focused on:

Releases

Every part of a release that can be automated should be. This allows for releases to happen more often with less risk.

  • Aim to create a release process that is not dependent on a single person
    • When a security risk opens up and your release engineer is on vacation, can someone quickly and reliably execute the release?
  • Take one step at a time.
    • Automate one step at a time and improve over time. Don’t try to make everything happen at once.
  • Look at what continuous integration your hosts uses.
    • Leverage something you already have existing through your host or Git provider versus building from scratch.

Testing

Every developer loves squashing technical debt. Every product owner cringes when those two words are mentioned. Technical debt never ends and can take hundreds of hours away from feature development. Though automated testing can’t change that, it can help to prevent rework and regressions as code or features are updated. Similar to automating releases, automating testing provides more confidence in your site’s stability and performance with each test added. 

There are a variety of testing infrastructures to look at but Drupal Core provides PHPUnit tests out of the box and a testing infrastructure for them. Behat integrates well for quick smoke tests after releases to ensure that core pages or links work as expected.

  • Start small.
    • Try to make a simple hello world test and expand to an easy, and somewhat trivial check after a release.
  • Scale up smoke tests.
    • Add in new tests over time for items you check in a smoke test manually after each release. Soon you’ll have your smoke testing fully automated and can start implementing regression tests and testing in all future code.

Plan Strategically for Drupal 9 

Upgrading to Drupal 9 will be an easier process than ever before but will not equal a clean slate to start from. Decisions made today will affect your site for years to come. Throughout your day-to-day, think of the following:

  • Stay up to date on Drupal Minor Upgrades (8.7, 8.8, etc.).
  • Audit your site building to ensure it applies globally, remove one-off fields and deprecated features.
  • Automate your releases and smoke/regression testing.

If you have questions about how to build a strategy for a successful transition to Drupal 9, we’re here to help. Drop us a line.

Jun 17 2019
Jun 17

Submitted by heykarthikwithu on Monday, 17 June 2019 - 14:18:28 IST

Run Simple test cases from terminal via Docker commands

from Drupal root, we could use the below command to run the test cases.

$ scripts/run-tests.sh --verbose --php /usr/local/bin/php Globe

In case, If we have a docker setup for our local envirnoment, we could use this below command to run the test cases.

$ docker exec -it drupal_phpfpm_1 /usr/local/bin/php scripts/run-tests.sh --verbose --php /usr/local/bin/php Custom

Where
drupal_phpfpm_1: Docker container name
/usr/local/bin/php: Path to the Php installation
Custom: Group name of the test cases

The same command could be used to run the test cases on you CI/CD envirnoment.

Jun 16 2019
Jun 16

I was pleasantly surprised earlier this week to discover how easy it is to lazy load images in Lightning*.

Lightning ships with the Blazy module, which integrates the javascript library of the same name. Accomplishing lazy loading of images is as easy as changing the display widget to use the Blazy formatter. As a test, I set up a content type with an image Media field on my local environment, set the Media entity up to use Blazy, and created a View to display all the content.

With the formatter set, images are only loaded when they are within a configurable distance of the display window. The impact, as you can see in the screenshot below, is that with Blazy enabled initial page load makes 3 image requests, load time is just over half a second, and transfers about 5K. When scrolling the page, images are loaded in as they approach the viewport.

Screenshot of page and web inspector network traffic with lazy loading enabled.Screenshot of page and web inspector network traffic with lazy loading enabled.

With normal, full content loading, the page makes 19 image requests, the load time is roughly 4 seconds, and the transfer size is 600K (my source images are pretty small).

Screenshot of page and web inspector network traffic using normal loading.Screenshot of page and web inspector network traffic using normal loading.

Limitations

The 1.x branch of Blazy, which is what Lightning ships with, does not work when embedding Media through CKEditor nor when using the CKEditor image embed option. There is a "Blazy With Media" formatter option in Lightning. I've not played with it but it may resolve at least the first issue. The 2.x branch of Blazy also apparently supports Media oEmbed and has an input filter for inline images.

Footnote

Ironically, I don't actually use Lightning on this site, a fact which I expect I'll get some grief about from the folks at work, and which, perhaps, I'll reconsider at some point. Nor do I have Blazy enabled, though I expect I'll be changing that in short order also.

Jun 15 2019
Jun 15

This year's edition of Drupal North has come to an end and what a great experience it has been. We got to connect with so many people in the community, share our experiences, and learn from others. Here are some of the highlights and key takeaways from the last 3 days:

Accessibility really is on the top of everyone's minds

Throughout the conference, there were so many conversations about accessibility -- from best practices, to the future of accessibility tools and websites. With the passing of Bill C-81, government and higher education institutions in Canada will need to make this a top priority as their websites will need to be compliant with Level AA WCAG standards. Those using Drupal will already be 1 step ahead as it is more accessibility-friendly than other content management platforms.

Just like Drupal, the Drupal North community values openness

The people that make up the Drupal community are always willing to share tips, tools, information, and what processes work best for them. Some of the attendees we spoke with specifically said they were excited to implement some of the techniques and tools that were discussed at Drupal North. This is a community of learners and innovators, always striving to move the web forward.

We are a group of problem-solvers

Some of the most valuable conversations this week were held when attendees were encouraged to share problems they've been encountering and ask the community for ideas and suggestions. The roundtable breakouts at the summits were great for brainstorming and connecting with others having similar issues, and lots of great feedback and input was shared during the Interactive UX Workshop that Alex Dergachev and I hosted.

We hope everyone had a great time at Drupal North 2019 and we are very proud to have been a Gold Sponsor for this year's conference. Keep following along with us on LinkedIn and Twitter, stay up-to-date on future trainings and UX Meetups, and reach out to us if you want to talk Drupal.

Jun 14 2019
Jun 14

O'Reilly OSCON Open Source Convention logo in red black whiteDrupal will be represented again at this year's O'Reilly Open Source Convention (OSCON) in Portland, OR in July. We'll have table P8 in the exhibit hall. If you're attending, make sure to visit our table on Wednesday / Thursday.

Community volunteers and local Drupal Association staff members will be at OSCON to talk about Drupal with attendees. We're looking forward to spreading the Drupal love!

If you'd like to help out, please contact me.

Dates: July 15-18

Location: Oregon Convention Center

Register here and use code USRG to save 25% off most registration types.

Jun 14 2019
Jun 14

In April 2018 at DrupalCon Nashville, Dries and then-Executive Director of the Drupal Association Megan Sanicki announced the launch of an initiative to help promote Drupal in the marketplace. This initiative was designed for agencies worldwide, offering marketing and sales support that unifies the Drupal brand and provides standardized materials that can be customized to each user's needs.

The fundraising goal for this initiative was $100,000, and we committed to starting work once we hit the $75,000 mark. In summer 2018 we reached $76,000 in funding and outlined the first phases of work we would complete with these funds. That funding was utilized and those first phases completed in Spring of 2019. We’ll talk about what was accomplished here, and what comes next for the Promote Drupal program.

Purpose (as stated in previous posts)

One of the Drupal Association goals is to grow adoption of Drupal. The audience for Drupal is broad and varied, as are the decision-makers choosing to adopt Drupal, as well as the people and agencies selling Drupal services.

Our enterprise market competition has deep pockets for product marketing and heavy sales support. Even our mid-market open source competition’s marketing is heavily backed by corporate funding.

So how can the Drupal Association help grow adoption of the product, across such a diverse market, with our limited budget? It won’t be easy, and it won’t be perfect for everyone, but an ideal outcome will create a source for standardized Drupal materials and stories that the worldwide community can use in their own regions to promote Drupal to new audiences and grow adoption.

To that end, this initiative is focused on creating materials targeted to the decision-makers who choose to adopt Drupal for business; specifically, for marketing decision-makers—the audience most underserved in current materials. 

What we've accomplished

The Promote Drupal Initiative was designed in four phases:

  • Phase 1: Update Drupal's brand and strategic messaging to connect with new decision-makers and influencers
  • Phase 2: Provide sales and marketing materials that everyone can use (and translate!)
  • Phase 3: Coordinate PR campaigns
  • Phase 4: Create "marketing campaigns in a box" to support localized ad and industry event marketing

This fundraising campaign supported Phase 1 & 2.

In the past 10 months, we've moved forward on the following deliverables:

Phase I

Create Drupal brand book

Complete Found here Building an open source marketing infrastructure for collaborative projects - and governance to guide its use. Complete You can join here  Create a "Why Drupal?" Video for widespread use Complete Video here

Distributing press releases as they come up - sharing finished releases with international regional associations for translating and sharing in their own communities

Complete Found here Create Drupal Pitch Deck: More about it—and how you can use it Complete  Found here

Redesigning the submission process and template for case studies and how they are selected as Drupal business case studies—and using these for general Drupal brand collateral

Near completion This is in final development and will be launched in August.

Phase II 

Set a roadmap for the most-needed marketing and sales materials in the community Complete  Work found here

Begin implementing that roadmap through the online community interface

Complete  Work found here Redesign home page of drupal.org to be persona-focused Complete https://www.drupal.org/home

Continue pushing press releases and media recognition

Complete Found here Have "Why Drupal" Video available in different languages  German translation complete Here Complete a Competitive Comparison Chart  In Progress  Work found here Drupal in the Enterprise: Integration with other Digital tools/solutions In Progress Work found here Infographic(s): Number of sites using Drupal, other stats about Drupal usage In Progress Work found here Drupal and Future Technologies: Drupal integration with AI, data science, etc. Not yet launched Idea found here Video interviews of companies using Drupal Not yet launched  Idea found here

This work is important

We see this initiative as an important way to support agencies in building new business, and creating a unified brand for Drupal worldwide. By understanding Drupal and its capabilities - particularly as the shift to Drupal 9 will be happening in 2020 - we aim to educate the curious regarding the vast ecosystem of Drupal's digital marketing technology and other business applications. 

Funding & moving forward

The idea of the initiative was to be a combination of self-sustaining community contribution - much like the code developed for Drupal - and partially an ongoing program supported by the Drupal Association, assuming funding to pay staff to support the work. 

Now that the initial funding is done, the Association plans to continue hosting Promote Drupal meetings, and to help guide the process for those participating until a plan is in place for future funding of this work. 

Promote Drupal needs your support

In order for the work to continue we are asking for community support. 

If you own an agency, please consider allocating some of your marketing team's time toward the Promote Drupal Initiative to move it forward. You do not need to reach out to Drupal Association staff directly for this to happen - you can simply have your team members join the meetings and participate in the work. We invite you to view projects in progress and see where you can lend your expertise and time. 

One element of work that would be particularly useful is to have the "Try Drupal" video translated into languages beyond English and German. Likewise, the additional materials listed above would have further reach worldwide if translated. 

Another opportunity is to add your materials to the Promote Drupal Pitch Deck (a.k.a. "Drupal in the Wild"); see the process here.

Thank you!

On behalf of the Drupal Association, we extend a big thanks to the original sponsors of the Promote Drupal Project—more than 50 agencies, as seen here. We also have a shout out to individual contributors; thanks to all 54 Team Members.

It is our hope that the community engages in this initiative in a way that helps sustain its momentum and continue producing meaningful content for the Drupal community. 

Jun 14 2019
Jun 14
Once upon a time the authoring experience (AX) in Drupal left much to be desired. Content editors using Drupal 6 or Drupal 7 glanced longingly across at WordPress editing screens wistfully hoping for some of that ease of use. The flexibility of Drupal from a content management standpoint was never in doubt, but we all just wished that when the edit screen looked so much better and behaved in a manner we were accustomed to when using other modern digital products and services. Well, finally the wait is over! Welcome to the new Drupal authoring experience! Let's focus on three main areas of the Drupal authoring experience which have made Drupal 8 a game changer for digital marketing professionals. 1. Gutenberg Editor It's nice...it is really nice! Below is a screenshot of the new Gutenberg editor experience available in Drupal 8.
Jun 14 2019
Jun 14

What’s the one big challenge that marketers and CMO’s we partner with  are facing this year? It’s really tough to put a finger on just one. Proving impact on revenue, marketing team staffing, personalization, and marketing-IT alignment are among the hurdles voiced in discussions that Mediacurrent’s sales team are having with prospects and clients. We are finding CMO’s are pressed more than ever to show marketing’s value while the complexities and opportunities sprouting within digital continue to evolve. Let’s dive into each challenge and uncover what makes these hurdles difficult to jump — and the tools or approach that can help marketers overcome them.

Proving Impact on Revenue

Probably not surprising that last year Gartner surveyed where CMOs were spending marketing budgets. They found marketing budgets shrunk slightly year over year since 2016 while a higher percentage of budgets are being allocated to digital. The pressure is on for marketers to prove how specific marketing campaigns and investments directly contribute to an organization’s revenue. Owners and shareholders want more specificity in understanding how much budget to allocate to higher revenue generating activities. Furthermore, marketers need to react faster to fluctuating market conditions that impact customer experience.

How can you attribute revenue to specific marketing activities and demonstrate ROI so you can invest and optimize in the right activities? There are a number of SaaS tools available and most implement a specific approach to measure marketing attribution and achieve granular ROI tracking. 

  • Motomo - offers a GPL-licensed on-premise web analytics stack.
  • Bizible - analytics and multi-touch revenue attribution.
  • Terminus / Brightfunnel - product suite that offers account-based marketing analytics and sales insights.
  • Conversion Logic - cross-channel attribution with AI-powered insights and budget forecast simulations.
  • Allocadia - a marketing performance management platform that offers revenue attribution and insights into marketing budget allocation.
  • Full Circle Insights - product stack that tracks marketing and sales attribution, built as a native Salesforce App.
  • Google Attribution - formerly called Adometry, it’s now part of the Google Marketing Platform.
  • Salesforce CRM - ROI tracking can be enabled with additional campaign configuration.
  • Domo Digital 360 - full suite of analytics, funnel, and ROI tracking.
  • VisualIQ - strategic measurement, multi-touch attribution, audience analysis, and predictive insights.
  • Oracle Marketing Cloud - integrated suite of tools that include analytics, marketing automation, content/social marketing, and data management.

Because each tool specializes in a specific aspect of ROI tracking, you will need to do some research to understand which tool best fits your organization. Most of the tools listed above implement some form of attribution tracking that will help achieve more robust ROI calculations. Our Director of Marketing Adam Kirby gives a helpful overview of  how marketing attribution works, in his MarTech West slide deck. Organizations we speak with often need help from consultants and agencies to understand how to optimally configure their martech stack with ROI tracking tools. This need coincidentally brings us to the next challenge marketer’s are facing...

Staffing Teams - The Right Blend

Organizations are becoming more careful to find the proper balance between internal team staffing and engaging help from an outside agency. In the early 2010’s, there was a movement within Fortune 2000 companies to bring more expertise in-house. As martech complexity evolved into the latter part of this decade, organizations are realizing that exposure to new technologies and approaches is limited with their in-house teams. By engaging with a wide spectrum of industries, clients, and projects, agencies provide a broad view into the martech landscape that in-house teams don’t have. What’s the right blend? It depends on the vertical. Organizations with one large website typically outsource at least half of their digital marketing. Higher Ed and Government have longer procurement cycles and, consequently, need at least 75% of their overall marketing team to be full-time in-house.

Not only is outside help needed by in-house teams to stay informed, budget scrutiny is forcing CMO’s to seek off-shore development help. However, they are finding off-shore falters when technology projects aren’t being led by one or more on-shore architects who maintain a project’s integrity between on-shore stakeholders and off-shore teams. These technical liaisons are critical to off-shore development success. We see too many organizations assume if off-shore developers demonstrate technical competency, they should be fully capable of leading an implementation. Yet, those organizations fail to consider the strength of influence local culture has on communication dynamics and the perception of requirements by off-shore teams.

Personalization

Another challenge marketers are targeting is how personalization can impact KPIs and produce a higher ROI percentage compared to other digital marketing efforts. In 2017, the concept of personalization was buzzing while marketers were trying to understand what it takes from a content and labor effect to implement. After GDPR went into effect a little over a year ago, personalization efforts have to take into account how GDPR laws impact customer data acquisition and retention, making the implementation of personalization trickier and more complex with respect to data analysis and the ability to capitalize on personalization opportunities. Tools like Acquia Lift, open source marketing automation platform Mautic (recently acquired by Acquia), Triblio, and Optimizely Web Personalization offer slightly different perspectives on personalization. 

When evaluating if you’re ready for personalization, here are eight considerations that will dictate success when carefully planned or potential failure if not addressed:

  1. Do you have enough content that’s written for each persona your personalization effort needs to target?
  2. Do you have content creators who can continually create new and evergreen content?
  3. Do you have KPIs defined to track the performance of your personalization efforts?
  4. Is your martech stack compatible with personalization technologies that fit your business model?
  5. Do accurate, fresh data metrics exist in usable forms? Is data structured uniformly and exclusive of redundancies that might skew its meaning?
  6. How do data privacy laws impact the efficacy of a personalization initiative? Can enough of the right user data legally be captured to supply the initiative?
  7. Are data governance guidelines in place that ensure data integrity stays intact well beyond the implementation phase of a personalization initiative?
  8. Finally, is your department or organization committed to investing time and energy into personalization? It’s a long game and shouldn’t be misinterpreted as an off-the-shelf-set-it-and-forget-it type of content solution.

If you’re starting a personalization strategy from ground zero, Mediacurrent Senior Digital Strategist Danielle Barthelemy wrote a quick guide to creating a content strategy with personalization as the end-goal. Danielle illustrates how a sound personalization strategy positively influences purchase intent, response rate, and acquisition costs. 

Marketing-IT Alignment

In order for digital marketing execution to be as effective and efficient as possible with initiatives like ROI tracking and personalization, it’s imperative for marketing and IT teams to collaborate cohesively.  A frictionless environment is critical for marketers to meet the immediacy of an ever-increasing market speed. In some organizations, these two departments are still maintaining competing interests in relation to policy, security, infrastructure, and budget. Example scenarios include  strict IT policies that can stifle speed-to-market, cowboy marketers all but ignoring technical security when implementing new tools, and executives missing the budgetary discord that echoes when both departments operate in their own silos.

These independent agendas must be meshed together into one for the betterment of the organization. But how? 

  • Learn how to empathize by understanding each other’s goals and challenges across departments. Define a shared list of KPI’s and time-bound each.
  • Schedule weekly touch point meetings between IT and marketing leaders.
  • Conduct a quarterly tools review to understand the “why” behind tools that each department uses.
  • Demonstrate discipline-specific concepts that require collaboration from the other department. For instance, show IT how marketing attribution works and what’s required of them to make it successful. Or, show marketing what a normalized database is and how it will help marketing be successful by reducing duplicate data.

Marketing ROI: An Ongoing Challenge

Overall, the challenges CMO’s are asking us about as we move into the latter half of 2019 are heavily rooted in accurately tracking ROI and putting tools in place to boost it. While marketers have been challenged with proving ROI for years, digital has evolved to a point where tools and systems exist that embolden marketers to aggressively pursue understanding where their money is best spent. For most organizations, there are still talent hurdles to overcome and knowledge gaps to fill to properly implement martech and systems that accurately track ROI. 

How about you — what challenges are your marketing department working to solve this year? Have you found the right in-house to agency team blend? Have you had success with ROI tracking and personalization?

Jun 14 2019
Jun 14

A lot of buzz around “Decoupled Drupal” is taking place and it has quickly become ubiquitous in the industry. Drupal has won hearts by embracing the newest of technology and presenting the best of possibilities. The full separation of the structure from the content has aided the content management systems with appropriate means to accelerate the pace of innovation. 

In this blog, we will address some loaded questions of what, why and when of Decoupled Drupal for you. 

A headless robot

Decoupled Drupal Is For You

Rendering a separate framework for front-end and back-end content management experience, Decoupled Drupal provides for a content presentation that is completely devoid of the content management. It is also known as ‘Headless Drupal’, where the head refers to the front-end rendering or the presentation of the content and the ‘body’ attributes to the backend storage. 

Addressing the 3 Ws: Why, What, When 

In this section, we will take one head at a time and examine the core functionalities of Decoupled (Headless) Drupal. 

Why Decoupled?

Being a flexible framework for developing websites, web/native apps and similar digital products, Decoupled Drupal allows for designers and front-end developers to build without limitations. As an organisation you can leverage a decoupled approach for progressive web apps, and native apps. Decoupled Drupal has created a noise in the community with its divide and conquer development strategy.

What’s your Intention?

Your intentions always determine the outcome, i.e., how your product will be built with the Decoupled Drupal. For the developers working on it, here are a few scenarios and their outcomes: 

  • In case of standalone websites/applications, decoupled Drupal might not be a wise choice. 
  • For multiple web applications and websites, decoupled Drupal can be leveraged in two different ways. 
  • When building non-web native apps, you can employ decoupled Drupal to attain a content repository without its own public-facing front end.
flow chartSource: Dri.es

Once the intentions are clear, the next step is to see if it can be executed given a proper apparatus. Here are a few questions that should influence your decision to choose decoupled Drupal: 

  • Is it right for your project and your team?
  • Do you have a strong grasp on your data needs?
  • Evaluate if your current hosting provider can support this architecture
  • Are you prepared to handle the complexity of serving content to multiple clients?
  • Do the URL alias values have a unique identifier that makes API requests easy?
  • Can your metadata logic power meta tags, JSON-LD, analytics to be generated with standardised rules?
  • Where are menus created, ordered, and managed? 
  • Do you have an architecture that supports combining multiple redirect rules into a single redirect?

When to Decouple

By now we have established enough facts that Decoupled Drupal is a package full of advantages. It’s time to delve deeper and seek the accuracy of circumstances in which it can be put into effect: 

Decoupled Drupal allows for designers and front-end developers to build without limitations

Resources 

Progressively decoupling the Drupal requires a separate development of the backend and front-end and thus, separate resources are a mandate. Two individually capable teams that can collaborate and support makes for a successful decoupled Drupal. 

Multiple Channels

 The faculty of publishing content and data across platforms and products can affect the way you become headless.

Applicable Content

 Decouple is a great fit if you already have an interactive data. Visualisations, animations, and complex user flows pushes for frameworks like Ember, React, Vue JS or Angular.

Drupal Interface

Sometimes, a rich interface and built-in features can hinder the work. Even Drupal’s flexible content model to store content requires a different interface for adding and managing that content in some cases. 

When Not to Decouple

Inversely, it is equally important to know what situations might not be healthy for a decoupled Drupal to thrive. Gauge these possibilities to rule out situations/project:

  • Drupal has the advantage to leverage a huge pile of free modules from the open source community. But with the decoupled Drupal, the ability to easily “turn-on” the front-end functionality goes out of the window. The separate content management system eliminates this likelihood of managing your website front-end directly. 
  • Drupal’s front-end proficiency should align with your front-end requirement. Absence of a systematic match can land your decoupled dream in doubts.  

Conclusion

There’s no confusion about the abilities of Decoupled Drupal. It’s your business requirements that should fit in like a puzzle with the headless architecture. With necessary technical leadership skills and expertise in this web infrastructure, you can sail your decoupling aspirations to the other end. 

We’d love to hear your feedback on our social media platforms: Twitter, Facebook and LinkedIn

And do not forget to share more ideas at [email protected]

Jun 14 2019
Jun 14

Another successful day at Drupal North is now complete! This day was packed with sessions from all kinds of speakers, including our very own Jigar Mehta and Robert Ngo. Some great discussions were had amongst the Drupal community which was out in full force. Here are some of the ideas that we saw repeated throughout the day:

Content must be modular

Making your content modular allows you to easily plug it into any new type of channel. There's no need for you to start from scratch just because you're creating something for a different platform or user base. And, if you keep this content in a centralized hub, all users have access to the most accurate and up-to-date versions.

Plan out where you're going in the initial design phase

Knowing where you're going makes it that much easier to get there. You need to start with solid components so you don't have to go back later on and make constant revisions. A detailed plan allows you to take advantage of UI Patterns that will save you time and headaches in the future.

More and more people actually know about Drupal

Years ago, many within the Drupal community would have to explain to people what Drupal, and even open-source was. This made the task of convincing them to switch to a Drupal site even harder. Now, executives and decision-makers will have often already heard of Drupal and just need to be convinced of what value YOU can bring to them.

Accessibility is key

The web is for everyone and that means your website needs to be accessible for everyone. It's also important to maintain this accessibility; technology is always improving so just because your site was accessible when you launched it 3 years ago, doesn't mean it is today. And when you conduct user tests, try and recruit diverse participants in order to get more inclusive results.

Drupalers love basketball!

To wrap up the day, conference attendees went to the after party to catch game 6 of the NBA Finals -- GO RAPTORS!

Just one more day left of Drupal North and we hope you've been making the most of it! Make sure you're following along with us on LinkedIn and Twitter, and check out the rest of our daily recaps on this blog.

Jun 13 2019
Jun 13

You’ve decided it’s time to rebuild your website. Research has been done, conversion rates have been analyzed, the team has selected a rebuild over a focused fix, and you and your team are committed to making this happen. One of the easiest ways of ensuring your success is to remain mindful of a few key things as you work your way through this larger process.

Regarding that term, “mindful:” one of the Kanopi team’s favorite authors is Brené Brown. She writes, “Mindfulness requires that we not “over-identify” with thoughts and feelings so that we are not caught up and swept away by negativity.” For the purposes of your website rebuild, I’d adapt this to be, “Mindfulness requires that we not “over-focus” on what we’ve done before, and rather remain aware of what’s important for our success so that we can focus on where we want to be.”

So, let’s get to it and break down what the top five things we need to be mindful of when executing a rebuild project.

1. YOU are the difference! Be engaged.

Stakeholder engagement can make or break a rebuild. But rebuilds are time-consuming, and you and your stakeholders will likely be pulled in several directions as you try to execute a rebuild while balancing other priorities and projects.

Your availability, open communication, and timely feedback is critical to enable your team to create the web presence your organization needs to reach its goals. Be realistic in what time your team can devote to the project so you can be as fully engaged as possible. Define roles and responsibilities early as well so it’s clear who is handling what.

If you need an assist from an outside agency to keep the project moving quicker, be direct with them about your business needs and wants. Help them to understand your users and audiences. An agency will make every effort to dive deeply into understanding your market, but at the end of the day, you and your team are the experts on what you do. So view any outside agency as a partner who can work with you towards success, and stay engaged with them throughout the process.

2. Define success & track it

We cannot know if we’re successful until we have identified what success will look like. For some sites, it’s simply exposure. For others, it’s a need to meet specific goals. Take the time to define what your organization needs to achieve, and which key metrics will allow us to quantify success.

Not sure where to start? Here are common metrics should you benchmark now as you prepare for the rebuild:

  • Users: note how many users are regularly coming to your site
  • Bounce Rate: record the overall bounce rate. Make note if this is at, above or below your industry’s standard.
  • Average Session Duration: how long are users staying on your page?
  • Sessions by Channel: where are your users coming from? How much organic traffic is coming in?
  • Top Keywords: identify what words are being used in the search engines when users are finding you. Are these surprising?
  • Competitor Keywords: are users who are looking at your competitors using the same keywords?
  • Top Referrers: who is sending traffic to your site? Maybe social media is key, or you’re more focused on industry referrals. Determine where you should be in the market.
  • Conversion Rates: what forms do you need users to fill out? What conversions are critical to your business goals? These can take the form of contact or forms from your CRM tools such as Marketo or Pardot, or even visits to a specific page or video views.   
  • Accessibility: does your site meet national or international compliance standards?

In short, benchmark where you are now, and use this data to help round out that definition of success. Then come back a few months after launch to reevaluate and compare so you can quantify the success to your stakeholders.

3. Get your content strategy in order

The old saying “Content is King” is truer today than ever. Users are more educated. Search engines have become smarter, looking for more than keywords — they look for meaning in phrases to help determine the focus of a given page.

As one of the most effective methods of growing audience engagement, developing your brand presence, and driving sales, content marketing is a mission-critical growth method for most businesses. — Hubspot

This is where most people turn to me and tell me they’ll get their team on it so they can move further along in the content process. But don’t underestimate the time and energy content development/aggregation can take, even if your larger project is hiring a copywriter to augment your team. All too often, when content becomes a late-stage endeavor a few things happen:

  • timelines get pushed out, waiting for content to be approved.
  • changes to the previous UX are often required to account for unrealized navigation or calls to action, causing potential budget overages.
  • content is rushed and not in alignment with the overall vision.

To help this process come together for your team, here are a few action items to start with:

  • Audit your content: take a full inventory of your site’s content to better identify:
    • what to keep
    • what to repurpose
      • for example: the video may look dated, but could your team could write a blog post from that material?
    • what should not be migrated to your new site
      • this can be archived to be referenced at a later date
  • Build a sitemap: determine the hierarchy of the content on the new site.
  • Identify missing content: comparing your audit to your sitemap, what needs to be produced?
  • Track content creation: track who is responsible for writing, editing and approving content — and give them deadlines
  • Start thinking ahead: you may need to start planning future content. Developing an editorial calendar will help keep the process moving. Content typically included in an editorial calendar:
    • blog posts
    • social media posts
    • videos
    • infographics

When preparing for a rebuild, your content strategy has to be one of the first things your team takes on. This approach will save you time, headaches, and likely budget moving forward. 

4. Consider your users’ digital experience

By this stage in the process you should know your target market, their buying habits and why your product or service is of value to them. You likely have personas and other data to help back this up. But in the omnichannel world in which we thrive, there is often more to architecting an effective user journey. Understanding the nuances of the devices, the influence of how a user comes to your site, and the overall adherence to best practices are complex. For example, consider the following:

  • What percentage of users are coming from mobile devices?
    • Are you CTAs and main conversion points easy to access on a small screen?
    • Is the user journey simplified?
  • Are you users coming from social media?
    • Is it your blog driving traffic or more word of mouth?
    • Is it positive or negative attention?
  • Have you produced a user journey map to identify the different pathways to conversion?
    • Is your site currently set up to promote these journeys?
    • Are you utilizing personalization to customize that user journey?

You can learn more about how to use user research to gain insight into audience behavior to help you frame your thoughts about your personas overall user journey to conversion.

5. Think about the future of your site

Websites need to evolve and adapt as the needs of your users change over time, but as you rebuild, are you setting yourself up for more incremental changes moving forward? Keep in mind that most rebuilds are focused on the MLP or “Minimum Lovable Product.” It’s the simplest iteration of your site that will meet your current needs with the intent to continually improve it over time. Regardless of whether you’re focused on an MLP launch due to either time or budget constraints, we need to keep these future goals in mind as we progress.

And then there’s the technology side of this: whether you’re looking ahead to Drupal 8 or 9 or the next major evolution with WordPress, consider those needs now to help ‘future proof’ your new site. The web changes too quickly to risk your site being stale when it’s still brand new. Talk this through from the start with your team.

These steps will set you up for success.

Your site speaks to who you are as an organization to your target market. Whether you’re a non-profit, higher education or a corporate entity, being mindful now will set your team’s rebuild up for success. And if you need help with your rebuild, contact us. We’d love to partner with you and help you recognize that success.

Jun 13 2019
Jun 13

You’ve decided it’s time to rebuild your website. Research has been done, conversion rates have been analyzed, the team has selected a rebuild over a focused fix, and you and your team are committed to making this happen. One of the easiest ways of ensuring your success is to remain mindful of a few key things as you work your way through this larger process.

Regarding that term, “mindful:” one of the Kanopi team’s favorite authors is Brené Brown. She writes, “Mindfulness requires that we not “over-identify” with thoughts and feelings so that we are not caught up and swept away by negativity.” For the purposes of your website rebuild, I’d adapt this to be, “Mindfulness requires that we not “over-focus” on what we’ve done before, and rather remain aware of what’s important for our success so that we can focus on where we want to be.”

So, let’s get to it and break down what the top five things we need to be mindful of when executing a rebuild project.

1. YOU are the difference! Be engaged.

Stakeholder engagement can make or break a rebuild. But rebuilds are time-consuming, and you and your stakeholders will likely be pulled in several directions as you try to execute a rebuild while balancing other priorities and projects.

Your availability, open communication, and timely feedback is critical to enable your team to create the web presence your organization needs to reach its goals. Be realistic in what time your team can devote to the project so you can be as fully engaged as possible. Define roles and responsibilities early as well so it’s clear who is handling what.

If you need an assist from an outside agency to keep the project moving quicker, be direct with them about your business needs and wants. Help them to understand your users and audiences. An agency will make every effort to dive deeply into understanding your market, but at the end of the day, you and your team are the experts on what you do. So view any outside agency as a partner who can work with you towards success, and stay engaged with them throughout the process.

2. Define success & track it

We cannot know if we’re successful until we have identified what success will look like. For some sites, it’s simply exposure. For others, it’s a need to meet specific goals. Take the time to define what your organization needs to achieve, and which key metrics will allow us to quantify success.

Not sure where to start? Here are common metrics should you benchmark now as you prepare for the rebuild:

  • Users: note how many users are regularly coming to your site
  • Bounce Rate: record the overall bounce rate. Make note if this is at, above or below your industry’s standard.
  • Average Session Duration: how long are users staying on your page?
  • Sessions by Channel: where are your users coming from? How much organic traffic is coming in?
  • Top Keywords: identify what words are being used in the search engines when users are finding you. Are these surprising?
  • Competitor Keywords: are users who are looking at your competitors using the same keywords?
  • Top Referrers: who is sending traffic to your site? Maybe social media is key, or you’re more focused on industry referrals. Determine where you should be in the market.
  • Conversion Rates: what forms do you need users to fill out? What conversions are critical to your business goals? These can take the form of contact or forms from your CRM tools such as Marketo or Pardot, or even visits to a specific page or video views.   
  • Accessibility: does your site meet national or international compliance standards?

In short, benchmark where you are now, and use this data to help round out that definition of success. Then come back a few months after launch to reevaluate and compare so you can quantify the success to your stakeholders.

3. Get your content strategy in order

The old saying “Content is King” is truer today than ever. Users are more educated. Search engines have become smarter, looking for more than keywords — they look for meaning in phrases to help determine the focus of a given page.

As one of the most effective methods of growing audience engagement, developing your brand presence, and driving sales, content marketing is a mission-critical growth method for most businesses. — Hubspot

This is where most people turn to me and tell me they’ll get their team on it so they can move further along in the content process. But don’t underestimate the time and energy content development/aggregation can take, even if your larger project is hiring a copywriter to augment your team. All too often, when content becomes a late-stage endeavor a few things happen:

  • timelines get pushed out, waiting for content to be approved.
  • changes to the previous UX are often required to account for unrealized navigation or calls to action, causing potential budget overages.
  • content is rushed and not in alignment with the overall vision.

To help this process come together for your team, here are a few action items to start with:

  • Audit your content: take a full inventory of your site’s content to better identify:
    • what to keep
    • what to repurpose
      • for example: the video may look dated, but could your team could write a blog post from that material?
    • what should not be migrated to your new site
      • this can be archived to be referenced at a later date
  • Build a sitemap: determine the hierarchy of the content on the new site.
  • Identify missing content: comparing your audit to your sitemap, what needs to be produced?
  • Track content creation: track who is responsible for writing, editing and approving content — and give them deadlines
  • Start thinking ahead: you may need to start planning future content. Developing an editorial calendar will help keep the process moving. Content typically included in an editorial calendar:
    • blog posts
    • social media posts
    • videos
    • infographics

When preparing for a rebuild, your content strategy has to be one of the first things your team takes on. This approach will save you time, headaches, and likely budget moving forward. 

4. Consider your users’ digital experience

By this stage in the process you should know your target market, their buying habits and why your product or service is of value to them. You likely have personas and other data to help back this up. But in the omnichannel world in which we thrive, there is often more to architecting an effective user journey. Understanding the nuances of the devices, the influence of how a user comes to your site, and the overall adherence to best practices are complex. For example, consider the following:

  • What percentage of users are coming from mobile devices?
    • Are you CTAs and main conversion points easy to access on a small screen?
    • Is the user journey simplified?
  • Are you users coming from social media?
    • Is it your blog driving traffic or more word of mouth?
    • Is it positive or negative attention?
  • Have you produced a user journey map to identify the different pathways to conversion?
    • Is your site currently set up to promote these journeys?
    • Are you utilizing personalization to customize that user journey?

You can learn more about how to use user research to gain insight into audience behavior to help you frame your thoughts about your personas overall user journey to conversion.

5. Think about the future of your site

Websites need to evolve and adapt as the needs of your users change over time, but as you rebuild, are you setting yourself up for more incremental changes moving forward? Keep in mind that most rebuilds are focused on the MLP or “Minimum Lovable Product.” It’s the simplest iteration of your site that will meet your current needs with the intent to continually improve it over time. Regardless of whether you’re focused on an MLP launch due to either time or budget constraints, we need to keep these future goals in mind as we progress.

And then there’s the technology side of this: whether you’re looking ahead to Drupal 8 or 9 or the next major evolution with WordPress, consider those needs now to help ‘future proof’ your new site. The web changes too quickly to risk your site being stale when it’s still brand new. Talk this through from the start with your team.

These steps will set you up for success.

Your site speaks to who you are as an organization to your target market. Whether you’re a non-profit, higher education or a corporate entity, being mindful now will set your team’s rebuild up for success. And if you need help with your rebuild, contact us. We’d love to partner with you and help you recognize that success.

Jun 13 2019
Jun 13

Just when you think Drupal couldn’t get any dumber, it goes and adds some great new features….. And TOTALLY redeems itself!

via GIPHY

Released back in November of 2015, Drupal 8 has been slowly but steadily upping its game.

In case you’ve been lost in a jungle for the past couple of years, or maybe you just don’t keep up with that kind of thing, we’ve got you covered.

Here are just some of the things Drupal 8 and soon to be Drupal 9 have us jumping around like crazy apes about.

BigPipe

BigPipe is a technique that was invented by Facebook back in 2009 when they made the site twice as fast, which is an amazing feat in itself. 

How it works, is it first breaks pages up into multiple “sections”, which are then loaded in parallel so your users don’t have to wait for the page to be completely loaded before they can start interacting with it.

Page speed is extremely important, considering 47% of people expect your site to load in less than 2 seconds and 40% will abandon it entirely if it takes longer than 3 seconds.

Not only that, but Google has indicated site speed (and as a result, page speed) is one of the signals used by its algorithm to rank pages.

If you care about SEO, you should care about the speed of your pages.

Page SpeedSource: Google Developers

The BigPipe module was included in Drupal 8 core since 8.1 and became stable in 8.3.

Just having BigPipe enabled makes your pages faster with zero configuration needed. We can thank Drupal 8’s improved render pipeline & render API for that.

Yay Drupal 8!

Layout Builder

Drupal’s Layout Builder is probably what we are most excited about. 

For far too long, Drupal has been very restricting when it comes to building pages out and putting the content that you want, where you want it.

Think about if Display Suite and Panels had a baby gorilla. That’s Drupal’s new Layout Builder.

Layout Builder was introduced in Drupal 8.5.0 as an experimental core module, but as of Drupal 8.70, it is now stable and production ready!

It offers a powerful visual design tool and is meant for the following three use cases, according to Drupal.org:

Layouts for templated content. The creation of "layout templates" that will be used to layout all instances of a specific content type (e.g. blog posts, product pages).

Customizations to templated layouts. The ability to override these layout templates on a case-by-case basis (e.g. the ability to override the layout of a standardized product page)

Custom pages. The creation of custom, one-off landing pages not tied to a content type or structured content (e.g. a single "About us" page).

The Layout Builder gives developers/site builders the ability to drag and drop site-wide blocks and content fields into regions within a given layout.

layout builderSource: Drupal.org

With custom and unique landing pages being so important nowadays, this is finally the flexibility and freedom we need!

Media

Media management has always been an afterthought in Drupal. 

Today we consume more videos and pictures than ever, with the likes of Youtube, Instagram and Facebook.

According to Cisco, they predict that video will make up 80 percent of all internet traffic by 2019. That's like.... Today!!

Thanks to the Media in Drupal 8 Initiative, an experimental core media module was introduced in Drupal 8.4. Then in 8.5, it was moved to stable and has gotten even better in Drupal 8.6, with the addition of oEmbed, additional media type support, and a media library.

Media timelineSource: Webwash

Let’s break down all three of these for you.

Additional Media Type Support
Support for local audio, video, images and generic files, along with being able to embed remote YouTube and Vimeo videos.

oEmbed Support
Needed to handle the new remote video media type mentioned above.

Media Library
The most exciting of the three and pretty much speaks for itself as a library of all your media. Can use a grid view, which shows a thumbnail, title, and bulk edit checkbox, or a table view, if you prefer that sort of thing.

All-in-all, media management is still not where it needs to be, but all these additions to core are a massive jump in the right direction.

There is no reason to wait until Drupal 9

If you’re currently on Drupal 6 or 7 and aren’t totally pumped after reading this, you should be.

Finally, Drupal has given us the speed, flexibility, and freedom we need to improve workflow, save time and succeed online. 

What’s even better is that Drupal 9 will essentially be just like another minor core update in Drupal 8. It will be seamless unlike ever before. 

There is just no reason to wait. Make the update today and enjoy all these great features.

Let’s start a conversation about it.
 

Jun 13 2019
Jun 13

By default, a Drupal 8 user account collects only very basic information about the user. 

And, most of that information is not visible to visitors or other users on the site.

Fortunately, Drupal makes it easy to modify and expand this profile so that people can add useful information about themselves such as their real name (versus a username), address, employer, URLs, biography, and more.

If you're new to how Drupal handles users, read this tutorial before starting. In this tutorial, I'm going how to create expanded user profiles for your Drupal users.

First, let's add some fields to your user profiles. This allows users to provide more information about themselves.

  • Go to "Configuration", "People", "Account settings", and then "Manage fields". You can now see a screen which looks the one below:

manage fields

Let's add the following Text (plain) fields:

  • First Name. Set the "Maximum length" to 50 characters.
  • Last Name. Set the "Maximum length" to 50 characters.

Next, add the following Link fields:

  • LinkedIn
  • Facebook
  • Personal Website

fields user profile

  • Go to the "Manage display" tab and arrange the new fields in the order you want them to show to site visitors.

manage display user fields

  • Go to "People" and "Permissions".
  • Give the "View user information" permission to the Anonymous and Authenticated users.

view drupal user information

Now, go and see those user profile fields that you just created:

  • Click your user name to go to "My account" in the black menu bar at the top.
  • Click the "Edit profile" tab.
  • Scroll down and you can use all the fields that you just created.
  • Fill in the fields.
  • Save your data and click the "View" tab to see your profile:

drupal user profile

Now, see how these fields appear to your site’s users. For many users, this user profile editing area should look similar, but slightly different:

  • You can use the Masquerade module to see the site as the user would. If you're not familiar with Masquerade, read this tutorial.
  • Click the article writer name to go to "My account".
  • Click the "Edit profile" tab and see what the user sees:

drupal article writer

Finally, see how this appears to a new user:

  • Log out or visit your site in another browser.
  • Visit http://[your_web_address]/user/register
  • The registration screen should show the default Drupal fields, plus your new fields:

new user registration

If you want to remove any fields from the registration area, you can hide them by going to "Configuration, "People", "Account settings", and then "Manage form display".

Want to Learn More?

This tutorial was an extract from Drupal 8 Explained, the best-selling guide to Drupal 8. Grab a copy today to learn all the fundamentals of Drupal 8.


About the author

Steve is the founder of OSTraining. Originally from the UK, he now lives in Sarasota in the USA. Steve's work straddles the line between teaching and web development.
Jun 13 2019
Jun 13

CMI 2.0 session at Drupaldevdays in Cluj-Napoca

Fabian Bircher

13 Jun 2019

0 Comments

Fabian Bircher

13 Jun 2019

0 Comments

Fabian Bircher, 13 Jun 2019 - 0 Comments

Session slides for the dev days CMI 2.0 session

Today I presented the CMI 2.0 updates at the drupal dev days in Cluj-Napoca. The session went well and I received good feedback and had interesting conversations about configuration management after it.

Attached are the slides I presented.

There are plenty of issues to work on, join us in the sprint room or on the #config drupal slack. Find more information on the CMI 2 project page and issues tagged with CMI 2.0 candiate

CMI 2.0 Devdays 2019.pdf (932.61 KB)Download
Jun 13 2019
Jun 13

It’s an undoubtable truth that images have a strong power to engage customers. And there are plenty of ways to enhance this engagement, one of which is image hover effect (aka mouseover effect). 

Features like this are meant to add a creative and interactive touch to your website, spark users’ interest, save space, make your website more user-friendly, and increase conversions.

Drupal 8 has easy content creation as a priority, and there are also many useful modules for creating image hover effect. Let’s take a look at a very simple but nice one — the Imagepin button module.

The Imagepin button Drupal module’s brief overview

The Imagepin button module combines image hover effect with an interesting image pinning effect. It allows you to add pins to images that will show some text when someone hovers the mouse over them. 

During content creation, editors will have a special option — “Pin widgets on this image.” They can create as many pins as they wish, write the text for them, and position them anywhere throughout the image.

Pins appear on the image:

Pins on image with Imagepin Drupal 8 module

Text appears as you hover the mouse over each pin:

Image hover effect with Imagepin Drupal 8 module

The module’s extensibility

Out-of-box, the Imagepin button module cooperates with the Slick module and displays all widgets as a carousel. But it’s just the beginning. The module is extensible with custom widgets, and the use of JavaScript can add more interactivity. 

So it is easy to adjust the module’s behavior to your website’s particular needs with the help of a good Drupal team. Let’s now see the module’s “classic” work on a simple example.

Creating image hover effect with the Imagepin button module

1. Installing and enabling the module

It begins with installing the Imagepin button module — either via Composer or by downloading it from drupal.org. Then it should be enabled in the “Media” section on the module list.

Enabling Imagepin Button Drupal 8 module

2) Enabling the Imagepin for a content type’s image field

We go to the Manage Display tab of a content type that contains our image field and click the cogwheel next to this field. We need to check “Enable users to pin widgets on this image.” While doing this, we can:

choose in which image style we will be pinning the image

optionally set breakpoints for mobile devices

Enabling Imagepin for image field Drupal 8

3) Creating pins for images

Next, when we add content, we see the “Pin widgets on this image” button and click it.

Adding pins to images with Imagepin Drupal 8 module

This button brings us to the “Pin widgets” UI where we see available widgets (none added yet). Let’s click “Add new” and submit the text we want to display. 

For example, on our image of Europe Travel Map, we could pin sightseeing objects — and start with “Eiffel Tower.” So we write its name in the text field.

We check it in the “Available widgets” and it turns orange, which means it is active. So we can drag the position of the pin to the place where we want to see it. And then we click “Save these positions.”

Adding pins to images with Imagepin Drupal 8 module

As we save the content item, we can see our pinned images.”

Pins on image with Imagepin module Drupal 8

As we hover over it, we see the image hover effect in action — the text shows up.

Image hover effect with Imagepin module Drupal 8

Get nice image hover effect created for your Drupal website!

We have shown you a simple example of image hover effect in Drupal 8 created by the Imagepin button module. Of course, it can be fully customized to meet your ideas. 

So if you need any help with:

  • configuring the Imagepin button module
  • extending it with custom plugins
  • creating image hover effect using other tools 

contact our expert Drupal team!

Jun 13 2019
Jun 13

In a world where there is no limit to devices to access information, you must ensure your data is always available on the go! The pace of innovation in content management is accelerating along with the number of channels to support the web content.

A content marketer’s work is not done by just creating the content for the website. Making it available across devices and managing content channel hubs and making it user-friendly, most importantly.

But first, let’s understand how the content presentation has changed over the years.

The Traditional Approach: Coupled Architecture

CMS like Drupal are traditionally monolithic, wherein they own the entire technological stack, both backend (data layer) and frontend (presentation layer). The benefit – it allowed easy site management with admin user being able to write and publish on the same system.

monolithic-architecture-srijan-technologies

Monolithic architecture is part of one cohesive unit of code where components work together with                          the same memory space and resources

Content editors have preferred the traditional approach because of its clear and logical modular architecture which allowed them to control in-place editing and layout management.

tightlycoupled-srijan

They Broke Up! What?

Even though the coupled architecture was easy to develop, deploy, and test, decoupled application architecture has become popular lately owing to the break-through user experiences it provides.

Decoupling segregates the concerns of frontend and backend of an application by simply adding a layer of technical abstraction (API).

One for content creation and storage, and the other as the presenting layer. In this scenario, a CMS like Drupal serves the backend as a data repository and a third party application like React JS owns the frontend providing interactive UIs.

In the video, watch how Drupal backend interacts with the decoupled apps in the decoupled approach and how it’s different from the traditional approach.

 

 

A decoupled architecture splits the content of a website from how it is displayed on multiple independent systems to how it is created.

Fully Decoupled or Headless Architecture

Headless architecture is similar to decoupled architecture in a way that both have content management and storage backends and deliver content from that database through a web service or API.

The only difference between the two is headless Drupal does not have a defined front-end system and has no functionality to present content to an end user on its own. The API, which exposes information for consumption, connects through an array of application.

headless-drupal-application-srijan

Srijan has implemented the headless approach for its various clients with much ease within the given stipulated time frame. Srijan helped Estee Lauder reduce its training session cost by up to 30% with a decoupled and multilingual LMS.

Understand your situation. Who is it for?

Here are some key pointers that will help you figure out if fully decoupled approach is an option for your project:

  • Not for basic editorial websites

Decoupling will do no good to a standalone website with basic editorial capabilities such as a blogging site. Such websites require less or no user interactivity and further, it can also hamper the performance of the crucial features required by content editors like the content preview, layout management, in-line editing making a simple website rather complicated.

  • Not if you want to lose more functionalities

One of the major advantages of CMS like Drupal is, you can access the plethora of modules in just one click. By simply selecting the Simple Google Maps module from your admin toolbar, you can have it on your website.

However, implementing decoupled architecture takes away such easy-to-use features since the CMS no longer manages the frontend.

  • Not if you want a complex procedure

If your design goals are closely aligned with traditional coupled architecture, then implementing decoupled approach will complicate the process and will add to the extra cost of creating those features from scratch.

Take a look here to know in which use cases decoupled Drupal works best and how to decide whether you need the architecture for your next project or not.

Enter Progressive Decoupling: A Hybrid Approach

Progressive decoupling gives you the best of both worlds. It allows Drupal to leverage JavaScript frameworks (like React and Angular) for dynamic user experience by injecting JS only where it’s needed in the front-end.

The approach is in the best interest of both - editors and developers as it strikes a balance between editorial and developer needs.

Which means the editor can control the layout of the page and the developer can use more JavaScript by interpolating a JavaScript framework into the Drupal frontend.

The video attached below will help you better understand the concept.

Comparison Chart: Coupled, Progressive or Fully Decoupled

This comparison chart will help you understand the features.

Features Coupled Decoupled (Progressive) Fully Decoupled/Headless Architecture Tightly Coupled Loosely coupled Separated Performance Fast Fast Fastest Fixed presentation environment Yes Yes No Use cases Complete text-based sites involving no user interactivity Websites that require rich/interactive elements of user experience Websites that require rich/interactive user experience Layout style Overrides built-in themes and templates Easy and secure third-party integrations Easy and secure third-party integrations Integration No Future-proof Future-proof SEO friendly Most SEO friendly SEO friendly Non-SEO friendly Delivery channels Limited Limited Unlimited API usability No APIs Based on architecture Complete API based Preview availability Available Available Unavailable

What’s in Store for the Future of Decoupling

With enterprises choosing to opt for a more flexible and scalable experience, the gap between the developers and content editors needs to be reduced.

The rapid evolution in decoupling allows constructing a content model once, preview it on every channel, and use familiar tools to either edit or place content on any channel in question, and decrease the delivery time.

At Srijan, we believe in a mature agile engineering process delivering better results. Contact us to get started.

Jun 13 2019
Jun 13

In this article we are going to explore some of the powers of the Drupal 8 migration system, namely the migration “templates” that allow us to build dynamic migrations. And by templates I don’t mean Twig templates but plugin definitions that get enhanced by a deriver to make individual migrations for each of the things that we need in the application. For example, as we will explore, each language.

The term “template” I inherit from the early days of Drupal 8 when migrations were config entities and core had migration (config) templates in place for Drupal to Drupal migrations. But I like to use this term to represent also the deriver-based migrations because it kinda makes sense. But it’s a personal choice so feel free to ignore it if you don’t agree.

Before going into the details of how the dynamic migrations works, let’s cover a few of the more basic things about migrations in Drupal 8.

What is a migration?

The very first thing we should talk about is what actually is a migration. The simple answer to this question is: a plugin. Each migration is a YAML-based plugin that actually brings together all the other plugins the migration system needs to run an actual logical migration. And if you don’t know what a plugin is, they are swappable bits of functionality that are meant to perform a similar task, depending on their type. They are all over core and by now there are plenty of resources to read more about the plugin system, so I won’t go into it here.

Migration plugins, unlike most others such as blocks and field types, are defined in YAML files inside the module’s migrations folder. But just like all other plugin types, they map to a plugin class, in this case Drupal\migrate\Plugin\Migration.

The more important thing to know about migrations, however, is the logical structure they follow. And by this I mean that each migration is made up of a source, multiple processors and a destination. Make sense right? You need to get some data (the source reads and interprets its format), prepare it for its new destination (the processors alter or transform the data) and finally save it in the destination (which has a specific format and behaviour). And to make all this happen, we have plugins again:

  • Source plugins
  • Process plugins
  • Destination plugins

Source plugins are responsible for reading and iterating over the raw data being imported. And this can be in many formats: SQL tables, CSV files, JSON files, URL endpoint, etc. And for each of these we have a Drupal\migrate\Plugin\MigrateSourceInterface plugin. For average migrations, you’ll probably pick an existing source plugin, point it to your data and you are good to go. You can of course create your own if needed.

Destination plugins (Drupal\migrate\Plugin\MigrateDestinationInterface) are closely tied to the site being migrated into. And since we are in Drupal 8, these relate to what we can migrate to: entities, config, things like this. You will very rarely have to implement your own, and typically you will use an entity based destination.

In between these two, we have the process plugins (Drupal\migrate\Plugin\MigrateProcessInterface), which are admittedly the most fun. There are many of them already available in core and contrib, and their role is to take data values and prepare them for the destination. And the cool thing is that they are chainable so you can really get creative with your data. We will see in a bit how these are used in practice.

The migration plugin is therefore a basic definition of how these other 3 kinds of plugins should be used. You get some meta, source, process, destination and dependency information and you are good to go. But how?

That’s where the last main bit comes into play: the Drupal\migrate\MigrateExecutable. This guy is responsible for taking a migration plugin and “running” it. Meaning that it can make it import the data or roll it back. And some other adjacent things that have to do with this process.

Migrate ecosystem

Apart from the Drupal core setup, there are few notable contrib modules that any site doing migrations will/should use.

One of these is Migrate Plus. This module provides some additional helpful process plugins, the migration group configuration entity type for grouping migrations and a URL-based source plugin which comes with a couple of its own plugin types: Drupal\migrate_plus\DataFetcherPluginInterface (retrieve the data from a given protocol like a URL or file) and Drupal\migrate_plus\DataParserPluginInterface (interpret the retrieved data in various formats like JSON, XML, SOAP, etc). Really powerful stuff over here.

Another one is Migrate Tools. This one essentially provides the Drush commands for running the migrations. To do so, it provides its own migration executable that extends the core one to add all the necessary goodies. So in this respect, it’s a critical module if you wanna actually run migrations. It also makes an attempt at providing a UI but I guess more of that will come in the future.

The last one I will mention is Migrate Source CSV. This one provides a source plugin for CSV files. CSV is quite a popular data source format for migrations so you might end up using this quite a lot.

Going forward we will use all 3 of these modules.

Basic migration

After this admittedly long intro, let’s see how one of these migrations looks like. I will create one in my advanced_migrations module which you can also check out from Github. But first, let’s see the source data we are working with. To keep things simple, I have this CSV file containing product categories:

id,label_en,label_ro
B,Beverages,Bauturi
BA,Alcohols,Alcoolice
BAB,Beers,Beri
BAW,Wines,Vinuri
BJ,Juices,Sucuri
BJF,Fruit juices,Sucuri de fructe
F,Fresh food,Alimente proaspete

And we want to import these as taxonomy terms in the categories vocabulary. For now we will stick with the English label only. We will see after how to get them translated as well with the corresponding Romanian labels.

As mentioned before, the YAML file goes in the migrations folder and can be named advanced_migrations.migration.categories.yml. The naming is pretty straightforward to understand so let’s see the file contents:

id: categories
label: Categories
migration_group: advanced_migrations
source:
  plugin: csv
  path: 'modules/custom/advanced_migrations/data/categories.csv'
  header_row_count: 1
  keys:
    - id
  column_names:
    0:
      id: 'Unique Id'
    1:
      label_en: 'Label EN'
    2:
      label_ro: 'Label RO'
destination:
  plugin: entity:taxonomy_term
process:
  vid:
    plugin: default_value
    default_value: categories
  name: label_en

It’s this simple. We start with some meta information such as the ID and label, as well as the migration group it should belong to. Then we have the definitions for the 3 plugin types we spoke about earlier:

Source

Under the source key we specify the ID of the source plugin to use and any source specific definition. In this case we point it to our CSV file, and kind of “explain” it how to understand the CSV file. Do check out the Drupal\migrate_source_csv\Plugin\migrate\source\CSV plugin if you don’t understand the definition.

Destination

Under the destination key we simply tell the migration what to save the data as. Easy peasy.

Process

Under the process key we do the mapping between our data source and the destination specific “fields” (in this case actual Drupal entity fields). And in this mapping we employ process plugins to get the data across and maybe alter it.

In our example we migrate one field (the category name) and for this we use the Drupal\migrate\Plugin\migrate\process\Get process plugin which is assumed unless one is actually specified. All it does is copies the raw data as it is without making any change. It’s the very most basic and simple process plugin. And since we are creating taxonomy terms, we need to specify a vocabulary which we don’t necessarily have to take from the source. In this case we don’t actually because we want to import all the term into the categories vocabulary. So we can use the Drupal\migrate\Plugin\migrate\process\DefaultValue plugin to specify what value should be saved in that field for each term we create.

And that’s it. Clearing the cache, we can now see our migration using Drush:

drush migrate:status

This will list our one migration and we can run it as well:

drush migrate:import categories

Bingo bango we have categories. Roll them back if you want with:

drush migrate:rollback categories

Dynamic migration

Now that we have the categories imported in English, let’s see how we can import their translations as well. And for this we will use a dynamic migration using a “template” and a plugin deriver. But first, what are plugin derivatives?

Plugin derivatives

The Drupal plugin system is an incredibly powerful way of structuring and leveraging functionality. You have a task in the application that needs to be done and can be done in multiple ways? Bam! Have a plugin type and define one or more plugins to handle that task in the way they see fit within the boundaries of that subsystem.

And although this is powerful, plugin derivatives are what really makes this an awesome thing. Derivatives are essentially instances of the same plugin but with some differences. And the best thing about them is that they are not defined entirely statically but they are “born” dynamically. Meaning that a plugin can be defined to do something and a deriver will make as many derivatives of that plugin as needed. Let’s see some examples from core to better understand the concept.

Menu links:

Menu links are plugins that are defined in YAML files and which map to the Drupal\Core\Menu\MenuLinkDefault class for their behaviour. However, we also have the Menu Link Content module which allows us to define menu links in the UI. So how does that work? Using derivatives.

The menu links created in the UI are actual content entities. And the Drupal\menu_link_content\Plugin\Deriver\MenuLinkContentDeriver creates as many derivatives of the menu link plugin as there are menu link content entities in the system. Each of these derivatives behave almost the same as the ones defined in code but contain some differences specific to what has been defined in the UI by the user. For example the URL (route) of the menu link is not taken from a YAML file definition but from the user-entered value.

Menu blocks:

Keeping with the menu system, another common example of derivatives is the menu blocks. Drupal defines a Drupal\system\Plugin\Block\SystemMenuBlock block plugin that renders a menu. But on its own, it doesn’t do much. That’s where the Drupal\system\Plugin\Derivative\SystemMenuBlock deriver comes into play and creates a plugin derivate for all the menus on the site. In doing so, augments the plugin definitions with the info about the menu to render. And like this we have a block we can place for each menu on the site.

Migration deriver

Now that we know what plugin derivatives are and how they work, let’s see how we can apply this to our migration to import the category translations. But why we would actually use a deriver for this? We could simply copy the migration into another one and just use the Romanian label as the term name no? Well yes…but no.

Our data is now in 2 languages. It could be 23 languages. Or it could be 16. Using a deriver we can make a migration derivative for each available language dynamically and simply change the data field to use for each. Let’s see how we can make this happen.

The first thing we need to do is create another migration that will act as the “template”. In other words, the static parts of the migration which will be the same for each derivative. And as such, it will be like the SystemMenuBlock one in that it won’t be useful on its own.

Let’s call it advanced_migrations.migration.category_translations.yml:

id: category_translations
label: Category translations
migration_group: advanced_migrations
deriver: Drupal\advanced_migrations\CategoriesLanguageDeriver
source:
  plugin: csv
  path: 'modules/custom/advanced_migrations/data/categories.csv'
  header_row_count: 1
  keys:
    - id
  column_names:
    0:
      id: 'Unique Id'
    1:
      label_en: 'Label EN'
    2:
      label_ro: 'Label RO'
destination:
  plugin: entity:taxonomy_term
  translations: true
process:
  vid:
    plugin: default_value
    default_value: categories
  tid:
    plugin: migration_lookup
    source: id
    migration: categories
  content_translation_source:
    plugin: default_value
    default_value: 'en'

migration_dependencies:
  required:
    - categories

Much of it is like the previous migration. There are some important changes though:

  • We use the deriver key to define the deriver class. This will be the class that creates the individual derivative definitions.
  • We configure the destination plugin to accept entity translations. This is needed to ensure we are saving translations and not source entities. Check out Drupal\migrate\Plugin\migrate\destination\EntityContentBase for more info.
  • Unlike the previous migration, we define also a process mapping for the taxonomy term ID (tid). And we use the migration_lookup process plugin to map the IDs to the ones from the original migration. We do this to ensure that our migrated entity translations are associated to the correct source entities. Check out Drupal\migrate\Plugin\migrate\process\MigrationLookup for how this plugin works.
  • Specific to the destination type (content entities) we need to import a default value also in the content_translation_source if we want the resulting entity translation to be correct. And we just default this to English because that was the default language the original migration imported in. This is the source language in the translation set.
  • Finally, because we need to lookup in the original migration, we also define a migration dependency on the original migration. So that the original gets run, followed by all the translation ones.

You’ll notice another important difference: the term name is missing from the mapping. That will be handled in the deriver based on the actual language of the derivative because this is not something we can determine statically at this stage. So let’s see that now.

In our main module namespace we can create this very simple deriver (which we referenced in the migration above):

namespace Drupal\advanced_migrations;

use Drupal\Component\Plugin\Derivative\DeriverBase;
use Drupal\Core\Language\LanguageInterface;
use Drupal\Core\Language\LanguageManagerInterface;
use Drupal\Core\Plugin\Discovery\ContainerDeriverInterface;
use Symfony\Component\DependencyInjection\ContainerInterface;

/**
 * Deriver for the category translations.
 */
class CategoriesLanguageDeriver extends DeriverBase implements ContainerDeriverInterface {

  /**
   * @var \Drupal\Core\Language\LanguageManagerInterface
   */
  protected $languageManager;

  /**
   * CategoriesLanguageDeriver constructor.
   *
   * @param \Drupal\Core\Language\LanguageManagerInterface $languageManager
   */
  public function __construct(LanguageManagerInterface $languageManager) {
    $this->languageManager = $languageManager;
  }

  /**
   * {@inheritdoc}
   */
  public static function create(ContainerInterface $container, $base_plugin_id) {
    return new static(
      $container->get('language_manager')
    );
  }

  /**
   * {@inheritdoc}
   */
  public function getDerivativeDefinitions($base_plugin_definition) {
    $languages = $this->languageManager->getLanguages();
    foreach ($languages as $language) {
      // We skip EN as that is the original language.
      if ($language->getId() === 'en') {
        continue;
      }

      $derivative = $this->getDerivativeValues($base_plugin_definition, $language);
      $this->derivatives[$language->getId()] = $derivative;
    }

    return $this->derivatives;
  }

  /**
   * Creates a derivative definition for each available language.
   *
   * @param array $base_plugin_definition
   * @param LanguageInterface $language
   *
   * @return array
   */
  protected function getDerivativeValues(array $base_plugin_definition, LanguageInterface $language) {
    $base_plugin_definition['process']['name'] = [
      'plugin' => 'skip_on_empty',
      'method' => 'row',
      'source' => 'label_' . $language->getId(),
    ];

    $base_plugin_definition['process']['langcode'] = [
      'plugin' => 'default_value',
      'default_value' => $language->getId(),
    ];

    return $base_plugin_definition;
  }

}

All plugin derivers extend the Drupal\Component\Plugin\Derivative\DeriverBase and have only one method to implement: getDerivativeDefinitions(). And to make our class container aware, we implement the deriver specific ContainerDeriverInterface that provides us with the create() method.

The getDerivativeDefinitions() receives an array which contains the base plugin definition. So essentially our entire YAML migration file turned into an array. And it needs to return an array of derivative definitions keyed by their derivative IDs. And it’s up to us to say what these are. In our case, we simply load all the available languages on the site and create a derivative for each. And the definition of each derivative needs to be a “version” of the base one. And we are free to do what we want with it as long as it still remains correct. So for our purposes, we add two process mappings (the ones we need to determine dynamically):

  • The taxonomy term name. But instead of the simple Get plugin, we use the Drupal\migrate\Plugin\migrate\process\SkipOnEmpty one because we don’t want to create a translation at all for this record if the source column label_[langcode] is missing. Makes sense right? Data is never perfect.
  • The translation langcode which defaults to the current derivative language.

And with this we should be ready. We can clear the cache and inspect our migrations again. We should see a new one with the ID category_translations:ro (the base plugin ID + the derivative ID). And we can now run this migration as well and we’ll have our term translations imported.

Other examples

I think dynamic migrations are extremely powerful in certain cases. Importing translations is an extremely common thing to do and this is a nice way of doing it. But there are other examples as well. For instance, importing Commerce products. You’ll create a migration for the products and one for the product variations. But a product can have multiple variations depending on the actual product specification. For example, the product can have 3 prices depending on 3 delivery options. So you can dynamically create the product variation migrations for each of the delivery option. Or whatever the use case may be.

Conclusion

As we saw, the Drupal 8 migration system is extremely powerful and flexible. It allows us to concoct all sorts of creative ways to read, clean and save our external data into Drupal. But the reason this system is so powerful is because it rests on the lower-level plugin API which is meant to be used for building such systems. So migrate is one of them. But there are others. And the good news is that you can build complex applications that leverage something like the plugin API for extremely creative solutions. But for now, you learned how to get your translations imported which is a big necessity.

Jun 13 2019
Jun 13

Burnout is becoming an increasingly prevalent problem, especially in a field as fast-paced as development. With more and more businesses undergoing a digital transformation, the demand for experienced developers has never been higher - and with it, naturally, come higher and higher demands from these developers.

This is further accentuated by the work- and career-oriented mentality we see widespread today. You can frequently spot people on social media either bragging or complaining about how hard or how long they’ve worked, but, even in the first case, such a workflow is certainly not sustainable. 

It’s true that more work yields more profit; but what good is profit when one’s mental health, and by consequence also physical health, suffer on account of work overload?

Another reason for burnout that should also be mentioned, besides excessive working hours, is a general dissatisfaction with how the work is done and a suboptimal workplace experience. 

In fact, we could argue that monotony or having very little control over one’s work is even more detrimental than working really long hours. Put the two together and you’re practically calling for burnout to arrive. 

This two-part series explores how you can spot the symptoms of your developers burning out and how you can mitigate or even prevent developer burnout. In the first part, we’ll focus on the symptoms of burnout; in the second, we’ll take a look at how to reduce the risks of burnout as a developer, as well as what measures to take as a manager to reduce those risks and mitigate burnout when it happens.

Symptoms of burnout - and how to spot them

Let’s start with the symptoms of burnout. Logically, it’s easier to spot these through self-reflection (e.g. you notice a lack of energy and/or motivation), but it’s even more crucial for managers to be able to spot them in their employees. So, let's take a look at what signs to look for as indicators that your developers are burning out.

  • They’re lacking energy and/or motivation: this is likely the most obvious symptom of burnout, but should nonetheless be mentioned. If you notice that certain developers on your team constantly seem sleepy and unmotivated, especially in a more hectic period, this should be a red flag that something is wrong.
  • They’re frequently late to work: in line with the previous point, sleepiness and late working hours may result in sleeping through morning alarms and consequently arriving late. The first instinct would be to scold or punish the person in question, but a deeper investigation may reveal other reasons for it - especially if they still seem lacking in energy after arriving late, and this happens on a relatively regular basis.
  • They’ve isolated themselves and stopped talking to coworkers: this can be difficult to spot in employees who are more introverted by nature, or those who work on specific projects that don’t require as much collaboration (or even disallow it altogether, e.g. when working under a very strict NDA). This means that you need to be extra mindful of these employees so that potential signs of their burnout don’t go overlooked. 
  • They’ve stopped participating at meetings: this point is similar to the previous one in that it concerns a kind of isolation. If someone is physically present at meetings, but “not really there” in the practical sense, it can either be because they have so much on their mind already, or because they’re too tired to actively participate. Both of these can be signs of burnout. 
  • The quality of their work has decreased: if you notice an increase of bugs and mistakes in a certain developer’s code, or if they take longer than usual to solve relatively simple tasks that involve familiar technologies, this could indicate that they’re suffering from burnout. Make sure to thoroughly explore this possibility before you sanction them.

Granted, some of these are almost impossible to spot if you have a freelancer or a team of developers working for you remotely. In such a case, you should also look for the following indicators: a remote worker fails to do certain tasks, or delivers them very late, they stop responding to calls and direct messages, they fail to track their time, etc. 

A word of warning, though: most of the points we’ve discussed here can be indicators of other issues, not necessarily burnout, but also personal issues such as family troubles and health issues (but, again, these could be the result of burnout, so it’s a bit of a “chicken-and-egg” situation). 

Nevertheless, if you are an open company that has a healthy company culture and a pretty good grasp of the goings-on in the lives of your employees (without being too Big Brother-y, of course), you can assume these are symptoms of burnout - especially if they start appearing in periods that demand more, or more difficult, work than usually. 

As a manager or a CEO of a smaller company, you need to communicate frequently and clearly with your subordinates and establish a trusting relationship with them. This will make it more likely that they’ll be willing to open up to you about their work and any difficulties they might be facing, and getting to know them will help you spot that something is off.

This holds true for teammates as well - be mindful of changes in your coworkers’ behavior that may indicate that they are overworked and on a path towards burnout. It’s much easier to spot something when you’re aware of it and know what you’re looking for. 

A very useful tool for collecting feedback from your employees, which we at Agiledrop also make good use of, is Officevibe. By guaranteeing anonymity, it gives those individuals who don’t want to expose themselves a chance to voice their opinions and/or dissatisfactions. With it, you’ll be able to get honest feedback and therefore a better overview of your team.

Conclusion

Hopefully, we’ve shed some light on the main signs of developer burnout and how to spot them. If you want to learn more, make sure to check back early next week for part 2 of the series, in which we’ll dive into some ways of reducing the risks of burnout occurring or even preemptively preventing it. 

Other posts in this series:

  • Part 2: How to prevent or mitigate developer burnout (coming soon)
Jun 13 2019
Jun 13

Google Maps don't look appealing or pretty by default when you embed them in your Drupal content. Nor do they always nicely coordinate with your site look and feel.

What if you found a way to give them a custom design? For example - your own color? In this tutorial, you will learn how to give your Drupal Google Maps a custom style with the Styled Google Map contrib module.

Step #1. Download the Required Modules

For this example, you’ll have to download and enable 3 Modules.

  • Styled Google Map.
  • Geofield Map.
  • Geofield (this is a dependency for the other two modules).

Use your preferred method to download the modules. I’m using the Composer since it will automatically take care of all the needed dependencies.

Drupal Google Maps Composer output 1

Drupal Google Maps Composer output 2

Drupal Google Maps Composer output3

Step #2. Configure the Styled Google Map Settings

  • Click Configuration > Web services > Styled Google Map settings page.
  • Click the link above the blue button.
  • Get yourAPI Key from Google.

Get your Drupal Google Maps API Key

  • Scroll down until you see the blue Get a Key button and click on it.
  • Create a project name.
  • Click Next.

Create a project name

  • Copy the generated key.
  • Click Done.

Click Done for Drupal Google Maps

  • Paste the key in your Drupal site.
  • Click Save configuration.

Click Save configuration

Step #3. Create a Content Type with a Location

  • Click Structure > Content types > Add content type.
  • Give your content type a proper name.
  • Click Save.
  • Add fields.

Click Save and add fields

  • Click the blue Add field button.
  • Choose Geofield.
  • Add a proper label.
  • Click Save and continue.

Click Save and continue

  • Leave the default number of values.
  • Click Save field settings.

Notice that you can choose here multipĺe (or unlimited) values if you want to show more than one marker in the same map (for example a fast food chain with multiple locations).

Notice that you can choose here multipĺe (or unlimited) values

Step #4. Configure the Content Type Display

  • Click Structure > Content types > Location > Manage display.
  • Look for your Geofield field.
  • Change the format to Styled Google Map.
  • There’s a cogwheel on the right, it handles various configuration options for the map (we’ll come back here later).
  • Click Save.

Click Save

Step #5. Configure the Form Display

  • Click Structure > Content types > Location > Manage form display.
  • Look for your Geofield field.
  • Change the widget to Geofield Map.
  • Click Save.

geofield map

Step #6. Create a Node

  • Create a node by the Location type.
  • The Geofield Map widget you chose in the last step will help you to position the marker with an address (and not with latitude and longitude values).
  • Click Save.

Click Save

Step #7. Configure The Map Design

There are lots of map designs on this site.

  • Choose your preferred one.
  • Copy the JavaScript code on the left.

Copy the JavaScript code on the left

  • Click Structure > Content types > Location > Manage display.
  • Click the cogwheel on the right of your Geofield field. You’ll find a lot of configuration options. Feel free to explore and experiment with them.
  • Scroll down and select MAP STYLE.
  • Paste the code you selected into the textbox.

Paste the code you selected into the textbox

  • Click Update.
  • Click Save.

Take a look at your node, the map has now a custom look!

Take a look at your node, the map has now a custom look!

If you want to customize your maps even further and with your own colors, take a look at this style wizard application in Github, it helps you generate the JSON code required to style the map.

Take a look at this style wizard application in Github

Additional Reading: 

Would you like to know more about how to build great websites with Drupal 8? Sign up for our Video Club and watch its easy to follow lessons at your convenience.


About the author

Jorge lived in Ecuador and Germany. Now he is back to his homeland Colombia. He spends his time translating from English and German to Spanish. He enjoys playing with Drupal and other Open Source Content Management Systems and technologies.
Jun 13 2019
Jun 13

With Drupal 9 approaching rapidly, it is an exciting time to be on the Drupal Association Board. The Association must continue to evolve alongside the project so we can continue providing the right kind of support. And, it is the Drupal Association Board who develops the Association’s strategic direction by engaging in discussions around a number of strategic topics throughout their term. As a community member, you can be a part of this important process by becoming an At-large Board Member.

We have two At-large positions on the Association Board of Directors. These positions are self-nominated and then elected by the community. Simply put, each At-large Director position is designed to ensure there is community representation on the Drupal Association Board.

Inclusion

2018

Map of 2018 candidates

In 2018, we made a special effort to encourage geographic inclusion through the people who were candidates for election and we were delighted that candidates stood in six continents all across the World — thank you!

2019

Drupal Association logo, Pride version

Now, in 2019, and recognising we are in the middle of Pride Month, we want to particularly encourage nominations from candidates from underrepresented or marginalised groups in our community. As referenced later in this blog post, anyone is eligible to nominate themselves, and voters can vote for whichever candidate they choose, but we want to encourage this opportunity to amplify the voices of underrepresented groups with representation on the Association Board. And as we meet the candidates, whether they are allies or members of these groups themselves, we hope to center issues of importance to these communities - in addition to the duties of care for the management of the Association that are always central to a board role.

As always, any individual can stand for election to the board, but by centering these important issues we are determined to encourage a board made of diverse members as that gives them the best ability to represent our diverse community.

If you are interested in helping shape the future of the Drupal Association, we encourage you to read this post and nominate yourself between 29 Jun, 2019 and 19 July 2019.

What are the Important Dates?

Self nominations: 29 Jun, 2019 to 19 July, 2019

Meet the candidates: 22 July, 2019 to 26 July, 2019

Voting: 1 August, 2019 to 16 August, 2019

Votes ratified, Winner announced: 3 September, 2019

How do nominations and elections work?

Specifics of the election mechanics were decided through a community-based process in 2012 with participation by dozens of Drupal community members. More details can be found in the proposal that was approved by the Drupal Association Board in 2012 and adapted for use this year.

What does the Drupal Association Board do?

The Board of Directors of the Drupal Association are responsible for financial oversight and setting the strategic direction for serving the Drupal Association’s mission, which we achieve through Drupal.org and DrupalCon. Our mission is: “Drupal powers the best of the Web.  The Drupal Association unites a global open source community to build and promote Drupal.”

New board members will contribute to steer the strategic direction of the Drupal Association. Board members are advised of, but not responsible for, matters related to the day-to-day operations of the Drupal Association including program execution, staffing, etc.

Directors are expected to contribute around five hours per month and attend three in-person meetings per year (financial assistance is available if required).

Association board members, like all board members for US-based organizations, have three legal obligations: duty of care, duty of loyalty, and duty of obedience. In addition to these legal obligations, there is a lot of practical work that the board undertakes. These generally fall under the fiduciary responsibilities and include:

  • Overseeing Financial Performance

  • Setting Strategy

  • Setting and Reviewing Legal Policies

  • Fundraising

  • Managing the Executive Director

To accomplish all this, the board comes together three times a year during two-day retreats. These usually coincide with the North American and major European Drupal Conferences, as well as one February meeting. As a board member, you should expect to spend a minimum of five hours a month on board activities.

Some of the topics that will be discussed over the next year or two are:

  • Strengthen sustainability

  • Grow Drupal adoption through our channels and partner channels

  • Evolve drupal.org and DrupalCon goals and strategies.

Who can run?

There are no restrictions on who can run, and only self-nominations are accepted.

Before self-nominating, we want candidates to understand what is expected of board members and what types of topics they will discuss during their term. That is why we now require candidates to:

What will I need to do during the elections?

During the elections, members of the Drupal community will ask questions of candidates. You can post comments on candidate profiles here on assoc.drupal.org.

In the past, we held group “meet the candidate” interviews. With many candidates the last few years, group videos didn’t allow each candidate to properly express themselves. We replaced the group interview and allow candidates to create their own 3-minute video and add it to their candidate profile page. These videos must be posted by 19 July, 2019, and the Association will promote the videos to the community from 22 July, 2019. Hint: Great candidates would be those that exemplify the Drupal Values & Principles. That might provide structure for a candidate video? You are also encouraged to especially consider diversity and inclusion.

How do I run?

From 29 June, 2019, go here to nominate yourself.  If you are considering running, please read the entirety of this post, and then be prepared to complete the self-nomination form. This form will be open on 29 June, 2019 through 19 July, 2019 at midnight UTC. You'll be asked for some information about yourself and your interest in the Drupal Association Board. When the nominations close, your candidate profile will be published and available for Drupal community members to browse. Comments will be enabled, so please monitor your candidate profile so you can respond to questions from community members. We will announce the new board member via our blog and social channels on 3 September, 2019.

Reminder, you must review the following materials before completing your candidate profile:

Who can vote?

Voting is open to all individuals who have a Drupal.org account by the time nominations open and who have logged in at least once in the past year. If you meet this criteria, your account will be added to the voters list on association.drupal.org and you will have access to the voting.

To vote, you will rank candidates in order of your preference (1st, 2nd, 3rd, etc.). You do not need to enter a vote on every candidate. The results will be calculated using an "instant runoff" method. For an accessible explanation of how instant runoff vote tabulation works, see videos linked in this discussion.

Elections process

Voting will be held from 1 August, 2019. During this period, you can review and comment on candidate profiles on assoc.drupal.org.

Finally, the Drupal Association Board will ratify the election and announce the winner on 3 September, 2019.

Have questions? Please contact Drupal Association Community Liaison, Rachel Lawson.

Finally, many thanks to nedjo for pioneering this process and documenting it so well!

Update to elected board member responsibilities

As detailed in a previous blog post, the elected members of the Drupal Association Board now have a further responsibility that makes their understanding of issues related to diversity & inclusion even more important; they provide a review panel for our Community Working Group. This is a huge important role in our global community.

A note from the nomination committee

While this blog post is primarily directed at the community elections process for the board - the Nomination Committee of the Drupal Association wants to affirm that diversity is a top priority during the Drupal Association Board of Directors nomination process for the appointed positions as well. We will work to identify the best way to provide more insight regarding how the committee evaluates candidates.

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web