Jun 14 2019
Jun 14
Once upon a time, the authoring experience (AX) in Drupal left much to be desired. Content editors using Drupal 6 or Drupal 7 glanced longingly across at WordPress editing screens wistfully hoping for some of that ease of use. The flexibility of Drupal from a content management standpoint was never in doubt, but we all just wished the edit screen looked so much better and behaved in a manner we were accustomed to when using other modern digital products and services. Well, finally the wait is over! Welcome to the new Drupal authoring experience! Let's focus on three main areas of the Drupal authoring experience which have made Drupal 8 a game-changer for digital marketing professionals. 1. Gutenberg Editor It's nice...it is really nice! Below is a screenshot of the new Gutenberg editor experience available in Drupal 8.
Jun 14 2019
Jun 14

What’s the one big challenge that marketers and CMO’s we partner with  are facing this year? It’s really tough to put a finger on just one. Proving impact on revenue, marketing team staffing, personalization, and marketing-IT alignment are among the hurdles voiced in discussions that Mediacurrent’s sales team are having with prospects and clients. We are finding CMO’s are pressed more than ever to show marketing’s value while the complexities and opportunities sprouting within digital continue to evolve. Let’s dive into each challenge and uncover what makes these hurdles difficult to jump — and the tools or approach that can help marketers overcome them.

Proving Impact on Revenue

Probably not surprising that last year Gartner surveyed where CMOs were spending marketing budgets. They found marketing budgets shrunk slightly year over year since 2016 while a higher percentage of budgets are being allocated to digital. The pressure is on for marketers to prove how specific marketing campaigns and investments directly contribute to an organization’s revenue. Owners and shareholders want more specificity in understanding how much budget to allocate to higher revenue generating activities. Furthermore, marketers need to react faster to fluctuating market conditions that impact customer experience.

How can you attribute revenue to specific marketing activities and demonstrate ROI so you can invest and optimize in the right activities? There are a number of SaaS tools available and most implement a specific approach to measure marketing attribution and achieve granular ROI tracking. 

  • Motomo - offers a GPL-licensed on-premise web analytics stack.
  • Bizible - analytics and multi-touch revenue attribution.
  • Terminus / Brightfunnel - product suite that offers account-based marketing analytics and sales insights.
  • Conversion Logic - cross-channel attribution with AI-powered insights and budget forecast simulations.
  • Allocadia - a marketing performance management platform that offers revenue attribution and insights into marketing budget allocation.
  • Full Circle Insights - product stack that tracks marketing and sales attribution, built as a native Salesforce App.
  • Google Attribution - formerly called Adometry, it’s now part of the Google Marketing Platform.
  • Salesforce CRM - ROI tracking can be enabled with additional campaign configuration.
  • Domo Digital 360 - full suite of analytics, funnel, and ROI tracking.
  • VisualIQ - strategic measurement, multi-touch attribution, audience analysis, and predictive insights.
  • Oracle Marketing Cloud - integrated suite of tools that include analytics, marketing automation, content/social marketing, and data management.

Because each tool specializes in a specific aspect of ROI tracking, you will need to do some research to understand which tool best fits your organization. Most of the tools listed above implement some form of attribution tracking that will help achieve more robust ROI calculations. Our Director of Marketing Adam Kirby gives a helpful overview of  how marketing attribution works, in his MarTech West slide deck. Organizations we speak with often need help from consultants and agencies to understand how to optimally configure their martech stack with ROI tracking tools. This need coincidentally brings us to the next challenge marketer’s are facing...

Staffing Teams - The Right Blend

Organizations are becoming more careful to find the proper balance between internal team staffing and engaging help from an outside agency. In the early 2010’s, there was a movement within Fortune 2000 companies to bring more expertise in-house. As martech complexity evolved into the latter part of this decade, organizations are realizing that exposure to new technologies and approaches is limited with their in-house teams. By engaging with a wide spectrum of industries, clients, and projects, agencies provide a broad view into the martech landscape that in-house teams don’t have. What’s the right blend? It depends on the vertical. Organizations with one large website typically outsource at least half of their digital marketing. Higher Ed and Government have longer procurement cycles and, consequently, need at least 75% of their overall marketing team to be full-time in-house.

Not only is outside help needed by in-house teams to stay informed, budget scrutiny is forcing CMO’s to seek off-shore development help. However, they are finding off-shore falters when technology projects aren’t being led by one or more on-shore architects who maintain a project’s integrity between on-shore stakeholders and off-shore teams. These technical liaisons are critical to off-shore development success. We see too many organizations assume if off-shore developers demonstrate technical competency, they should be fully capable of leading an implementation. Yet, those organizations fail to consider the strength of influence local culture has on communication dynamics and the perception of requirements by off-shore teams.

Personalization

Another challenge marketers are targeting is how personalization can impact KPIs and produce a higher ROI percentage compared to other digital marketing efforts. In 2017, the concept of personalization was buzzing while marketers were trying to understand what it takes from a content and labor effect to implement. After GDPR went into effect a little over a year ago, personalization efforts have to take into account how GDPR laws impact customer data acquisition and retention, making the implementation of personalization trickier and more complex with respect to data analysis and the ability to capitalize on personalization opportunities. Tools like Acquia Lift, open source marketing automation platform Mautic (recently acquired by Acquia), Triblio, and Optimizely Web Personalization offer slightly different perspectives on personalization. 

When evaluating if you’re ready for personalization, here are eight considerations that will dictate success when carefully planned or potential failure if not addressed:

  1. Do you have enough content that’s written for each persona your personalization effort needs to target?
  2. Do you have content creators who can continually create new and evergreen content?
  3. Do you have KPIs defined to track the performance of your personalization efforts?
  4. Is your martech stack compatible with personalization technologies that fit your business model?
  5. Do accurate, fresh data metrics exist in usable forms? Is data structured uniformly and exclusive of redundancies that might skew its meaning?
  6. How do data privacy laws impact the efficacy of a personalization initiative? Can enough of the right user data legally be captured to supply the initiative?
  7. Are data governance guidelines in place that ensure data integrity stays intact well beyond the implementation phase of a personalization initiative?
  8. Finally, is your department or organization committed to investing time and energy into personalization? It’s a long game and shouldn’t be misinterpreted as an off-the-shelf-set-it-and-forget-it type of content solution.

If you’re starting a personalization strategy from ground zero, Mediacurrent Senior Digital Strategist Danielle Barthelemy wrote a quick guide to creating a content strategy with personalization as the end-goal. Danielle illustrates how a sound personalization strategy positively influences purchase intent, response rate, and acquisition costs. 

Marketing-IT Alignment

In order for digital marketing execution to be as effective and efficient as possible with initiatives like ROI tracking and personalization, it’s imperative for marketing and IT teams to collaborate cohesively.  A frictionless environment is critical for marketers to meet the immediacy of an ever-increasing market speed. In some organizations, these two departments are still maintaining competing interests in relation to policy, security, infrastructure, and budget. Example scenarios include  strict IT policies that can stifle speed-to-market, cowboy marketers all but ignoring technical security when implementing new tools, and executives missing the budgetary discord that echoes when both departments operate in their own silos.

These independent agendas must be meshed together into one for the betterment of the organization. But how? 

  • Learn how to empathize by understanding each other’s goals and challenges across departments. Define a shared list of KPI’s and time-bound each.
  • Schedule weekly touch point meetings between IT and marketing leaders.
  • Conduct a quarterly tools review to understand the “why” behind tools that each department uses.
  • Demonstrate discipline-specific concepts that require collaboration from the other department. For instance, show IT how marketing attribution works and what’s required of them to make it successful. Or, show marketing what a normalized database is and how it will help marketing be successful by reducing duplicate data.

Marketing ROI: An Ongoing Challenge

Overall, the challenges CMO’s are asking us about as we move into the latter half of 2019 are heavily rooted in accurately tracking ROI and putting tools in place to boost it. While marketers have been challenged with proving ROI for years, digital has evolved to a point where tools and systems exist that embolden marketers to aggressively pursue understanding where their money is best spent. For most organizations, there are still talent hurdles to overcome and knowledge gaps to fill to properly implement martech and systems that accurately track ROI. 

How about you — what challenges are your marketing department working to solve this year? Have you found the right in-house to agency team blend? Have you had success with ROI tracking and personalization?

Jun 14 2019
Jun 14

A lot of buzz around “Decoupled Drupal” is taking place and it has quickly become ubiquitous in the industry. Drupal has won hearts by embracing the newest of technology and presenting the best of possibilities. The full separation of the structure from the content has aided the content management systems with appropriate means to accelerate the pace of innovation. 

In this blog, we will address some loaded questions of what, why and when of Decoupled Drupal for you. 

A headless robot

Decoupled Drupal Is For You

Rendering a separate framework for front-end and back-end content management experience, Decoupled Drupal provides for a content presentation that is completely devoid of the content management. It is also known as ‘Headless Drupal’, where the head refers to the front-end rendering or the presentation of the content and the ‘body’ attributes to the backend storage. 

Addressing the 3 Ws: Why, What, When 

In this section, we will take one head at a time and examine the core functionalities of Decoupled (Headless) Drupal. 

Why Decoupled?

Being a flexible framework for developing websites, web/native apps and similar digital products, Decoupled Drupal allows for designers and front-end developers to build without limitations. As an organisation you can leverage a decoupled approach for progressive web apps, and native apps. Decoupled Drupal has created a noise in the community with its divide and conquer development strategy.

What’s your Intention?

Your intentions always determine the outcome, i.e., how your product will be built with the Decoupled Drupal. For the developers working on it, here are a few scenarios and their outcomes: 

  • In case of standalone websites/applications, decoupled Drupal might not be a wise choice. 
  • For multiple web applications and websites, decoupled Drupal can be leveraged in two different ways. 
  • When building non-web native apps, you can employ decoupled Drupal to attain a content repository without its own public-facing front end.
flow chartSource: Dri.es

Once the intentions are clear, the next step is to see if it can be executed given a proper apparatus. Here are a few questions that should influence your decision to choose decoupled Drupal: 

  • Is it right for your project and your team?
  • Do you have a strong grasp on your data needs?
  • Evaluate if your current hosting provider can support this architecture
  • Are you prepared to handle the complexity of serving content to multiple clients?
  • Do the URL alias values have a unique identifier that makes API requests easy?
  • Can your metadata logic power meta tags, JSON-LD, analytics to be generated with standardised rules?
  • Where are menus created, ordered, and managed? 
  • Do you have an architecture that supports combining multiple redirect rules into a single redirect?

When to Decouple

By now we have established enough facts that Decoupled Drupal is a package full of advantages. It’s time to delve deeper and seek the accuracy of circumstances in which it can be put into effect: 

Decoupled Drupal allows for designers and front-end developers to build without limitations

Resources 

Progressively decoupling the Drupal requires a separate development of the backend and front-end and thus, separate resources are a mandate. Two individually capable teams that can collaborate and support makes for a successful decoupled Drupal. 

Multiple Channels

 The faculty of publishing content and data across platforms and products can affect the way you become headless.

Applicable Content

 Decouple is a great fit if you already have an interactive data. Visualisations, animations, and complex user flows pushes for frameworks like Ember, React, Vue JS or Angular.

Drupal Interface

Sometimes, a rich interface and built-in features can hinder the work. Even Drupal’s flexible content model to store content requires a different interface for adding and managing that content in some cases. 

When Not to Decouple

Inversely, it is equally important to know what situations might not be healthy for a decoupled Drupal to thrive. Gauge these possibilities to rule out situations/project:

  • Drupal has the advantage to leverage a huge pile of free modules from the open source community. But with the decoupled Drupal, the ability to easily “turn-on” the front-end functionality goes out of the window. The separate content management system eliminates this likelihood of managing your website front-end directly. 
  • Drupal’s front-end proficiency should align with your front-end requirement. Absence of a systematic match can land your decoupled dream in doubts.  

Conclusion

There’s no confusion about the abilities of Decoupled Drupal. It’s your business requirements that should fit in like a puzzle with the headless architecture. With necessary technical leadership skills and expertise in this web infrastructure, you can sail your decoupling aspirations to the other end. 

We’d love to hear your feedback on our social media platforms: Twitter, Facebook and LinkedIn

And do not forget to share more ideas at [email protected]

Jun 14 2019
Jun 14

Another successful day at Drupal North is now complete! This day was packed with sessions from all kinds of speakers, including our very own Jigar Mehta and Robert Ngo. Some great discussions were had amongst the Drupal community which was out in full force. Here are some of the ideas that we saw repeated throughout the day:

Content must be modular

Making your content modular allows you to easily plug it into any new type of channel. There's no need for you to start from scratch just because you're creating something for a different platform or user base. And, if you keep this content in a centralized hub, all users have access to the most accurate and up-to-date versions.

Plan out where you're going in the initial design phase

Knowing where you're going makes it that much easier to get there. You need to start with solid components so you don't have to go back later on and make constant revisions. A detailed plan allows you to take advantage of UI Patterns that will save you time and headaches in the future.

More and more people actually know about Drupal

Years ago, many within the Drupal community would have to explain to people what Drupal, and even open-source was. This made the task of convincing them to switch to a Drupal site even harder. Now, executives and decision-makers will have often already heard of Drupal and just need to be convinced of what value YOU can bring to them.

Accessibility is key

The web is for everyone and that means your website needs to be accessible for everyone. It's also important to maintain this accessibility; technology is always improving so just because your site was accessible when you launched it 3 years ago, doesn't mean it is today. And when you conduct user tests, try and recruit diverse participants in order to get more inclusive results.

Drupalers love basketball!

To wrap up the day, conference attendees went to the after party to catch game 6 of the NBA Finals -- GO RAPTORS!

Just one more day left of Drupal North and we hope you've been making the most of it! Make sure you're following along with us on LinkedIn and Twitter, and check out the rest of our daily recaps on this blog.

Jun 13 2019
Jun 13

You’ve decided it’s time to rebuild your website. Research has been done, conversion rates have been analyzed, the team has selected a rebuild over a focused fix, and you and your team are committed to making this happen. One of the easiest ways of ensuring your success is to remain mindful of a few key things as you work your way through this larger process.

Regarding that term, “mindful:” one of the Kanopi team’s favorite authors is Brené Brown. She writes, “Mindfulness requires that we not “over-identify” with thoughts and feelings so that we are not caught up and swept away by negativity.” For the purposes of your website rebuild, I’d adapt this to be, “Mindfulness requires that we not “over-focus” on what we’ve done before, and rather remain aware of what’s important for our success so that we can focus on where we want to be.”

So, let’s get to it and break down what the top five things we need to be mindful of when executing a rebuild project.

1. YOU are the difference! Be engaged.

Stakeholder engagement can make or break a rebuild. But rebuilds are time-consuming, and you and your stakeholders will likely be pulled in several directions as you try to execute a rebuild while balancing other priorities and projects.

Your availability, open communication, and timely feedback is critical to enable your team to create the web presence your organization needs to reach its goals. Be realistic in what time your team can devote to the project so you can be as fully engaged as possible. Define roles and responsibilities early as well so it’s clear who is handling what.

If you need an assist from an outside agency to keep the project moving quicker, be direct with them about your business needs and wants. Help them to understand your users and audiences. An agency will make every effort to dive deeply into understanding your market, but at the end of the day, you and your team are the experts on what you do. So view any outside agency as a partner who can work with you towards success, and stay engaged with them throughout the process.

2. Define success & track it

We cannot know if we’re successful until we have identified what success will look like. For some sites, it’s simply exposure. For others, it’s a need to meet specific goals. Take the time to define what your organization needs to achieve, and which key metrics will allow us to quantify success.

Not sure where to start? Here are common metrics should you benchmark now as you prepare for the rebuild:

  • Users: note how many users are regularly coming to your site
  • Bounce Rate: record the overall bounce rate. Make note if this is at, above or below your industry’s standard.
  • Average Session Duration: how long are users staying on your page?
  • Sessions by Channel: where are your users coming from? How much organic traffic is coming in?
  • Top Keywords: identify what words are being used in the search engines when users are finding you. Are these surprising?
  • Competitor Keywords: are users who are looking at your competitors using the same keywords?
  • Top Referrers: who is sending traffic to your site? Maybe social media is key, or you’re more focused on industry referrals. Determine where you should be in the market.
  • Conversion Rates: what forms do you need users to fill out? What conversions are critical to your business goals? These can take the form of contact or forms from your CRM tools such as Marketo or Pardot, or even visits to a specific page or video views.   
  • Accessibility: does your site meet national or international compliance standards?

In short, benchmark where you are now, and use this data to help round out that definition of success. Then come back a few months after launch to reevaluate and compare so you can quantify the success to your stakeholders.

3. Get your content strategy in order

The old saying “Content is King” is truer today than ever. Users are more educated. Search engines have become smarter, looking for more than keywords — they look for meaning in phrases to help determine the focus of a given page.

As one of the most effective methods of growing audience engagement, developing your brand presence, and driving sales, content marketing is a mission-critical growth method for most businesses. — Hubspot

This is where most people turn to me and tell me they’ll get their team on it so they can move further along in the content process. But don’t underestimate the time and energy content development/aggregation can take, even if your larger project is hiring a copywriter to augment your team. All too often, when content becomes a late-stage endeavor a few things happen:

  • timelines get pushed out, waiting for content to be approved.
  • changes to the previous UX are often required to account for unrealized navigation or calls to action, causing potential budget overages.
  • content is rushed and not in alignment with the overall vision.

To help this process come together for your team, here are a few action items to start with:

  • Audit your content: take a full inventory of your site’s content to better identify:
    • what to keep
    • what to repurpose
      • for example: the video may look dated, but could your team could write a blog post from that material?
    • what should not be migrated to your new site
      • this can be archived to be referenced at a later date
  • Build a sitemap: determine the hierarchy of the content on the new site.
  • Identify missing content: comparing your audit to your sitemap, what needs to be produced?
  • Track content creation: track who is responsible for writing, editing and approving content — and give them deadlines
  • Start thinking ahead: you may need to start planning future content. Developing an editorial calendar will help keep the process moving. Content typically included in an editorial calendar:
    • blog posts
    • social media posts
    • videos
    • infographics

When preparing for a rebuild, your content strategy has to be one of the first things your team takes on. This approach will save you time, headaches, and likely budget moving forward. 

4. Consider your users’ digital experience

By this stage in the process you should know your target market, their buying habits and why your product or service is of value to them. You likely have personas and other data to help back this up. But in the omnichannel world in which we thrive, there is often more to architecting an effective user journey. Understanding the nuances of the devices, the influence of how a user comes to your site, and the overall adherence to best practices are complex. For example, consider the following:

  • What percentage of users are coming from mobile devices?
    • Are you CTAs and main conversion points easy to access on a small screen?
    • Is the user journey simplified?
  • Are you users coming from social media?
    • Is it your blog driving traffic or more word of mouth?
    • Is it positive or negative attention?
  • Have you produced a user journey map to identify the different pathways to conversion?
    • Is your site currently set up to promote these journeys?
    • Are you utilizing personalization to customize that user journey?

You can learn more about how to use user research to gain insight into audience behavior to help you frame your thoughts about your personas overall user journey to conversion.

5. Think about the future of your site

Websites need to evolve and adapt as the needs of your users change over time, but as you rebuild, are you setting yourself up for more incremental changes moving forward? Keep in mind that most rebuilds are focused on the MLP or “Minimum Lovable Product.” It’s the simplest iteration of your site that will meet your current needs with the intent to continually improve it over time. Regardless of whether you’re focused on an MLP launch due to either time or budget constraints, we need to keep these future goals in mind as we progress.

And then there’s the technology side of this: whether you’re looking ahead to Drupal 8 or 9 or the next major evolution with WordPress, consider those needs now to help ‘future proof’ your new site. The web changes too quickly to risk your site being stale when it’s still brand new. Talk this through from the start with your team.

These steps will set you up for success.

Your site speaks to who you are as an organization to your target market. Whether you’re a non-profit, higher education or a corporate entity, being mindful now will set your team’s rebuild up for success. And if you need help with your rebuild, contact us. We’d love to partner with you and help you recognize that success.

Jun 13 2019
Jun 13

You’ve decided it’s time to rebuild your website. Research has been done, conversion rates have been analyzed, the team has selected a rebuild over a focused fix, and you and your team are committed to making this happen. One of the easiest ways of ensuring your success is to remain mindful of a few key things as you work your way through this larger process.

Regarding that term, “mindful:” one of the Kanopi team’s favorite authors is Brené Brown. She writes, “Mindfulness requires that we not “over-identify” with thoughts and feelings so that we are not caught up and swept away by negativity.” For the purposes of your website rebuild, I’d adapt this to be, “Mindfulness requires that we not “over-focus” on what we’ve done before, and rather remain aware of what’s important for our success so that we can focus on where we want to be.”

So, let’s get to it and break down what the top five things we need to be mindful of when executing a rebuild project.

1. YOU are the difference! Be engaged.

Stakeholder engagement can make or break a rebuild. But rebuilds are time-consuming, and you and your stakeholders will likely be pulled in several directions as you try to execute a rebuild while balancing other priorities and projects.

Your availability, open communication, and timely feedback is critical to enable your team to create the web presence your organization needs to reach its goals. Be realistic in what time your team can devote to the project so you can be as fully engaged as possible. Define roles and responsibilities early as well so it’s clear who is handling what.

If you need an assist from an outside agency to keep the project moving quicker, be direct with them about your business needs and wants. Help them to understand your users and audiences. An agency will make every effort to dive deeply into understanding your market, but at the end of the day, you and your team are the experts on what you do. So view any outside agency as a partner who can work with you towards success, and stay engaged with them throughout the process.

2. Define success & track it

We cannot know if we’re successful until we have identified what success will look like. For some sites, it’s simply exposure. For others, it’s a need to meet specific goals. Take the time to define what your organization needs to achieve, and which key metrics will allow us to quantify success.

Not sure where to start? Here are common metrics should you benchmark now as you prepare for the rebuild:

  • Users: note how many users are regularly coming to your site
  • Bounce Rate: record the overall bounce rate. Make note if this is at, above or below your industry’s standard.
  • Average Session Duration: how long are users staying on your page?
  • Sessions by Channel: where are your users coming from? How much organic traffic is coming in?
  • Top Keywords: identify what words are being used in the search engines when users are finding you. Are these surprising?
  • Competitor Keywords: are users who are looking at your competitors using the same keywords?
  • Top Referrers: who is sending traffic to your site? Maybe social media is key, or you’re more focused on industry referrals. Determine where you should be in the market.
  • Conversion Rates: what forms do you need users to fill out? What conversions are critical to your business goals? These can take the form of contact or forms from your CRM tools such as Marketo or Pardot, or even visits to a specific page or video views.   
  • Accessibility: does your site meet national or international compliance standards?

In short, benchmark where you are now, and use this data to help round out that definition of success. Then come back a few months after launch to reevaluate and compare so you can quantify the success to your stakeholders.

3. Get your content strategy in order

The old saying “Content is King” is truer today than ever. Users are more educated. Search engines have become smarter, looking for more than keywords — they look for meaning in phrases to help determine the focus of a given page.

As one of the most effective methods of growing audience engagement, developing your brand presence, and driving sales, content marketing is a mission-critical growth method for most businesses. — Hubspot

This is where most people turn to me and tell me they’ll get their team on it so they can move further along in the content process. But don’t underestimate the time and energy content development/aggregation can take, even if your larger project is hiring a copywriter to augment your team. All too often, when content becomes a late-stage endeavor a few things happen:

  • timelines get pushed out, waiting for content to be approved.
  • changes to the previous UX are often required to account for unrealized navigation or calls to action, causing potential budget overages.
  • content is rushed and not in alignment with the overall vision.

To help this process come together for your team, here are a few action items to start with:

  • Audit your content: take a full inventory of your site’s content to better identify:
    • what to keep
    • what to repurpose
      • for example: the video may look dated, but could your team could write a blog post from that material?
    • what should not be migrated to your new site
      • this can be archived to be referenced at a later date
  • Build a sitemap: determine the hierarchy of the content on the new site.
  • Identify missing content: comparing your audit to your sitemap, what needs to be produced?
  • Track content creation: track who is responsible for writing, editing and approving content — and give them deadlines
  • Start thinking ahead: you may need to start planning future content. Developing an editorial calendar will help keep the process moving. Content typically included in an editorial calendar:
    • blog posts
    • social media posts
    • videos
    • infographics

When preparing for a rebuild, your content strategy has to be one of the first things your team takes on. This approach will save you time, headaches, and likely budget moving forward. 

4. Consider your users’ digital experience

By this stage in the process you should know your target market, their buying habits and why your product or service is of value to them. You likely have personas and other data to help back this up. But in the omnichannel world in which we thrive, there is often more to architecting an effective user journey. Understanding the nuances of the devices, the influence of how a user comes to your site, and the overall adherence to best practices are complex. For example, consider the following:

  • What percentage of users are coming from mobile devices?
    • Are you CTAs and main conversion points easy to access on a small screen?
    • Is the user journey simplified?
  • Are you users coming from social media?
    • Is it your blog driving traffic or more word of mouth?
    • Is it positive or negative attention?
  • Have you produced a user journey map to identify the different pathways to conversion?
    • Is your site currently set up to promote these journeys?
    • Are you utilizing personalization to customize that user journey?

You can learn more about how to use user research to gain insight into audience behavior to help you frame your thoughts about your personas overall user journey to conversion.

5. Think about the future of your site

Websites need to evolve and adapt as the needs of your users change over time, but as you rebuild, are you setting yourself up for more incremental changes moving forward? Keep in mind that most rebuilds are focused on the MLP or “Minimum Lovable Product.” It’s the simplest iteration of your site that will meet your current needs with the intent to continually improve it over time. Regardless of whether you’re focused on an MLP launch due to either time or budget constraints, we need to keep these future goals in mind as we progress.

And then there’s the technology side of this: whether you’re looking ahead to Drupal 8 or 9 or the next major evolution with WordPress, consider those needs now to help ‘future proof’ your new site. The web changes too quickly to risk your site being stale when it’s still brand new. Talk this through from the start with your team.

These steps will set you up for success.

Your site speaks to who you are as an organization to your target market. Whether you’re a non-profit, higher education or a corporate entity, being mindful now will set your team’s rebuild up for success. And if you need help with your rebuild, contact us. We’d love to partner with you and help you recognize that success.

Jun 13 2019
Jun 13

Just when you think Drupal couldn’t get any dumber, it goes and adds some great new features….. And TOTALLY redeems itself!


via GIPHY

Released back in November of 2015, Drupal 8 has been slowly but steadily upping its game.

In case you’ve been lost in a jungle for the past couple of years, or maybe you just don’t keep up with that kind of thing, we’ve got you covered.

Here are just some of the things Drupal 8 and soon to be Drupal 9 have us jumping around like crazy apes about.

BigPipe

BigPipe is a technique that was invented by Facebook back in 2009 when they made the site twice as fast, which is an amazing feat in itself. 

How it works, is it first breaks pages up into multiple “sections”, which are then loaded in parallel so your users don’t have to wait for the page to be completely loaded before they can start interacting with it.

Page speed is extremely important, considering 47% of people expect your site to load in less than 2 seconds and 40% will abandon it entirely if it takes longer than 3 seconds.

Not only that, but Google has indicated site speed (and as a result, page speed) is one of the signals used by its algorithm to rank pages.

If you care about SEO, you should care about the speed of your pages.

Page SpeedSource: Google Developers

The BigPipe module was included in Drupal 8 core since 8.1 and became stable in 8.3.

Just having BigPipe enabled makes your pages faster with zero configuration needed. We can thank Drupal 8’s improved render pipeline & render API for that.

Yay Drupal 8!

Layout Builder

Drupal’s Layout Builder is probably what we are most excited about. 

For far too long, Drupal has been very restricting when it comes to building pages out and putting the content that you want, where you want it.

Think about if Display Suite and Panels had a baby gorilla. That’s Drupal’s new Layout Builder.

Layout Builder was introduced in Drupal 8.5.0 as an experimental core module, but as of Drupal 8.70, it is now stable and production ready!

It offers a powerful visual design tool and is meant for the following three use cases, according to Drupal.org:

Layouts for templated content. The creation of "layout templates" that will be used to layout all instances of a specific content type (e.g. blog posts, product pages).

Customizations to templated layouts. The ability to override these layout templates on a case-by-case basis (e.g. the ability to override the layout of a standardized product page)

Custom pages. The creation of custom, one-off landing pages not tied to a content type or structured content (e.g. a single "About us" page).

The Layout Builder gives developers/site builders the ability to drag and drop site-wide blocks and content fields into regions within a given layout.

layout builderSource: Drupal.org

With custom and unique landing pages being so important nowadays, this is finally the flexibility and freedom we need!

Media

Media management has always been an afterthought in Drupal. 

Today we consume more videos and pictures than ever, with the likes of Youtube, Instagram and Facebook.

According to Cisco, they predict that video will make up 80 percent of all internet traffic by 2019. That's like.... Today!!

Thanks to the Media in Drupal 8 Initiative, an experimental core media module was introduced in Drupal 8.4. Then in 8.5, it was moved to stable and has gotten even better in Drupal 8.6, with the addition of oEmbed, additional media type support, and a media library.

Media timelineSource: Webwash

Let’s break down all three of these for you.

Additional Media Type Support
Support for local audio, video, images and generic files, along with being able to embed remote YouTube and Vimeo videos.

oEmbed Support
Needed to handle the new remote video media type mentioned above.

Media Library
The most exciting of the three and pretty much speaks for itself as a library of all your media. Can use a grid view, which shows a thumbnail, title, and bulk edit checkbox, or a table view, if you prefer that sort of thing.

All-in-all, media management is still not where it needs to be, but all these additions to core are a massive jump in the right direction.

There is no reason to wait until Drupal 9

If you’re currently on Drupal 6 or 7 and aren’t totally pumped after reading this, you should be.

Finally, Drupal has given us the speed, flexibility, and freedom we need to improve workflow, save time and succeed online. 

What’s even better is that Drupal 9 will essentially be just like another minor core update in Drupal 8. It will be seamless unlike ever before. 

There is just no reason to wait. Make the update today and enjoy all these great features.

Let’s start a conversation about it.

Drupal Development Experts

Jun 13 2019
Jun 13

By default, a Drupal 8 user account collects only very basic information about the user. 

And, most of that information is not visible to visitors or other users on the site.

Fortunately, Drupal makes it easy to modify and expand this profile so that people can add useful information about themselves such as their real name (versus a username), address, employer, URLs, biography, and more.

If you're new to how Drupal handles users, read this tutorial before starting. In this tutorial, I'm going how to create expanded user profiles for your Drupal users.

First, let's add some fields to your user profiles. This allows users to provide more information about themselves.

  • Go to "Configuration", "People", "Account settings", and then "Manage fields". You can now see a screen which looks the one below:

manage fields

Let's add the following Text (plain) fields:

  • First Name. Set the "Maximum length" to 50 characters.
  • Last Name. Set the "Maximum length" to 50 characters.

Next, add the following Link fields:

  • LinkedIn
  • Facebook
  • Personal Website

fields user profile

  • Go to the "Manage display" tab and arrange the new fields in the order you want them to show to site visitors.

manage display user fields

  • Go to "People" and "Permissions".
  • Give the "View user information" permission to the Anonymous and Authenticated users.

view drupal user information

Now, go and see those user profile fields that you just created:

  • Click your user name to go to "My account" in the black menu bar at the top.
  • Click the "Edit profile" tab.
  • Scroll down and you can use all the fields that you just created.
  • Fill in the fields.
  • Save your data and click the "View" tab to see your profile:

drupal user profile

Now, see how these fields appear to your site’s users. For many users, this user profile editing area should look similar, but slightly different:

  • You can use the Masquerade module to see the site as the user would. If you're not familiar with Masquerade, read this tutorial.
  • Click the article writer name to go to "My account".
  • Click the "Edit profile" tab and see what the user sees:

drupal article writer

Finally, see how this appears to a new user:

  • Log out or visit your site in another browser.
  • Visit http://[your_web_address]/user/register
  • The registration screen should show the default Drupal fields, plus your new fields:

new user registration

If you want to remove any fields from the registration area, you can hide them by going to "Configuration, "People", "Account settings", and then "Manage form display".

Want to Learn More?

This tutorial was an extract from Drupal 8 Explained, the best-selling guide to Drupal 8. Grab a copy today to learn all the fundamentals of Drupal 8.


About the author

Steve is the founder of OSTraining. Originally from the UK, he now lives in Sarasota in the USA. Steve's work straddles the line between teaching and web development.
Jun 13 2019
Jun 13

CMI 2.0 session at Drupaldevdays in Cluj-Napoca

Fabian Bircher

13 Jun 2019

0 Comments

Fabian Bircher

13 Jun 2019

0 Comments

Fabian Bircher, 13 Jun 2019 - 0 Comments

Session slides for the dev days CMI 2.0 session

Today I presented the CMI 2.0 updates at the drupal dev days in Cluj-Napoca. The session went well and I received good feedback and had interesting conversations about configuration management after it.

Attached are the slides I presented.

There are plenty of issues to work on, join us in the sprint room or on the #config drupal slack. Find more information on the CMI 2 project page and issues tagged with CMI 2.0 candiate

CMI 2.0 Devdays 2019.pdf (932.61 KB)Download
Jun 13 2019
Jun 13

It’s an undoubtable truth that images have a strong power to engage customers. And there are plenty of ways to enhance this engagement, one of which is image hover effect (aka mouseover effect). 

Features like this are meant to add a creative and interactive touch to your website, spark users’ interest, save space, make your website more user-friendly, and increase conversions.

Drupal 8 has easy content creation as a priority, and there are also many useful modules for creating image hover effect. Let’s take a look at a very simple but nice one — the Imagepin button module.

The Imagepin button Drupal module’s brief overview

The Imagepin button module combines image hover effect with an interesting image pinning effect. It allows you to add pins to images that will show some text when someone hovers the mouse over them. 

During content creation, editors will have a special option — “Pin widgets on this image.” They can create as many pins as they wish, write the text for them, and position them anywhere throughout the image.

Pins appear on the image:

Pins on image with Imagepin Drupal 8 module

Text appears as you hover the mouse over each pin:

Image hover effect with Imagepin Drupal 8 module

The module’s extensibility

Out-of-box, the Imagepin button module cooperates with the Slick module and displays all widgets as a carousel. But it’s just the beginning. The module is extensible with custom widgets, and the use of JavaScript can add more interactivity. 

So it is easy to adjust the module’s behavior to your website’s particular needs with the help of a good Drupal team. Let’s now see the module’s “classic” work on a simple example.

Creating image hover effect with the Imagepin button module

1. Installing and enabling the module

It begins with installing the Imagepin button module — either via Composer or by downloading it from drupal.org. Then it should be enabled in the “Media” section on the module list.

Enabling Imagepin Button Drupal 8 module

2) Enabling the Imagepin for a content type’s image field

We go to the Manage Display tab of a content type that contains our image field and click the cogwheel next to this field. We need to check “Enable users to pin widgets on this image.” While doing this, we can:

choose in which image style we will be pinning the image

optionally set breakpoints for mobile devices

Enabling Imagepin for image field Drupal 8

3) Creating pins for images

Next, when we add content, we see the “Pin widgets on this image” button and click it.

Adding pins to images with Imagepin Drupal 8 module

This button brings us to the “Pin widgets” UI where we see available widgets (none added yet). Let’s click “Add new” and submit the text we want to display. 

For example, on our image of Europe Travel Map, we could pin sightseeing objects — and start with “Eiffel Tower.” So we write its name in the text field.

We check it in the “Available widgets” and it turns orange, which means it is active. So we can drag the position of the pin to the place where we want to see it. And then we click “Save these positions.”

Adding pins to images with Imagepin Drupal 8 module

As we save the content item, we can see our pinned images.”

Pins on image with Imagepin module Drupal 8

As we hover over it, we see the image hover effect in action — the text shows up.

Image hover effect with Imagepin module Drupal 8

Get nice image hover effect created for your Drupal website!

We have shown you a simple example of image hover effect in Drupal 8 created by the Imagepin button module. Of course, it can be fully customized to meet your ideas. 

So if you need any help with:

  • configuring the Imagepin button module
  • extending it with custom plugins
  • creating image hover effect using other tools 

contact our expert Drupal team!

Jun 13 2019
Jun 13

In a world where there is no limit to devices to access information, you must ensure your data is always available on the go! The pace of innovation in content management is accelerating along with the number of channels to support the web content.

A content marketer’s work is not done by just creating the content for the website. Making it available across devices and managing content channel hubs and making it user-friendly, most importantly.

But first, let’s understand how the content presentation has changed over the years.

The Traditional Approach: Coupled Architecture

CMS like Drupal are traditionally monolithic, wherein they own the entire technological stack, both backend (data layer) and frontend (presentation layer). The benefit – it allowed easy site management with admin user being able to write and publish on the same system.

monolithic-architecture-srijan-technologies

Monolithic architecture is part of one cohesive unit of code where components work together with                          the same memory space and resources

Content editors have preferred the traditional approach because of its clear and logical modular architecture which allowed them to control in-place editing and layout management.

tightlycoupled-srijan

They Broke Up! What?

Even though the coupled architecture was easy to develop, deploy, and test, decoupled application architecture has become popular lately owing to the break-through user experiences it provides.

Decoupling segregates the concerns of frontend and backend of an application by simply adding a layer of technical abstraction (API).

One for content creation and storage, and the other as the presenting layer. In this scenario, a CMS like Drupal serves the backend as a data repository and a third party application like React JS owns the frontend providing interactive UIs.

In the video, watch how Drupal backend interacts with the decoupled apps in the decoupled approach and how it’s different from the traditional approach.

 

 

A decoupled architecture splits the content of a website from how it is displayed on multiple independent systems to how it is created.

Fully Decoupled or Headless Architecture

Headless architecture is similar to decoupled architecture in a way that both have content management and storage backends and deliver content from that database through a web service or API.

The only difference between the two is headless Drupal does not have a defined front-end system and has no functionality to present content to an end user on its own. The API, which exposes information for consumption, connects through an array of application.

headless-drupal-application-srijan

Srijan has implemented the headless approach for its various clients with much ease within the given stipulated time frame. Srijan helped Estee Lauder reduce its training session cost by up to 30% with a decoupled and multilingual LMS.

Understand your situation. Who is it for?

Here are some key pointers that will help you figure out if fully decoupled approach is an option for your project:

  • Not for basic editorial websites

Decoupling will do no good to a standalone website with basic editorial capabilities such as a blogging site. Such websites require less or no user interactivity and further, it can also hamper the performance of the crucial features required by content editors like the content preview, layout management, in-line editing making a simple website rather complicated.

  • Not if you want to lose more functionalities

One of the major advantages of CMS like Drupal is, you can access the plethora of modules in just one click. By simply selecting the Simple Google Maps module from your admin toolbar, you can have it on your website.

However, implementing decoupled architecture takes away such easy-to-use features since the CMS no longer manages the frontend.

  • Not if you want a complex procedure

If your design goals are closely aligned with traditional coupled architecture, then implementing decoupled approach will complicate the process and will add to the extra cost of creating those features from scratch.

Take a look here to know in which use cases decoupled Drupal works best and how to decide whether you need the architecture for your next project or not.

Enter Progressive Decoupling: A Hybrid Approach

Progressive decoupling gives you the best of both worlds. It allows Drupal to leverage JavaScript frameworks (like React and Angular) for dynamic user experience by injecting JS only where it’s needed in the front-end.

The approach is in the best interest of both - editors and developers as it strikes a balance between editorial and developer needs.

Which means the editor can control the layout of the page and the developer can use more JavaScript by interpolating a JavaScript framework into the Drupal frontend.

The video attached below will help you better understand the concept.

Comparison Chart: Coupled, Progressive or Fully Decoupled

This comparison chart will help you understand the features.

Features Coupled Decoupled (Progressive) Fully Decoupled/Headless Architecture Tightly Coupled Loosely coupled Separated Performance Fast Fast Fastest Fixed presentation environment Yes Yes No Use cases Complete text-based sites involving no user interactivity Websites that require rich/interactive elements of user experience Websites that require rich/interactive user experience Layout style Overrides built-in themes and templates Easy and secure third-party integrations Easy and secure third-party integrations Integration No Future-proof Future-proof SEO friendly Most SEO friendly SEO friendly Non-SEO friendly Delivery channels Limited Limited Unlimited API usability No APIs Based on architecture Complete API based Preview availability Available Available Unavailable

What’s in Store for the Future of Decoupling

With enterprises choosing to opt for a more flexible and scalable experience, the gap between the developers and content editors needs to be reduced.

The rapid evolution in decoupling allows constructing a content model once, preview it on every channel, and use familiar tools to either edit or place content on any channel in question, and decrease the delivery time.

At Srijan, we believe in a mature agile engineering process delivering better results. Contact us to get started.

Jun 13 2019
Jun 13

In this article we are going to explore some of the powers of the Drupal 8 migration system, namely the migration “templates” that allow us to build dynamic migrations. And by templates I don’t mean Twig templates but plugin definitions that get enhanced by a deriver to make individual migrations for each of the things that we need in the application. For example, as we will explore, each language.

The term “template” I inherit from the early days of Drupal 8 when migrations were config entities and core had migration (config) templates in place for Drupal to Drupal migrations. But I like to use this term to represent also the deriver-based migrations because it kinda makes sense. It’s a personal choice so feel free to ignore it if you don’t agree.

Before going into the details of how the dynamic migrations works, let’s cover a few of the more basic things about migrations in Drupal 8.

What is a migration?

The very first thing we should talk about is what actually is a migration. The simple answer to this question is: a plugin. Each migration is a YAML-based plugin that actually brings together all the other plugins the migration system needs to run an actual logical migration. And if you don’t know what a plugin is, they are swappable bits of functionality that are meant to perform a similar task, depending on their type. They are all over core and by now there are plenty of resources to read more about the plugin system, so I won’t go into it here.

Migration plugins, unlike most others such as blocks and field types, are defined in YAML files inside the module’s migrations folder. But just like all other plugin types, they map to a plugin class, in this case Drupal\migrate\Plugin\Migration.

The more important thing to know about migrations, however, is the logical structure they follow. And by this I mean that each migration is made up of a source, multiple processors and a destination. Make sense right? You need to get some data (the source reads and interprets its format), prepare it for its new destination (the processors alter or transform the data) and finally save it in the destination (which has a specific format and behaviour). And to make all this happen, we have plugins again:

  • Source plugins
  • Process plugins
  • Destination plugins

Source plugins are responsible for reading and iterating over the raw data being imported. And this can be in many formats: SQL tables, CSV files, JSON files, URL endpoint, etc. And for each of these we have a Drupal\migrate\Plugin\MigrateSourceInterface plugin. For average migrations, you’ll probably pick an existing source plugin, point it to your data and you are good to go. You can of course create your own if needed.

Destination plugins (Drupal\migrate\Plugin\MigrateDestinationInterface) are closely tied to the site being migrated into. And since we are in Drupal 8, these relate to what we can migrate to: entities, config, things like this. You will very rarely have to implement your own, and typically you will use an entity based destination.

In between these two, we have the process plugins (Drupal\migrate\Plugin\MigrateProcessInterface), which are admittedly the most fun. There are many of them already available in core and contrib, and their role is to take data values and prepare them for the destination. And the cool thing is that they are chainable so you can really get creative with your data. We will see in a bit how these are used in practice.

The migration plugin is therefore a basic definition of how these other 3 kinds of plugins should be used. You get some meta, source, process, destination and dependency information and you are good to go. But how?

That’s where the last main bit comes into play: the Drupal\migrate\MigrateExecutable. This guy is responsible for taking a migration plugin and “running” it. Meaning that it can make it import the data or roll it back. And some other adjacent things that have to do with this process.

Migrate ecosystem

Apart from the Drupal core setup, there are few notable contrib modules that any site doing migrations will/should use.

One of these is Migrate Plus. This module provides some additional helpful process plugins, the migration group configuration entity type for grouping migrations and a URL-based source plugin which comes with a couple of its own plugin types: Drupal\migrate_plus\DataFetcherPluginInterface (retrieve the data from a given protocol like a URL or file) and Drupal\migrate_plus\DataParserPluginInterface (interpret the retrieved data in various formats like JSON, XML, SOAP, etc). Really powerful stuff over here.

Another one is Migrate Tools. This one essentially provides the Drush commands for running the migrations. To do so, it provides its own migration executable that extends the core one to add all the necessary goodies. So in this respect, it’s a critical module if you wanna actually run migrations. It also makes an attempt at providing a UI but I guess more of that will come in the future.

The last one I will mention is Migrate Source CSV. This one provides a source plugin for CSV files. CSV is quite a popular data source format for migrations so you might end up using this quite a lot.

Going forward we will use all 3 of these modules.

Basic migration

After this admittedly long intro, let’s see how one of these migrations looks like. I will create one in my advanced_migrations module which you can also check out from Github. But first, let’s see the source data we are working with. To keep things simple, I have this CSV file containing product categories:

id,label_en,label_ro
B,Beverages,Bauturi
BA,Alcohols,Alcoolice
BAB,Beers,Beri
BAW,Wines,Vinuri
BJ,Juices,Sucuri
BJF,Fruit juices,Sucuri de fructe
F,Fresh food,Alimente proaspete

And we want to import these as taxonomy terms in the categories vocabulary. For now we will stick with the English label only. We will see after how to get them translated as well with the corresponding Romanian labels.

As mentioned before, the YAML file goes in the migrations folder and can be named advanced_migrations.migration.categories.yml. The naming is pretty straightforward to understand so let’s see the file contents:

id: categories
label: Categories
migration_group: advanced_migrations
source:
  plugin: csv
  path: 'modules/custom/advanced_migrations/data/categories.csv'
  header_row_count: 1
  keys:
    - id
  column_names:
    0:
      id: 'Unique Id'
    1:
      label_en: 'Label EN'
    2:
      label_ro: 'Label RO'
destination:
  plugin: entity:taxonomy_term
process:
  vid:
    plugin: default_value
    default_value: categories
  name: label_en

It’s this simple. We start with some meta information such as the ID and label, as well as the migration group it should belong to. Then we have the definitions for the 3 plugin types we spoke about earlier:

Source

Under the source key we specify the ID of the source plugin to use and any source specific definition. In this case we point it to our CSV file, and kind of “explain” it how to understand the CSV file. Do check out the Drupal\migrate_source_csv\Plugin\migrate\source\CSV plugin if you don’t understand the definition.

Destination

Under the destination key we simply tell the migration what to save the data as. Easy peasy.

Process

Under the process key we do the mapping between our data source and the destination specific “fields” (in this case actual Drupal entity fields). And in this mapping we employ process plugins to get the data across and maybe alter it.

In our example we migrate one field (the category name) and for this we use the Drupal\migrate\Plugin\migrate\process\Get process plugin which is assumed unless one is actually specified. All it does is copies the raw data as it is without making any change. It’s the very most basic and simple process plugin. And since we are creating taxonomy terms, we need to specify a vocabulary which we don’t necessarily have to take from the source. In this case we don’t actually because we want to import all the term into the categories vocabulary. So we can use the Drupal\migrate\Plugin\migrate\process\DefaultValue plugin to specify what value should be saved in that field for each term we create.

And that’s it. Clearing the cache, we can now see our migration using Drush:

drush migrate:status

This will list our one migration and we can run it as well:

drush migrate:import categories

Bingo bango we have categories. Roll them back if you want with:

drush migrate:rollback categories

Dynamic migration

Now that we have the categories imported in English, let’s see how we can import their translations as well. And for this we will use a dynamic migration using a “template” and a plugin deriver. But first, what are plugin derivatives?

Plugin derivatives

The Drupal plugin system is an incredibly powerful way of structuring and leveraging functionality. You have a task in the application that needs to be done and can be done in multiple ways? Bam! Have a plugin type and define one or more plugins to handle that task in the way they see fit within the boundaries of that subsystem.

And although this is powerful, plugin derivatives are what really makes this an awesome thing. Derivatives are essentially instances of the same plugin but with some differences. And the best thing about them is that they are not defined entirely statically but they are “born” dynamically. Meaning that a plugin can be defined to do something and a deriver will make as many derivatives of that plugin as needed. Let’s see some examples from core to better understand the concept.

Menu links:

Menu links are plugins that are defined in YAML files and which map to the Drupal\Core\Menu\MenuLinkDefault class for their behaviour. However, we also have the Menu Link Content module which allows us to define menu links in the UI. So how does that work? Using derivatives.

The menu links created in the UI are actual content entities. And the Drupal\menu_link_content\Plugin\Deriver\MenuLinkContentDeriver creates as many derivatives of the menu link plugin as there are menu link content entities in the system. Each of these derivatives behave almost the same as the ones defined in code but contain some differences specific to what has been defined in the UI by the user. For example the URL (route) of the menu link is not taken from a YAML file definition but from the user-entered value.

Menu blocks:

Keeping with the menu system, another common example of derivatives is the menu blocks. Drupal defines a Drupal\system\Plugin\Block\SystemMenuBlock block plugin that renders a menu. But on its own, it doesn’t do much. That’s where the Drupal\system\Plugin\Derivative\SystemMenuBlock deriver comes into play and creates a plugin derivate for all the menus on the site. In doing so, augments the plugin definitions with the info about the menu to render. And like this we have a block we can place for each menu on the site.

Migration deriver

Now that we know what plugin derivatives are and how they work, let’s see how we can apply this to our migration to import the category translations. But why we would actually use a deriver for this? We could simply copy the migration into another one and just use the Romanian label as the term name no? Well yes…but no.

Our data is now in 2 languages. It could be 23 languages. Or it could be 16. Using a deriver we can make a migration derivative for each available language dynamically and simply change the data field to use for each. Let’s see how we can make this happen.

The first thing we need to do is create another migration that will act as the “template”. In other words, the static parts of the migration which will be the same for each derivative. And as such, it will be like the SystemMenuBlock one in that it won’t be useful on its own.

Let’s call it advanced_migrations.migration.category_translations.yml:

id: category_translations
label: Category translations
migration_group: advanced_migrations
deriver: Drupal\advanced_migrations\CategoriesLanguageDeriver
source:
  plugin: csv
  path: 'modules/custom/advanced_migrations/data/categories.csv'
  header_row_count: 1
  keys:
    - id
  column_names:
    0:
      id: 'Unique Id'
    1:
      label_en: 'Label EN'
    2:
      label_ro: 'Label RO'
destination:
  plugin: entity:taxonomy_term
  translations: true
process:
  vid:
    plugin: default_value
    default_value: categories
  tid:
    plugin: migration_lookup
    source: id
    migration: categories
  content_translation_source:
    plugin: default_value
    default_value: 'en'

migration_dependencies:
  required:
    - categories

Much of it is like the previous migration. There are some important changes though:

  • We use the deriver key to define the deriver class. This will be the class that creates the individual derivative definitions.
  • We configure the destination plugin to accept entity translations. This is needed to ensure we are saving translations and not source entities. Check out Drupal\migrate\Plugin\migrate\destination\EntityContentBase for more info.
  • Unlike the previous migration, we define also a process mapping for the taxonomy term ID (tid). And we use the migration_lookup process plugin to map the IDs to the ones from the original migration. We do this to ensure that our migrated entity translations are associated to the correct source entities. Check out Drupal\migrate\Plugin\migrate\process\MigrationLookup for how this plugin works.
  • Specific to the destination type (content entities) we need to import a default value also in the content_translation_source if we want the resulting entity translation to be correct. And we just default this to English because that was the default language the original migration imported in. This is the source language in the translation set.
  • Finally, because we need to lookup in the original migration, we also define a migration dependency on the original migration. So that the original gets run, followed by all the translation ones.

You’ll notice another important difference: the term name is missing from the mapping. That will be handled in the deriver based on the actual language of the derivative because this is not something we can determine statically at this stage. So let’s see that now.

In our main module namespace we can create this very simple deriver (which we referenced in the migration above):

namespace Drupal\advanced_migrations;

use Drupal\Component\Plugin\Derivative\DeriverBase;
use Drupal\Core\Language\LanguageInterface;
use Drupal\Core\Language\LanguageManagerInterface;
use Drupal\Core\Plugin\Discovery\ContainerDeriverInterface;
use Symfony\Component\DependencyInjection\ContainerInterface;

/**
 * Deriver for the category translations.
 */
class CategoriesLanguageDeriver extends DeriverBase implements ContainerDeriverInterface {

  /**
   * @var \Drupal\Core\Language\LanguageManagerInterface
   */
  protected $languageManager;

  /**
   * CategoriesLanguageDeriver constructor.
   *
   * @param \Drupal\Core\Language\LanguageManagerInterface $languageManager
   */
  public function __construct(LanguageManagerInterface $languageManager) {
    $this->languageManager = $languageManager;
  }

  /**
   * {@inheritdoc}
   */
  public static function create(ContainerInterface $container, $base_plugin_id) {
    return new static(
      $container->get('language_manager')
    );
  }

  /**
   * {@inheritdoc}
   */
  public function getDerivativeDefinitions($base_plugin_definition) {
    $languages = $this->languageManager->getLanguages();
    foreach ($languages as $language) {
      // We skip EN as that is the original language.
      if ($language->getId() === 'en') {
        continue;
      }

      $derivative = $this->getDerivativeValues($base_plugin_definition, $language);
      $this->derivatives[$language->getId()] = $derivative;
    }

    return $this->derivatives;
  }

  /**
   * Creates a derivative definition for each available language.
   *
   * @param array $base_plugin_definition
   * @param LanguageInterface $language
   *
   * @return array
   */
  protected function getDerivativeValues(array $base_plugin_definition, LanguageInterface $language) {
    $base_plugin_definition['process']['name'] = [
      'plugin' => 'skip_on_empty',
      'method' => 'row',
      'source' => 'label_' . $language->getId(),
    ];

    $base_plugin_definition['process']['langcode'] = [
      'plugin' => 'default_value',
      'default_value' => $language->getId(),
    ];

    return $base_plugin_definition;
  }

}

All plugin derivers extend the Drupal\Component\Plugin\Derivative\DeriverBase and have only one method to implement: getDerivativeDefinitions(). And to make our class container aware, we implement the deriver specific ContainerDeriverInterface that provides us with the create() method.

The getDerivativeDefinitions() receives an array which contains the base plugin definition. So essentially our entire YAML migration file turned into an array. And it needs to return an array of derivative definitions keyed by their derivative IDs. And it’s up to us to say what these are. In our case, we simply load all the available languages on the site and create a derivative for each. And the definition of each derivative needs to be a “version” of the base one. And we are free to do what we want with it as long as it still remains correct. So for our purposes, we add two process mappings (the ones we need to determine dynamically):

  • The taxonomy term name. But instead of the simple Get plugin, we use the Drupal\migrate\Plugin\migrate\process\SkipOnEmpty one because we don’t want to create a translation at all for this record if the source column label_[langcode] is missing. Makes sense right? Data is never perfect.
  • The translation langcode which defaults to the current derivative language.

And with this we should be ready. We can clear the cache and inspect our migrations again. We should see a new one with the ID category_translations:ro (the base plugin ID + the derivative ID). And we can now run this migration as well and we’ll have our term translations imported.

Other examples

I think dynamic migrations are extremely powerful in certain cases. Importing translations is an extremely common thing to do and this is a nice way of doing it. But there are other examples as well. For instance, importing Commerce products. You’ll create a migration for the products and one for the product variations. But a product can have multiple variations depending on the actual product specification. For example, the product can have 3 prices depending on 3 delivery options. So you can dynamically create the product variation migrations for each of the delivery option. Or whatever the use case may be.

Conclusion

As we saw, the Drupal 8 migration system is extremely powerful and flexible. It allows us to concoct all sorts of creative ways to read, clean and save our external data into Drupal. But the reason this system is so powerful is because it rests on the lower-level plugin API which is meant to be used for building such systems. So migrate is one of them. But there are others. And the good news is that you can build complex applications that leverage something like the plugin API for extremely creative solutions. But for now, you learned how to get your translations imported which is a big necessity.

Jun 13 2019
Jun 13

Burnout is becoming an increasingly prevalent problem, especially in a field as fast-paced as development. With more and more businesses undergoing a digital transformation, the demand for experienced developers has never been higher - and with it, naturally, come higher and higher demands from these developers.

This is further accentuated by the work- and career-oriented mentality we see widespread today. You can frequently spot people on social media either bragging or complaining about how hard or how long they’ve worked, but, even in the first case, such a workflow is certainly not sustainable. 

It’s true that more work yields more profit; but what good is profit when one’s mental health, and by consequence also physical health, suffer on account of work overload?

Another reason for burnout that should also be mentioned, besides excessive working hours, is a general dissatisfaction with how the work is done and a suboptimal workplace experience. 

In fact, we could argue that monotony or having very little control over one’s work is even more detrimental than working really long hours. Put the two together and you’re practically calling for burnout to arrive. 

This two-part series explores how you can spot the symptoms of your developers burning out and how you can mitigate or even prevent developer burnout. In the first part, we’ll focus on the symptoms of burnout; in the second, we’ll take a look at how to reduce the risks of burnout as a developer, as well as what measures to take as a manager to reduce those risks and mitigate burnout when it happens.

Symptoms of burnout - and how to spot them

Let’s start with the symptoms of burnout. Logically, it’s easier to spot these through self-reflection (e.g. you notice a lack of energy and/or motivation), but it’s even more crucial for managers to be able to spot them in their employees. So, let's take a look at what signs to look for as indicators that your developers are burning out.

  • They’re lacking energy and/or motivation: this is likely the most obvious symptom of burnout, but should nonetheless be mentioned. If you notice that certain developers on your team constantly seem sleepy and unmotivated, especially in a more hectic period, this should be a red flag that something is wrong.
  • They’re frequently late to work: in line with the previous point, sleepiness and late working hours may result in sleeping through morning alarms and consequently arriving late. The first instinct would be to scold or punish the person in question, but a deeper investigation may reveal other reasons for it - especially if they still seem lacking in energy after arriving late, and this happens on a relatively regular basis.
  • They’ve isolated themselves and stopped talking to coworkers: this can be difficult to spot in employees who are more introverted by nature, or those who work on specific projects that don’t require as much collaboration (or even disallow it altogether, e.g. when working under a very strict NDA). This means that you need to be extra mindful of these employees so that potential signs of their burnout don’t go overlooked. 
  • They’ve stopped participating at meetings: this point is similar to the previous one in that it concerns a kind of isolation. If someone is physically present at meetings, but “not really there” in the practical sense, it can either be because they have so much on their mind already, or because they’re too tired to actively participate. Both of these can be signs of burnout. 
  • The quality of their work has decreased: if you notice an increase of bugs and mistakes in a certain developer’s code, or if they take longer than usual to solve relatively simple tasks that involve familiar technologies, this could indicate that they’re suffering from burnout. Make sure to thoroughly explore this possibility before you sanction them.

Granted, some of these are almost impossible to spot if you have a freelancer or a team of developers working for you remotely. In such a case, you should also look for the following indicators: a remote worker fails to do certain tasks, or delivers them very late, they stop responding to calls and direct messages, they fail to track their time, etc. 

A word of warning, though: most of the points we’ve discussed here can be indicators of other issues, not necessarily burnout, but also personal issues such as family troubles and health issues (but, again, these could be the result of burnout, so it’s a bit of a “chicken-and-egg” situation). 

Nevertheless, if you are an open company that has a healthy company culture and a pretty good grasp of the goings-on in the lives of your employees (without being too Big Brother-y, of course), you can assume these are symptoms of burnout - especially if they start appearing in periods that demand more, or more difficult, work than usually. 

As a manager or a CEO of a smaller company, you need to communicate frequently and clearly with your subordinates and establish a trusting relationship with them. This will make it more likely that they’ll be willing to open up to you about their work and any difficulties they might be facing, and getting to know them will help you spot that something is off.

This holds true for teammates as well - be mindful of changes in your coworkers’ behavior that may indicate that they are overworked and on a path towards burnout. It’s much easier to spot something when you’re aware of it and know what you’re looking for. 

A very useful tool for collecting feedback from your employees, which we at Agiledrop also make good use of, is Officevibe. By guaranteeing anonymity, it gives those individuals who don’t want to expose themselves a chance to voice their opinions and/or dissatisfactions. With it, you’ll be able to get honest feedback and therefore a better overview of your team.

Conclusion

Hopefully, we’ve shed some light on the main signs of developer burnout and how to spot them. If you want to learn more, make sure to check back early next week for part 2 of the series, in which we’ll dive into some ways of reducing the risks of burnout occurring or even preemptively preventing it. 

Other posts in this series:

  • Part 2: How to prevent or mitigate developer burnout (coming soon)
Jun 13 2019
Jun 13

Google Maps don't look appealing or pretty by default when you embed them in your Drupal content. Nor do they always nicely coordinate with your site look and feel.

What if you found a way to give them a custom design? For example - your own color? In this tutorial, you will learn how to give your Drupal Google Maps a custom style with the Styled Google Map contrib module.

Step #1. Download the Required Modules

For this example, you’ll have to download and enable 3 Modules.

  • Styled Google Map.
  • Geofield Map.
  • Geofield (this is a dependency for the other two modules).

Use your preferred method to download the modules. I’m using the Composer since it will automatically take care of all the needed dependencies.

Drupal Google Maps Composer output 1

Drupal Google Maps Composer output 2

Drupal Google Maps Composer output3

Step #2. Configure the Styled Google Map Settings

  • Click Configuration > Web services > Styled Google Map settings page.
  • Click the link above the blue button.
  • Get yourAPI Key from Google.

Get your Drupal Google Maps API Key

  • Scroll down until you see the blue Get a Key button and click on it.
  • Create a project name.
  • Click Next.

Create a project name

  • Copy the generated key.
  • Click Done.

Click Done for Drupal Google Maps

  • Paste the key in your Drupal site.
  • Click Save configuration.

Click Save configuration

Step #3. Create a Content Type with a Location

  • Click Structure > Content types > Add content type.
  • Give your content type a proper name.
  • Click Save.
  • Add fields.

Click Save and add fields

  • Click the blue Add field button.
  • Choose Geofield.
  • Add a proper label.
  • Click Save and continue.

Click Save and continue

  • Leave the default number of values.
  • Click Save field settings.

Notice that you can choose here multipĺe (or unlimited) values if you want to show more than one marker in the same map (for example a fast food chain with multiple locations).

Notice that you can choose here multipĺe (or unlimited) values

Step #4. Configure the Content Type Display

  • Click Structure > Content types > Location > Manage display.
  • Look for your Geofield field.
  • Change the format to Styled Google Map.
  • There’s a cogwheel on the right, it handles various configuration options for the map (we’ll come back here later).
  • Click Save.

Click Save

Step #5. Configure the Form Display

  • Click Structure > Content types > Location > Manage form display.
  • Look for your Geofield field.
  • Change the widget to Geofield Map.
  • Click Save.

geofield map

Step #6. Create a Node

  • Create a node by the Location type.
  • The Geofield Map widget you chose in the last step will help you to position the marker with an address (and not with latitude and longitude values).
  • Click Save.

Click Save

Step #7. Configure The Map Design

There are lots of map designs on this site.

  • Choose your preferred one.
  • Copy the JavaScript code on the left.

Copy the JavaScript code on the left

  • Click Structure > Content types > Location > Manage display.
  • Click the cogwheel on the right of your Geofield field. You’ll find a lot of configuration options. Feel free to explore and experiment with them.
  • Scroll down and select MAP STYLE.
  • Paste the code you selected into the textbox.

Paste the code you selected into the textbox

  • Click Update.
  • Click Save.

Take a look at your node, the map has now a custom look!

Take a look at your node, the map has now a custom look!

If you want to customize your maps even further and with your own colors, take a look at this style wizard application in Github, it helps you generate the JSON code required to style the map.

Take a look at this style wizard application in Github

Additional Reading: 

Would you like to know more about how to build great websites with Drupal 8? Sign up for our Video Club and watch its easy to follow lessons at your convenience.


About the author

Jorge lived in Ecuador and Germany. Now he is back to his homeland Colombia. He spends his time translating from English and German to Spanish. He enjoys playing with Drupal and other Open Source Content Management Systems and technologies.
Jun 13 2019
Jun 13

With Drupal 9 approaching rapidly, it is an exciting time to be on the Drupal Association Board. The Association must continue to evolve alongside the project so we can continue providing the right kind of support. And, it is the Drupal Association Board who develops the Association’s strategic direction by engaging in discussions around a number of strategic topics throughout their term. As a community member, you can be a part of this important process by becoming an At-large Board Member.

We have two At-large positions on the Association Board of Directors. These positions are self-nominated and then elected by the community. Simply put, each At-large Director position is designed to ensure there is community representation on the Drupal Association Board.

Inclusion

2018

Map of 2018 candidates

In 2018, we made a special effort to encourage geographic inclusion through the people who were candidates for election and we were delighted that candidates stood in six continents all across the World — thank you!

2019

Drupal Association logo, Pride version

Now, in 2019, and recognising we are in the middle of Pride Month, we want to particularly encourage nominations from candidates from underrepresented or marginalised groups in our community. As referenced later in this blog post, anyone is eligible to nominate themselves, and voters can vote for whichever candidate they choose, but we want to encourage this opportunity to amplify the voices of underrepresented groups with representation on the Association Board. And as we meet the candidates, whether they are allies or members of these groups themselves, we hope to center issues of importance to these communities - in addition to the duties of care for the management of the Association that are always central to a board role.

As always, any individual can stand for election to the board, but by centering these important issues we are determined to encourage a board made of diverse members as that gives them the best ability to represent our diverse community.

If you are interested in helping shape the future of the Drupal Association, we encourage you to read this post and nominate yourself between 29 Jun, 2019 and 19 July 2019.

What are the Important Dates?

Self nominations: 29 Jun, 2019 to 19 July, 2019

Meet the candidates: 22 July, 2019 to 26 July, 2019

Voting: 1 August, 2019 to 16 August, 2019

Votes ratified, Winner announced: 3 September, 2019

How do nominations and elections work?

Specifics of the election mechanics were decided through a community-based process in 2012 with participation by dozens of Drupal community members. More details can be found in the proposal that was approved by the Drupal Association Board in 2012 and adapted for use this year.

What does the Drupal Association Board do?

The Board of Directors of the Drupal Association are responsible for financial oversight and setting the strategic direction for serving the Drupal Association’s mission, which we achieve through Drupal.org and DrupalCon. Our mission is: “Drupal powers the best of the Web.  The Drupal Association unites a global open source community to build and promote Drupal.”

New board members will contribute to steer the strategic direction of the Drupal Association. Board members are advised of, but not responsible for, matters related to the day-to-day operations of the Drupal Association including program execution, staffing, etc.

Directors are expected to contribute around five hours per month and attend three in-person meetings per year (financial assistance is available if required).

Association board members, like all board members for US-based organizations, have three legal obligations: duty of care, duty of loyalty, and duty of obedience. In addition to these legal obligations, there is a lot of practical work that the board undertakes. These generally fall under the fiduciary responsibilities and include:

  • Overseeing Financial Performance

  • Setting Strategy

  • Setting and Reviewing Legal Policies

  • Fundraising

  • Managing the Executive Director

To accomplish all this, the board comes together three times a year during two-day retreats. These usually coincide with the North American and major European Drupal Conferences, as well as one February meeting. As a board member, you should expect to spend a minimum of five hours a month on board activities.

Some of the topics that will be discussed over the next year or two are:

  • Strengthen sustainability

  • Grow Drupal adoption through our channels and partner channels

  • Evolve drupal.org and DrupalCon goals and strategies.

Who can run?

There are no restrictions on who can run, and only self-nominations are accepted.

Before self-nominating, we want candidates to understand what is expected of board members and what types of topics they will discuss during their term. That is why we now require candidates to:

What will I need to do during the elections?

During the elections, members of the Drupal community will ask questions of candidates. You can post comments on candidate profiles here on assoc.drupal.org.

In the past, we held group “meet the candidate” interviews. With many candidates the last few years, group videos didn’t allow each candidate to properly express themselves. We replaced the group interview and allow candidates to create their own 3-minute video and add it to their candidate profile page. These videos must be posted by 19 July, 2019, and the Association will promote the videos to the community from 22 July, 2019. Hint: Great candidates would be those that exemplify the Drupal Values & Principles. That might provide structure for a candidate video? You are also encouraged to especially consider diversity and inclusion.

How do I run?

From 29 June, 2019, go here to nominate yourself.  If you are considering running, please read the entirety of this post, and then be prepared to complete the self-nomination form. This form will be open on 29 June, 2019 through 19 July, 2019 at midnight UTC. You'll be asked for some information about yourself and your interest in the Drupal Association Board. When the nominations close, your candidate profile will be published and available for Drupal community members to browse. Comments will be enabled, so please monitor your candidate profile so you can respond to questions from community members. We will announce the new board member via our blog and social channels on 3 September, 2019.

Reminder, you must review the following materials before completing your candidate profile:

Who can vote?

Voting is open to all individuals who have a Drupal.org account by the time nominations open and who have logged in at least once in the past year. If you meet this criteria, your account will be added to the voters list on association.drupal.org and you will have access to the voting.

To vote, you will rank candidates in order of your preference (1st, 2nd, 3rd, etc.). You do not need to enter a vote on every candidate. The results will be calculated using an "instant runoff" method. For an accessible explanation of how instant runoff vote tabulation works, see videos linked in this discussion.

Elections process

Voting will be held from 1 August, 2019. During this period, you can review and comment on candidate profiles on assoc.drupal.org.

Finally, the Drupal Association Board will ratify the election and announce the winner on 3 September, 2019.

Have questions? Please contact Drupal Association Community Liaison, Rachel Lawson.

Finally, many thanks to nedjo for pioneering this process and documenting it so well!

Update to elected board member responsibilities

As detailed in a previous blog post, the elected members of the Drupal Association Board now have a further responsibility that makes their understanding of issues related to diversity & inclusion even more important; they provide a review panel for our Community Working Group. This is a huge important role in our global community.

A note from the nomination committee

While this blog post is primarily directed at the community elections process for the board - the Nomination Committee of the Drupal Association wants to affirm that diversity is a top priority during the Drupal Association Board of Directors nomination process for the appointed positions as well. We will work to identify the best way to provide more insight regarding how the committee evaluates candidates.

Jun 13 2019
Jun 13

This is actually quite a common question from our students. They start building their Drupal site. Then they go to work with their blocks or menus.

Then they accidentally disable the "Log in" menu link. There is no "Log in" link displayed on the site anymore. Neither for them nor for their visitors.

In this short tip, you will learn how to login to your Drupal admin page in such situation. 

The image below shows the normal Drupal login link. When you first install Drupal it usually appears in the top right-hand corner of your site.

login link displayed

For example, you may accidentally disable it in your Drupal admin dashboard:

login menu link disabled

Then, the "Log in" link disappears.

login link gone

How can you now log in back to your site and get the "Log in" link back to its place?

So here's what you do. Use one of these URLs:

Visit one of those URLs and you should see a login screen like the one below:

login block

Once you are back to your admin panel, simply enable the menu link, and you are all set.

\
Jun 13 2019
Jun 13

To Perform an HTTP request in Drupal 7 we can use "drupal_http_request" function. This is a flexible and powerful HTTP client implementation. Correctly handles GET, POST, PUT or any other HTTP requests. Handles redirects.

Parameters

$url: A string containing a fully qualified URI.

array $options: (optional) An array that can have one or more of the following elements:

  • headers: An array containing request headers to send as name/value pairs.
  • method: A string containing the request method. Defaults to 'GET'.
  • data: An array or object containing the values for the request body or a string containing the request body, formatted as 'param=value&param=value&...'; to generate this, use http_build_query(). Defaults to NULL.
  • max_redirects: An integer representing how many times a redirect may be followed. Defaults to 3.
  • timeout: A float representing the maximum number of seconds the function call may take. The default is 30 seconds. If a timeout occurs, the error code is set to the HTTP_REQUEST_TIMEOUT constant.
  • context: A context resource created with stream_context_create().

Return value

object: An object that can have one or more of the following components:

  • request: A string containing the request body that was sent.
  • code: An integer containing the response status code, or the error code if an error occurred.
  • protocol: The response protocol (e.g. HTTP/1.1 or HTTP/1.0).
  • status_message: The status message from the response, if a response was received.
  • redirect_code: If redirected, an integer containing the initial response status code.
  • redirect_url: If redirected, a string containing the URL of the redirect target.
  • error: If an error occurred, the error message. Otherwise not set.
  • headers: An array containing the response headers as name/value pairs. HTTP header names are case-insensitive (RFC 2616, section 4.2), so for easy access the array keys are returned in lower case.
  • data: A string containing the response body that was received.

Examples

<?php

function create($data) {
    try {
        $response = drupal_http_request(FULLY_QUALIFIED_URL, [
            "headers" => [
                "Authorization" => "Basic " . authorization(),
                "Content-Type" => "application/json",
            ],
            "method" => "POST",
            "data" => json_encode(array(
                'id' => $data['id'],
                'values' => array(
                    'name' => $data['name'],
                    'email' => $data['email'],
                ),
            ))
        ]);
        if ($response->code == 201) {
            return json_decode($response->data);
        }
        else {
            watchdog('error', t($response->code . ' Error, while creating.'), (array) $response, WATCHDOG_ERROR);
            return NULL;
        }
    }
    catch (Exception $exception) {
        watchdog('exception', t('Exception - while creating.'), (array) $exception, WATCHDOG_ERROR);
    }
}

function get_all($filters) {
    try {
        $query = '';
        $build_query = http_build_query($filters);
        if(isset($build_query) && !empty($build_query)){
            $query = '?' . $build_query;
        }
        $response = drupal_http_request(FULLY_QUALIFIED_URL . $query, [
            "method" => "GET",
            "headers" => [
                "Authorization" => "Basic " . authorization(),
                "Content-Type" => "application/json",
            ],
        ]);
        if ($response->code == 200) {
            return json_decode($response->data);
        }
        else {
            watchdog('error', t($response->code . ' Error, while getting all.'), (array) $response, WATCHDOG_ERROR);
            return NULL;
        }
    }
    catch (Exception $exception) {
        watchdog('exception', t('Exception - while getting all.'), (array) $exception, WATCHDOG_ERROR);
    }
}

function get($id) {
    try {
        $response = drupal_http_request(FULLY_QUALIFIED_URL . '/' . $id, [
            "method" => "GET",
            "headers" => [
                "Authorization" => "Basic " . authorization(),
                "Content-Type" => "application/json",
            ],
        ]);
        if ($response->code == 200) {
            return json_decode($response->data);
        }
        else {
            watchdog('error', t($response->code . ' Error, while getting.'), (array) $response, WATCHDOG_ERROR);
            return NULL;
        }
    }
    catch (Exception $exception) {
        watchdog('exception', t('Exception - while getting.'), (array) $exception, WATCHDOG_ERROR);
    }
}

function update($id, $data) {
    try {
        $response = get($id);
        $result = $response->values;
        $response = drupal_http_request(FULLY_QUALIFIED_URL . '/' . $id, [
            "headers" => [
                "Authorization" => "Basic " . authorization(),
                "Content-Type" => "application/json",
            ],
            "method" => "PUT",
            "data" => json_encode(array(
                'values' => array(
                    'name' => isset($data['name']) ? $data['name'] : $result->name,
                    'email' => isset($data['email']) ? $data['email'] : $result->email,
                ),
            ))
        ]);
        if ($response->code == 200) {
            return json_decode($response->data);
        }
        else {
            watchdog('error', t($response->code . ' Error, while updating.'), (array) $response, WATCHDOG_ERROR);
            return NULL;
        }
    }
    catch (Exception $exception) {
        watchdog('exception', t('Exception - while updating.'), (array) $exception, WATCHDOG_ERROR);
    }
}

function delete($id) {
    try {
        $response = drupal_http_request(FULLY_QUALIFIED_URL . '/' . $id, [
            "method" => "DELETE",
            "headers" => [
                "Authorization" => "Basic " . authorization(),
                "Content-Type" => "application/json",
            ],
        ]);
        if ($response->code == 204) {
            return TRUE;
        }
        else {
            watchdog('error', t($response->code . ' Error, while deleting'), (array) $response, WATCHDOG_ERROR);
            return FALSE;
        }
    }
    catch (Exception $exception) {
        watchdog('exception', t('Exception - while deleting'), (array) $exception, WATCHDOG_ERROR);
    }
}

By using the above examples one can make the POST, GET, PUT & DELETE operations via drupal_http_request.

Cheers :)

Jun 13 2019
Jun 13

Today marked the kick-off of Drupal North 2019, and Evolving Web is excited to be a part of it for the 4th year in a row. Day 1 was packed with trainings, summits (for the 1st time!), and networking opportunities. Here were the key takeaways we saw:

Drupal is for everyone

In the "What is Drupal?" and "Qu'est-ce que c'est Drupal?" trainings by Evolving Web's own Trevor Kjorlien and Adrian Cid Almaguer, everyone from developers, to project managers, to graphic designers and more, took part in a hands-on demonstration on how to build a site with Drupal.

Nobody wants a website

A website is just a tool for you to achieve your larger goals. Whether that be building a community, selling a product, getting donations, providing information, or anything else, your website has to be designed with your goals in mind. That being said:

Focus on what your audience wants, not what you want

Your website should always be making your audience's life easier and give them what they are looking for as quickly as possible. It's important to step out of your own shoes and into theirs in order to have a good understanding of want they want so you can cater to those needs.

Students really love chocolate

While sharing her experiences in getting students to participate in UX/UI studies, Joyce Peralta from McGill University explained that sometimes it's the small incentives that can be the most effective. Through many attempts, she found that students could be easily swayed by a simple table full of chocolate bars situated in a prime location in the library. Simple but effective!

Drupal North started off on a great foot and we're looking forward to the next two days of sessions. If you're attending, make sure to check out presentations from our team:

Jun 12 2019
Jun 12

In my post, Drupal is frustrating, I stated that enterprise websites need, want, and are willing to pay for better support options when using Open Source software. Organizations have reached out to me as a Webform module subject matter expert (SME) seeking to start a 1-to-1 support relationship. Occasionally, these relationships result in a sponsored feature request. Sometimes organizations want to ask me a simple question or at least know that I am available to answer questions. In the past, I shied away from the idea of setting up regular office hours because it would be an unpaid commitment of my time during business hours. Fortunately, with the existing funds collected by the Webform module's Open Collective, I feel that now is a good time to experiment and set up some initial office hours for the Webform module.

About office hours

The goal of office hours is to make it easier for me to help people and organizations with questions and issues related to the Webform module for Drupal 8 as well as to assist current and future Webform module contributors.

Sponsor office hours

Sponsor office hours are intended to help backers of the Webform module's Open Collective with any Webform related questions or challenges. These office hours will be strictly for monthly sponsors and backers of the Webform module's Open Collective.

Add-ons office hours

Add-ons office hours are for anyone in the Drupal community building Webform add-ons and extensions that are being contributed back to the open source community. The goal of these hours is to help support and improve the quality of the projects and community around the Webform module.

Office hour guidelines

I've been thinking about how to efficiently structure office hours for maximum efficiency, therefore I've decided to publish some guidelines. Consider these a starting point, they will likely evolve over time.

Guidelines

  • Office hours will be once a month for 1 hour on Fridays at 11 AM EST.

  • Maximum of 4 topics per hour.

  • Discussions per topic will be limited to 15 minutes.

  • Topics must be limited to only the Webform module for Drupal 8.

  • Office hours will be posted as an event to the Webform module's Open Collective.

  • Discussion requests need to be posted as a comment to the "[Office Hours]" ticket in the Webform Issue Queue.

  • All participants agree to follow Drupal's code of conduct.

  • Discussions may be recorded and posted to YouTube.

Sponsor office hours

Goals

  • Answer questions

  • Provide guidance

  • Reduce frustration

  • Ensure success

Open to

  • Sponsors who contribute more than $50 USD per month

  • Individuals or organizations who contribute under $50 USD per month may join two office hours per year.

  • Individuals or organizations who make a one-time $50 USD contribution can join one upcoming office hour.

Guidelines

Possible topics

  • Webform add-on review and recommendations

  • Creating custom handlers, elements, and exporters

  • Theming and styling questions

  • Project plan review

  • Reviewing project requirements

Add-ons office hours

Goals

Open to

Guidelines

Possible topics

  • Creating integrations with third-party services

  • Implementing webform elements

  • Adding webform support to contributed modules

  • Code review and recommendations

Office hours payment

Maintainer/contributors are going to be compensated $125 USD for each online office hour with an additional 30-60 minute preparation time. Office hours can be extended for an additional hour as needed.

About the $125 USD hourly rate

The hourly rate of $125 USD is meant to be starting place between what a U.S. based agency charges and a freelance consultant. This rate is based on the $100 USD per hour rate established in 2014 via the Drupal 8 accelerate program. The additional $25 USD is to account for a 5-year cost of living increase.

Final thoughts about office hours

Support is a significant issue within Open Source and Drupal. Developers generally enjoy the experience of sharing code and ideas but supporting the code is not as rewarding. No one wants to provide free support, yet everyone wants to know an open source project is supported. The more maintainers can make themselves available to individuals and organizations, the stronger the community and code is around a project. Sponsor office hours provide tangible and more reliable support for the Webform module. If you need more support around the Webform module, consider backing the Webform module's Open Collective and sign-up for the next sponsor office hours.

Open source works best when the community around a project grows and more people get involved in contributing code and ideas. The main purpose around the Webform module's Open Collective is to strengthen and support the community around the Webform module and Drupal. Add-on office hours are intended to encourage mentorship and code contribution to the Webform module and Drupal community. If you have an idea that you want to contribute back to Drupal or need help getting involved in the Drupal community, sign-up for the next add-ons office hours.

Almost done…

We just sent you an email. Please click the link in the email to confirm your subscription!

OKSubscriptions powered by Strikingly

Jun 12 2019
Jun 12

The World is Moving Towards Open Source Software

Open source software has been around for some time now. When it first came out, open source software was perceived as risky and immature. However, with the passage of time, more and more companies started developing and building upon open source. A couple of great open source examples that have been pioneering the industry for a while now are Drupal CMS and Linux OS.

What is Open Source Software?

So, what exactly is open source software? Well, open source describes the type of software that has no proprietary license attached to it. Instead, it's published with a license that guarantees the software will forever be free to download, distribute, and use. This also means that unlike proprietary software, the code can be inspected by anybody. On top of that, if somebody wants to customize the code to their needs by changing it, they are free to do it.

Proprietary software is often the exact opposite. The code of proprietary software cannot be copied and distributed freely, modifications to the code are also prohibited, in case there are issues arising, you cannot fix them by yourself. You have to rely on the software vendor to fix the problem for you.

Open source has its set of advantages as well as its disadvantages. 

Advantages of Open Source Software

So, you might wonder what are the specific advantages of open source as opposed to software with a proprietary license. Here are some advantages:

  • Flexibility: Open source software is known for having great flexibility. The great flexibility is granted by the fact that the code is open. Thus, people are able to customize it to their needs.

  • Speed: Competition in the digital era is fiercer than ever before. One of the defining factors that are dictating the success of a company over its competition is the speed of innovation. Luckily, the companies that are using open source software know that open source facilitates speed. By not having to deal with the bureaucracy that comes when dealing with proprietary software, everything can be set-up to be working in a fast and reliable way.

  • Cost efficiency: Another trump card in the arsenal of open source software is the cost efficiency provided. Open source can be used by anyone free of charge because it is registered under the GNU General Public License which basically ensures that if somebody is using open source software, then they also have to make the code available for other people to be able to use it. Successful open source communities leverage the power of the community by providing good infrastructure for the community to share and review software extensions and improvements.

  • Security: Proprietary software has had a reputation of being more secure than the open source counterpart. Part of this was due to the popular belief that if the source code is hidden from the public, then hackers will have a harder time cracking it. However, this is far from the truth. The code for open source software is available for everybody to see, which, in turn, could make it more vulnerable. However, because of the fact that everyone has access to it, it is easier to peer review the code. In this way, people will be able to spot vulnerabilities way easier than with proprietary code, making it easier for developers to fix said vulnerabilities.

Disadvantages of Open Source Software

Now that we’ve talked about the advantages of open source, we should also discuss its shortcomings.

  • Not user-friendly: A common problem with open source projects is a lack of focus on design and user-friendliness. People might have a harder time being able to adapt to the interface of an open source software compared to competing proprietary platforms. Of course, this is not true for all open source projects, but it is common to see that well-funded companies are better able to attract and afford the best designers.

  • Hidden costs: Although open source software is hailed to be free to use, it actually is not. When adopting new software for a business, a decision maker also has to take into account different factors. For example, it is easy to overlook the cost of setting up and customizing the software for the company, paying for the training of the employees or hiring skilled personnel that is able to actually operate the software. Even if the adoption is not for business use, a time investment still has to be made in order to properly be able to use the software to its full potential.

  • Lackluster support:  When it comes to proprietary software, there are often dedicated departments that are ready to help a struggling user with their issues. In contrast, most open source software does not enjoy the same level of support. However, open source tends to gather dedicated communities around it that can be helpful in solving some issues. However, it’s good to keep in mind that these people are not paid for their service and might not be able to solve all the issues that are arising.

  • Orphan Software: Proprietary software can enjoy a longer lifespan than their open source counterparts. One of the risks of using OSS is that the community or developers or both lose interest in the project or move on to another project. What this means is that the software will stop being developed supported. The users of the software will be left high and dry and will have to migrate to another platform. Of course, there are also plenty of commercial software projects that go out of business, but strong commercial backing does increase confidence in the continuity of the software. Some open source projects have loosely associated commercial backing. Like Redhat backing Linux and Acquia backing Drupal.

Tech Giants buy Open Source Software Companies

Open Source Software Giant Hand

Lately, more and more tech giants are willing to start having some presence on the open source market. A couple of these examples are IBM, AT&T and Microsoft.

IBM acquires Red Hat

On 28 October 2018, IBM acquired Red Hat for $34 billion, a gargantuan amount of money. The aim of this acquisition is for IBM to shape the cloud and open source market for the years to come. IBM is betting a lot of money on this acquisition, in order to secure a lead on the market. However, there are some skeptics of this acquisition. They claim that IBM is going to ruin the Red Hat culture, as it was proven by their track record until now, kind of like some sort of corporate colonization. Only time will tell how this acquisition is going to shape the future of open source software. Nevertheless, the willingness of IBM to dish out so much money proves that open source software is seriously a path of the future.

AT&T acquires AlienVault

AlienVault is a developer of an open source solution that manages cyber attacks. It includes the Open Threat Exchange which is the world's largest crowd-sourced computer security platform. It was acquired by AT&T on August 22 in 2018. Since then it was renamed from AlienVault to AT&T Cybersecurity. With the high reach and resources of AT&T, former AlienVault is sure to have a bigger impact on the cyber safety of the world. However, this acquisition sparked a lot of controversies, mainly with some supporters of AlienVault claiming that this is the end for the brand. Well, this is true since the company was renamed to AT&T Cybersecurity. However, time will tell if there are going to be more radical changes to their business model under the ownership of AT&T.

Acquia acquires Mautic

With the acquisition of the open source marketing automation tool Mautic on 8 May 2019, Acquia is aiming to strengthen its presence on the open source software scene. Together with Mautic, Acquia is going to deliver the only open source solution to proprietary alternatives, expanding on Acquia's vision to deliver the industry's first Open Digital Experience Platform.  On top of that, unlike the other two companies, Acquia has a strong open source culture, making the acquisition of Mautic a well-thought business decision.

Apps, Plug-ins, and Services: When Open Source  Mingles With Closed Source Software

Android, Google, and Huawei

Android is an open source operating system for mobile phones. Formally, it is known as the AOSP (Android Open Source Project). It is a project developed by Google. The OS is based on a modified version of the Linux kernel and is designed primarily for touchscreen mobile devices. It is licensed under Apache 2.0 which makes it possible for users to modify and distribute modifications if they choose to. Even so, in the recent case of the U.S. ban of Huawei, Google announced the new trade embargo forced them to retract Huawei's Android license. Now, since Android is open source, the OS itself is still free to use. However, practically all Android devices outside of China come with Google services and apps pre-installed. These Google apps play an important role in any Android device. Google can do this since apps like Google Maps, Youtube, Gmail and Play Store, etc. are not open source and companies need a license agreement in order to have them on their device. The Google play store is also a paid service, it provides security checks and code validation for app updates. This forms a very important security layer on the Android platform.

To add insult to injury, losing the partnership with Google means Huawei will not get timely security updates to the AOSP Android Platform. When Google fixes vulnerabilities, they will first send out their fix to partners, and after partners have had time to publish the update to their devices the patch will become public. This means Huawei's devices will have increased exposure to hackers and viruses before the security patch is published and pushed to Huawei devices.  

Sooperthemes: Providing and Supporting Paid Drupal Extensions

Here at Sooperthemes, we are passionate about the Drupal project. We want to see Drupal thrive and become better than its competitors. In order to do that, we had to find out what are the areas in which Drupal can be improved. As it turns out, there was a strong need for Drupal to be easier to navigate and to use in site-building for users who are in a marketing or communication department and do not have deep technical knowledge. That's why Sooperthemes has developed Glazed Builder. Glazed Builder is a powerful visual page-builder that anyone can use, without needing to write, or see any code. With Glazed Builder, Sooperthemes wants to give accessibility to the power of Drupal to a wider audience and to make it easy for them to build, maintain, and grow a Drupal-based website. 

Although other open source platforms like Android, WordPress, and even Linux OS have had a thriving ecosystem of paid applications and plugins for many years, the same cannot be said for Drupal. Fortunately, with our 13+ years of experience in the Drupal community, we were able to create a combination of product and service that thrives in the Drupal community.  

Conclusion

As it can be seen by the latest trends, open source seems to be here to stay and to become the staple of software in the near future. This prediction is based not only on the benefits that open source software is bringing but also by the amount of interest that major companies in the tech world are showing towards open source software. The most successful recipe seems to be a mix of open source platform and paid-for applications. The paid-for applications are especially handy for components that require more involvement from marketing and UX design experts, who are not typical contributors in open source software communities.

Jun 12 2019
Jun 12

There are many beautiful words you can use to tell your customers that your website is trustworthy, reliable, and transparent. But one small widget can say it better that a thousand words.

So let us introduce the UpTime Widget Drupal module. See how it could help you always stay aware of your website uptime, build customer trust, and stand out from competitors.

Module maintained by our developers

Before we move on, we are especially happy to mention that the UpTime Widget Drupal module is maintained by our guys.

Knyshuk.vova is the owner of the module. Its creator Lolandese transferred the ownership to him in accordance with Open Ownership Pledge. Vladimirrem and ApacheEx are maintainers of the module who also make important commits.

These are Drupal developers from Drudesk and Drudesk’s parent company — InternetDevels, which is also listed as supporting organization on the module’s page.

What UpTime Widget Drupal module does

The UpTime Widget module connects your website to the popular free uptime monitoring service — UpTimeRobot.com.

It shows your website uptime (the percentage of time that your website is available to visitors online). Ideally, it should be 100%, although this figure may be a little bit lower in reality.

Your website uptime figure appears in the form of a handy widget to be placed anywhere on your website as a Drupal block. It can also optionally show a configurable copyright notice.

Uptime widget for Drupal website

Uptime widget for Drupal website

The UpTimeRobot service is able to monitor your website uptime every 5 minutes or at an interval you choose. You can get notifications about it by:

  • email
  • SMS
  • Twitter
  • RSS
  • push notifications for iPhone or iPad.

How the UpTime Widget module works in more detail

Getting your keys on the UptimeRobot service

First, we will need to register our website from the UptimeRobot.com service and get the API key and the monitor ID. We need to make a few easy steps:

  • sign up, activate your account, and log in at UptimeRobot.com
  • add a new monitor of the HTTP(s) type, give our website a name, and submit its URL

Register website at UpTime Robot service

The UptimeRobot service has plenty of interesting things like informative dashboards or detailed notification settings. We can come back here any time, but now let’s grab the API key and monitor ID and move on to our Drupal 8 website.

Installing and configuring the UpTime Widget module

The UpTime Widget module can be installed on the Drupal 8 website in any way you prefer. Although it is using a third-party service, installation with Composer is not obligatory.

When the module is installed and enabled, its settings appear at admin/config/system/uptime_widget. Let’s run through some of them.

  • There are two key required fields where we need to enter the previously received API key and monitor ID.
  • The “decimal separator” and “scale” fields have nice defaults, but we can play with the ways our website uptime digits are displayed.
  • The monitoring interval and the refresh interval fields also have sensible defaults. But we can choose how often the website uptime should be checked and how often Drupal should receive this information.

Configuring UpTime Widget Drupal module

Configuring the copyright notice

The website uptime widget by default comes with the copyright widget, which can optionally be disabled. Hiding or showing the copyright is also available in the block configuration, which will be described in the “Configuring the Uptime block” part.

The module’s settings page at admin/config/system/uptime_widget lets us configure how the copyright will look. It offers:

  • several options for the copyright notice
  • the option to specify the year that our domain was first online
  • the option to write a custom “Prepend text” instead of “All rights reserved.”

Configuring copyright notice of Drupal uptime widget

Placing and configuring the Uptime block on the website

It’s now time to place the uptime widget block on our Drupal website. In Structure — Block Layout, we choose the theme region (for example, Footer first), click on it, find the Uptime block in the list of blocks, place block, and save the blocks.

Placing UpTime Widget as block on Drupal website

We can configure the block to our liking — either on the Block Layout page or by clicking the “quick edit” pencil near the block on the website.

We can leave or hide its title by checking or unchecking “Display title,” configure visibility for specific roles, specific pages or content types, and so on.

Configuring UpTime Widget as Drupal block

We can also choose to show both the uptime and copyright widgets, or only one of them.

UpTime and copyright widgets Drupal

More features to come in the future

Our guys have many plans about the module’s improvements in version 8.2. Here are at least some of them:

  • Uptime check notifications should be configurable directly from the Drupal website, which is for now only possible on the UpTimeRobot service.
  • The Uptime information should be included into the “Reports” page on the Drupal dashboard.
  • Public Status Pages, or detailed boards about uptime information, should be integrated into Drupal.

UpTime Robot service dashboard

Get yourself a website uptime widget

Show your visitors they can rely on you all the time! And you can always rely on our Drupal support team if you need any help in:

  • installing and configuring the UpTime Widget Drupal module
  • customizing its look on your website
  • creating another custom Drupal module in accordance with the customer’s requirements

Stay reliable and build your customer trust!

Jun 12 2019
Jun 12

Tour and travel business has started to catch up in the digital realm. In fact, it’s growing faster than the total travel market. It is predicted that by 2020, the overall tours and activities segment will grow to $183 billion.

A clear opportunity for businesses in the travel industry.

Despite your services being authentic, not choosing the right technology to back up your digital transformation can make the entire game plan chaotic, tedious, and not to mention a loss in revenue. Drupal is the leading choice in the travel industry (we have some case studies lined up).

Here’s why it can be the right option for yours too.

What Difference Does it Make?

Travel marketers know and as business owners, you too need to understand that in order to deliver value to the customers they need to be invested in the user experience beyond the first click. It’s all about ‘what they want’.

Here’s where the right CMS technology needs to focus on:

Need for speed: More than half of the users will leave a site if it takes more than 3 seconds to load. Your CMS needs to provide you with the ability to track the site speed performance. Further, for mobile sites, the technology needs to be AMP friendly.

Going ga-ga over mobile: Today, 48% of mobile users in the U.S are comfortable researching, planning, and booking an entire trip to a new travel destination using only their smartphone. The CMS should help in making the mobile site an assistive and delightful experience, exactly what customers are looking for

Travel blogging is popular: Consumption of digital travel content sees double-digit growth year-over-year. Not just this, when planning trips, 49% of people look at travel content sites.

Videos on the same hand provide an authentic outlook on the experience and travel vlogging is gaining traction. The CMS should be scalable and help you leverage the power of content.

Layout matters: Travellers’ basic needs have remained the same - they want to experience. A clear and easy-to-navigate layout will help the user get most out of it. Your CMS can help you avoid irrelevant navigation links and nontransactional elements that can be distracting to the user.

They ‘Book’ right now: Any action you want a user to take must be obvious. Use of contrasting colours and fonts for high-visibility should come easy with your themes. Also, most hotel bookings turn out to be last minute, rather than pre-planned which shows how critical is your CTA.

travel-image-srijan-technologies1

Drupal for your Travel and Hospitality Website

Anything travellers expect to accomplish online should be just as easy to achieve. Here’s what Drupal can do.

1.Content Friendly and Scalable

‘Content’ and ‘more content’ has been gaining traction lately in the business shelf space. As consumer expectations grow for more personalized and relevant experiences, travel businesses too are running in for more content.

For travel visitors, a website which offers them a lot of engaging content without encountering many blockers always wins the hearts.

Nearly 9 out of 10 customers tend to walk over to a competitor’s website when your website goes down.


Drupal scales to support the most content-rich sites and experiences. It lets you create every kind of content you want. And it can help you handle it - and ensure your site always runs in turbo mode. As holiday planners throng the internet with their booking during the holiday season, you must be equipped to avoid any unfortunate downtimes.

Besides, it is highly flexible, capable of accommodating and organizing your information into structured blocks for better visibility. It’s WYSIWYG editor is best for easy content adding.

2.Multilingual capabilities

While building content is a good way to start, it needs to reach out to the right audience too. Acting on consumer intent and preference is one of the keys to unlocking growth.

Do you know that your most likely customers can be found in non-English speaking countries - like China, Germany and Hong Kong with a maximum number of international departures? This factor, however, should not be the reason why any person should walk away from your website. .

Drupal 8 multilingual initiative helps you rebuild language support to target an international audience. Providing content translation in an additional 94 languages, the four core modules (Language, interface translation, content translation, configuration translation) make it easy and efficient for a multilingual travel business.

3.SEO friendly features

Once you got your content up on your site, it is important to ensure people can find you on search. Search is, after all, the number 1 channel high-value travellers turn to.

When travellers search with the intent to book, they browse through one to five sites.

srijan-google

You can rock your SEO with Drupal. With more than 20 SEO modules and core features, it can optimize your site without breaking any best practices. Modules like - Path alias, meta tags, Alternate Hreflang Module, SEO Checklist Module, XML Sitemap, Google Analytics, Linkit are easy to get started with.

4.Layout and Navigation

Labyrinths and twisted lanes appeal to tourists and travellers, but the same on your website will not. Navigation must always be easy and intuitive. The whole point of successful UX is to let the visitor easily get the information they want without driving them mad.

Taxonomy in Drupal can help you define tags, tabs and call-to-action buttons helping you highlight the key terms.

Layout builder can help you with your page building capability without restrictions. It is unique in offering a single, powerful visual design tool. Layouts for templated content, customizations in it, and creation of custom pages which are not tied to a content type or

2019-06-12-srijan

Having the ability to customize blocks without the need to code, you can highlight your services, recommendations and offers with different campaigns.

The search functionality is an easy addition for navigation. For travel, it is important that they can look up for their exact needs.

In addition to providing you with an incredible SEO, Drupal has search APIs like Federated Solr and Solr Search. Not only do they provide the user with the easy search but it helps index the content, too.

5. Mobile friendly

People who have negative experiences on mobile are 62% less likely to purchase from that brand in the future than if they have a positive experience. It’s critical to remember that mobile is one big part of the user experience.

percentage-of-smartphone-srijan-technologies

Your mobile version should be responsive without cutting out important information, easy to navigate and demonstrate how you deliver benefit to the visitor.

Drupal 8’s very essence is accessibility because it was made with mobile-first usability. Drupal supports responsive design, best practices, and ensures your users get a seamless content experience every time, on every device.

travel-image-site-srijan

6. Secure payment gateway 

Nothing is more troubling to a visitor than the thought of making a payment which is conned. This can be an absolute deal breaker. Security is therefore very important for your reputation.

Drupal Commerce, supports the core payment API, for a smooth payment collection procedure, through the check out form. It offers you feature-rich payment access, integration with Paypal, Brain tree, Amazon Pay and 106 other additional gateways.

2019-06-12-payment-srijan

Improving Digital Experience and Commerce Platform for TUI India


TUI Group is a multinational travel and tourism company, headquartered in Germany, with presence in 180 countries. TUI India is part of the TUI Group.

We built a new architecture for the website to strengthen multiple functionalities, like integrating the site with Salesforce, PayU, Zomato, and other APIs.

It helped improve user experience with additional features. 

Read the complete case study here.

tui-srija

7. Improve your Marketing

For businesses, it is important to track, analyze the user behaviour and provide them with personalized services. Third party tools such as CRM, Google Analytics, Feedback tools (like hotjar) help get the 'bigger picture' on how to improve user experience and increase conversion rates.

2019-06-12-improve-marketing-srijan

Drupal offers easy integration with enterprise-level CRM modules, ERP systems, social networks, real-time analytics, payment gateways to enhance the versatility of your website.

This ensures that your website visitors can find a solution at each step of their travel planning.

Enabling Cleartrip’s Marketing Teams to Quickly Serve New, Relevant Offers

Cleartrip.com is a travel portal that offers travel bookings for flights, hotels, trains and buses. The company primarily functions in India and the Middle-East geography.

Srijan developed a new marketing portal for them which simplified their process of presenting deals to their customers. We helped them:

*Build ready-to-use templates that help team deployed offers and packages quickly

*Ability to replicate deals across different country domains easily.

Read the complete case study here.

clear-trip-casestudy-srijan

8. Easy website setup with travel utilities

For quick assistance, to develop a travel website, Drupal has BAT.

The Booking and Availabiliy Management Tool (BAT) comes in handy when you want to
provide your customers with facilities for booking and reservation while helping you manage the availability at the accommodation concerned.

An incredible result of the collaboration between BAT and Drupal is Roomify for Accommodation,  which is, in essence, an all-in-one solution for vacation rentals, hotels and agencies with multiple properties.

Besides this, you can use BEE. Yes, the Drupal module BEE - Bookable Entities Everywhere grants the booking and availability features to all node types. And it is built on BAT.

Bon Voyage!

With the capability to build you a website which can be used to plan and execute a vacation, Drupal is a powerful, secure and highly reliable platform. It not only helps you retain customers but also win over new travellers.

With stability and versatility, your website can support a large volume of content, personalize what you offer, and smoothen the workflow reducing any extra work.

Contact our experts for a bon voyage!

Jun 12 2019
Jun 12

Decoupled Days 2019 is almost here and Amazee Labs has been busy gathering the most valuable practices we’ve discovered since last year to share with the community.

Amazee Labs is proud to sponsor Decoupled Days, a conference for architects, developers, and business people involved in implementing decoupled CMS architectures. Since successfully debuting in 2017, its mission of sharing the experiences with both back-end CMS development as a content service and front-end development of non-CMS applications consuming CMS content (especially those in JavaScript) has been a success for everyone who’s chosen to be involved.

See what we’re sharing at this year’s conference:

Wednesday, 17 July

Jamie Hollern will be presenting a session called Storybook and Drupal: Seamless frontend integration for decoupled sites. This session will explain how to use Twig with Storybook and Drupal to bring all the advantages of UI component libraries into a decoupled Drupal project, and how to build a component library for decoupled Drupal sites. 

13:45 pm in Room 2 
 

Learn how to Decouple your teams using GraphQL with Bryan Gruneberg. In this lighting talk, he will present strategies to decouple your team by putting GraphQL at the centre.

16:30 pm in Room 3 


Pascal Kappeler and Stephanie Lüpold take on Shaping the future of decoupled Drupal: An unusual case study. This session they’ll show how an SME can drive technical innovation at scale, but that it is made possible with the help of a corporate partnership. If you are interested in getting a first-hand account of how two very different companies work together, in order to push technical boundaries, this is the right session for you.

15:45 pm in Room 2 
 

Join Fran Garcia-Linares who will delve into the question GraphQL V4: what's next? In this session, Fran will outline all the exciting new features that are (or will be) included in the 4th version of GraphQL module for Drupal and the new possibilities they open up for us/for those in the know -- like support for SDL based schemas!

16:00 pm in Room 2 
 

Thursday, 18 July

John Albin Wilkins will discuss Gatsby and GraphQL Schema Stitching with Drupal, the benefits of a universal GraphQL graph over traditional web services like REST and JSON:API, configuring Gatsby’s gatsby-source-graphql plugin and more. A demo website, including full Git repository, will be provided for attendees to try out.

11:15 am in Room 1 

Decoupled Drupal is the future, but learning an entirely new stack can be daunting. Stew West presents GraphQL and Twig, a beginner’s guide and demo on how to use GraphQL and demonstrate the advantages of changing the Drupal push model to a pull model by letting the template define its data requirements.  

16:15 pm in Room 1

You should also check out these sessions from amazee.io and Michael Schmid

In this session, Michael will demonstrate how Caching decoupled websites is hard, but freaking fast if done right and share his best practices around caching of decouple websites, things that work, things that didn't work and how we are running websites today.

18th Jul at 09:00 am in Room 1

Michael Schmid presents 3 Years Decoupled Website Building and Hosting - Our Learnings. In this session, he shares what we’ve learned as a business so far and what we envision for the future. Attendees will walk away with insights into how decoupled projects can affect everything from hiring and client relationships to profits, processes, maintenance and more. 

18 July at 11:15 am in Room 2

Catch up on a year’s worth of Amazee Labs’ best practices in just two days at Decoupled Days 2019.

Jun 12 2019
Jun 12

It’s exciting to see how once unimaginable things become popular digital practices! A vivid example is artificial intelligence. We have shared with you an article about artificial intelligence coming to your apps thanks to cognitive services. What about Drupal websites — are they ready for AI? The answer is a definite yes! Let’s see how artificial intelligence and Drupal 8 come together.

Benefits of artificial intelligence for Drupal 8 websites

Smart machine thinking is meant to create a new level of user experiences. By leveraging the huge potential of artificial intelligence on your Drupal website, you will be able to:

  • offer smarter, more engaging, and more advanced features to your users
  • better analyse your users’ behavior
  • create more personalized experiences for them
  • raise e-commerce conversions through more targeted offers
  • streamline and automate many workflows and take the load off your staff
  • raise your company’s image as customers see you using advanced technologies

and much more.

Useful Drupal modules for artificial intelligence & their capabilities

The Drupal community is highly interested in innovative trends. The AI has been a hot topic at the biggest Drupal meetups including DrupalCon Baltimore and Drupal Europe Darmstadt. And, of course, useful Drupal modules have been created that bring artificial intelligence to websites. Let’s review a few of them.

Azure Cognitive Services API 

We have mentioned Microsoft Azure Cognitive Services at the beginning, and here is a module that connects them to Drupal websites. 

The Azure Cognitive Services API Drupal module allows developers to enrich Drupal websites with features like speech, vision, and facial recognition, emotion detection, and much more. There is a pack of 4 submodules.

  • Azure Face API

The Azure Face API module integrates with Microsoft Face API service. It is meant to detect and recognize human faces. The tool is able to compare similar faces and group their pictures together, as well as identifies people who were previously tagged. Face API can detect age, gender, hair color, and more.

Azure Face API

The current capability of the module is detecting up to 64 human faces with a high level of precision. The detection result is specified in a file.

Azure Face API

  • Azure Emotion API

The Azure Emotion API connects the Microsoft Face API to your website. It recognizes the emotions of one or more persons on the picture (happiness, sadness, surprise, anger, neutral, and so on). The tool puts a bounding box on a person’s face and returns the result in a JSON file.

Azure Emotion API

Azure Emotion API

  • Azure Computer Vision API

The Azure Computer Vision API module analyzes the content on images and returns information about them. It uses tags and descriptions, identifies image types and color schemes, and can be your helpful assistant in content moderation. For example, you can configure it to apply restrictions on adult or abusive content.

Azure Computer Vision API

  • Azure Text Analytics API

The Azure Text Analytics API module is meant for natural language processing. Its key features are:

  • Sentiment analysis. It detects negative or positive sentiments and gives them a score.
  • Key phrase extraction. It extracts the key points in the text. 
  • Language detection. It detects the text language. (The module supports almost 20 languages for now).

Azure Text Analytics API

Azure Text Analytics API

Automatic Alternative Text

Adding ALT text to images is a rule of thumb in today’s web. It is vital for web accessibility and SEO. The Automatic Alternative Text module provides ALT text automatically by using Microsoft Cognitive Services — namely, Computer Vision API.

It detects the content of images and describes it in human-readable text while specifying the level of confidence. The tool can generate more than one ALT tag, create thumbnails, and much more. 

Automatic Alternative Text

Cloudwords for Multilingual Drupal

It’s a fact that Drupal 8 is a great choice for multilingual websites, but everything is even greater with this module. The Cloudwords for Multilingual Drupal module uses AI to help you create high-quality multilingual campaigns. 

The module lets you manage your content localization quickly and efficiently. It also has strong workflow automation and project management capabilities. 

Drupal Chatbot

The era of chatbots is already here! The Drupal Chatbot module allows you to create a basic voice or text-based chat bot. The module uses Dialogflow as natural language processing agent, but you can also extend it with Alexa. 

The Drupal Chatbot can be enabled as a block and extended with various functionalities. For example, it supports Latest Pages, Top Rated Pages, Latest Article Search, and so on. The module is actively being developed.

Drupal Chatbot

Drupal Chatbot

Chatbot API

The Chatbot API module facilitates the integration of various chatbots and personal assistants with your Drupal website. It creates a layer to serve Drupal data to these services.

For now, the module supports Alexa and Dialogflow, for which it uses submodules, and works together with the corresponding modules (Alexa and Dialogflow Webhook). However, it’s just the beginning, and more integrations are expected.

Acquia Lift Connector

The Acquia Lift Connector module integrates your website with the Acquia Lift personalization tool. It enables you to create highly personalized offers based on user’s behavior. The module offers real-time audience segmentation, behavioral targeting, A/B testing, and more.

It has a unified drag-and-drop user interface with all the customer information where you can create personalizations. Acquia Lift uses machine learning to automatically recommend content to users.

Acquia Lift Connector

Acquia Lift Connector

Embrace artificial intelligence on your Drupal 8 website

Just imagine, all this and much more could come to your website today! Our Drupal team is ready to help you with building AI integrations based on these and other modules. And if there is no module yet for your idea, we will create a custom one. Enjoy artificial intelligence and Drupal 8!

Jun 12 2019
Jun 12

Telegram is an easy to use free chat application that is rapidly winning fans all over the world. 

There is a Telegram plugin for WordPress but there is not yet a Telegram module for Drupal.

In this tutorial, you will learn how to integrate the Telegram app with your Drupal 8 site using JavaScript from Re:plain.

Step #1. Create a Telegram Account

If you don’t have a Telegram account yet, you’ll have to create one. The process is pretty straightforward. Download the Telegram app to your smartphone and activate an account with your mobile phone number.

Download the Telegram app to your smartphone and activate an account with your mobile phone number

  • Allow Telegram to receive and make phone calls and send SMS messages.
  • Enter your phone number and the code provided by Telegram.

Enter your phone number and the code provided by Telegram

  • Allow Telegram to access pictures and contacts and you’re good to go.

Step #2. Get the JavaScript Code

  • Open your web browser and web.telegram.org in the address bar.
  • Choose your country, type your phone number.
  • Click Next.

Click Next

Click OPEN IN WEB

  • Click Start.
  • Follow the instructions.

Follow the instructions

  1. Click Menu.
  2. Click Connect the site ("Connectar el sitio" on my screenshot below).

Click Connect the site

  • Create a name for your chat room, for example, “Customer Support”.
  • Enter a description and a welcome message for your “Customers”.
  • Choose the default widget language in the site (i.e. English).
  • Your chat room is created.
  • Copy the JavaScript for later use.

Copy the JavaScript snippet

Step #3. Add the Javascript to Your Site

The code has to be inserted into a page before the closing </body> tag. That means the JS code has to be inserted into the html.html.twig template.

The theme I’m using is the default Bartik. For demonstration purposes, I’m going to use the default core template. However, this is not a best practice.

The right way of doing this is creating a Bartik subtheme, copying the template inside the new theme and modifying it there.

You can read more about creating a subtheme here.

  • Locate the core/themes/classy/templates/layout/html.html.twig file.
  • As you can see, Classy is the base theme for Bartik.
  • Open the file in your text editor and paste the script right before the closing </body> tag.

Open the file in your text editor

  • Save the file.
  • Clear the site cache. You’ll see the Telegram icon at the bottom right corner of your screen.

Good job! We haven’t installed any Drupal module, that’s the reason why you’re seeing the Telegram icon even in your administrative pages. These make use of the html.html.twig template as well.

  • Open another browser and test the chat as an anonymous user.
  • The system will prompt you to introduce your contact data - this is a Telegram answering template. You can configure your templates in the Telegram web application or in your phone.
  • I can read and answer to the message in my cell phone.

I can read and answer to the message in my cell phone

Step #4 - Change the Logo of the Chat Window

  • In your web/mobile Telegram application click Menu > Customer Support (or whatever you named your channel). You have a bunch of options here. They’re pretty self-explanatory.
  • Tap/Click Edit logo.
  • Click the Camera icon in order to upload a picture.
  • Refresh your Drupal site.
  • There’s the logo.

Logo

Feel free to explore the different configuration options available.

Telegram has an extensive documentation about how to customize and enhance the functionality of your chats with the help of bots, that perform different tasks.

As you already noticed, this method is useful for any type of site, not just Drupal sites.

If you want to learn more Drupal, join OSTraining now. You'll get access to a vast library of Drupal training videos, plus the best-selling "Drupal 8 Explained" book!


About the author

Jorge lived in Ecuador and Germany. Now he is back to his homeland Colombia. He spends his time translating from English and German to Spanish. He enjoys playing with Drupal and other Open Source Content Management Systems and technologies.
Jun 12 2019
Jun 12

Some years ago, a frontend developer colleague mentioned that we should introduce SASS, as it requires almost no preparation to start using it. Then as we progress, we could use more and more of it. He proved to be right. A couple of months ago, our CTO, Amitai made a similar move. He suggested to use ddev as part of rebuilding our starter kit for a Drupal 8 project. I had the same feeling, even though I did not know all the details about the tool. But it felt right introducing it and it was quickly evident that it would be beneficial.

Here’s the story of our affair with it.

For You

After the installation, a friendly command-line wizard (ddev config) asks you a few questions:

The configuration wizard holds your hand

It gives you an almost a perfect configuration, and in the .ddev directory, you can overview the YAML files. In .ddev/config.yaml, pay attention to router_http_port and router_https_port, these ports should be free, but the default port numbers are almost certainly occupied by local Nginx or Apache on your development system already.

After the configuration, ddev start creates the Docker containers you need, nicely pre-configured according to the selection. Even if your site was installed previously, you’ll be faced with the installation process when you try to access the URL as the database inside the container is empty, so you can install there (again) by hand.

You have a site inside ddev, congratulations!

For All of Your Coworkers

So now ddev serves the full stack under your site, but is it ready for teamwork? Not yet.

You probably have your own automation that bootstraps the local development environment (site installation, specific configurations, theme compilation, just to name a few), now it’s time to integrate that into ddev.

The config.yaml provides various directives to hook into the key processes.

A basic Drupal 8 example in our case looks like this:

hooks:
  pre-start:
    - exec-host: "composer install"
  post-start:
    # Install Drupal after start
    - exec: "drush site-install custom_profile -y --db-url=mysql://db:[email protected]/db --account-pass=admin --existing-config"
    - exec: "composer global require drupal/coder:^8.3.1"
    - exec: "composer global require dealerdirect/phpcodesniffer-composer-installer"
  post-import-db:
    # Sanitize email addresses
    - exec: "drush sqlq \"UPDATE users_field_data SET mail = concat(mail, '.test') WHERE uid > 0\""
    # Enable the environment indicator module
    - exec: "drush en -y environment_indicator"
    # Clear the cache, revert the config
    - exec: "drush cr"
    - exec: "drush cim -y"
    - exec: "drush entup -y"
    - exec: "drush cr"
    # Index content
    - exec: "drush search-api:clear"
    - exec: "drush search-api:index"

After the container is up and running, you might like to automate the installation. In some projects, that’s just the dependencies and the site installation, but sometimes you need additional steps, like theme compilation.

In a development team, you will probably have a dev, stage and a live environment that you would like to routinely sync to local to debug and more. In this case, there are integrations with hosting providers, so all you need to do is a ddev pull and a short configuration in .ddev/import.yaml:

provider: pantheon
site: client-project
environment: test

After the files and database are in sync, everything in post-import-db will be applied, so we can drop the existing scripts we had for this purpose.

We still prefer to have a shell script wrapper in front of ddev, so we have even more freedom to tweak the things and keep it automated. Most notably, ./install does a regular ddev start, which results in a fresh installation, but ./install -p saves the time of a full install if you would like to get a copy on a Pantheon environment.

For the Automated Testing

Now that the team is happy with the new tool, they might be faced with some issues, but for us it wasn’t a blocker. The next step is to make sure that the CI also uses the same environment. Before doing that, you should think about whether it’s more important to try to match the production environment or to make Travis really easily debuggable. If you execute realistic, browser-based tests, you might want to go with the first option and leave ddev out of the testing flow; but for us, it was a desirable to spin an identical site on local to what’s inside Travis. And unlike our old custom Docker image, the maintenance of the image is solved.

Here’s our shell script that spins up a Drupal site in Travis:

#!/usr/bin/env bash
set -e

# Load helper functionality.
source ci-scripts/helper_functions.sh

# -------------------------------------------------- #
# Installing ddev dependencies.
# -------------------------------------------------- #
print_message "Install Docker Compose."
sudo rm /usr/local/bin/docker-compose
curl -s -L "https://github.com/docker/compose/releases/download/1.22.0/docker-compose-$(uname -s)-$(uname -m)" > docker-compose
chmod +x docker-compose
sudo mv docker-compose /usr/local/bin

print_message "Upgrade Docker."
sudo apt -q update -y
sudo apt -q install --only-upgrade docker-ce -y

# -------------------------------------------------- #
# Installing ddev.
# -------------------------------------------------- #
print_message "Install ddev."
curl -s -L https://raw.githubusercontent.com/drud/ddev/master/scripts/install_ddev.sh | bash

# -------------------------------------------------- #
# Configuring ddev.
# -------------------------------------------------- #
print_message "Configuring ddev."
mkdir ~/.ddev
cp "$ROOT_DIR/ci-scripts/global_config.yaml" ~/.ddev/

# -------------------------------------------------- #
# Installing Profile.
# -------------------------------------------------- #
print_message "Install Drupal."
ddev auth-pantheon "$PANTHEON_KEY"

cd "$ROOT_DIR"/drupal || exit 1
if [[ -n "$TEST_WEBDRIVERIO" ]];
then
  # As we pull the DB always for WDIO, here we make sure we do not do a fresh
  # install on Travis.
  cp "$ROOT_DIR"/ci-scripts/ddev.config.travis.yaml "$ROOT_DIR"/drupal/.ddev/config.travis.yaml
  # Configures the ddev pull with Pantheon environment data.
  cp "$ROOT_DIR"/ci-scripts/ddev_import.yaml "$ROOT_DIR"/drupal/.ddev/import.yaml
fi
ddev start
check_last_command
if [[ -n "$TEST_WEBDRIVERIO" ]];
then
  ddev pull -y
fi
check_last_command

As you see, we even rely on the hosting provider integration, but of course that’s optional. All you need to do after setting up the dependencies and the configuration is to ddev start, then you can launch the tests of any kind.

All the custom bash functions above are adapted from https://github.com/Gizra/drupal-elm-starter/blob/master/ci-scripts/helper_functions.sh, and we are in the process of having an ironed out starter kit from Drupal 8, needless to say, with ddev.

One key step is to make ddev non-interactive, see global_config.yaml that the script copies:

APIVersion: v1.7.1
omit_containers: []
instrumentation_opt_in: false
last_used_version: v1.7.1

So it does not ask about data collection opt-in, as it would break the non-interactive Travis session. If you are interested in using the ddev pull as well, use encrypted environment variables to pass the machine token securely to Travis.

The Icing on the Cake

ddev has a welcoming developer community. We got a quick and meaningful reaction to our first issue, and by the time of writing this blog post, we have an already merged PR to make ddev play nicely with Drupal-based webservices out of the box. Contributing to this project is definitely rewarding – there are 48 contributors and it’s growing.

The Scene of the Local Development Environments

Why ddev? Why not the most popular choice, Lando or Drupal VM? For us, the main reasons were the Pantheon integration and the pace of development. It definitely has the momentum. In 2018, it was the 13th choice for local development environment amongst Drupal developers; in 2019, it’s at the 9th place according to the 2019 Drupal Local Development survey. This is what you sense when you try to contribute: the open and the active state of the project. What’s for sure, based on the survey, is that nowadays the Docker-based environments are the most popular. And with a frontend that hides all the pain of working with pure Docker/docker-compose commands, it’s clear why. Try it (again), these days - you can really forget the hassle and enjoy the benefits!

Jun 12 2019
Jun 12

Glitzy websites are all the rage these days. Everybody seems to be looking for easy ways to create multimedia-rich pages with ease. Yet there is a big downside to the current trend of page builders -- if you're not careful, you might end up making your long term content management far harder than it should be.

WordPress 5 made its Gutenberg "Block editor" the standard for all WordPress sites going forward. Drupal 8.7 added a new "Layout Builder" in its core, adding sophisticated layout capabilities. Both of these are playing catchup to Software-as-a-Service (SaaS) offerings like Squarespace and Weebly -- and a whole bunch of 3rd party modules and plugins that have been filling the gap so far.

The goal for all of these is to make it easy to interleave text with photography, video, galleries, and animations using something approaching a drag-and-drop interface. Yet how they go about doing this varies drastically under the hood. In broad strokes, you can group all of these layout builders into one of 3 categories:

Broad categories of layout builders

  Field-Oriented Repeating Templates Embedded Layouts Module or Plugin

Drupal "Layout Builder"

Drupal Panels, Panelizer

Display Suite

Custom themes, page-type templates

Drupal "Paragraphs"

Field Collections

Entity References/Inline Entity Form

Commerce Variations

WordPress Gutenberg

Drupal Entity Embed

Vast majority of WP layout plugins

Where items are stored Individual fields are in individual database tables/columns `Multiple entities are linked together to build up a page Everything is dropped into a huge text field Best for Managing huge numbers of similar items Keeping content and presentation separate, to allow re-skinning down the road, while still making rich authoring relatively easy Very easy authoring Drawbacks Slightly less flexible -- harder to change up the sequence of rich elements Not as organized as field-based layouts, harder to extract, search, and aggregate information Very poor at reusing information on other pages, inconsistent look across the site, hard to update overall look and feel, finicky to use and get "just right", often has accessibility issues

That's the summary. Now let's take a look under the hood...

How do layout builders store their data, and why should I care?

Which is the best tool -- Excel or Word? Entirely depends on the job you're trying to do, of course. Yet these layout builders are as different as Word and Excel -- some are excellent at creating long documents with lots of variation, while others are far better at preserving data so you can show it in a chart, do math, and more. You wouldn't pick Excel to write an essay, for example, and you shouldn't pick Word to track your finances.

If you are creating a rich landing page for a campaign, a layout builder that takes the Embedded approach can get you there quickly. Lots of support for drag-and-drop, lots of ways to quickly get a result. You can build 5 more while you're at it -- but now try to compare things across 50 of these one-off pages -- now suddenly not having a template and simple fields to fill in makes the job much harder. You create pages for a bunch of products, and then you go to create a product comparison chart, and you're building that table by hand, cut-and-paste.

Or say for example you are a research institution, publishing research papers from dozens of contributors. You can make a nice landing page for each paper, with sections for the author's biography, the category, methodology, supporting organizations, and various other items -- but if you don't put each of these in its own field, it gets a lot trickier to build a nice search interface that will help your visitors find what they are looking for.

What is Content Management?

There are plenty of definitions of Content Management out there, mostly by vendors looking to sell you on a specific system, or pedantic descriptions of how big companies (Enterprises) need all this sophisticated stuff. While we are a vendor trying to sell you on something, let's take a moment to clear away all the B.S.

Website Content Management is about creating, maintaining, curating, and cultivating on your website for the entire life of the website. The problem with this focus on Layout Builders is that all the focus is on that very first step -- Creating. It ignores the rest of the lifecycle of your content.

At Freelock, we believe the longer you keep good content on your website, the more valuable it becomes. And keeping old content maintained and relatively fresh is a big part of that job. A Content Management System can help you keep your old content fresh -- keeping it looking consistent with rest of your site, bringing in new traffic, guiding people down your sales funnel to become customers, providing reputation and longevity that assure your customers you're not just another fly-by-night operation.

Embedding all of your rich content into one-off pages hampers this very process, especially when you want to change the look-and-feel of your website -- or find, re-use, or change the overall experience of your oldest content. Let's drill down into these different types of builders to see how they compare, for the longer term.

Field Oriented Layout Builders -- the Excel approach

Drupal Layout Builder Adding a custom block to a page using Layout Builder

Drupal excels at information architecture, and so the layout builder Drupal chose to include in its core supports this way of thinking about content. With the ability to easily create fields on different content types, and aggregate content using the powerful "Views" module, Drupal is all about information reusability.

There are dozens of different kinds of fields out there, and an even larger number of ways to use each one. For example, if you add a date field for an event, you can show it on a list of upcoming (or past) events automatically. You can show it on a calendar view. You can show it in a list, or a set of cards.

Add a geolocation field, and now you can show it on a map -- and you can filter that for upcoming events near me. Add a price and now you can create a "facet" that lets you see items between certain price ranges. All of this power builds on all of the other kinds of information stored in fields, and makes it easy to manage hundreds, thousands of similar items.

The new Drupal Layout Builder lets you easily create a template for showing each of these fields in a particular spot on the page, create multiple columns, drag-and-drop items around. In addition, you can create one-off blocks to highlight certain items, and reuse that on other items -- change up the layout entirely on a single item, if you wish.

Managing field layouts in the future

Down the road, if a product is no longer a special snowflake, hit a button and it reverts to the same layout as all the rest of your products -- the layout is stored in "just" another field on the item.

If you want to show a Google Map or a thumbnail linked to a file, you would have a field for the location to show and another field for the media. Then when you place the location field on the layout template, you would pick the "map" renderer to show a map for the field, and when you want to show the downloadable file, you could specify the size to generate the thumbnail and place it where you want it -- and it will look consistent across all the items in your site.

Want to change your design? Swap out the Google Map renderer for OpenStreetmaps, and all of the maps on your site use the new service immediately. Change the thumbnail size for the document, and move it to the left sidebar, and it's done across your site.

Embedded Layouts - the Word approach

Gutenberg in action Gutenberg editor in action

The new WordPress Gutenberg editor is the poster child for the opposite way of creating rich layouts. Instead of having a field for each kind of data, you start with a blank page and a collection of blocks you can drop into it.

Honestly, I like using Gutenberg -- once you figure out the basics, it's mostly fun to use. Its killer feature is management of "Reusable Blocks" -- create a chunk of boilerplate, save it as a reusable block, and then you can reuse it on any other Gutenberg page. You can keep it in your page as a "reusable block" or you can convert it to a regular block and edit it.

You can create entire templates this way.

This... this is awesome for creating proposals! Or reports, or anything you need to do once, and don't care much about how it will look in 5 years.

It's very rapid for creating pages, and if you are constantly editing some key landing pages, Gutenberg seems like a fine way to go.

However, for content that's going to stick around for years, especially through a site redesign, it's going to be a bit of a nightmare. And right from the start it stops being useful for a huge number of scenarios modern sites are trying to support.

Very little design control

One thing a CMS attempts to do is make your site look consistent. One challenge with Gutenberg and other approaches that largely dump styles as well as content into a big text area is that it makes it much easier to violate your site's design, leading to ugly, confusing, jarring sites. Having spent several years as a professional writer, seeing garish colors and inconsistent fonts and font sizes makes me shudder. I don't want to have to think about what it looks like -- I just want to write it and trust the designer to make it look good.

Useful data is embedded, hard to reuse

I see blocks for "Product Comparisons" for Gutenberg. Wow, drop these in and you get some boxes where you can enter stuff -- cool!

But... you have to enter that stuff. And it already exists, on the product pages. Wait a second -- I thought this was supposed to make my job easier? And... hey, I have a new product that just varies in two of these areas. Which page did I put that product comparison on?

Managing changes in the future

Back to the earlier scenarios, now I want to switch from Google Maps to OpenStreetmap. To make this change, I need to do a search and replace -- find every map on my site, and generate a new widget from the new map provider. Lots of manual work. Maybe I can find or create a script, but even so, it feels a little brittle -- if I chose a different map style on one page, I might not find that one. And change my document thumbnail to move it to the other side of the page and shrink the thumbnail? Geez, I have dozens of those to do now.

This is the big "mistake" of embedded layouts -- managing content down the road.

And this is not new to Gutenberg -- the vast majority of page builders for WordPress essentially work the same way, embedding "short codes" into the body, and the only way to find them is search.

This is part of why I've heard many shops say you just need to start over and redo your website from scratch every few years.

If you've kept your content separate from the design, that is entirely not true -- having to rebuild your site is entirely the result of having your design too entwined with your content.

Repeating Templates -- a Hybrid

Nested Paragraphs Nested Paragraphs

In between these two extremes, there is a third way. The best current example of this approach is the Paragraphs module for Drupal.

Compared to field-based layouts, you can easily make pages with a bunch of varied layouts, changing the layout as desired, one row at a time. If you do this very much with a field-based layout, you end up with a bunch of blocks hanging out that can clutter other parts of your CMS, and you end up constantly tweaking the design to get a result that looks good.

Compared to Embedded layouts, your content is still entirely separate from the design, making it easy to re-skin down the road. And you can still use fields that can be extracted and compared/searched/reused, although doing that effectively takes a fair amount of upfront planning.

We typically create a set of paragraph types, such as:

  • Plain text
  • Pull quote
  • Image and text (image to the left or right)
  • Photo Gallery
  • Large media (video or image)
  • Columns, cards
  • Tab set, Accordion set
  • Slide carousel
  • Embed a view

When creating your content, you can add these in any order you like. We can provide particular classes for color variations, locked down to your brand's style guide.

The design remains very tightly controlled. Data is not necessarily easily reused -- but you can have a section of Paragraphs on content that still uses fields for all of the data management scenarios you like.

Because everything is templated, a re-skin of the site can make changes to one of these templates and it instantly applies everywhere it is used.

So which layout type should I use?

Should you use Excel or Word? Well, of course, you should use both, if you need them. There are very compelling reasons to use fields -- they are essential to Content Management, and many, many ways they make your work much easier. But there are times when dashing out a quick document, or web page, is needed right now.

By making Gutenberg its default editor, WordPress has gone entirely to the right side of that table -- they are trying to focus on being a good page builder, potentially at the expense of content management. Do you need content management? Depends on how much content you have to manage! If you're only talking about having a nice brochure site, and a steady stream of blog or news posts, this is all fine. But the more sophisticated your needs, the more you're starting to go against the grain. You can add fields to WordPress, and create views of content -- but this involves either creating some code or finding plugins that will help you do this -- many of which are not free/open source (a discussion for another post).

With Drupal, on the other hand, you can choose all three. You can even have all 3 on the same website! We are already using Gutenberg in Drupal on some sites, and we're using Paragraphs almost everywhere. Meanwhile we are very impressed with the new Layout Builder, and find it just the thing for making attractive layouts for certain types of content. You can have your Word, and your Excel too!

Jun 11 2019
Jun 11

Our lead community developer, Alona Oneill, has been sitting in on the latest Drupal Core Initiative meetings and putting together meeting recaps outlining key talking points from each discussion. This article breaks down highlights from meetings this past May. You'll find that the meetings, while also providing updates of completed tasks, are also conversations looking for community member involvement. There are many moving pieces as things are getting ramped up for Drupal 9, so if you see something you think you can provide insights on, we encourage you to get involved.

Drupal 9 Readiness

Meetings are for core and contributed project developers as well as people who have integrations and services related to core. Site developers who want to stay in the know to keep up-to-date for the easiest Drupal 9 upgrade of their sites are also welcome.

  • Usually happens every other Monday at 18:00 UTC.
  • Is done over chat.
  • Happens in threads, which you can follow to be notified of new replies even if you don’t comment in the thread. You may also join the meeting later and participate asynchronously!
  • Has a public agenda anyone can add to at https://www.drupal.org/project/drupal/issues/3054328
  • Transcript will be exported and posted to the agenda issue.

Meeting Highlights From 05/13/2019

Update on Drupal 9 dependency targets (CKEditor, Symfony, PHP)

  • The plan is to release Drupal 9 with CKEditor 4 support, add optional CKEditor 5 support during D9's active support, and deprecate CKE4 before D10
  • D9 will probably be EOL in Dec. 2023 and CKE 4 can have security coverage through then
  • Policy issue

Documentation Updates for Drupal 9

Drupal.org tasks for Drupal 9

Opened an issue today to track these and currently working on Drupal 9 plan summary via https://www.drupal.org/project/drupalorg/issues/3046058.

Deprecations

Upgrading to Drupal 9 version of Classy and Stable could be hard given the number of unknowns. We are proposing to provide the current versions of Classy and Stable in contrib space. This would allow pre-existing themes to add a dependency to the contrib version of the themes and have the exact same behavior in Drupal 9 as they had in Drupal 8.

To avoid this problem in Drupal 10, we are proposing to not ship Classy as part of Drupal 9. The feature gap would be replaced with a new starter kit theme.

Risks for June 2020 Target Date

  1. Biggest worries right now are: D7 to D8 migration support, including finishing multilingual support and UX improvements.
  2. Filtering fails that are not actionable.
  3. Roadmap for Simpletest moving forward. Currently Simpletest isn't formally deprecated, we're welcoming any thoughts and suggestions here https://www.drupal.org/project/drupal/issues/2866082.

Meeting Highlights From 05/27/2019

New Drupal 9 Readiness Features & New Contrib Readiness Data

  1. https://twitter.com/DropIsMoving/status/1130868996771844096 is now live on drupal.org.

  2. Also Dwayne McDaniel figured out a new way to run his contrib analysis scripts and published a fresh set of data at https://github.com/mcdwayne/d9-drupal-check-allthemodules.

  3. Gábor Hojtsy took that and analyzed the top ones up to 50 uses and created this summary doc with docs pulled from api.d.o with a script: http://hojtsy.hu/blog/2019-may-24/analysis-top-uses-deprecated-code-drupal-contributed-projects-may-2019.

  4. Also the coder rules got fixed to conform better to the core trigger_error()/@deprecated formats and a new coder release is out now, so core can update to that https://www.drupal.org/project/drupal/issues/3049433.

Documentation Updates for Drupal 9

An extensive review was completed of the Drupal 9 docs. The review yielded grammatical improvements to improve English formatting.

New Critical for the D9 Roadmap

There's some discussion already in the other meta https://www.drupal.org/project/drupal/issues/2866082.

Sprint plans for DrupalDevDays Transylvania?

  • WebTestBase is officially deprecated, "Convert contrib test suites to BTB" might be a good sprint topic.
  • Alex Pott has a good start on getting the conversion docs a little more up to date and we're trying to continue to add to it, https://www.drupal.org/node/2166895.

Release schedules (branch opening, alpha, beta, rc) for Drupal 8.9 and 9.0

  • Beta1 would be tagged the first week of April 2020 if we are ready for 9.0.0 in June.
    • If we're not ready for 9.0.0 in June by say March, then we switch 8.9 back to a normal minor with a normal minor schedule (and announce the December release date).
    • Also with regards to documenting on the releases page we should probably inform the rest of the committers first. The email should outline the alphas, March as a go/no go decision, deadline for June vs. Dec. 3. as well as the beta and RC dates.

Freezing entity schema updates between 8.9 and 9.0

  • With 8.7 especially, but not only, entity schema updates are causing a lot of trouble for sites trying to update. Sometimes it is a bug in the update, just as often it is corrupted data or custom code on sites trying to update. Given 9.0.0 will have a lot of other changes, we should consider not committing any big entity schema updates to 8.9/9.0.

  • Opened an issue, freeze entity schemas in 8.9.x and 9.0.x to work on solving this.

Out of the Box Initiative Meeting 05/28/2019

  • We want to create a roadmap for Drupal 8.8 release. You can find the details here: https://www.drupal.org/project/drupal/issues/3047414
  • Talking about SimplyTest.me Umami Demo.
    • Trying to shave off all installation time and figure out where it goes.
    • Adding an option with and without multilingual.
  • Talking about creating help pages about all new features and things in 8.8 release.
  • Add a floating tour button on pages that have an explanation (need to create an issue).
  • Implement Layout Builder on every page, recipe, and article.
  • Working on Umami’s Language-switcher as a drop-down menu.

Layout Initiative Meeting 05/29/19

Currently, when using Layout Builder, the already added sections give no indication of their layout or its configuration. This isn't a big deal for sighted users who are only using the default Layout Builder layouts, because the only configuration is the widths of the columns, which they'll be able to see visually.

However, for non-sighted users, or if a layout has more complex configuration (for example, using different proportions for different view port sizes, or adding class names, or anything since layout plugins have full control over their settings and rendering), then there is no indication given of the section's layout or its configuration.

An issue was created to document this concern.

graphic with drupal logo next to phrase community of support

That's A Wrap

Check back for frequent meeting recaps to gather insights on the latest Drupal Core developments and find ways to get involved. Our community thrives when we all come together!

Jun 11 2019
Jun 11

Every once in a while you have those special pages that require a little extra something. Some special functionality, just for that page. It could be custom styling for a marketing landing page, or a third party form integration using JavaScript. Whatever the use case, you need to somehow sustainably manage JavaScript or CSS for those pages.

Our client has some of these special pages. These are pages that live outside of the standard workflow and component library and require their own JS and CSS to pull them together.  Content authors want to be able to manage these bits of JavaScript and CSS on a page-by-page basis. Ideally, these pages would go through the standard development and QA workflow before code makes it out to production. Or perhaps you need to work in the opposite direction, giving the content team the ability to create in Production, but then capture and pull them back into the development pipeline in future deployments?

This is where Drupal 8’s Configuration Entities become interesting. To tackle this problem, we created a custom config entity to capture these code “snippets”. This entity gives you the ability to enter JavaScript or CSS into a text area or to paste in the URL to an externally hosted resource. It then gives you a few choices on how to handle the resulting Snippet. Is this JavaScript, or CSS? Do you want to scope the JavaScript to the Footer or the Header? Should we wrap the JavaScript in a Drupal Behavior?

Once the developer makes her selections and hits submit, the system looks at the submitted configuration and if it’s not an external resource, it writes a file to the filesystem of the Drupal site.

Now that you’ve created your library of Snippets, you can then make use of them on your content. From either your Content Type, Paragraph, or other Content Entity – simply create a new reference field. Choose “Other”, then on the next page scroll through the entity type list till you get to the configuration section and select JSnippet. Your content creators will then have access to the Snippets when creating content.

By providing our own custom Field Formatter for Entity Reference fields, we’re then able to alter how that snippet is rendered on the final page. During the rendering process, when the Snippet reference field is rendered, the custom field formatter loads the referenced configuration entity and uses its data and our dynamically generated library info to attach the relevant JavaScript or CSS library to the render array. During final rendering, this will result in the JavaScript or CSS library being added to the page, within its proper scope.

Because these snippets are configuration entities, they can be captured and exported with the site’s configuration. This allows them to be versioned and deployed through your standard deployment process. When the deployed configuration is integrated, the library is built up and any JS or CSS is written to the file system.

Want to try it out? Head on over to Drupal.org and download the JSnippet module. If you have any questions or run into any issues just let us know in the issue queue.

Jun 11 2019
Jun 11

Virtual. Remote. Distributed. Pick your label. This style of organization is becoming wildly more in demand and popular among many agencies and organizations. It saves the cost of office space, allows for hiring the best talent possible regardless of location, can be a huge bonus to employees who require flexibility in their schedules, and saves everyone time in commuting assuming they don’t go to a shared work space. You can even wear what you want (being mindful of video chats, of course).

The flipside? While many folks have gone remote, some people find the experience quite isolating and disconnected. Does remote work make people happier? Does it make them more productive? From my experience running a remote-only agency, the answer is not really. Going for days not seeing another human in person can be extremely isolating and demotivating. And while it seems as though you’d have more time at your computer, and therefore would be more productive, often the opposite is true: it can often be harder to have focused time to work on tasks if you are at home with multiple screens. And even worse if you are distracted by anything at home (deliveries at your door, that laundry in the corner, etc).

It can also be physically damaging: the human body is not designed to sit at a desk for long periods of time, and there’s less incentive to get up and move if you don’t have to move more than a few feet to your computer.

I know I’ve experienced all those issues. So I feel everyone’s pain. Literally.

The main reason Kanopi Studios exists is to support humans in every way.

We support our clients by giving them great work so they can be successful online, but additionally Kanopi serves to support its employees so they are successful in both their work and home lives. We want our people to always be happy, fulfilled, and constantly evolving in a positive way. So it’s critical that we create an environment and culture that fosters practices that provide meaning, collaboration, and happiness regardless of location. It’s also critical that employees feel empowered to speak up if they are feeling the negative repercussions of remote work.

As CEO, it’s my job to give my staff the right tools and systems so that they are as happy and healthy as possible, and to create connectivity in Kanopi’s culture. Building and sustaining strong relationships requires a unique approach that makes use of a variety of tools to create the right work culture to combat the isolation.

There’s a session I give on this very topic, and the DrupalCon video is linked below. I cover how to be the best remote employee, as well as how to support your team if you are a leader of a remote team. I give key tactics to keep you (and all other staff) inspired, creative, productive and most importantly, happy! I hope you find it helpful in making your own work environment as connected and collaborative as possible, no matter where you are.

[embedded content]
Jun 11 2019
Jun 11

Recently I was interviewed on RTL Z, the Dutch business news television network. In the interview, I talk about the growth and success of Drupal, and what is to come for the future of the web. Beware, the interview is in Dutch. If you speak Dutch and are subscribed to my blog (hi mom!), feel free to check it out!

June 11, 2019

15 sec read time

db db
Jun 11 2019
Jun 11

Websites need to look pretty and be blazing fast. That often means lots of beautiful high-quality images, but they can be pretty enormous to download, making the page slow to load. Images are often one of the 'heaviest' parts of a website, dragging a visitor's experience down instead of brightening it up as intended. If a website feels even a tiny bit unresponsive, that tarnishes your message or brand. Most of us have sat waiting frustratedly for a website to work (especially on mobile), and given up to go elsewhere. Drupal can be configured to deliver appropriately-resized versions, but what's even better than that?

Lazy image loading

Don't send images to be downloaded at all until they're actually going to be seen! Browsers usually download everything for a page, even if it's out of sight 'below the fold'. We know we can do better than that on a modern website, with this technique called lazy image loading.

Lazily loading an image means only sending it for a user to download once they are scrolling it into view. Modern web browsers make this surprisingly simple to achieve for most images, although there are often a few that need special attention. When combined with optimisation from Kraken.io, and other responsive design tricks, performance can sky-rocket again. Check out our case study of NiquesaTravel.com for a great example using this.

Niquesa is a luxury brand for busy people, so the website experience needs to be smooth, even when used on the go over a mobile network. Perhaps more than that, SEO (search engine optimisation) is critical. Their bespoke packages need to show up well in Google searches. Google promotes websites that perform well on mobile devices - so if your site is slow, it needs to be sped up. It's not just that you'll lose out on competitive advantage and tarnish your brand: people simply won't find you.

You can see what Google thinks of your website performance by using their PageSpeed Insights tool. That gives you an overall score and lists specific improvements you can make. Niquesa asked us to boost their score, especially for mobile devices. So we looked to speed up anything slow, and to reduce the amount of things there are to download in the first place. Any website can use that approach too. Lazy image loading speeds up the initial page load, and reduces the amount to download.

This stuff should be standard on most websites nowadays. But many web projects began well before browsers supported this kind of functionality so still need it adding in. As an ever-improving platform, the internet allows you to continually improve your site. There's no need to feel locked in to a slow site! Get in touch with us if you're interested in improving your website with lazy loaded imagery. Who wouldn't want beautiful high-quality media and great performance on any device?

Can you teach me to be lazy?

Sure! Rather than using the normal src attribute to hold the image file location, use a data-src attribute. Browsers ignore that, so nothing gets downloaded. We then use the browser's Intersection Observer API to observe when the image is being scrolled up into view. Our javascript can jump in at this point to turn that data-src attribute into a real src attribute, which means the browser will download the real image.

On its own, that wouldn't take very long to set up on most websites. But on top of this, we often go the extra mile to add some extra optimisations. These can take up the majority of the time when applying lazy loading to a website, as they are a great improvement for the user experience, but usually need crafting specifically for each individual project:

  • Images defined via style or srcset attributes (rather than a src attribute) and background images in CSS files, need similar handling. For example, use a data-style or data-srcset attribute.
  • Images that we expect to be immediately in view are excluded from any lazy loading, as it is right to show them immediately.
  • It may be important to keep a placeholder in place of the real image, perhaps either to keep a layout in place or in case javascript is not running. Styling may even need to be tweaked for those cases. Sadly it's not unusual for third-party javascript out of your control to break functionality on a page!
  • Dimensions may need some special handling, as Drupal will often output fixed widths & heights, but responsive design usually dictates that images may need to scale with browser widths. If the real image is not being shown, its aspect ratio may still need to be applied to avoid breaking some layouts.
  • Some design elements, like carousels, hide some images even when they are within the viewport. These can get their own lazy magic. One of our favourite carousel libraries, Slick, supports this with almost no extra work, but many designs or systems will need more careful bespoke attention.

Here is a basic example javascript implementation for Drupal:

(function($) {
  // Set up an intersection observer.
  Drupal.lazy_load_observer = new window.IntersectionObserver(function(entries) {
    for (var i in entries) {
      if (entries.hasOwnProperty(i) && entries[i].isIntersecting) {
        var $element = $(entries[i].target);
        // Take the src value from data-src.
        $element.attr('src', $element.attr('data-src'));
        // Stop observing this image now that it is sorted.
        Drupal.lazy_load_observer.unobserve(entries[i].target);
      }
    }
  },
  {
    // Specify a decent margin around the visible viewport.
    rootMargin: "50% 200%"
  });

  // Get that intersection observer acting on images.
  Drupal.behaviors.lazy_load = {
    attach: function (context, settings) {
      $('img[data-src]', context).once('lazy-load').each(function() {
        Drupal.lazy_load_observer.observe(this);
      });
    }
  };
})(jQuery);

(This does not include a fallback for older browsers. The rootMargin property, which defines how close an element should be to the edge of the viewport before being acted on, might want tweaking for your design.)

Drupal constructs most image HTML tags via its image template, so a hook_preprocess_image can be added to a theme to hook in and change the src attribute to be a data-src attribute. If required, a placeholder image can be used in the src attribute there too. We tend to use a single highly-cacheable transparent 1x1 pixel lightweight image, but sometimes a scaled down version of the 'real' image is more useful.

The lazy loading idea can be applied to any page element, not just images. Videos are a good candidate - and I've even seen ordinary text loaded in on some webpages as you scroll further through long articles. Enjoy being lazier AND faster!

Image: Private beach by Thomas

Jun 11 2019
Jun 11

Advanced Encryption Standard, where we use “AES-256” to encrypt the data with Cipher. Encrypt & Decrypt approach taken is “Cipher Block Chaining” method “AES-256-CBC”.

AES Encrypt

  • We would have the “Secret” stored in a file which is other than the web root.
$key = hash('sha256', $secret, true);
  • Hash the “Secret” with sha256, this gives you the “Key” which will be used to openssl encrypt.
  • And Generate the pseudo random bytes as “IV”, so that it would be used during encryption and also be attached to the encrypted data.
$iv = openssl_random_pseudo_bytes(16);
  • Now encrypt the “String” with openssl encrypt by passing the “AES-256-CBC” method, “Key” and “IV”
$ciphertext = openssl_encrypt($plaintext, $method, $key, OPENSSL_RAW_DATA, $iv);
  • “openssl_encrypt” will Encrypt given data with given method and key, returns a raw or base64 encoded string.
  • “Hash” the returned “Cipher” text with sha256 hmac method
$hash = hash_hmac('sha256', $ciphertext, $key, true);
  • Now concatenate the “IV” & “Hash” & “Cipher” and store in the DB as the encrypted value.

AES Decrypt

  • Hash the “Secret” with sha256, this gives you the “Key” which will be used to openssl encrypt.
$key = hash('sha256', $password, true);
  • Explode the concatenated string to “IV” & “Hash” & “Cipher”
$iv = substr($ivHashCiphertext, 0, 16);

$hash = substr($ivHashCiphertext, 16, 32);

$ciphertext = substr($ivHashCiphertext, 48);
  • “openssl_decrypt” will take a raw or base64 encoded string and decrypts it using a given method and key.
  • Now decrypt the “Cipher” with “AES-256-CBC” method, “Key” and “IV”
openssl_decrypt($ciphertext, $method, $key, OPENSSL_RAW_DATA, $iv);
  • Return the decrypted “String”.it is ok, or do i need to change it to excel.

Cheers :)

Jun 11 2019
Jun 11

Change is the only constant and yet what one fears the most is change. But it is rightly said about change - “Don’t be afraid of change. You may lose something good, but you may gain something better.” We’ll like to say the same about the fear you hold for changing the current version of your Drupal 6/7 site to Drupal 8. Well, we also know that its more of a confusion than the fear of change, since you’re stuck between the two thoughts - whether to upgrade now to Drupal 8 or wait for Drupal 9. What if we say, we offer you a solution that will hit both the birds with one stone?

An Easy, Inexpensive & Drupal 9 Compatible Migration!

We have been an active Drupal community member since the past 6+ years, 7+ Drupal projects supported, 5000+ successfully delivered international projects and 500+ international Drupal projects - out of which 100+ projects are of Drupal Migration. And hence, we can help you in migrating your current Drupal 6/7 site to Drupal 8 and that too in a way that you will not have to spend a single penny for migrating to Drupal 9 in future. There’s a bunch of rational reasons to back this statement and offer of ours, which we’ll like to share with you:
 

  • Change in Drupal Philosophy
    Previously, every Drupal upgrade was considered to be tedious and more of a technical task as compared to its counterpart CMS platforms. This is because Drupal 8 was created with a philosophy of bridging the gap between the technical developer and a layman-like admin. And taking this philosophy of positive change, Drupal 9 is going to bridge the gap of upgrade issue by introducing compatibility between its older and newer version - making the entire process effortless and inexpensive.
     

  • Upgrade-based Modules
    The compatibility between the older and newer version of Drupal majorly depended upon the modules and themes used while building the older version. Until and unless these modules and themes aren’t upgraded, the migration was a time-taking task and tedious task that required technical assistance. This has been changed with the change in the upgrade path of the content, which makes the migration easier if prepared.
     

  • Drupal Core Deprecating Policy
    Drupal 8 capable of introducing new APIs and features against the old ones. And once these new ones are launched, the old ones automatically get deprecated. Though these old APIs cannot be removed in the minor release of  Drupal 8, it will be removed in the next major version of Drupal 9. Hence, if you migrate to Drupal 8 now, the migration to Drupal 9 can easily be done with just a handful of changes to make it compatible.
     

Looking at the above three major reasons, it must be clear to you that migrating to Drupal 9 from Drupal 8 is far easier as compared to the migration from Drupal 6/7 to Drupal 9. Dries Buytaert, the founder of Drupal, has also shared similar information about planning to be done for Drupal 9. According to him, Drupal 9 is basically built in Drupal 8 instead of a different codebase, altogether. This implies that the new features are added as backward-compatible code and experimental features, which means once the code is stable the old functionality will be deprecated.
 

Dries, in his blog on ‘Plan for Drupal 9’, has quoted contributed module authors as one of the core reasons behind the easy migration from Drupal 8 to Drupal 9. On this, he says that these are the module authors are already well-equipped with the upcoming technologies of Drupal 9 and hence they can priorly work in a manner that is Drupal 9 compatible. AddWeb, being one of these contributing members of the community, can assure you of the easy and inexpensive migration to Drupal 9 as and when it arrives.
 

Why Vouch for Drupal 9?
Now, after grasping all the above information regarding the upcoming major release of Drupal 9, you must be wondering what’s in Drupal 9 to vouch for. Let us throw some light on the same, to be able to bring some clarity for you. Drupal 9 is all about eliminating the use of deprecated modules and APIs. Drupal 8, which runs on the dependency of Symfony 3, will run out from the market by November 2021. And hence, it is highly advisable to upgrade and avail the benefits of all that’s latest!
 

Concluding Words:
As an expert #Drupal-er and active community member, AddWeb is all set to offer you with this amazing opportunity to migrate from your current Drupal 6/7 site to Drupal 8, in a way that the future migration to Drupal 9 will be super easy and inexpensive. Share your details with us in here and let our Drupal Migration Experts get back to you. In case, of any queries or suggestions feel free to get in touch with us!

Jun 11 2019
Jun 11

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

Recently, GitHub announced an initiative called GitHub Sponsors where open source software users can pay contributors for their work directly within GitHub.

There has been quite a bit of debate about whether initiatives like this are good or bad for Open Source.

On the one hand, there is the concern that the commercialization of Open Source could corrupt Open Source communities or harm contributors' intrinsic motivation and quest for purpose.

On the other hand, there is the recognition that commercial sponsorship is often a necessary condition for Open Source sustainability. Many communities have found that to support their growth, as a part of their natural evolution, they need to pay developers or embrace corporate sponsors.

Personally, I believe initiatives like GitHub Sponsors, and others like Open Collective, are a good thing.

It helps not only with the long-term sustainability of Open Source communities, but also improves diversity in Open Source. Underrepresented groups, in particular, don't always have the privilege of free time to contribute to Open Source outside of work hours. Most software developers have to focus on making a living before they can focus on self-actualization. Without funding, Open Source communities risk losing or excluding valuable talent.

Jun 10 2019
Jun 10

Why?

Because instead of building a radically new version of Drupal in a separate codebase, Drupal 9 is being built in Drupal 8.

You might be thinking… “Huh?!”

Well, what this means is that the upgrade experience will be as smooth as a monkeys bottom.

Drupal 9 will essentially be just like another minor core update in Drupal 8. 

What is a minor core update? Quite simply, it’s the middle number in the version of Drupal you are running.

Core updates come out roughly every 6 months and keeping your site up-to-date with these is critical in making sure it’s well maintained.

Drupal Patch MeaningPhoto credit: From Acquia webinar

Drupal 9 release date

So when is Drupal 9 expected to be released?

From the information we have so far, it's scheduled for the second quarter of 2020.

So why was this date chosen in the first place?

Simple. A little thing called Symfony 3.

Drupal 8’s biggest dependency is Symfony 3, which has an end-of-life date of November 2021.

This means that after November 2021, developers will not resolve any security bugs in Symfony 3, and Drupal will be in the same situation.

Drupal 9 will be using Symfony 4 or 5 and won’t have to worry about this issue.

Drupal 9 release datePhoto credit: From Acquia webinar

What does this mean for Drupal 6 or 7 sites?

Well, it means you’re missing out on a helluva lot of great new features, you have potential security risks and are flat our hindering yourself from being able to deliver amazing user experiences that will help your business grow.

If you’re on Drupal 6, support ended as far back as February 2016 and you’ve got serious unmitigated security risks.

You’re on Drupal 7, support ends in 2021 and we all know how fast time goes.

If you haven’t started planning or budgeting for this, its time to start now.

The migration from either 6 or 7 to 8/9 is going to be painful and cost-intensive but it will be the last great migration Drupal will need to incur.

Does this resonate with you? Let's chat!
 

Clutch 5 Star Review

Jun 10 2019
Jun 10

This post was written by Adam Bergstein with input and feedback from Aimee Degnan and Ryan Bateman.

Hook 42 recently had a project that required the use of Kubernetes. Kubernetes is an orchestration tool that helps solve many enterprise problems when using multiple containers. People often want redundancy and scale across a series of containers or the same set of containers across multiple machines. Kubernetes helps solve this. It also can help perform orchestration tasks during failures or to distribute load between containers. Managing containers at scale can be a challenge and the goal of Kubernetes is to help.

We have long been tracking the efforts of Lagoon, a promising open source continuous integration and continuous delivery (CI/CD) hosting platform that is developed with Kubernetes. CI/CD is built around the concept of rapid deployments of ongoing, frequent changes. This lowers the risk presented by larger, fixed deployments by promoting smaller, more routine deployments. Lagoon not only offers hosting-related tools, but the platform is able to run locally as well. The CI/CD capabilities helped create testable environments as we pushed changes through our development workflow. We want to share our experience. 

Understanding Considerations

There are some key concepts to understand how Lagoon works before diving in.

Persistence

Drupal applications require a “source of truth” for persistent content, which includes a Drupal application’s managed files and database. Production environments (or pre-production before a system is launched) conventionally serve as the source of truth environment within a development workflow. Content is subsequently pulled from the source of truth and it is only managed by changes made directly on the production system. Where code can be pushed through environments, content should always get pulled from production. 

Repositories

Code repositories are critical for managing code and deployments. Each change is one or more commits that can be staged within specific branches. Changes can be merged into other branches, rolled back if there is an issue, and audited as a series of changes. 

Hosting providers offer varying conventions and integrations tied to code repositories. As a simple example: both Pantheon and Acquia offer their own Git repositories commonly synchronized with a Github or Gitlab repository. The Github/Gitlab repositories offer development tools like pull requests or integrations to help support team-based workflows. 

Hooks/Events

Both repositories and hosting platforms expose relevant hooks that are useful for performing actions during DevOps events. This is how automation can be built into specific changes. Automation is critical for any CI/CD infrastructure, as it’s not manageable or practical to manually rebuild environments as each frequent change occurs. 

Creating and maintaining branches, tags, and pull requests tied to repositories create opportunities for automation that are commonly leveraged repository events in our DevOps infrastructure. Even synchronizing between two repositories can be a useful DevOps trigger, as this signifies code is ready for some degree of deployment or testing. 

Any sort of deployment found in CI/CD workflows often require rebuilding containers. This is common for all environments. Persistent aspects may be left untouched, while the containers for a given environment are rebuilt as new changes are deployed.

On-demand Environments

Cloud infrastructures changed the traditional way of understanding environments. On-demand environments are a result of rapid change and are transient. This is in contrast to an environment traditionally configured on a bare metal server. On-demand environments are commonly provisioned with new branches, tags, or pull requests and destroyed when the changes are deployed. They are not intended to persist.

In a CI/CD workflow, incremental changes are verified before the production deployment. Development-specific branches and pull requests can build new environments known as on-demand environments. Persistent contents from the “source of truth” are copied with changes pushed to a branch on a new environment. This helps to verify and mimic production behavior with the new change. And, the environment subsequently goes away when the change is deployed.

Fixed Environments

Hosting providers still commonly offer “fixed” environments as people are not often comfortable merging an on-demand environment right into production. But, fixed environments exist for an intended purpose. As an example, a production environment is intended to be what end-users access. Other environments are commonly used to vet changes before a production launch. Vetting may include any of the following purposes: proper deployment, stakeholder approval, automated tests, and verified with the most recent code (changes pushed while the code was developed and/or staged). The same fixed environments can be uniquely configured and used in those capacities in a more permanent capacity.

Exploring Lagoon

Every hosting platform has a set of best practices tied to their platform that drives intended use. While any platform, like Lagoon, can seem opinionated, our focus is to connect the aforementioned concepts to their specific Lagoon equivalent. 

Repositories, Hooks, and Events

Lagoon, at this point, attaches to a remote repository, like Github or Gitlab. Lagoon integrates through the repository platform, via a webhook or continuous integration system. The hook is intended to be invoked during specific events like branch creation and/or pull request to help create the CI/CD on-demand environments. This approach shaped our development and release workflow to the Lagoon platform, which we elaborate more on below. 

Fixed Environments

The first thing our team did was set up fixed environments tied to specific branches. This met the need of having changes go through a conventional development, staging, and production release cycle. Within Github, there are subsequent branches for each fixed environment. As a best practice, each branch was locked within Github. This is to ensure a branch was not accidentally removed, which may remove a fixed environment entirely. 

Github repositories often leverage “master” as the default branch. We’ve selected that as our branch for our development server. This is useful for pull requests, as “master” is selected by default and our team didn’t need to worry about selecting the wrong branch. Doing so may trigger a code deployment to another, unintended environment.

On-demand Environments

Lagoon maintains two events for provisioning on-demand environments. The first is pull requests. Pull requests seem to be useful if you are operating in a pure CI/CD environment where changes can be tested as an environment tied to the change. But, pull requests often are not created until a potential change is ready to be reviewed and possibly merged. This would be ideal for an environment to do smoke testing. But, we desired to have environments for work-in-progress as well, where anyone could push a commit to a branch, demonstrate some work, or get help on something. We opted not to use pull requests for that reason.

The second provisioning event is through a branch pattern. This leverages a naming convention to create on-demand environments. Lagoon monitors the creation of any branch that matches the pattern and creates an environment. This is helpful for the initial testing of changes.

Release Workflow 

Our release workflow is based on staggered branches. Changes are pushed to the on-demand branches and prepared for initial review. A pull request is made for the code review and smoke testing occurs on the on-demand environment. Once the pull request is merged into master, our development server is rebuilt (any commits pushed to master trigger a rebuild of the development server). We close all on-demand branches at this point, which subsequently removes the environment on Lagoon. 

Once the merge is complete, additional quality assurance, client review, and automated testing occurs through development and staging environments. This happens by making a pull request from the master to staging branches, creating a release candidate. 

With all of the verification passing, we are able to initiate the release. This is done via a pull request from the staging to production branches. Once the pull request is accepted, the production release occurs. All of this is automated thanks to the hooks and events tied to the Lagoon platform.

Persistence

Lagoon rebuilds environments on every push to a branch. In our workflow, accepted pull requests push vetted code to branches. Persistence becomes a major factor when environments can be rebuilt in this fluid manner. 

Lagoon also provides the ability to configure what environment is deemed the production environment. This is subsequently protected within Lagoon and persistent. The “production” branch, and its subsequent environment, represent the source of truth for database and files. Rebuilding the database and files on a production release is risky, so this mechanism needs to exist to differentiate from the other more fluid environments that get rebuilt more routinely. These protection mechanisms helped avoid production data being overwritten through the API or any unintended impact by a production deployment and subsequent rebuild. Not only that, but this helps identify what data needs to be routinely backed-up and maintain high fidelity.

DevOps

All non-production environments should automatically load a copy of the production database and files when new changes are pushed to the subsequent branch. Verifying changes before being released to production was a critical DevOps automation for us.  We leverage hooks in Lagoon (defined in the .lagoon.yml file) with Drush commands and aliases to identify the environments that synchronize the database and files from production every time an environment is rebuilt.  

Code artifacts were also a vital part of our DevOps automation. We leveraged Composer to build the Drupal codebase (using the great Composer template for Drupal projects - drupal-composer/drupal-project) and Gulp to build the theme. Custom code was committed to the repository. This allowed us to easily and routinely evaluate changes to core and contrib.

Once our code was built, we executed Drush commands to import the configuration, run database updates, and clear caches to ensure changes were properly deployed. While this does not catch every possible nuance in deploying code (e.g. rebuilding entities), we automated a significant portion of this that should minimize the need for running manual commands. 

Observations

Lagoon is doing some innovative work, especially for Drupal teams looking for a CI/CD platform adopting rapid releases. Their “infrastructure as code” implementation, through their Docker images and .lagoon.yml configuration, enable rapid, effective change at the heart of DevOps that can help continuous learning and the subsequent predictability of automation. 

As expected, coming from the perspective of using other hosting platforms like Acquia and Pantheon, there was some learning to adapt to the fluid CI/CD nature of the platform. Lagoon’s local implementation replaced our standard MAMP, DDEV, and Lando setups. Being able to add in restrictions, like branch locking, was beneficial to our teams transition when configuring the infrastructure. Also, there were some different conventions, like leveraging drush aliases through Lagoon’s CLI container and not through a local Drush, that were unique. But, many of our existing knowledge and concepts were the same or could easily be mapped to their Lagoon equivalent.

After some trial and error, we were able to share some feedback with the Amazee team on improvements to their documentation and relevant code samples we felt could help others. Hopefully shaping the path forward for others being onboarded can make it easier to digest for those new to the platform.

Overall, Lagoon shows a lot of promise for modern workflows. While different than other hosting platforms, Lagoon enabled our team to work effectively and efficiently from start to finish. We’re excited to see how the platform evolves and continues to provide solutions capable of rapidly changing to our customer’s needs. 

Jun 10 2019
Jun 10

The audience revels in the magnificent performances of the actors, picturesque visuals, breathtaking action sequences, alluring background score, thoughtful dialogues, and emotions attached to the narrative. To bring them all out in the best possible way on to the screen, there goes an exceptional direction and screenplay behind-the-scenes in addition to a massive swathe of people who are involved in different parts of the film. Apparently, a film works wonders when both the onscreen elements and the off-screen elements strike the right chord.

Interior of cinema hall with red chairs


A similar theory is of paramount significance in the case of web development. The rapid evolution of diverse end-user clients and applications have resulted in a plethora of digital channels to support. Monolithic architecture-powered websites leverage web content management solutions for disseminating content via a templating solution tightly coupled with the content management system on the backend. Propelled by the need to distribute content-rich digital interactions, application development and delivery (AD&D) professionals, who are supporting content management systems (CMS), are showing an inclination towards an API-first approach.
 
Headless CMSes have been leading the way forward to provide a spectacular digital experience and Drupal, being API-first, is a quintessential solution to implement a headless architecture. Before we move forward, let’s briefly look at how significant is content for your online presence and how the headless CMS is fulfilling the needs of organisations.

Content: Linchpin of ambitious digital experience

It is difficult to envisage a digital screen without content as every single moment that we spend on a smartphone, laptop, tablet, or a smartwatch is enriched with digital content like images, text, video, product reviews and so on. Even when we talk to a voice assistant and inquire about something, its answers constitute words, links, pictures, maps etc. (again, it’s all content). The relevance quotient of that content should be top-of-the-line as it is the medium that enables users to experience their digital interactions. This makes content the linchpin of ambitious digital experiences.

The relevance quotient of content should be top-of-the-line as it is the medium that enables users to experience their digital interactions

Several content repositories are struggling to meet today’s digital requirements. When the world was just web and email, governance of dynamic content dissemination worked perfectly fine using a web CMS. A web CMS has been an astronomical solution for offering unique designs, WYSIWYG authoring, a workflow for approvals and translation, fantastic marketing capabilities and internet-scale delivery.

Forrester’s The rise of the headless content management system report states that web CMSes has led to a cluster of content with markup and metadata. Moreover, if you optimise your content repository for HTML templates, it would require you to undo all the optimisations in order to use the content elsewhere in a non-HTML format. Also, tightly coupled approaches did not need APIs (Application Programming Interfaces) connecting the repository to the delivery tier or the content editing and workflow tools. And, selling the content repository and delivery environment together is great for web-only scenarios but reusing the content on the mobile application or in email marketing would still require you to run the entire web CMS stack.

There is where the need for headless CMS kicks in. It uses modern storage, stateless interfaces and cloud infrastructure for the efficacious delivery of Internet-scale content experiences on any device.

Uncloaking headless CMS

Illustration having traingles and rhombus consisting of icons representing shopping cart, Source: Forrester

Headless CMS is a content component in a digital experience architecture that interacts with other components comprising of authoring, delivery front ends, analytics tools through loosely coupled APIs. It does not do any rendering of content and the rendering is decoupled from the management interface which is why terms ‘headless’ and ‘decoupled’ are used interchangeably.

Headless CMS stores the content, offers a user interface for the creation and management of content, and provides a mechanism for accessing the content through REST APIs as JSON

While ‘head’ refers to the frontend rendering or the presentation of the content, the ‘body’ refers to the backend storage and the governance of the content.

Headless CMS stores the content and offers a user interface for the creation and management of content. It provides a mechanism for accessing the content through REST APIs as JSON. So, it is also referred to as API-first CMS.

Content can be delivered to and integrated with the third party system like e-commerce tool. Or, it can be delivered to and exhibited using front end technology in the browser, mobile app or syndication service. Headless CMS is a wonderful content-as-a-service solution.

Flowchart containing rectangles to explain headless CMS and Traditional CMSSource: Contentstack

A traditional CMS confronts with the processes of creation of content, its dissemination and its display. It has a backend where the users can enter content which is stored in a database, retrieved, rendered into HTML on the server which is then delivered as fully rendered pages to the browser.

In contrast, headless CMS decouples the rendering and presentation system thereby enabling you to replace it with frontend or other technologies of your choice. The CMS will be a content store and web application for the content producers and the content is delivered to the frontend or another system through an API.

With the stupendous rise of headless architectures, a portion of the web is turning server-centric for data and client-centric for the presentation. This has given momentum to the ascension of JavaScript frameworks and on the server side it has led to the growth of JSON:API and GraphQL for better serving the JavaScript applications with content and data. Among the different web services implementations like REST, JSON:API and GraphQL, when we consider request efficiency, JSON:API is the better option as a single request is usually sufficient for most needs. JSON:API also is great in operational simplicity and is perfect while writing data.

Headless CMS decouples the rendering and presentation system thereby enabling you to replace it with frontend or other technologies of your choice

Headless CMS is advantageous for the following reasons:

  • You can instantly start with headless with no hurdles.
  • It does not require you to alter your existing delivery tier as it seamlessly fits into the existing architecture
  • It is perfect for building web and mobile applications as it allows practically any application- be it web, mobile, IoT(Internet of Things), smart TV or touchscreens- to pull and push content.
  • Frontend developers, backend developers, marketing and content editors can get started quickly and work autonomously.
  • You can give more power to the front-end developers as they simply work content APIs and do not have to learn inner functionalities of CMS or its templating system.
  • It follows the approach of ‘Create Once, Publish Everywhere’ thereby allowing you to reuse content for different channels.
  • It works tremendously well in a microservices environment and enables cross-functional teams to work via agile processes and get tasks done swiftly.

Going the Drupal way

Call it headless or decoupled, leveraging Drupal, as the central content service, is a magnificent solution to power your complete application and device ecosystem. Decoupled Drupal has the provision for omnichannel delivery of content that is quintessential for marketers and publishers.

Decoupled Drupal has the provision for omnichannel delivery of content that is quintessential for marketers and publishers

It enables the developer to leverage any technology for rendering the frontend experience instead of theming and presentation layers in Drupal. The Drupal backend exposes content to native applications, JavaScript application, IoT devices and other such systems. In addition to the modules for web service implementations like REST, GraphQL and JSON:API, Decoupled Drupal ecosystem also offers several other alternative modules that can be of huge help.

Flowchart consisting of rectangular boxes and arrows to explain types of decoupled Drupal Source: Dries Buytaert’s blog

There are different approaches to decouple Drupal:

Coupled Drupal

In traditional Drupal, also referred to as coupled Drupal, monolithic implementation is done in which Drupal has the authority over all frontend and backend side of your web application setup. Coupled Drupal is fantastic for content creators, especially when you are in dire need of achieving fast time to market without relying too much on front-end developers. Developers, who love Drupal 8 and want it to own the entire stack, still find it a great way of building a web application.

Progressively decoupled Drupal

Another way to utilise the power of Drupal is the progressively decoupled approach. It is a compelling approach for developing Drupal’s frontend where the governance of contiguous experiences is handled by content editors, site assemblers and the front-end developers. While content authors and the site assemblers get the benefits of contextualised interfaces, content workflow, site preview etc. to remain usable and incorporated with Drupal as a whole, a portion of the page to a JavaScript framework is dedicated for front-end developers to let them work autonomously. Progressive decoupling helps in utilising Drupal’s rendering system while simultaneously using a JavaScript framework for powering the client-side interactivity.

Fully decoupled Drupal

In fully decoupled Drupal, there is a complete separation between Drupal’s frontend and the backend. The Twig theme layer is replaced with a different frontend entirely. Native mobile or desktop applications, JavaScript single-page applications or IoT applications are some of the examples. RESTful API is leveraged by these applications to communicate with Drupal. RESTful API, which acts as a middle layer between frontend and backend, exposes resources as JSON or XML that can be queried or modified with the help of HTTP methods like GET, POST etc. Even though integral features like in-place editing and layout management are not available, the fully decoupled approach is preferred by developers as it offers ultimate authority over the frontend and is superb for those who are already experienced with the development of applications in frameworks like React, Vue etc.

Increasing intricacy of JavaScript development has given birth to JAMstack (JavaScript, APIs, Markup) which has, in turn, resulted in another very much favoured approach called fully decoupled static sites. Enhanced performance, security and reduced complication for developers have made static sites a favourite option among many developers. For instance, Gatsby, a static site generator, can retrieve content from Drupal, generate a static site, and deploy it to a content delivery network (CDN) via specialised cloud provider like Netlify.

Meritorious features of decoupled Drupal

Following are some of the major benefits of decoupled Drupal:

  • Syndication of content: Whether it is a coupled approach or a decoupled approach, Drupal remains the hub while developing experience ecosystems with all of them ingesting content from one source of truth.
  • Full separation: Even though monolithic and progressively decoupled approaches in Drupal has implicit separation of concerns and mostly couldn’t be seen by the user, fully decoupled architecture gives you an explicit separation between structured content that is governed by Drupal and its presentation which is managed by consumer applications.
  • User experience: Decoupled architecture offers an amazing user-centred experience. For instance, a JavaScript framework can be more suited to the task when it comes to an interactive application which is in dire need of frequent re-renderings of content.
  • Work in parallel: Decoupling also brings efficacy to a pipelined development process which involves teams working in parallel. A team of front-end developers can develop applications against a dummy web service API that is utilised only for the purpose of testing but not actually completed whereas the team of backend developers can administer the backend that exposes the API and the underlying processes yielding it.

Challenges of Decoupled Drupal

Some of the major hurdles while decoupling Drupal are mentioned below:

  • Editing and governance: Drupal 8’s wonderful features like in-place editing, configuration menus constituting certain page components, and some modules that include contextualised tools for Drupal governance won’t be available.
  • Security: Although JavaScript and application frameworks have the provision for defending cross-site scripting attacks, fully decoupled and progressively decoupled approaches put the obligation of carefully scrutinising the security implications.
  • Point of failure: Fully decoupled architecture require the use of stacks like MERN (MongoDB, Express, React, NodeJS) or MEAN (Angular instead of React) or other solutions that may imperative for native mobile or IoT applications. That means, it can be challenging to introduce an additional hosting stack into your firm’s infrastructure and can lead to an additional point of failure.
  • Layout management: Having to remove modules like Panels and Display Suite can be an issue for the developers causing obstacles to the marketing teams who do not have the access to developers who can help in implementing layout changes.
  • Previews: It can be challenging if your editorial team wants a previewable content workflow as it is used to working with coupled CMS.
  • Notifications: In a fully decoupled architecture, Drupal system messages, that are frequently highlighted at the top of rendered pages, are not accessible. Moreover, providing these messages in a progressively decoupled setup is not much of an issue.
  • Performance: BigPipe module works tremendously well in enhancing the web performance in Drupal that can match the page load performance of JavaScript applications. Fully decoupled architecture is devoid of this feature but progressively decoupled setup can give you the option of leveraging the feature.
  • Accessibility: Drupal won’t be providing the readymade frontend code or a roster of core interface components and interaction that can be relied upon which calls for front-end developers to build a suitable UX and ensure accessibility without the assistance of Drupal.

Strategies employed while choosing decoupled Drupal

Assessment of the organisational needs is instrumental to the decision-making process. Being abreast of the business requirements pertaining to building a robust digital presence helps you in forming an immaculate strategy while choosing decoupled Drupal.

For instance, selecting decoupled Drupal might or might not be an astounding option for developing a single standalone website. It depends upon the functionalities that are deemed as “really necessary” by your developers and content editors. In case, you are developing multiple web experiences, decoupled Drupal instance can either be leveraged as a content repository which is devoid of its public-facing frontend or simply as a traditional site that can act concurrently as a content repository. It, again, depends upon how dynamic you want your web application to be that would ultimately help in deciding a JavaScript of choice or even a static site generator.

Developing native mobile or IoT applications may require you to adopt a decoupled approach where you can expose web service APIs and consume that Drupal site as a central content service which is bereft of its own public-facing frontend.

The significant thing to take a note here is the stupendous capabilities of Drupal for supporting almost any given use case as it streamlines the process of developing decoupled Drupal. 

Case studies

Some of the biggest names in different industries have chosen decoupled Drupal to power their digital presence.

The Economist

Established in 1843, The Economist, which set out to take part in “a severe contest between intelligence, which presses forward, and an unworthy, timid ignorance obstructing our progress”, has seen staggering growth over the years and has earned great recognition in the world of media. It chose decoupled architecture for building a Drupal backend for the native iOS and Android Espresso applications with the help of a digital agency.

Screengrab of The Economist Espresso application showing the logo in red and black strips on top left corner


Drupal turned out to be an astronomical solution for the Economist editorial team. They could iteratively design and had a spectacular content creation and publishing workflow that met their requirements. It helped in incorporating features like automatic issue creation, approval of content, the look and feel of interfaces among others.

The customisation of Drupal content creation interface was done in a way that would avoid errors while formatting and enables content authors to emphasise on content. Editorial teams had the provision for a dashboard that could help in swiftly and efficaciously creating and publishing new issues. It also offered visual indicators of approval status, countdown timers for each region and quick links for all the articles.

Produce Market Guide

The website of Produce Market Guide (PMG), a resource for produce commodity information, fresh trends and data analysis, was rebuilt by OpenSense Labs. It involved interpolation of a JavaScript framework into the Drupal frontend using progressively decoupled Drupal that helped in creating a balance between the workflows of developers and content editors. The rebuilding process comprised of majorly progressively decoupled approach, React, Elasticsearch Connector module among others.

Homepage of Produce Market Guide with red strip on top and images showing people on right


The process of mapping and indexing on Elastic Server required ElasticSearch Connector and Search API modules. Elastic backend architecture building process was followed by the development of faceted search application with React and the integration of the app in Drupal as block or template page. The project structure for the search was designed and built in the sandbox with modern tools like Babel and Webpack and third-party libraries like Searchkit.
 
Moreover, Logstash and Kibana, that are based on Elasticsearch, were incorporated on the Elastic Server thereby helping in collecting, parsing, storing and visualising the data. The app in the Sandbox was developed for the production and all the CSS/JS was incorporated inside Drupal as a block to make it a progressively decoupled feature. Following the principles of Agile and Scrum helped in building a user-friendly site for PMG with a search application that could load the search results rapidly.

Princess Cruises

As one of the premiere cruise lines in the world, Princess Cruises innovatively metamorphosed their marketing landscape with the integration of decoupled Drupal. They went on to fundamentally change the way their guest accessed information while onboard their ships.

Princess Cruises webpage showing a mobile phone over a bluish background


The guests on their ships relied upon their smartphones to swiftly access information, purchase items and inform the management about anything. This led to the development of [email protected] with the objective of transforming Princess experience. It is a mobile application that is specifically designed for allowing guests to plan their day, assess the ship’s itinerary, scan through restaurant menus and book shore excursions on-the-go.

When the ships are sailing different parts of the world, the digital experience had to be reliable which called for a centralised way of administering content across several channels and touchpoints. This would enable them to offer a uniform experience on mobile and digital signage onboard the ship. Decoupled Drupal was chosen to serve content across multiple touchpoints and channels. Princess Cruises could create content once and publish everywhere thereby connecting every passenger to [email protected], hence Drupal.

NASA

NASA, an independent agency of the executive branch of the federal government of the United States, went for the decoupled setup for the redressal of their site with the help of an agency. Drupal and Amazon Web Services (AWS) turned out to be a wonderful match for meeting the content needs of both NASA and the public with user-driven APIs, dynamic host provisioning, scalability and security.

Homepage of NASA showing images of planets in space


The deployment of NASA’s website is done in numerous AWS availability zones and manages almost 500 content editors updating over 2000 content every day. On an average, it receives nearly a million page views a day and has even gone onto handle peak load of approximately 40,000,000 page views in a single day with groundbreaking feat of 2,000,000+ simultaneous users during NASA’s 2017 Total Solar Eclipse coverage.

Conclusion

Application development and delivery teams have already started exploring headless CMS tools along with numerous other sets of API-first microservices for building innovative solutions. These digital natives are adopting a do-it-yourself approach to digital experience architectures and dragging their organisations into the digital-first age.

Headless throws open interesting possibilities and challenges traditional ways of doing things. For a lot of organisations, it is no longer a question of whether they should go for headless or not but more of a contemplation of headless to assess where does the headless fit in their organisational setup. Moreover, the growth of microservices architecture will continue to give that extra push to headless or decoupled approaches.

Decoupled Drupal is an outstanding solution for implementing headless architecture. It acts as a central hub, processing and curating content and data from other tools and services while simultaneously sharing its own content and data via APIs. With the stupendous flexibility, scalability and content authoring capabilities of headless approaches, digital firms can enjoy seamless creativity and innovation as they build their digital architectures.

We have been perpetually working towards the provision for great digital experiences with our suite of services.

Contact us at [email protected] to get the best out of decoupled Drupal and ingrain your digital presence with its superb capabilities.

Jun 10 2019
Jun 10

Hussain began working with PHP in 2001, and at that time, wouldn't touch any CMS or framework and preferred to write his own. He grew tired of issues with PHP and was about to switch to another language when he came across a volunteer project that needed Drupal's capabilities, so in 2010 he tried Drupal 6.

Jun 10 2019
Jun 10

Smart business decisions tend to be equated with cutting costs and saving money.
 
Over the past decade or so, “Better! Faster! Cheaper!” has become the rallying cry for business process reengineering and new initiatives within every sector. As a developer and former business owner, I get this. Efficiency is essential.
 
I tend to look favorably on the fastest, most streamlined solution, and as such, I have a lot of empathy for clients who are seeking fast fixes to ensure that their websites and all of their digital assets get into compliance with WCAG 2.1 for ADA accessibility.
 
But as a developer, my focus is, first and foremost, on solving problems, and I can state unequivocally that overlays can't be counted on to solve the challenges associated with digital accessibility.
 
A recent web accessibility legal case, Haynes vs. Hooters set the precedent that organizations are required to remediate their actual code and not rely on band-aid dashboards or overlay solutions that appear to represent a quick fix that requires seemingly little hands-on maintenance.

Here are 4 key challenges inherent to overlays:

  1. Visually impaired users don't typically use them. They tend to have their own tools with their own voice and reader settings with which they are comfortable and proficient based on their experience and ability level. Your goal is to make your code available to whatever tools and devices they prefer using, not force them to use your overlay tool that has pre-selected settings and options.
  2. Visually impaired users typically have their own stylesheets and ways to access the web. They don’t tend to use presets from widgets because widgets complicate the experience for them and the inability to disable or override them can be frustrating.
  3. Overlays simply don't work well with mobile devices unless a significant expenditure is invested in customizing them to the individual site.
  4. Overlays basically amount to putting a line of Javascript code that pulls preloaded information onto your site. So even if the overlay has been customized as part of your package, to make it fully compliant it's nearly impossible to keep it that way, because accessibility issues can re-emerge with any subsequent change to your site.

 

Sustainable Website Compliance Solutions

Promet Source serves as an accessibility partner, committed to real and lasting accessibility solutions.
 
We conduct both automated and manual testing holistically, from the perspective of the entire spectrum of disabled users and available Assistive Technology -- recognizing that there is no one-size-fits-all fix. This list of automated testing tools, recognized by the World Wide Web Consortium (W3C), demonstrates the wide range of testing options and the need for focused expertise. 
 
Our clients interact closely with both accessibility and developer certified experts throughout engagement and have the opportunity to ask questions and seek clarification every step along the way.
 
After guiding clients through the remediation process of actually fixing code to conform to WCAG 2.1 standards, we provide tools and resources to ensure that your development team has the training and knowledge to maintain your sites conformance.
 
We look forward to consulting with you about your specific accessibility objectives and working toward a solution that best addresses your needs.
 

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web