Aug 30 2019
Aug 30

In this 5-part series, take a deep dive into universal design concepts in the context of creating component-based systems for dynamic web content. Get a birds-eye view of the inner workings of user experience (UX) architecture. Brand strategy, user psychology, objective methodology, and data-driven decisions come together, guided by timeless fundamental ideas, to construct today’s digital journeys.

a dancer mid-split

The Art of Setting Type for Dynamic Systems

It is the typographer’s task to divide up, organize, and interpret this matter in such a way that the reader will have a good chance of finding what is of interest to him.

- Emil Ruder

In the 1950s, Swiss typographer and graphic designer Emil Ruder was involved in developing the International Typographic Style. Cleanliness, objectivity, and the use of “the grid” are key features of this approach which still plays a major role in design today.

In 2006, Oliver Reichenstein said,

Web Design is 95% Typography.

And while faster connection speeds have allowed us to provide more media-rich experiences, text-based information is still vital - and the only thing that devices like Google, Alexa, and assistive technologies for the differently-enabled “see.”

Today, the timeless challenge of how to visually present text-based information is as formidable as it’s ever been. Contemporary design decisions take into account the experiences of billions of potential users, located anywhere in the world, in front of screens of all sizes, with a wide range of backgrounds, situations, and abilities. 

Fortunately, standards such as those for Heuristic Evaluation, the W3C’s Web Content Accessibility Guidelines and user testing tools such as the System Usability Scale and the Single Ease Question have provided an objective structure to inform the creative works of designers today.

Fast, Scalable, Legible Type Styling

Guiding principles for font selection and implementation 

Limit the number of fonts 

In the name of consistency, efficiency, and optimized load times, consider combining two harmonious fonts that create a visual hierarchy. Two is considered an ideal number for fonts. Choose a font with enough members in its family to allow design options, however. A font with light and black versions, as seen on some of Google’s best fonts, will provide a versatile UI toolkit.

Most brand standards contain definitions for print and well as web fonts. In the absence of such standards, a review of mission and vision statements can guide you to a font that reflects the brand identity.

Use a responsive type scale

Responsive modular typography scales, when designed and implemented, are an easy way to preserve visual hierarchy across the various screen widths of devices. A modular type scale is a series of type sizes that relate to each other because they increase by the same ratio. 

Mediacurrent Rain typography example shows Raleway and Merriweather fonts

Example of how responsive typography looks across various breakpoints on Mediacurrent’s Rain demo site.

 

Avoid All Caps for headings and paragraph text

Not only have studies shown text formatted in All Caps decreases reading speed by 10-20%, but this styling presents substantial barriers for users with dyslexia. According to the latest estimates, that is 15-20% of the population.

When headings are less than around 25 characters, all caps is still considered readable for everyone. All caps can also be a valid choice for short navigation titles and concise button copy.

Avoid Italics for headings and paragraph text

A first-time user may spend seconds on your website, not minutes, and as reading speed is dependant on legibility, styling is of paramount importance for all users. In terms of accessibility, italicized letters are nearly illegible to some with reading issues.

Addressing accessibility concerns is a great investment that provides an opportunity to bring in objective best practices to increase usability, readability, and visual impact for all users, opening up your website to a larger audience.

Modern, Readable, Accessible Type Styling 

Guiding principles for line and paragraph spacing

Look to the gold standard for accessibility

Because Section 508 compliance is currently attuned to the A and AA Level WCAG 2.0 guidelines, the AAA Guidelines are sometimes left out of the conversation. The guidelines around line and paragraph spacing, however, are especially easy to incorporate. Because they improve the experience for all users, they are well worth considering. 

a diverse group of people laughing

Avoid justified text

People with certain cognitive disabilities have problems reading text that is both left and right justified. The uneven spacing between words in fully justified text can cause "rivers of white" space to run down the page making reading difficult and in some cases impossible. Text justification can also cause words to be spaced closely together so that it is difficult for them to locate word boundaries.

Implement a maximum width for containers of heading and paragraph text

For people with some reading or vision disabilities, long lines of text can become a significant barrier. They have trouble keeping their place and following the flow of text. Having a narrow block of text makes it easier for them to continue on to the next line in a block. Text blocks must be no wider than 80 characters. 

When determining how to limit the width of a responsive text block, it can be helpful to use a character counting tool. Or try advanced math! The Golden Ratio Typography Calculator generates suggestions for your entire type hierarchy based on the Golden Ratio: a pattern found in nature that is inherently pleasing to the human eye.

Provide ample line spacing for paragraph text

People with some cognitive disabilities find it difficult to track text where the lines are close together. Providing extra space between lines and paragraphs allows them to better track the next line and to recognize when they have reached the end of a paragraph. Provide line spacing that is at least 1.5 (a space and a half) in text blocks and spaces between paragraphs are at least 1.5x line spacing.

In the first post of this series, we discussed the advantages of generously-sized type comfortably line-spaced within a width-restricted column. Not only does this approach align with that of today’s best-in-class websites and provide a comfortable, welcoming reading experience, it is universal design that is accessible and all-inclusive.

Preparing to Reach a Global Audience

Scratching the surface of localization

Designing type for Right To Left (RTL) languages

Arabic is the 4th most popular language globally and 60% of Arabic speakers prefer browsing internet content in Arabic. Aligning text to the right side is key, and so is upsizing the font to preserve text readability. Factoring in the shorter length of words as most Arabic words. Read more about web design for RTL languages.

Dubai.ae screenshot

Example of RTL language typography on dubai.ae.

Accommodating translated text

Character counts expand or contract according to language. Expect a 15 - 30% expansion when translating English to Spanish or Portuguese, for example. When planning to translate English to Chinese or Japanese, however, expect variation. Read more about typography for translated text here.

Planning for character-based languages

Written Japanese, for example, consists of thousands of characters across four character sets: hiragana, katakana, kanji and the Latin alphabet. Japanese users perceive carelessly designed websites as reflective of brand quality. They describe products as “unnatural”, “foreign”, and “suspicious.” The work to get from unnatural to perfect is not hard. Read more about standards for presenting Japanese text on the screen.

Applying and Iterating Type Systems

Continuous improvement based on user feedback

After carefully selecting type styles representative of the brand experience, applying them with accessibility in mind, defining parameters within responsive components, and perhaps including specifications for translated type, what comes next?

We take these specs and build a living style guide to catalog the site’s elements and components. Having the building blocks all in one place not only ensures consistency but allows developers to rapidly adjust or efficiently leverage the existing pieces to create more complex components.

A living style guide makes it easier to add new patterns to meet your needs and the needs of your users that arise as “the rubber hits the road” when content editors begin using the site and feedback is collected from users, ideally through methodical tests.

Technology has evolved rapidly since pioneers like Emil Ruder developed the Swiss style, and is always offering new challenges for designers. But happily, this evolution has allowed us to collect data to objectively guide our practice and to create, share, and continuously improve standards for accessibility and usability, allowing us to meet the challenge of designing with confidence for the global audience. What a great time to be a designer!

In the next installment, we’ll cover pattern libraries/ living style guides in greater detail. We’ll discuss how style libraries ensure a consistent experience and make ongoing enhancements much easier. Learn about the steps it takes to build an emotion-rich, brand-true user journey within the restricted structure of a website built to last for years displaying dynamic, user-entered content.

Aug 30 2019
Aug 30

 

The quantum leap in technology and the ubiquitous demand for delivering personalized content has made it pivotal for companies to embrace the web content management system for all the right reasons. It begins from engaging customers, lead generation, to snowballing overall revenue from various platforms.

However, it can be daunting and tricky sometimes for enterprises to choose the right one as per their requirements on the basis of the varied features and potential that these CMS offers. 

One of the most cited reports by Gartner, Web Content Management Magic Quadrant report, has proclaimed that Acquia Drupal, Adobe Experience Manager, and Sitecore are the three most prominent WCM Managers. 

A square divided into 4 with text and dots insideThese three were at the forefront in Gartner’s report based on the two factors majorly, “ability to execute”, and “completeness of vision”. They undoubtedly outperformed on both these parameters and hence it goes without saying that all three are proficient in fulfilling enterprises’ WCM needs.

However, as one-size does not fit all and there can be multiple parameters for comparing these, it’s better to largely group them into different stakeholder perspectives- the business decision-makers’, the content and marketing teams, and the IT teams. Each of these teams play a vital role in the process. 

Source: Gartner

The content, marketing, and IT team because they’ll be the ones directly working with the CMS they choose. The business decision-makers, CIO, CTO, and the freshly-coined Chief Data Officer (CDO) are the ones who will be held accountable for their choices and showcasing adequate ROI.

This blog will deal with the comparison of AEM and Drupal from the perspective of business decision-makers. Here we’ll primarily focus on factors like cost of implementation, scalability, security, and maintenance to see how Drupal and AEM fare against each other.

Pick What’s Best For Your Organization

Rectangular box with logo and text of Drupal and AEM

Identifying the company’s particular requirement should be the first task on your to-do list. This will only help you out as a decision-maker to draw the conclusion.

Cost of Implementation

"A high price may be a part of its charisma in selling difficult art”, quoted by Robert Genn, a renowned Canadian artist. 

You should get the desired output for the price you pay for a CMS. 

Setting up any large technology solution is always an eye-catching question as it involves the cost associated with it for implementation. Though it is justified as CMS forms a framework for the enterprise technology architecture, and further paves way for digital transformation; the expenditure made on this part is yet always under analysis.

“Being expensive doesn’t make CMS equipped or competent enough for the enterprises”

Furthermore, migrating to new CMS is also a cumbersome task for any enterprise IT team. So, this crucial decision of investing in any particular CMS is made after several rounds of deliberation. And once deployed, organizations become extremely perceptive of acclimating to a change.

The objective behind all such discussions is to get this done right in the very first time & the investment and ROI of any particular enterprise CMS is one of the biggest factors to consider. 

Licensing Fees

The first and foremost difference between AEM and Drupal is their nature of availability- AEM is proprietary while Drupal is open-source. AEM’s licensing fee starts from somewhere 40,000 USD, with additional cost implications based on organizations’ size and expected usage. And this is not a one-time investment. It is a recurring expenditure every year.

Contrarily, Drupal is an open-source platform and hence discards any licensing fee completely. 

Build vs. Buy

The way current market trends are soaring up does not allow enterprises to consider the deployment of CMS a good idea as it will consume a considerable amount of time. Their expectations lie in finding out an unconventional and innovative solution which is readily available with the quick installation process and some training for content, marketing, and larger digital experience teams to run full-fledged campaigns. 

The general perception that surfaces with Drupal being an open-source, developer-driven solution, is the requirement of complete expert teams to deploy it. There is a definite apprehension around the fact that enterprises have to build from scratch to set up a Drupal CMS, and that without second thoughts, consumes ample amount of time, resources, as well as cost. 

AEM is perceived as a ready-to-roll product requiring minimal expert set up.

“In contrast to the assumption, Drupal does have over 20,000 modules and distributions that provide on-the-go functionalities and features”

However, it’s paramount to apprehend that build vs. buy debate is not as straightforward as it appears. Because:

  • In contrast to the assumption, Drupal does have over 20,000 modules and distributions that provide on-the-go functionalities and features like multisite, multilingual, dynamic content, personalization, and more. With distributions like Acquia Lightning, Drupal caters to highly targeted, easy-to-implement enterprise-level content management solutions, unlike AEM.
  • The other noteworthy point is that AEM does need specialized teams for design, hosting and maintaining an efficient digital platform. They demand an almost similar degree of the build like that of Drupal. But even here, the total cost of implementation for Drupal would be around 250,000 USD to 350,000 USD on an average, and around 500,000 USD if you compute hosting too. On the other hand, typical AEM implementation budget starts exceeding the limit of 500, 000 USD and can go as high as 1 million USD.

Support & Maintenance

Both Drupal and AEM demand substantial maintenance post-deployment.

In regards to the maintenance, Drupal will safeguard your content but to lay the foundation for modern architecture it won't hesitate from breaking some components to ensure that the platform is up-to-date and relevant. To achieve the same, It will eliminate unnecessary bloatware associated with rote adherence to backward compatibility.

 

Licensing 

Build or Buy

Support & Maintenance

Drupal

No licensing fees

Considerable build. Cost approx 250,000-350,000 USD

Support costs can be tailored to fit different enterprises

AEM

Licensing fee applicable

Needs specialized teams for build. Costs approx 500,000 - 1 million USD

Dedicated support teams by Adobe

Security

AEM is certainly a secure platform with predefined functionalities to ensure security and administration. User identification and access can be easily managed. There are tried-and-tested techniques by which risk of inter-site request forgery, can be alleviated, and third party data storage, and role-based authentication and access to data objects can be successfully secured.

A bar graph with 11 bars of different height in peach color on white background

Source: Sucuri

As per the stats, Drupal performs substantially better than other leading CMS platforms in terms of protecting websites from any external attacks.

Sucuri, security platform for websites, compiled the Hacked Website report, analyzed more than 34,000 infected websites. Among the statistics that it shared, one of the factors for comparison was affected open-source CMS applications.

The infection crippled other CMS due to improper deployment, configuration, and the poor maintenance.

With Drupal being an open-source platform, it is generally presumed less secure in comparison to a proprietary platform like AEM. However, that’s not the case! 

Drupal has almost similar security and governance features embedded within it similar to AEM. Drupal has dedicated modules for identification, authentication, and access-based controls.

“Drupal’s security standards are set by the Open Web Security Project, making it secure by design”

The database encryption at various levels vouches for data security so that no unauthenticated user can breach the security. Also, there is built-in security reporting feature to let teams highlight and raise the security concerns within the community. 

It is to take heed that AEM’s security initiative is led solely by one team, with Adobe extending support whenever necessary. 

But Drupal has a different story to narrate! The security advisory not only works on to safeguard the platform but also ensure that there are multiple touchpoints to resolve the issue.

Security documentation, helps developers write secure code, and maintain modules for added security - all this creates a larger system that continuously safeguards Drupal. It’s security standards are set by the Open Web Security Project, making it secure by design.

Scalability and Future Proofing

CMS lays the foundation for providing the unique digital experience to users so it ought to be scalable and customizable. It should be capable of integrating seamlessly with other technologies and simultaneously support new solutions so that its ready to serve the growing needs of the organizations shortly. 

So how are Drupal and AEM different in this regard?

The Adobe ecosystem of products offers all-powerful solutions to amplify the enterprise digital experience. However, its features and full potential can only be realized when organizations are using a combination of several different Adobe solutions. While they continuously come up with new capabilities, the chance to leverage these is often dependent on having up-to-date versions of other Adobe products. AEM is a part of the Adobe Marketing Cloud and makes a pertinent case for buying all the other products to create an efficient digital platform.

For instance, Adobe’s  Smart Layout was a new AI feature rolled out in 2018. However, to make use of this feature, enterprises will need the most recent versions of AEM sites, AEM Assets, Target Analytics, and Audience Manager. And that just adds up to the overall budget by several thousand dollars.

While large enterprises might think of implementing it, the vendor lock-in associated with Adobe does not bode well for scalability.

That's because most enterprises today opt for those solutions which are best-of-breed and best-of-need philosophy. The complete enterprise technology stack gets build in an agile manner, where every new piece appended is the best possible solution for the enterprise with given current requirements and constraints. But in case you decide to choose products from Adobe ecosystem, the need to stick to Adobe becomes crucial. And if you want to leverage any of its products to its full capacity, you may lose out on your flexibility of opting for other solutions that might be more suitable to your needs.

In a nutshell, once you have decided to go with Adobe or even with just AEM, getting out of it becomes difficult even if you keep on adding new moving parts to your digital experience stack.

“Drupal empowers enterprises with the flexibility of choosing any technology solution that works best as per their respective requirements and integrates with them hassle-free”

Four circles with text written over them

However, with Drupal, scalability, expansion, and flexibility have never been a cause of concern. Since Drupal is co-created by a whole community of developers, its feature of integrating seamlessly with various other technology solutions is inculcated in its DNA.

  • The Drupal CMS integrates effortlessly with multiple enterprise systems, even Adobe, and does not rely on the existence of other drupal based solutions to work effectively
  • The potential of decoupled Drupal applications has proven it as best content manager-while letting other technologies to build atop to deliver augmented user-facing functionalities.
  • The open-source community also takes care that new modules and functionalities built on any project get contributed and made readily available for others to use and worked upon.

In essence, Drupal empowers enterprises with the flexibility of choosing any technology solution that works best as per their respective requirements and integrates with them hassle-free. 

The Drupal community also ensures to keep the core up-to-date and the platform is always up to speed to work with other enterprise technologies.

The question that arises here is that whether you want to completely switch to Adobe solutions or want to keep that option with you where you have the flexibility to choose the best solutions for your company.

For uninterrupted flexibility and extensibility, it’s suggested to avoid vendor lock-in and choose Drupal.

Have a look at this video to get more understanding of different web content management systems and their capabilities-

[embedded content]

 

Drupal or AEM

Here is a compiled list of factors arranged in tabular form to carry out the comparison between Drupal and AEM to help you reach the final decision-

Attributes Drupal  AEM

Major Differentiation

A web content management system built by the largest community of developers. A web content management system with implicit personalization and analytics feature.   Agility, cost, and open-source availability are its USPs. Proprietary and licenses are require Flexibility

Being an open-source platform, it does not require any roadmap to implement change.

It is innovation-friendly!

Since it is proprietary and requires a license, it has to wait for rolling out a roadmap to implement changes. Development Speed Easy prototyping leads to faster implementations

Requires considerable time to execute the process

Responsiveness

Highly responsive! It helps in building interactive websites for the enhanced digital experience of users

User Experience and building responsive interfaces are available Integration Capabilities

Rest APIs in the core is easy to integrate

Integratable-Relies on partners for integration

Enterprise Fit

Meets all the criteria of the enterprises

Meets all the criteria of the enterprises Security

Dedicated teams to ensure the security of the platform.

Has a more transparent and open process to communicate the issues within the community

Security directed as by proprietary team Scalable

Highly scalable

Scalable Maintenance

Sign up with any Drupal shop. Vendor Agnostic approach is possible.

Proprietary Cost

Source code is free but organizations have to pay more for adding up functionalities

License costs are high

Final Words

The final choice between Drupal and AEM will be made by the decision-makers based on the array of questions around the cost of building, hosting, and maintenance. This also involves considering the associated ROI with both Drupal and AEM to ensure easy workflow. 

The trade-off between making Adobe products work together well and the freedom to choose the best technology solutions as per your requirements needs to be carefully evaluated.

And even though both Drupal and AEM ensure top-notch security, your enterprise internal security and compliance requirements will also need to be taken into account before choosing one over the other.

Aug 30 2019
Aug 30

For over a decade* the Drupal community has gathered in volunteer-led “camps”, based on the BarCamp model, to follow in that camp’s initial goals:

  • Introduce web developers to the Drupal platform.
  • Skill sharing among established Drupal developers
  • Give back to the Drupal project (e.g. fix bugs, documentation, lesson materials)

The scope of Drupal Camps has expanded as our community has grown and matured, and camps now serve everyone in the community, not just developers. They serve as an on-ramp for new developers, a networking opportunity for clients and agencies, a marketing opportunity for service providers, and much more.

The Drupal Event Organizers have documentation of over 50 volunteer-led events with over 10,000 attendees worldwide in the past year, and we know there are almost double this number that we don’t have good data for. These events have evolved organically, with little central organization, and are estimated to comprise more than one million dollars in sponsorship and ticket sales worldwide, yearly.

The Event Organizers have organized in various forms over that time… from various documentation efforts to our current monthly meeting format. With Kaleem’s instigation, the group started toward a more formal organization in late 2018, and that process is almost a reality. 

We’ve established a formation board composed of organizers from across the globe and have been collaborating on a charter for the Event Organizers Working Group over the past few months. We’re looking forward to presenting it soon and beginning the next chapter of the Drupal Event Organizers.

Thanks to the following folks who have contributed to the effort thus far.

Our Formation Board members:

And our trusted advisers:

* Possibly starting at Drupal Camp Toronto, 2006? If you know of an earlier camp, please let us know in the comments.

Aug 30 2019
Aug 30

The first step is to install Drupal Console. You may have already watched the last episode that covers this, however, if not, you can install with composer using:

composer require drupal/console

This will install Drupal console in your project in the ./vendor/bin/ directory. This means you can run Drupal console using:

./vendor/bin/drupal [command]

Note, in my examples I am using Lando for my local development environment. I have it set up to allow me to run Drupal console commands on my site by running lando drupal. You can decide to install Drupal Console globally and add it to your path. Depending on the development environment your setup might be slightly different.

You can get a list of all the Drupal console commands by running Drupal Console command with no command argument:

drupal

To generate a boilerplate module with Drupal console run:

drupal generate:module

Fill out the questions asked by Drupal console. Make sure to enter a module name and description. In this example, I added a package name, didn’t include a composer.json file, and didn’t include a themable template:

Drupal Console Generate Module

At the end of the questions Drupal console will ask you if it should proceed with creating the module for you. Hit enter to default this to yes.

Drupal Console Generate Module

The next step is to create our custom entity. We want our entity to be a content entity, not a config entity. This can be accomplished with the following Drupal console command:

drupal generate:entity:content

Make sure to select the module you just created. In this case, I will call the entity Statistic and accept everything else at its defaults.

Drupal Console Generate Entity

When you get through all the questions, you will see all the files that were created:

Drupal Console Generate Entity

Now before you go turn this module on, we are going to take a quick look at the code. At the time of this guide, there is a bug that will cause you some headaches if you don’t fix it prior to enabling the module!

Take a look through all the files generated to create the entity. Don’t worry if you are a bit confused at this point, there are a lot of files and a lot of things going on here!

Open up the Statistic.php file (located in the src/Entity folder). If you called your entity something else, it will be the name of the Entity class.

At the top of the file you might notice that the line of code importing EditorialContentEntityBase is missing a semicolon. This is a bug in the code generation tool, make sure to add a semicolon there!

Fix Content Entity Error

This file is doing a lot, but the core of what it is doing is create the structure for your entity. This structure will be created in the database when you install the module. Go ahead and install the module to see how it works. You will notice that once you install the module you are given some menu options under the Admin > Structure menu.

The Statistics setting page is a placeholder where you could eventually add additional configuration settings. This file is controlled by the src/Entity/StatisticSettingsForm.php file.

You will notice you can manage field and display field just like you can with a content type. At this point, we could be done and just use the UI to configure the entity from here. However, in this case we want to add one more field to our entity before we call it finished.

Uninstall the module and open back up the Statistic.php file. Go down to the baseFieldDefinitions method and add the following code after the $field[‘name’] variable is set.

$fields['value'] = BaseFieldDefinition::create('string')
  ->setLabel(t('Value'))
  ->setDescription(t('The value of the Statistics entity.'))
  ->setRevisionable(TRUE)
  ->setSettings([
    'max_length' => 50,
    'text_processing' => 0,
  ])
  ->setDefaultValue('')
  ->setDisplayOptions('view', [
    'label' => 'above',
    'type' => 'string',
    'weight' => -4,
  ])
  ->setDisplayOptions('form', [
    'type' => 'string_textfield',
    'weight' => -4,
  ])
  ->setDisplayConfigurable('form', TRUE)
  ->setDisplayConfigurable('view', TRUE)
  ->setRequired(TRUE);

What this will do is add a new field called value. In this case we are making it a string field and the form will be displayed as a textfield. We could have made this an integer field or a decimal field, but for simplicity we will just make it a string.

If you re-install the module, you will now notice your value field is available. Go ahead and create some Statistics now!

There is a lot more you can do with it. You could spend an hour looking through all the code that was generated as there is a lot to learn! This is a good stopping point for now, as we were able to quickly create our own custom entity using Drupal console!

Aug 29 2019
Aug 29

Masonry is a very popular JavaScript library, that stacks items in columns and rows, like a masonry brick wall.

The items reorder themselves according to their size as the viewport size changes with a nice animation effect. For some examples, take a look at the official Masonry site.

It is possible to create a view in Drupal with this style of layout. Keep reading to learn how!

Example of the Masonry layout

Step #1. Install the required modules

  • Open the Terminal application of your PC and type:

composer require drupal/masonry

Type the composer installation command

composer require drupal/masonry_views

Type this command

This will download the required modules. Masonry API and Masonry Views. The API provides integration between the Masonry JavaScript libraries and Drupal. Masonry Views defines the style of the view (its layout).

There are two more libraries to install for the proper functioning of the modules:

  • Create the /libraries folder in the root of your site
  • Inside this folder create two more folders called /masonry and imagesloaded
  • Right-click on the masonry library page and select Save As
  • Place this file (masonry.pkgd.min.js) inside the /masonry folder
  • Do the same with the imagesloaded.pkgd.min.js file.

Create the folders and move the files

  • Click Extend
  • Select Masonry API and Masonry Views
  • Click Install.

Click Install

The system will prompt you to enable also the Libraries module.

  • Click Continue.

Step #2. Create the View

For this tutorial, I’ve generated some dummy content with the Devel Generate module.

Some dummy content

Some more dummy content

Use this module only on development stages. I generated 50 articles with their images. These images have different width and height values.

  • Click Structure > Views > Add View
  • Select Create a block
  • Show Content of type Article
  • Display format: Masonry of fields
  • Items per block: 12
  • Click Save and edit

Click Save and edit

  • In the Fields section click Add

Click Add

  • In the Category Content, select the fields Body and Image
  • Click Add and configure fields

Click <i>Add and configure fields

  • Change the Formatter and set a character limit for the text
  • Click Apply and continue

Click Apply and continue

  • Leave the default image style
  • Link the image to the content
  • Click Apply.

Click Apply

  • In the Fields section click the dropdown and select Rearrange
  • Put the image between title and body
  • Click Apply.

Click Apply

  • Save the view

Step #3. Place the Block

  • Click Structure > Block Layout
  • Scroll down to the Content section
  • Click Place block.

Click Place block

  • Search for the block you created in the last step
  • Click Place block.

Click Place block

  • Hide the title
  • Click Save block.

Click Save block

  • Rearrange the block above the Main page content
  • Scroll down and click Save blocks.

Scroll down and click Save blocks

If you used the Devel module to generate content like I did, you will notice that the body text is not trimmed. Apart from that, you will see the content inside the block neatly organized in a mason grid.

You will see the content inside the block neatly organized in a mason grid

Step #4. Theming the Grid

There are two classes you can target in your stylesheet to change the appearance of the container and the items inside it:

  • .masonry-layout
  • .masonry-item

The two CSS classes to target

More on Drupal theming at OSTraining here. I hope you liked this tutorial. Thanks for reading!


About the author

Jorge lived in Ecuador and Germany. Now he is back to his homeland Colombia. He spends his time translating from English and German to Spanish. He enjoys playing with Drupal and other Open Source Content Management Systems and technologies.
Aug 29 2019
Aug 29

In my opinion, there’s nothing in the world quite like learning a new skill. It’s a huge rush figuring out how to do something you didn’t know how to do before. You start with the basics, and then once you’ve got the gist of it, you begin to improve. Over time you’ll get better at it, and you’ll get faster at it. Aside from just improving your abilities, your confidence level might change too. This process can go a few different ways. One is imposter syndrome, where you feel like a fraud that doesn’t know what they’re doing. The other is overconfidence, where you think that your skills are better than they are. Be it due to imposter syndrome, overconfidence, or just plain ol’ inexperience; you’re bound to make some mistakes. And we don’t wanna make mistakes, do we?

Chef at workLike everything else in the world, managing Drupal modules takes time and effort to learn. There’s so much going on under-the-hood that it’s all too easy to make a mistake accidentally. In this article, I’ll point out a couple of common mistakes that new Drupal users make, and how you can avoid them. This is all going down while I do my best to make everything fit my metaphor that compares Drupal to a professional kitchen. Let’s get right into it!

Too Many Modules

Have you ever watched the Too Many Cooks video by Adult Swim? No? Well, here it is. Watch the whole thing; it’s worth it. Trust me. Okay, done with that? Cool. Where were we… Oh yeah, too many modules!

[embedded content]

Having too many cooks (or modules) working in one kitchen (your Drupal site) is a disaster waiting to happen for a whole schwack of reasons. Each module you have installed needs to be downloaded and loaded on the user’s device. More modules mean longer load times. Did you know that 53% of mobile users will abandon a page if it takes over 3 seconds to load? Yeah, so I’d say your load times are pretty important. By keeping your number of modules to a minimum, you’ll reduce your load times, sometimes drastically.

Another problem that can be caused by having too many modules is compatibility. For example, let’s say that you installed a contact form module that enables a user to send you an email along with an attachment. For some strange reason though, the blog comments module you’ve installed is broken now! Something somewhere in the new module is causing compatibility issues. Going back to our kitchen metaphor, it’s like the head chef is cooking a great Italian meal, but the sous chef is preparing Chinese food. That’s just not gonna work out for anyone. 

Outdated or Unsupported Modules

If your site has been around for a few years, first of all, great job! Secondly, you should check to see if your modules are all up to date. Not only that, but you should be checking to see if they’re even still supported. Why? Because as time goes on, the nature of the web changes. The nature of web tools, applications, systems, and services all change with it. Sometimes those changes mean that there are additions, alterations, or subtractions to a piece of code that your website relies on to run. When that happens, it could break the functionality of one of your modules. Since this is all happening behind the scenes, deep in code-world, it can be tough to spot the source of your problem. 

Let’s think about it as a restaurant again. In this scenario, our restaurant and kitchen are Drupal, the cooks are our modules, and the food we’re serving is our site content. Our modules work together using features built into Drupal to put all the code, copy, images, and other assets together into one big tasty meal. I mean website.

Alrighty, so, you’ve closed your restaurant this weekend for some remodelling, but forgot to tell some cooks. Fortunately, most of the kitchen floorplan is still familiar. However, it looks like some of the tongs, several knives, and a couple of pans got moved around. Oh, and the salad station has been completely removed. Gone. Vanished. Poof! This is sort of what can happen to a website and its modules when there’s a Drupal update.

Chef at workThe unprepared staff are like outdated modules. Since they don’t know where everything in the kitchen is anymore, their job performance has dropped drastically. Until they get a new tour of the remodelled kitchen, they won’t perform to their fullest, and might not be able to perform at all. In a Drupal update, there might only be tiny changes that barely impact performance, but because your module wasn’t updated to run on this new version, it’s stopped working or slowed down. It’s critical always to ensure your modules are running on the latest version for whichever version of Drupal you’re on.

Your salad tender is another story. That guy’s an unsupported module. For some strange reason, you still offer salads on the menu, but you’ve removed the station for it, and your salad tender wants to get paid under the table from now on. He’s still coming in to work every day even though there he’s got no station, recipes, or even acknowledgement from management. Occasionally though, he still gets salads out to the guests. Sometimes they look great, and other times, your guests might get food poisoning. For the health and safety of your guests, you should look at getting someone who actually has their Food Safe certification and the proper tools to make the salads. In Drupal terms, an unsupported module is a module that’s been abandoned by the team, retired for security reasons, or has been rendered obsolete in some other way. While an unsupported module might function properly and give your site the feature you want it to have, they can also present significant security risks. Unsupported modules don’t get regular updates, so hackers and other black-hats have all the time in the world to leisurely look for exploits in the module’s code. Since their code isn’t being updated along with Drupal’s, their functionality and performance suffer, too. When you use unsupported modules, you’re opening your site up to any number of security risks and performance issues. Just don’t do it!

What Can I Do?

Well, first of all, be sure only to install modules that are still supported and being actively developed. They should be vetted by the community so that you know what you’re installing is high-quality code that’s both functional and secure. Make sure to read the documentation that comes with your modules so that you can make sure you don’t have any compatibility issues with other modules. Keep the number of installed modules to a minimum to improve load times and performance. Look for modules that have overlapping or similar features and decide, “Do I really need both of these?” To put it plainly: carefully research each module, make sure it’s what you need, make sure it works with your site, and make sure the developers and community are still supporting it.

A great way to get a detailed look at your site is to perform a site audit. With a site audit, you’ll see what’s working, what’s not working, where your problems are, and tons of other information. It's kinda like when the health inspector walks through a kitchen and points out the things that need to be improved on. At Cheeky Monkey Media, when our expert team of Drupal developers perform a site audit, they collect all the information that you need to know about your site into one simple, concise, and easy to read the document. Get in touch with one of the monkeys today to get started. We know how hard it can be to try and manage a site on your own, so reach out! We’d be happy to help.

Cooks working
Aug 29 2019
Aug 29

Drupal 8 has a wonderful migration system that generally boils migrations into a particular entity type down into a single YAML file. This file specifies the data source, the output entity type, and how to process input fields and where in the entity they get stored.  

However, there can be a number of complications. Two, in particular, are bad or unexpected input data values, and exactly what form the input data - or even, the partially processed data - is in (eg, is it a scalar, an array, a nested array, etc.). Because the migration system doesn’t provide a way to look at what’s going on inside the process pipeline, it can be difficult and frustrating to debug these kinds of issues.

One way to quickly see what’s going on without having to get into xdebug and grope around in the code for the various plugins provided by the migrate, drupal_migrate, migrate_plus, and migrate_tools modules, is to create a debug plugin and place it wherever you need in the process pipeline.

Usually, the custom migration module code will be structured like this:

migration/
├── config
│   └── install
│       ...
│       ├── migrate_plus.migration.node_article.yml
│       ...
├── migration.info.yml
├── README.txt
└── src
    └── Plugin
        └── migrate
            ├── destination
            ├── process
            │   ...
            │   ├── Debug.php
            │   ...
            └── source

We’re going to look at the node_article migration pass for an example. The YAML file could look something like this:

id: node_article
label: Node - Article
migration_group: migration
migration_tags:
  - Drupal 6
source:
  plugin: node
  node_type: blog
  constants:
    type: article
process:
  nid: tnid
  type: constants/type
  langcode:
    plugin: default_value
    source: language
    default_value: "und"
  title: title
  uid: node_uid
  status: status
  created: created
  changed: changed
  promote: promote
  sticky: sticky
  'body/format':
    -
      plugin: static_map
      source: format
      map:
        2: 'filtered_text'      # Filtered HTML "no links"
        3: 'html_text'          # HTML
        5: 'plain_text'         # PHP code
        6: 'filtered_text'      # Filtered HTML
        7: 'plain_text'         # Email Filter
    -
      plugin: default_value
      default_value: 'html_text'
  'body/value': body
  'body/summary': teaser
  revision_uid: revision_uid
  revision_log: log
  revision_timestamp: timestamp
destination:
  plugin: entity:node

But say that it turns out that the body/format is not getting translated correctly when the node_article migration pass is run. How can this be debugged to understand if the YAML structure is wrong, or the data values are wrong?

Let’s have a look at migration/src/Plugin/migrate/process/Debug.php:

<?php
namespace Drupal\migration\Plugin\migrate\process;

use Drupal\migrate\MigrateExecutableInterface;
use Drupal\migrate\ProcessPluginBase;
use Drupal\migrate\Row;

/**
 * Debug process pipeline
 *
 * @MigrateProcessPlugin(
 *   id = "debug"
 * )
 */
class Debug extends ProcessPluginBase {

  /**
   * {@inheritdoc}
   */
  public function transform($value, MigrateExecutableInterface $migrate_executable, Row $row, $destination_property) {
    echo "DEBUG: " . $this->configuration['message'] . "\n";
    print_r(['value' => $value]);
    if (!empty($this->configuration['row']) &&
        $this->configuration['row']) {
      print_r(['row' => $row]);
    }
    return $value;
  }

}

Debug.php provides a process plugin that can be placed “invisibly” into a process pipeline and gives details about what’s happening inside there. The transform method for all process plugins receives the $value being processed and the $row of input values, among other things. It should return a value that is the result of processing that the plugin does.

The ProcessPluginBase object also contains a configuration object, where named parameters that appear in the process pipeline YAML code are stored. For example, the static_map plugin can find the source value in $this->configuration[‘source’].

In this case, the debug plugin returns exactly the value it receives, so it doesn’t change anything. At base, it prints a message parameter and dumps the $value so it can show what actual data is being passed and what format it is in. It has been useful to also be able to look at the $row values. To see them, simply add the parameter row: 1.

For example, to see what the result of the body/format mapping is before passing it along to the default_value plugin:

  'body/format':
    -
      plugin: static_map
      source: format
      map:
        2: 'filtered_text'      # Filtered HTML "no links"
        3: 'html_text'          # HTML
        5: 'plain_text'         # PHP code
        6: 'filtered_text'      # Filtered HTML
        7: 'plain_text'         # Email Filter
    -
      plugin: debug
    -
      plugin: default_value
      default_value: 'html_text'

If it would also be useful to see what the input row values looked like:

  'body/format':
    -
      plugin: static_map
      source: format
      map:
        2: 'filtered_text'      # Filtered HTML "no links"
        3: 'html_text'          # HTML
        5: 'plain_text'         # PHP code
        6: 'filtered_text'      # Filtered HTML
        7: 'plain_text'         # Email Filter
    -
      plugin: debug
      row: 1
    -
      plugin: default_value
      default_value: 'html_text'

This produces output for each row processed, which can get quite large. If you know what rows are causing problems, the migration can be run with the --idlist=”<source keys>” parameter. If not, simply redirect the output into a file and use an editor like vim to hunt through it for the cases that are problematic.

This idea can be expanded: if there is some logic to the problem, this can be added so that output is only created for rows that create an issue. It’s possible to create more than one debug plugin (with different names, of course) if there are multiple special purpose needs.

Aug 29 2019
Aug 29

The first step is to install Drupal Console. You can do that with composer using:

composer require drupal/console

This will install Drupal console in your project in the ./vendor/bin/ directory. This means you can run Drupal console using:

./vendor/bin/drupal [command]

Note, in my examples I am using Lando for my local development environment. I have it set up to allow me to run Drupal console commands on my site by running lando drupal. You can decide to install Drupal Console globally and add it to your path. Depending on the development environment your setup might be slightly different.

You can get a list of all the Drupal console commands by running Drupal Console command with no command argument:

drupal

To generate a boilerplate module with Drupal console run:

drupal generate:module

Fill out the questions asked by Drupal console. Make sure to enter a module name and description. The rest of the options can be left at their defaults (in the example below I also added a Package name):

Drupal Console Generate Module

At the end of the questions Drupal console will ask you if it should proceed with creating the module for you. Hit enter to default this to yes.

Drupal Console Generate Module

You can now see that this has generated a boilerplate module for you. You can now look at this generated code and even install this module on your Drupal site.

We will now use Drupal console to generate and administrative form. This can be started by running the following command:

drupal generate:form:config

The first step is to enter your module name. This should autocomplete as you type. After that you need to add a Form Classname. In the example below, I used SettingsForm. The next four questions can be left at their defaults:

Drupal Console Generate Form Config

When you tell Drupal console you want to generate a form structure, it will provide you a list of form elements and walk you through a form builder process. In this example, I just created a textfield called My Textfield. Go ahead and generate some form fields.

Drupal Console Generate Form Config

When you are finished, just leave the New field type question blank and press the Enter key. It will then ask you some questions regarding the path and menu links.

Drupal Console Generate Form Config

After this it will generate the code for you.

Drupal Console Generate Form Config

Now all you need to do is enable your module, find the link in your admin menu, and you now have an Admin settings form!

Custom Drupal Admin Form

There is so much more you can do with this. As the module currently sits, it doesn’t do much, however, it provides a solid base for you to start developing your own custom module. Spend some time looking through the generated code and feel free to make modifications to the custom module you built.

Aug 29 2019
Aug 29

In a recent press release, TopDevelopers.co declared Agiledrop one of 2019’s best Drupal development firms in the world. With over 10 years of extensive Drupal expertise and several Acquia certified developers, we see this recognition as a true testament to the quality of our work, both with clients and within the Drupal community.

We’re proud to announce that Agiledrop has been chosen as one of the Best Drupal Development companies of 2019! 

For those of you who don't know us - Agiledrop is a web development company specializing in the Drupal CMS. We're based in Central Europe in Slovenia, with headquarters in the capital Ljubljana and offices in several other cities in Slovenia.

Since our beginnings in 2008, our team has grown to over 50 members and we've distinguished ourselves as one of the country's top development companies as well as prolific contributors in the Drupal community, reserving a prestigious position on TopDevelopers.co's list of top Drupal development companies.

As a proud member of the Drupal Association, we’re constantly giving back to the community through code contributions, as well as with frequently organized meetups and free Drupal courses where we train new generations of Drupal enthusiasts. Our specialty, however, lies in helping digital agencies scale up their businesses and take on more work.

Thanks to our 300+ successfully completed projects for a wide range of clients and our 40+ skillful developers, we’re naturally attuned to the needs of particular clients and are able to provide them with the right people for their project in just a few days. Agility is literally in our name! 

Our developers are well-versed in working with Drupal-related services such as Thunder, Drupal Console, Acquia Lightning, Drupal VM, Drush and Open Social. Along with Drupal development, we provide enterprise WordPress development and front-end solutions with modern JavaScript frameworks, e.g. Angular and React. 

We’ve worked with a varied set of clients which includes names such as Ogilvy, UNESCO, Wunderkraut (now Appnovation) and T-Systems - to name just a few. If you’d like to learn about other clients we’ve worked with and what their experience of working with Agiledrop was like, check out our portfolio of references and case studies

And if you're interested in more specific details about our work, such as service and industry focus? Then our profile page on TopDevelopers.co is the place to go; their analysts were super friendly and helped us in building a top-notch profile on their website. 

Of course, we were also very curious to learn how we got chosen among all the amazing and proficient Drupal development companies. When we approached the spokesperson at TopDevelopers.co and asked him about the reasons for choosing us, he was happy to provide us with quite an extensive list:

  • Agiledrop gives its clients only proven Drupal developers who have previously worked on enterprise-level projects, which eliminates the possibility of a bad hire. 
  • Our developers are ready to work on clients’ projects quickly, with individual developers being able to start in only a few days and entire teams being able to start in less than a week, which significantly speeds up the work.
  • We make it a priority to communicate clearly and frequently with clients, no matter the time difference, at least a few times a day, keeping all stakeholders up-to-date on the project’s progress.
  • Our developers adhere to strict coding and security standards, saving costly overruns in the later stages of the project.
  • A lot of Agiledrop’s clients are repeat clients who were so satisfied with our reliability and the quality of our services that they opted for a long-term partnership with us; see our portfolio for their immensely positive reviews.
  • Owing to their comprehensive skillsets, our developers have an excellent grasp of what technologies are most suited to a particular project and are thus able to always choose the most adequate solution to realize the client’s goals and needs.

Who is TopDevelopers.co?

TopDevelopers.co is a directory and review platform for IT service providers. They offer unbiased service to service seekers, by providing them with a listing of genuine and highly professional IT firms, which can help the service seekers in achieving their goals by providing high-quality technical services.  

The research team of TopDevelopers.co chooses the best firms by filtering a vast list of companies and introduces only the competitive names to the businesses, enterprises, and entrepreneurs to partner with. 

The company has a friendly team of researchers and a hassle-free communication system. They provide the listing service for various technologies and services, which makes their platform a one-stop destination to find the perfect technology partner for any kind of project.

Are you just now considering taking on that big project, but lack the development capacity? Contact us at Agiledrop - our A-team will be happy to help you out!

Aug 29 2019
Aug 29

In the previous posts we talked about option to manage migrations as configuration entities and some of the benefits this brings. Today, we are going to learn another useful feature provided by the Migrate Plus module: migration groups. We are going to see how they can be used to execute migrations together and share configuration among them. Let’s get started.

Example definition of a migration group.

Understanding migration groups

The Migrate Plus module defines a new configuration entity called migration group. When the module is enabled, each migration can define one group they belong to. This serves two purposes:

  1. It is possible to execute operations per group. For example, you can import or rollback all migrations in the same group with one Drush command provided by the Migrate Tools module.
  2. It is is possible to declare shared configuration for all the migrations within a group. For example, if they use the same source file, the value can be set in a single place: the migration group.

To demonstrate how to leverage migration groups, we will convert the CSV source example to use migrations defined as configuration and groups. You can get the full code example at https://github.com/dinarcon/ud_migrations The module to enable is UD configuration group migration (CSV source) whose machine name is ud_migrations_config_group_csv_source. It comes with three migrations: udm_config_group_csv_source_paragraph, udm_config_group_csv_source_image, and  udm_config_group_csv_source_node. Additionally, the demo module provides the udm_config_group_csv_source group.

Note: The Migrate Tools module provides a user interface for managing migrations defined as configuration. It is available under “Structure > Migrations” at /admin/structure/migrate. This user interface will be explained in a future entry. For today’s example, it is assumed that migrations are executed using the Drush commands provided by Migrate Plus. In the past we have used the Migrate Run module to execute migrations, but this module does not offer the ability to import or rollback migrations per group.

Creating a migration group

The migration groups are defined in YAML files using the following naming convention: migrate_plus.migration_group.[migration_group_id].yml. Because they are configuration entities, they need to be placed in the config/install directory of your module. Files placed in that directory following that pattern will be synced into Drupal’s active configuration when the module is installed for the first time (only). If you need to update the migration groups, you make the modifications to the files and then sync the configuration again. More details on this workflow can be found in this article. The following snippet shows an example migration group:

uuid: e88e28cc-94e4-4039-ae37-c1e3217fc0c4
id: udm_config_group_csv_source
label: 'UD Config Group (CSV source)'
description: 'A container for migrations about individuals and their favorite books. Learn more at https://understanddrupal.com/migrations.'
source_type: 'CSV resource'
shared_configuration: null

The uuid key is optional. If not set, the configuration management system will create one automatically and assign it to the migration group. Setting one simplifies the workflow for updating configuration entities as explained in this article. The id key is required. Its value is used to associate individual migrations to this particular group.

The label, description, and source_type keys are used to give details about the migration. Their value appear in the user interface provided by Migrate Tools. label is required and serves as the name of the group. description is optional and provides more information about the group. source_type is optional and gives context about the type of source you are migrating from. For example, "Drupal 7", "WordPress", "CSV file", etc.

To associate a migration to a group, set the migration_group key in the migration definition file: For example:

uuid: 97179435-ca90-434b-abe0-57188a73a0bf
id: udm_config_group_csv_source_node
label: 'UD configuration host node migration for migration group example (CSV source)'
migration_group: udm_config_group_csv_source
source: ...
process: ...
destination: ...
migration_dependencies: ...

Note that if you omit the migration_group key, it will default to a null value meaning the migration is not associated with any group. You will still be able to execute the migration from the command line, but it will not appear in the user interface provided by Migrate Tools. If you want the migration to be available in the user interface without creating a new group, you can set the migration_group key to default. This group is automatically created by Migrate Plus and can be used as a generic container for migrations.

Organizing and executing migrations

Migration groups are used to organize migrations. Migration projects usually involve several types of elements to import. For example, book reports, events, subscriptions, user accounts, etc. Each of them might require multiple migrations to be completed. Let’s consider a news articles migration. The "book report" content type has many entity reference fields: book cover (image), support documents (file), tags (taxonomy term), author (user), citations (paragraphs). In this case, you will have one primary node migration that depends on five migrations of multiple types. You can put all of them in the same group and execute them together.

It is very important not to confuse migration groups with migration dependencies. In the previous example, the primary book report node migration should still list all its dependencies in the migration_dependencies section of its definition file. Otherwise, there is no guarantee that the five migrations it depends on will be executed in advance. This could cause problems if the primary node migration is executed before images, files, taxonomy terms, users, or paragraphs have already been imported into the system.

It is possible to execute all migrations in a group by issuing a single Drush with the --group flag. This is supported by the import and rollback commands exposed by Migrate Tools. For example, drush migrate:import --group='udm_config_group_csv_source'. Note that as of this writing, there is no way to run all migrations in a group in a single operation from the user interface. You could import the main migration and the system will make sure that if any explicit dependency is set, they will be run in advance. If the group contained more migrations than the ones listed as dependencies, those will not be imported. Moreover, migration dependencies are only executed automatically for import operations. Dependent migrations will not be rolled back automatically if the main migration is rolled back individually.

Note: This example assumes you are using Drush to execute the migrations. At the time of this writing, it is not possible to rollback a CSV migration from the user interface. See this issue in the Migrate Source CSV for more context.

Sharing configuration with migration groups

Arguably, the major benefit of migration groups is the ability to share configuration among migrations. In the example, there are three migrations all reading from CSV files. Some configurations like the source plugin and header_offset can be shared. The following snippet shows an example of declaring shared configuration in the migration group for the CSV example:

uuid: e88e28cc-94e4-4039-ae37-c1e3217fc0c4
id: udm_config_group_csv_source
label: 'UD Config Group (CSV source)'
description: 'A container for migrations about individuals and their favorite books. Learn more at https://understanddrupal.com/migrations.'
source_type: 'CSV resource'
shared_configuration:
  dependencies:
    enforced:
      module:
        - ud_migrations_config_group_csv_source
  migration_tags:
    - UD Config Group (CSV Source)
    - UD Example
  source:
    plugin: csv
    # It is assumed that CSV files do not contain a headers row. This can be
    # overridden for migrations where that is not the case.
    header_offset: null

Any configuration that can be set in a regular migration definition file can be set under the shared_configuration key. When the migrate system loads the migration, it will read the migration group it belongs to, and pull any shared configuration that is defined. If both the migration and the group provide a value for the same key, the one defined in the migration definition file will override the one set in the migration group. If a key only exists in the group, it will be added to the migration when the definition file is loaded.

In the example, dependencies, migration_tag, and source options are being set. They will apply to all migrations that belong to the udm_config_group_csv_source group. If you updated any of these values, the changes would propagate to every migration in the group. Remember that you would need to sync the migration group configuration for the update to take effect. You can do that with partial configuration imports as explained in this article.

Any configuration set in the group can be overridden in specific migrations. In the example, the header_offset is set to null which means the CSV files do not contain a header row. The node migration uses a CSV file that contains a header row so that configuration needs to be redeclared. The following snippet shows how to do it:

uuid: 97179435-ca90-434b-abe0-57188a73a0bf
id: udm_config_group_csv_source_node
label: 'UD configuration host node migration for migration group example (CSV source)'
# Any configuration defined in the migration group can be overridden here
# by re-declaring the configuration and assigning a value.
# dependencies inherited from migration group.
# migration_tags inherited from migration group.
migration_group: udm_config_group_csv_source
source:
  # plugin inherited from migration group.
  path: modules/custom/ud_migrations/ud_migrations_csv_source/sources/udm_people.csv
  ids: [unique_id]
  # This overrides the header_offset defined in the group. The referenced CSV
  # file includes headers in the first row. Thus, a value of 0 is used.
  header_offset: 0
process: ...
destination: ...
migration_dependencies: ...

Another example would be multiple migrations reading from a remote JSON. Let’s say that instead of fetching a remote file, you want to read a local file. The only file you would have to update is the migration group. Change the data_fetcher_plugin key to file and the urls array to the path to the local file. You can try this with the ud_migrations_config_group_json_source module from the demo repository.

What did you learn in today’s blog post? Did the know that migration groups can be used to share configuration among different migrations? Share your answers in the comments. Also, I would be grateful if you shared this blog post with others.

This blog post series, cross-posted at UnderstandDrupal.com as well as here on Agaric.coop, is made possible thanks to these generous sponsors. Contact Understand Drupal if your organization would like to support this documentation project, whether it is the migration series or other topics.

Aug 29 2019
Aug 29

In the previous post we talked about migration groups provided by the Migrate Plus module. Today, we are going to compare them to migration tags. Along the way, we are going to dive deeper into how they work and provide tips to avoid problems when working with them. Let’s get started.

Example migration group definition containing migration tags.

What is the difference between migration tags and migration groups?

In the article on declaring migration dependencies we briefly touched on the topic of tags. Here is a summary of migration tags:

  • They are a feature provided by the core Migrate API.
  • Multiple tags can be assigned to a single migration.
  • They are defined in the migration definition file alone and do not require creating a separate file.
  • Both Migrate Tools and Migrate Run provide a flag to execute operations by tag.
  • They do not allow you to share configuration among migrations tagged with the same value.

Here is a summary of migration groups:

  • You need to install the Migrate Plus module to take advantage of them.
  • Only one group can be assigned to any migration.
  • You need to create a separate file to declare group. This affects the readability of migrations as their configuration will be spread over two files.
  • Only the Migrate Tools provides a flag to execute operations by group.
  • They offer the possibility to share configuration among members of the same group.
  • Any shared configuration could be overridden in the individual migration definition files.

What do migration tags and groups have in common?

The ability to put together multiple migrations under a single name. This name can be used to import or rollback all the associated migrations in one operation. This is true for the migrate:import and migrate:rollback Drush commands provided by Migrate Plus. What you have to do is use the --group or --tag flags, respectively. The following snipped shows an example of importing and rolling back the migrations by group and tag:

$ drush migrate:import --tag='UD Config Group (JSON Source)'
$ drush migrate:rollback --tag='UD Config Group (JSON Source)'

$ drush migrate:import --group='udm_config_group_json_source'
$ drush migrate:rollback --group='udm_config_group_json_source'

Note: You might get errors indicating that the "--tag" or "--group" options do not accept a value. See this issue if you experience that problem.

Neither migration tags nor groups replace migration dependencies. If there are explicit migration dependencies among the members of a tag or group, the Migrate API will determine the order in which the migrations need to be executed. But if no dependencies are explicitly set, there is no guarantee the migrations will be executed in any particular order. Keep this in mind if you have separate migrations for entities that you later use to populate entity reference fields. Also note that migration dependencies are only executed automatically for import operations. Dependent migrations will not be rolled back automatically if the main migration is rolled back individually.

Can groups only be used for migrations defined as configuration entities?

Technically speaking, no. It is possible to use groups for migrations defined as code. Notwithstanding, migration groups can only be created as configuration entities. You would have to rebuild caches and sync configuration for changes in the migrations and groups to take effect, respectively. This is error prone and can lead to hard to debug issues.

Also, things might get confusing when executing migrations. The user interface provided by Migrate Plus works exclusively with migrations defined as configuration entities. The Drush commands provided by the same module work for both types of migrations: code and configuration. The default and null values for the migration_group key are handled a bit different between the user interface and the Drush commands. Moreover, the ability to execute operations per group using Drush commands is provided only by the Migrate Tools module. The Migrate Run module lacks this functionality.

Managing migrations as code or configuration should be a decision to take at the start of the project. If you want to use migration groups, or some of the other benefits provided by migrations defined as configuration, stick to them since the very beginning. It is possible to change at any point and the transition is straightforward. But it should be avoided if possible. In any case, try not to mix both workflows in a single project.

Tip: It is recommended to read this article to learn more about the difference between managing migrations as code or configuration.

Setting migration tags inside migration groups

As seen in this article, it is possible to use set migration tags as part of the shared configuration of a group. If you do this, it is not recommended to override the migration_tags key in individual migrations. The end result might not be what you expect. Consider the following snippets as example:

# Migration group configuration entity definition.
# File: migrate_plus.migration_group.udm_config_group_json_source.yml
uuid: 78925705-a799-4749-99c9-a1725fb54def
id: udm_config_group_json_source
label: 'UD Config Group (JSON source)'
description: 'A container for migrations about individuals and their favorite books. Learn more at https://understanddrupal.com/migrations.'
source_type: 'JSON resource'
  migration_tags:
    - UD Config Group (JSON Source)
    - UD Example
# Migration configuration entity definition.
# File: migrate_plus.migration.udm_config_group_json_source_node.yml
uuid: def749e5-3ad7-480f-ba4d-9c7e17e3d789
id: udm_config_group_json_source_node
label: 'UD configuration host node migration for migration group example (JSON source)'
migration_tags:
  - UD Lorem Ipsum
migration_group: udm_config_group_json_source
source: ...
process: ...
destination: ...
migration_dependencies: ...

The group configuration declares two tags: UD Config Group (JSON Source) and UD Example. The migration configuration overrides the tags to a single value UD Lorem Ipsum. What would you expect the final value for the migration_tags key be? Is it a combination of the three tags? Is it only the one key defined in the migration configuration?

The answer in this case is not very intuitive. The final migration will have two tags: UD Lorem Ipsum and UD Example. This has to do with how Migrate Plus merges the configuration from the group into the individual migrations. It uses the array_replace_recursive() PHP function which performs the merge operation based on array keys. In this example, UD Config Group (JSON Source) and UD Lorem Ipsum have the same index in the migration_tags array. Therefore, only one value is preserved in the final result.

The examples uses the migration_tags key as it is the subject of this article, but the same applies to any nested structure. Some configurations are more critical to a migration than a tag or group. Debugging a problem like this can be tricky. But the same applies to any configuration that has a nested structure. If the end result might be ambiguous, it is preferred to avoid the situation in the first place. In general, nested structures should only be set in either the group or the migration definition file, but not both. Additionally, all the recommendations for writing migrations presented in this article also apply here.

What did you learn in today’s blog post? Did you know the difference between migration tags and groups? Share your answers in the comments. Also, I would be grateful if you shared this blog post with others.

This blog post series, cross-posted at UnderstandDrupal.com as well as here on Agaric.coop, is made possible thanks to these generous sponsors. Contact Understand Drupal if your organization would like to support this documentation project, whether it is the migration series or other topics.

Aug 29 2019
Aug 29

Make sure you have downloaded and installed the ECK module. Once installed, click on the Configure link or go to Structure > ECK Entity Types.

ECK Module Install

The ECK module allows you to define different entity types. These are similar to the way you define content types, but at a slightly higher level. When you create a content type, you are technically creating a bundle within the node entity type. In the case of ECK, you are defining your own Entity type that may or may not have bundles within it.

Click on the Add entity type button.

ECK Add Entity Type

In this example, we will be creating a new entity to keep track of important statistics. We will use the label Statistic and check all the available Base Fields. You can use whatever Label you would like and the base fields are all optional. I would highly recommend adding a Title field though.

Once finished, click the Create entity type button.

ECK Add Stat Entity Type

Your new Entity will now be displayed in the Entity Types list.

ECK Entity Type List

Click the Add bundle button to add a new bundle to this entity type. In our example, we will only create one bundle. The bundle will be called Number Stat and we will use it to track numeric statistics. Once you have your bundle information filled out, click the Save bundle button.

ECK Add Entity Bundle

The new bundle is now listed on the Statistic bundles listing page.

ECK Bundles List

We could create additional bundles if we had different types of statistics. Similar to how you can have different content types in Drupal 8. In this case, we will stick with just the one bundle and click on the Manage fields button.

Here you can add fields just like you add fields to a content type. In this case, just add a decimal number field called Statistic.

ECK Add Field

You can now go back to the ECK Entity Types admin page and click the Add content button on your Statistic entity type. Go ahead and fill out the form to add a few stats. Once you create some stats, you can go to the ECK Entity Types admin page and select the Content list option from the dropdown button. This will list all the entities for that entity type.

ECK Content

Now that you have the data stored on your site you can use Views or entity reference fields to display or reference this content throughout your site.

The ECK Module has a number of permissions available. These allow you to decide which roles can administer your ECK entities, bundles, and types. Typically you will give all of these roles to only your administrative users. However, the module provides flexibility if you have more complicated use cases.

ECK Permissions

You now know how to create custom entity types using the ECK module. Now go out and build all your entity data structures!

Aug 29 2019
Aug 29

usability testing and great government UX webinar

Crafting UX - for the people, by the people 

Mass.gov takes a data-informed approach to site decisions. An open ear to constituent feedback ensures a "wicked awesome" user experience. So it's no surprise that when the site’s navigation needed improvement, user testing became a guiding light.

In our upcoming webinar, hear how Mediacurrent teams with Mass.gov on a user testing strategy leveraging Drupal 8 and Google Optimize. 

Join our webinar

Watch how to deliver a constituent experience that is discoverable, accessible, and truly “for the people, by the people.” Join us on September 18, 2019, at 2:00 EDT to follow along with our process. Learn tips to user test your way to better website UX. 

Register for the webinar.

You'll learn

  • Our approach to testing and gathering user feedback
  • The A/B testing variants we used to steer site navigation and layout
  • Deep dive into testing with Google Optimize 

Presenters

  • Clair Smith, Senior Front End Developer, Mediacurrent
  • Becky Cierpich, UX/UI Designer, Mediacurrent

We hope to see you there!

Aug 28 2019
Aug 28

Join Hook 42 for Another Exciting BADCamp

Some of our team will be at BADCamp this year with a combination of to-do's and goals. Our team is not only going to be on-site for training, sessions, booth-sitting, and attending sessions, we're also behind the scenes helping organize and plan for this year's event. Our sponsorship, and those of many other amazing donors, is going towards putting together another BADCamp you don't want to miss!

We're excited to be so hands-on for this event and would like to send a special thank you to those involved in the planning and coordination that have embraced Hook 42's participation thus far. The variety of ways we can contribute to this event and make an impact is extensive, and our team is happy to be giving back to an amazing community. We can't wait for October!

Hook 42 Trainings at BADCamp

We're excited to have four of our team members teaming up to give two trainings this year. The first, Adam Bergstein and Ryan Bateman will be talking about GatsbyJS and Drupal 8. In this training you'll learn how to take a brand new Drupal 8 site running the Umami Demonstration Install Profile and build a recipe blog front-end using GatsbyJS and React, with no React experience required! 

We'll also have Aimee Hannaford and Lindsey Gemmill giving training on web accessibility, covering a beginning to end high-level dive into how to be inclusive on the web. For more information about each training, visit the respective training page BADCamp website linked below.

Hook 42's Adam Bergstein and Ryan Bateman deliver a full day Drupal 8 and GatsbyJS training at BADCamp 2019

Hook 42's Aimee Hannaford and Lindsey Gemmill deliver a full day web accessibility training at BADCamp 2019

Registration For Training Sessions Is Now Open!

There is limited availability for training sessions, so sign up soon. Each room holds approximately 30 people. If you're interested in one of our training sessions, or any other training at BADCamp this year, please be sure to register to reserve your spot. See the training detail page on the BADCamp website for more information.

Sessions

Beyond full-day training, our team will also be giving a few sessions at BADCamp. We're passionate about the topics we submit to camps and conferences and are excited to be sharing our ideas and expertise with all of you.

Hook 42's Adam Bergstein and Ryan Bateman deliver a session at BADCamp

Hook 42's Aimee Hannaford and Ryan Bateman deliver a session at BADCamp

Hook 42's Lindsey Gemmill delivers a session at BADCamp

Hook 42's John QT Nguyen and Pantheon's David Needham deliver a session at BADCamp

We Hope To See You There

If we aren't behind the computer talking, you can find us out and about. Our team will be fully immersed in BADCamp 2019, so we're just as excited to attend the other amazing presentations selected for this year's event as we hope you are.

Hook 42 will also have a booth at the event this year. We're always looking to make new connections with people at these events and can't wait to talk about the amazing things our team has to offer. Whether you're a new face or an old face, come stop by and say hello!

Aug 28 2019
Aug 28

To always work smoothly and be up-to-date, your Drupal website needs support and maintenance. It’s nice to know you can rely on experts for things like Drupal updates, website performance audit, bug fixes, or anything else. However, you need to choose them carefully. 

First of all, there are plenty of reasons to choose a support company over a freelancer. But there are also the must-have characteristics of a good Drupal support agency. Let them help you with your choice. 

Good Drupal support agency’s features 

  • Decent time on the market

All companies were once startups. However, if a Drupal support agency popped up a week ago, it might be risky to immediately entrust it with major website tasks. After a little while, the agency is able to prove its stability on the market. 

Drudesk support agency has celebrated its 4th birthday and its parent company InternetDevels is preparing to have its 12th anniversary. They have gained a lot of solid experience in challenging Drupal support projects and complex cases.

  • Positive customer feedback

No one says that 100% trust in online feedback is a panacea. Still, it’s important to know what experiences other people have had with a Drupal support agency, so it definitely should have positive reviews.

Drudesk’s customers speak warmly about the agency, call it a forward-thinking and conscientious team, communicative and organized, quick to respond to queries, a reliable partner, and more. Drudesk reviews on Clutch, testimonials on the company’s main page, and other feedback will give you a better picture. 


positive customer feedback

  • A wide range of support services

Website support embraces a wealth of various aspects. It’s convenient and effective to have the same agency working with all of them. So your ideal Drupal supporting partner needs to have broad expertise. 

Drudesk offers all kinds of support and related services. Among them:

and many more.

But this is not all — Drudesk has a strong web development background that allows it to develop websites from scratch. 

In addition, the agency employs quality assurance engineers, DevOps experts, SEO specialists, and others whose expertise may be needed for your case. 

a wide range of support services

  • Warranty for the work performed

Drupal support services, like any others, need to have a warranty. This means that the agency is responsible for their work. 

It should be noted that website support services have their peculiarities. Websites often have old bugs and legacy code that may interfere with the website’s work, making it harder to give a warranty for particular tasks. 

Still, Drudesk cares about customers and offers a warranty for the tasks performed. The standard acceptance period lasts 31 calendar days. During this time, you can check the results. If you find any bugs, everything is fixed free of charge.

  • Transparent task tracking

A good Drupal support agency will give you full control over the tasks performed. Its workflows will be transparent and open to you.

With Drudesk, all customers can see the working progress in real-time. All tasks are managed in Drudesk’s CRM. You can leave your comments on tasks and approve the results. You can also be notified about the team’s new comments in any way convenient to you. 

Since Drudesk relies on the Agile methodology, it has an iterative approach to projects. Large ones are broken into sprints and your feedback is considered at every stage in order to make the result more suited to your needs.

transparent task tracking

  • Openness to communication

When working with a decent Drupal support agency, you should feel they are ready to answer your questions and hear your feedback.

Effective communication is one of Drudesk’s priorities. The agency has English-speaking customer service managers and project managers. 

They make sure your ideas are brought to the developers in the right ways, and they are always open to communication and ready to schedule a Skype call with you or reply by email.  

  • Drupal.org profile & community engagement

Every agency working with the Drupal open-source CMS is part of the Drupal community. Its engagement into Drupal life is directly connected to its attitude and expertise. 

Drudesk is listed on drupal.org and its developers make their contribution to various Drupal projects. They help newbies understand Drupal through discussion forums and willingly mentor new talents. Drudesk developers also actively participate in Drupal events like code sprints, Drupal Camps, etc.

Drupal community engagement

Your Drupal support agency is here

When all these characteristics are met, this means you have found a perfect Drupal support services partner. Contact our Drupal support agency and let’s discuss how we can make your website’s work better. This is where good website support starts!

Aug 27 2019
Aug 27

Carole Bernard's Drupal.org user profile photoThe Drupal Association (DA) is pleased to announce the recent hire of Carole Bernard as the Director of Marketing and Outreach. She and her team will focus on increasing visibility for the Drupal Association and opportunities for Drupal adoption through marketing, community engagement, volunteer management and public relations activities.

With extensive nonprofit and public sector experience, Bernard has served in senior leadership roles for more than 15 years at local, regional and national organizations.  

Bernard, a Boston native, began her career as a speechwriter for the Mayor of Boston. She then worked as the Director of Public Information for the largest human service agency in New England, Action for Boston Community Development, Inc. She started her own consulting business in 2015, providing strategic communications, fundraising and executive management services to nonprofit organizations in the Washington, DC area. She recently served as the Director of Communications and Marketing for Sickle Cell Disease Association of America, Inc. She also has worked for Paralyzed Veterans of America, National Center for Women and Children and the National Minority AIDS Council as their Director of Communication.

“I am so excited to be a part of the dynamic team at the Drupal Association,” says Bernard. “I am blown away by the passion and commitment of the Drupal community, and I look forward to working with everyone to tell the DA story, to showcase the Drupal project, to broaden the organization’s reach to new audiences and to increase opportunities for Drupal adoption around the world through strategic communications and outreach efforts.”

“We want to continue to position Drupal as the leading open-source CMS for ambitious digital experiences that reach audiences across multiple channels around the world,” says Heather Rocker, DA Executive Director. “We also want to expand our efforts to bring new entities and individuals to the table to participate in our global community. We are excited to have Carole join us and to help us lay out and execute a plan that leverages all of the DA assets for growth, inclusion, awareness and participation.”

Bernard received her bachelor’s degree in Political Science from Framingham State College and her master’s degree in Journalism from Boston University.

Aug 27 2019
Aug 27

So you want to turn those plain URL’s into something a little more fancy? Well, you have come to the right place. First make sure you have downloaded and installed the Drupal 8 URL Embed module. It’s recommended to download and install this module using composer so it will download the Embed module and the PHP Embed library automatically. You can do that with

composer require drupal/url_embed

The next step is to customize the embed button, if you want to. To do this, go to Configuration > Content Authoring > Text editor embed buttons.

Essentially, you can change the icon and add additional embed buttons. These buttons will be able to be used within your Text formats. It’s fine to leave these at the defaults for now. By clicking on the Settings link you can change where the button icon files should be stored.

Leave these settings at their defaults as well. Next, navigate to Configuration > Content Authoring > Text formats and editors. Here you will see a list of all the text formats on your website. Click the Configure button on the Basic HTML text format.

This will bring you to a page that will allow you to edit the Toolbar Configuration. Drag the URL Embed button into the Toolbar Configuration. You can place this wherever you want, but it likely makes the most sense in the Media section.

We now need to enable and configure two additional filters. Check the boxes next to Display embedded URLs and Convert URLs to URL embeds.

In the Filter Processing order section, make sure to rearrange the filters so Convert URLs to URL embeds comes before Display embedded URLs. This kind of makes sense as you need to first convert the URL before you can display it! You may also want to drag the Align images below both of these filters if you want to be able to align the embed codes. To use the Align images you will need to make sure you have the alignment buttons added to your toolbar configuration.

Under Filter settings, you now need to allow in the Limit allowed HTML tags and correct faulty HTML section.

Still in the Filter settings section, you can specify an optional prefix which can be used to indicate what URLs to accept. This could be useful if you only wanted Twitter links to be used for example. If you don’t specify anything here, then all of the embed urls from any provider will work.

Click the Save configuration button and now create some content on your site. You can go to Content > Add content > Article and go to the Body field to see your new Embed button in action. Make sure you add some content then click the URL Embed button and paste in a link (the example below used a Twitter link).

Now when you save the page, it will show up exactly like you would expect an embedded Tweet to show up.

Feel free to try out different types of links, the next example uses a Youtube link.

That’s all there is to it! As you can see, it’s incredibly easy to use the Drupal 8 URL Embed module to embed Twitter, Youtube, Instagram, Spotify, and many more types of links directly in your Drupal content. Now go out and start sharing those links on your Drupal site!

Aug 27 2019
Aug 27

Congrats, Drupal 9 release is planned for June 2020! In the meantime, you should ensure that your platform is ready for the upgrade. Although it should arrive easily and smoothly, your website still needs a plan.

The team at WishDesk explains how to get ready and plan for Drupal 9.

What should I know about Drupal 9?

Dries Buytaert, the Drupal creator, announced the planned release of D9 for June 3, 2020. This new version is developed in the same codebase as D8, which means that migration from Drupal 8 to Drupal 9 will be painless. What about D7 and D8? They will reach their end-of-life in November 2021. This means that there will be no official support and no more updates for these versions.

plan for Drupal 9

We recommend upgrading to Drupal 9 by November 2021.

How to plan for Drupal 9 if you are using Drupal 8

If you are using D8 and keep it updated, the migration will be no harder than a minor release. When planning for Drupal 9, make sure that all your modules are updated and deprecated code is removed from the codebase.

Why will migration from Drupal 8 to Drupal 9 be effortless?

  • there will be no need to rewrite the whole code for D9
  • regular D8 updates will keep your platform up-to-date and ready for migration
  • D8 modules will be compatible with D9

How to prepare for Drupal 9 if you are on Drupal 7

Things are more difficult for those whose website is still built on D7. We strongly recommend starting the upgrade process now if you want to have fewer problems when planning for Drupal 9. There are numerous reasons to upgrade to D8:

  • better development experience
  • powerful layout building
  • easy upgrades
  • improved media management
  • improved theming experience
  • decoupled content delivery
  • improved performance
  • and more!

Although the best option for D7 website owners is to migrate to Drupal 8, you can contact a web development company for support of your Drupal 7 website even after the official end-of-life.  

Plan for Drupal 9 with WishDesk!

Get ready for Drupal 9 with the web development specialists at the WishDesk team! No matter what version of this CMS you are using now, we will help you plan for Drupal 9, as well as perform the migration process effortlessly. Just contact our experts!

Aug 27 2019
Aug 27

Want an easy way to extend your market reach and ultimately your sales? Do you feel you need to personalize your website to every user no matter which country they belong to or what language they speak? Getting yourself a multilingual website is your best bet. Not only is it a more cost-effective marketing strategy, it also helps in increasing your website traffic and overall SEO. Drupal CMS has particularly taken up this challenge of providing not only users but also developers with the ability to access Drupal in a language that they prefer. And with Drupal 8 being multilingual out-of-the-box, it has become an ideal choice for businesses and developers. Powerful Drupal translation modules offer developers with granular configuration capabilities where every content entity can be translated.

What are Multilingual Websites?

Multilingual basically means written or available in different languages. Multilingual websites connect better with users from different countries as it immediately adds an element of familiarity. Drupal 8 provides an easy and a great experience of building a multilingual website. Currently Drupal 8 supports 100 different languages for translation.

Drupal 8 has a multilingual feature which comes along with the installation interface. As soon as you install Drupal, based on the browser preference, it provides a language for your Drupal website. Based on the option selected the site is installed in that particular language. Drupal 8 basically provides 4 different modules for language and content translation. We can enable the required Drupal modules in our site and use according to our needs in the website. 

Drupal-8-migration

The four core modules provided by drupal are

  1. Language module
  2. Content translation module
  3. Interface translation module
  4. Configuration translation module

Let’s catch up with what each module does, its configurations and how each module can be used in our Drupal website.

Firstly, you need to enable all the 4 core modules in your drupal site. All the translation modules can be configured at path /admin/config/regional

Drupal Language Module

This Drupal 8 language module is one of the core modules located at core/modules/language. It provides a feature of adding and choosing a new language to your Drupal website. Under /admin/config/regional/language/ you can simply add a new language to your site by clicking on the “Add Language” button. It provides a list of different languages from which you can choose the language you need for the development.

drupal-8-migrationdrupal language module

Choose the preferred language from the list and add it

Once the language is added the interface will look similar to this. In the above picture, the default language of the interface is set as English and spanish is the additional language installed. The 9172/9340(98.2%) under Interface translation indicates that 9172 words out of 9340 words available for translation are translated i.e 98.2% of the words in the interface are translated.

It also provides a block( Language switcher) to switch the language from one to another which can be placed at  any region of your Drupal website. Under /admin/structure/block we can place the language switcher block with which we can switch the default language of our website.

Drupal Language Switcher                                                                                Language Switcher

Once the block is placed in the region we will be able to switch to the different languages in the web page itself.

Content Translation Module

The Drupal Content Translation module allows you to translate content entities such as comments, custom block, contents, taxonomy terms, users etc. In order to translate the content entities, the website should have at least two languages installed. The content translation can be configured at path admin/config/regional/content-language . It provides a list of entity types which can be translated. 

For example, click on the content configuration option that appears for each content type.

Let us consider that the content translation is being enabled for the article content type. It provides an option to decide if each subtype entity to be translatable or not. We can also change the default language for a particular content type. Each field has an option to translate its content or not. 

Content Translation Module - Choosing the content                                            Content Translation Module - Choosing the content

It also provides an option to input the content in the language which is suitable for the user while adding content from the backend interface. Once the above configuration is set up and when we try to add content to the Article content type we can see a Select option with the languages installed in our site. We can select any language and add content in the particular language selected.

Content Translation Module - Select the language                                       Content Translation Module - Select the language


Once the contents are saved, users with translate permissions will see links to Translate its content. It provides an additional tab called “Translate” along with the  "Edit" links, and you'll be able to add translations for each configured language.

Content Translation Module - Select the language                                                           
                                                         Content Translation Module - Select the language

Interface translation Module

 The Drupal Interface translation module is also a part of core module and can be easily enabled like any other drupal module. Once this module is enabled it is possible to replace any string in the interface with string which has been customized. Whenever this module encounters any string it tries to translate the particular string to the current language of the interface. If a particular translation is not available it is remembered and we can lookup into the untranslated string in the table.   

Interface translation Module                                                                Interface translation Module


In the above example, the strings which are both translated and untranslated are displayed and we are able to modify the strings for the language that is installed as well.
The translations for the strings are put up in a single place called http://localize.drupal.org and the Localization Update module will automatically import the updated translation strings for your selected language. In Drupal 7 and previous versions, this was a contributed module. In Drupal 8 it is a part of a core module.

Configuration translation Module

The Drupal 8 Configuration Translation module allows configuration to be translated into different languages. The site name, views name, and other configurations can be translated easily using the configuration translation.

Configuration translation Module                                                                    Configuration translation Module

It also provides an option to input the content in the language which is suitable for the user while adding content from the backend interface. Once the above configuration is set up and when we try to add content to the Article content type we can see a Select option with the languages installed in our site. We can select any language and add content in the particular language selected.

Aug 26 2019
Aug 26

August 26th is a special day for the owners of Hook 42. As a woman-owned company, we are very familiar with the adversity women face in the workplace. We’re even more familiar with how intense that can get depending on the industry you’re in.

Our inspiration for starting Hook 42 was a culmination of many different things, but one of those things was creating our own voice outside of a male-dominated workforce. The signs were clear that everywhere we looked we found areas of improvement, so our owners took things into their own hands and built a business to solve those problems. 

Hook 42 strives to be a fully inclusive workplace, and we encourage our team to voice their opinions and work towards a better version of themselves, no matter what it takes. 

But what does this philosophy have to do with today specifically?

National Women’s Equality Day and National WebMistress Day fall on August 26th. Both of these topics are influenced by a history of inequality and strive to create an inclusive space that gives everyone an equal opportunity. We’d like to take time to recognize the fights that were won and those we’re still fighting today with the celebration of these two holidays.

A Little Bit of History

The United States Congress passed the 19th Amendment to the Constitution granting women full and equal voting rights on this day in 1920. [ source ]

“Later, On July 30, 1971, Rep. Bella Abzug (D-NY) presented a bill designating August 26th as Women’s Equality Day. That year, rallies, celebrations and political debate filled the country on August 26th. By 1973, Congress passed a joint resolution declaring the day to be observed on August 26th of each year. Every year since each president declares this day as Women’s Equality Day commemorating the certification of the 19th Amendment to the United States Constitution.” [ source ]

We’re predicting that it is no coincidence Kat Valentine founded National WebMistress Day on this very same day quite some years later. In only its third year, National WebMistress Day is putting a more granular focus on supporting equal rights for women and recognizing the adversity women face in the technology industry.

As members of that industry, we want to share our support for every woman in tech that is fiercely making an effort to leave their mark and find their place in a chaotic environment. We’re 100% behind supporting the wave-making, change-inducing women of tech and cultivating that same culture right within our company.

We talked to the ladies within Hook 42 to gather some insights into their journey within tech so far, and what the future holds. Let’s continue to drive change, and support those who are advocating these rights.

The Women of Hook 42

Aimee

What would you tell your younger woman-self if you had the chance to give her advice based on what you know now about the tech industry?

Don’t dress down. You don’t need to compromise your personal appearance to fit into the “hoodie and t-shirt crew”. If you want to rock your comfort wear then embrace it!

Believe in yourself. You are the only one that can affect change around you. Make goals, strive to reach them, and calibrate when you need to. You are the only one that can improve yourself and your life.

And enjoy the journey. 

How can people be an ally to women in tech?

Be supportive in all aspects of a woman’s life. The tech industry has long days, off-hours, and overtime baked into its core. This combined with the added mental and physical workload we may experience outside of work thrives on a supportive culture.

What helps you push boundaries in tech?

Technology is one of the fastest platforms to implement and execute change. I love bringing peace to chaos and to create new “things” that help improve our lives. Being able to leverage technology for good is a big driver for me.

Is there a woman in tech you look up to, maybe a mentor? (Can be someone you've never talked to but are just inspired by)
Gerri Sinclair. She balances a loving family and a wonderful extended group of friends while bringing bold technology changes through the years. Thank you for all of your energy and impact on technology Madame Modem!

What are you looking to accomplish with your career in tech?

I like to have a place where people can feel they can work safely where they can grow in their personal and professional lives. I like to build cool things. My goal is to have more people be able to have positive interactions which said cool things. :) I’m working to extending the tech environment beyond the tech to include the human side of things, like growth, authenticity, inclusion, and accessibility.

Were there struggles in your journey? Do you still face them? How did you push past it?

There are and have been many struggles in my journey. Some struggles have changed over time, some are still the same. Struggles include breaking through glass ceilings, having to work harder / smarter / stronger / faster than men, and being criticized for having the courage to lead the charge and search out brave new worlds.

Mentors, friends, professional guidance, education, and grit got me through the struggles. Once again, a person (me) is fully responsible for my own actions, regardless if I’m advised in one way or another.

Kristen

What would you tell your younger woman-self if you had the chance to give her advice based on what you know now about the tech industry?

Even though you hear a lot about bad things about being a woman in tech, you can find nice communities who will welcome you and be supportive. If you are in a bad environment in tech (no matter what gender), try to talk to other people in tech who are in healthy environments and consider switching before leaving.

How can people be an ally to women in tech?

Be a good person and call out bad behavior. Try to show that you welcome others in your groups with your words and actions (e.g. the "Pac-Man Rule"). Be open to differing opinions when offered respectfully.

What helps you push boundaries in tech?

Having a supportive community helps us stand on the shoulders of giants and continue to innovate.

What fuels your passion to keep doing what you do every day?

Seeing something I helped build used by real people, in real life.

Is there a woman in tech you look up to, maybe a mentor? (Can be someone you've never talked to but are just inspired by)

Although there are a lot of great people in the Drupal community, there are 3 people who are amazing role models for all genders who immediately pop into my head: Angie Byron, Amanda Gonser, and Gábor Hojtsy.

What are you looking to accomplish with your career in tech?

Using the skills I have for giving back to the greater good and improving humanity.

Were there struggles in your journey? Do you still face them? How did you push past it?

Yes, often. Some things that may help: stay away from toxic people, give yourself permission to fail, give yourself permission to say "no" without guilt, be kind to yourself, and talk to people who have gone through similar things so you know you aren't alone.

Alona

What would you tell your younger woman-self if you had the chance to give her advice based on what you know now about the tech industry?

Keep learning and never give up.

How can people be an ally to women in tech?

Support, mentor, and encourage.

What fuels your passion to keep doing what you do every day?

I like working in tech. I’m passionate about what I do and am looking forward to learning new things and grow professionally. I have a dream to become a successful woman, and this dream keeps me going.

Ellen

How can people be an ally to women in tech?

Start nurturing curiosity at a young age. Encourage the girls and young women in your life to have bold dreams, because that’s the age where kids start conceptualizing what is “possible” and “impossible.” When I was little, I wanted to be an actress or a ballet dancer, but I never dreamed of being a scientist, astronaut, engineer, or inventor. Young people need to hear and see that women totally belong in every sector of public life.

What fuels your passion to keep doing what you do every day?

I’m lucky to have a strong network of family and friends who are my cheerleaders. My community has supported me every step of the way, and so far I’ve had a diverse, and storied career. I’m a career-changer, and my background is primarily in education, so I didn’t find my way into the tech world until recently. My passion comes from knowing who I am, where I come from, and the community that’s helped me along the way.

What are you looking to accomplish with your career in tech?

Build amazing websites and web applications! Make the digital universe a little more accessible with every project I touch. Along the way, I hope to encourage other underrepresented folks in the tech industry to keep chasing their curiosity.

Kris

What would you tell your younger woman-self if you had the chance to give her advice based on what you know now about the tech industry?

Do more research. It’s amazing to see how many companies flourish in the tech industry. I didn't know how well rounded it was until I got into it myself.

How can people be an ally to women in tech?

I feel there is a stigma around women in the tech industry that they aren't as smart or as innovative as a lot of men. I feel that if women were given the chance to prove themselves, that they are willing to improve their skills and show they want to learn and progress, there would be a lot more women in the industry and a lot more women-owned companies.

What fuels your passion to keep doing what you do every day?

My passion to do what I do every day comes from the great people I work with. Not only do they support me and encourage me to expand my knowledge, but they give me opportunities every day. My coworkers, leaders, and bosses all work hard to make this company run smoothly, and I strive to work like them. I want to prove that I am capable of taking on challenges and exceeding expectations. I want to be a better me everyday.

Lindsey

What would you tell your younger woman-self if you had the chance to give her advice based on what you know now about the tech industry?

Confidence from a woman might seem to be threatening, but continue to show your value and stand tall. Make sure the words you use are thoughtfully crafted before raising concerns and contradictions. The strong voice of a woman is often mistaken, so delivery is important. Take the time to think before you react in conflicting situations. 

How can people be an ally to women in tech?

When you see something awry, say something. If a woman confides in you, listen with the intent to help. See us based on our level of experience, not based on our gender. Fight our fights with us, and support our equality in the workplace.

What fuels your passion to keep doing what you do every day?

There are two parts of this for me, the first being the people around me. If it wasn’t for the support system I have inside and outside of the office, it would be much harder to hold my head high in the moments of struggle. 

I also thoroughly enjoy what I do and had the luxury of choice in deciding where my career would take me. Every career path comes with struggles, and being able to work through them and come out with something to show for is very rewarding for me and makes the work even more enjoyable.

Were there struggles in your journey? Do you still face them? How did you push past it?

There have always been struggles. My confidence is quite frequently confused with attitude. Things men often receive praise for, like telling it like it is and not backing down, are things women get reprimanded for. Not letting that defeat me, and keeping my head held high is what helps push past the discrimination. It’s not just men that take offense to it either, other women are often threatened by frequent requests to do better, and be better for all of us. There is a fine line between intimidation and confidence when conversing with women, and its a struggle I face daily and work to eliminate. It's in part learning how receptive people are to certain words, and others being open to strong opinions coming from a variety of people.

Knowing who I am and keeping sight of my worth, not letting other people’s opinions of me think any differently of myself, and knowing that my experiences and education are just as valid and just as those around me are how I stay my course. That confidence that gets "in the way" is also what keeps me going every day.

What About You?

Do you want to share your support for the women around you? It's imperative that we cultivate an environment of acceptance and one where everyone feels welcome. Take a moment to give a special shout-out to a woman in tech, or any woman you know that deserves some recognition.
 

Aug 26 2019
Aug 26

Promet Source is humbled to announce that this summer, two websites that we designed and developed for our clients have won three prestigious awards. 

2019 is the 25th year in which the widely recognized Communicator Awards have recognized “work that transcends innovation and craft” -- work that stands to make a “lasting impact.” The Communicator Awards receive more 6,000 entries and is sanctioned and judged by the Academy of Interactive & Visual Arts, an invitation-only group consisting of industry-leading professionals from media, communications, advertising, creative and marketing firms. AIVA also judges the W3 Awards, and the Davey Awards.  

The American Association of School Librarians’ 2019 Best Websites for Teaching & Learning rankings are peer reviewed and awarded based on the following criteria: innovation, creativity, active participation, collaboration, free to use, user friendliness, and encouraging for a community of learners to explore and discover. 

Awarding Endeavors

At Promet Source, the satisfaction of a job well done and a client whose expectations have been exceeded is the award we seek and the primary driver of creative solutions, collaborative passion, and constant dedication to staying on top of our game. 

Occasionally, though, there’s another level of achievement and recognition -- prestigious awards in which third parties evaluate our work and judge it to be at the leading edge of the rapidly evolving marketing and media landscape.

Both of these award-winning websites represented a labor of love for both Promet Source and our clients. Ready to Code was sponsored by Google and stemmed from a longstanding collaborative effort between the American Library Association and Google. The depth and breadth of resources available on the site are designed to empower librarians to advance computational thinking among children and teens of all backgrounds as an essential component of 21st Century literacy.

The Martin County Florida website served as an opportunity to optimize a utilitarian user experience that streamlined access to the full range of county services, updates, news, and social media connections, while strengthening civic pride with a visually engaging site that leveraged images of the county’s spectacular natural beauty and stunning infrastructural achievements.   

For Martin County, knowledge that their website has received recognition and attention on an international scale has compounded the impact of key objectives: heightened civic pride and greater awareness of a centralized location for conducting all county business.

For the American Library Association Ready to Code site, the recent ranking from the American Association of School Librarians represents a stamp of approval that has served to advance the mission of the site by increasing its credibility and magnifying its objectives, according to Marijke Visser, associate director and senior policy advocate for the American Library Association.

Why Awards Matter

In the rapidly evolving digital landscape, pushing the bounds of possibilities is an essential success factor. Leaning on what passed for innovation last year is the path toward tired solutions that don’t resonate with clients and constituents. That’s why we at Promet Source take awards such as these seriously and are honored to be in the company of agencies that share our dedication.

Ultimately, websites win awards because they work. They are built upon a deep level of inquiry that leads to extreme empathy for what users need and what they are hoping to accomplish when they visit a site. 

Users are bombarded every day with messages, media, and technologies that confuse and get in the way. The websites that win awards offer a surprisingly delightful experience in which extra effort and expertise on the part of designers and developers translates into streamlined simplicity -- a breath of fresh air within a cluttered and complex world.

Interested in the calibre of design and web development that wins awards while driving big-picture goals? Contact us today.

Aug 26 2019
Aug 26

In order to achieve a consistent user experience and interface design, Atomic Design is a great approach.

Atomic Design is not only a budget saver, but also your best friend for future requirements and allows you to fulfill customers wishes quickly and easily. 

The first step is to define the elements that are most in use. This would be, for example, colors, fonts, buttons and images. Starting with these simple elements, different components can be assembled, such as cards or sliders.

Example: Image + Text + Link = Card

Example: Image + Text + Link = Card


The analogy for atomic design is that it is organized along chemical notation, ranging from atoms to molecules to organisms to templates to pages. On each level the components are assembled from smaller components of the lower levels.
 

Chemical notation in 'Atomic Design'

Chemical notation in `Atomic Design`


When we design user experiences (UX) at 1xINTERNET we try to build the best possible user interfaces with the fewest amount of components.

The focus is to create efficiency in the project and to potentially save budget. When fewer components are designed in UX, fewer components need to be programmed and tested. 
 

This approach not only reduces initial efforts considerably but also reduces the cost of the project and its maintenance.


Imagine the following example: You have three different interaction pages and you create a great but different UX for each of those. Then, the users have three different systems to learn. If you instead create an equally good UX by reusing as many components as possible, the user has to learn much less and can interact much faster with your application.

Conclusion

A strict design process and the right integration with the website technology certainly offers efficiency benefits. We plan to write more blog posts on the topic in the next few weeks and are looking forward to your feedback.

Aug 26 2019
Aug 26

Our lead community developer, Alona Oneill, has been sitting in on the latest Drupal Core Initiative meetings and putting together meeting recaps outlining key talking points from each discussion. This article breaks down highlights from meetings this past week. You'll find that the meetings, while also providing updates of completed tasks, are also conversations looking for community member involvement. There are many moving pieces as things are getting ramped up for Drupal 9, so if you see something you think you can provide insights on, we encourage you to get involved.

We've had some gaps in August with limited meetings, so we're happy to be back in the swing of things and reporting a lot of great updates.

Drupal 9 Readiness (08/19/19)

Meetings are for core and contributed project developers as well as people who have integrations and services related to core. Site developers who want to stay in the know to keep up-to-date for the easiest Drupal 9 upgrade of their sites are also welcome.

  • It usually happens every other Monday at 18:00 UTC.
  • It is done over chat.
  • Happens in threads, which you can follow to be notified of new replies even if you don’t comment in the thread. You may also join the meeting later and participate asynchronously!
  • Has a public Drupal 9 Readiness Agenda anyone can add to.
  • The transcript will be exported and posted to the agenda issue.

Symfony development updates

Composer initiative issues affecting D8 and D9

Simpletest deprecation process

The active issue [meta] How to deprecate Simpletest with minimal disruption is one of the biggest deprecations in Drupal 8 while being one of the least trivial to update for, however, work on it is still a priority. 

Composer in Core meeting 08/21/19

  • It usually happens every other Wednesday at 15:30 UTC.
  • It is for contributors to the composer initiative in core who are working on improving drupal's relationship with composer.
  • It is done over chat.
  • Happens in threads, which you can follow to be notified of new replies even if you don’t comment in the thread. You may also join the meeting later and participate asynchronously!
  • Has a public migration meeting agenda anyone can add to.
  • The transcript will be exported and posted to the agenda issue.
  • For anonymous comments, start with a bust in silhouette emoji. To take a comment or thread off the record, start with a no entry sign emoji.

Scaffold files in core

The issue Add core scaffold assets to drupal/core's composer.json extra field is currently waiting on committer attention.

File security component

The issue Add drupal/core-filesecurity component for writing htaccess files has been fixed.

Relocate the scaffold plugin

The issue Relocate Scaffold plugin outside of 'core' directory has also been fixed.

Vendor Hardening Plugin

The issue Add Composer vendor/ hardening plugin to core is ready to be reviewed.

Template files

We have the current template files but still needed are the scaffold files as well as the vendor hardening plugin. Then we will be able to make a template that can be used to make a tarball.

Drupal.org changes to support this

  • Change packaging so that it can package 8.7.x the old way and 8.8.x the new way using composer to create a project.
  • Add a variant of the tool that Webflow uses to generate the drupal/core-pinned-dependencies and drupal/core-development-dependencies into the drupal.org jenkins workflow that publishes those 'somewhere'.

Optionally supporting composer.json for extensions

The issue [PP-3] Provide optional support for using composer.json for dependency metadata still needs work done on it.

Core cross-initiative meeting 8/22/2019

Migrate

Status

  • On 8/21 we discussed the best course of action to resolve.
  • We hoped to deprecate the Drupal 6 migrations and move them to contrib, but we do not have a way to validate how to make deprecations.
  • V Spagnolo has been working on several issues and they are all advancing well.
  • Nathaniel Catchpole shared some criteria for getting into 9.0-beta1.
  • Good progress has been made on multilingual (biggest blocker right now).
  • Multilingual components must land before 9.0-beta1, thankfully the initiative as a whole is pretty stable.

Blockers (Drupal 6 Deprecations)

  • Needs release manager sign-off on the plan to get deprecation made.
  • Plan for moving those elements deprecated into contrib projects.

Next Steps

  • Need release manager criteria for making the multilingual migration stable.
  • Need release manager review on deprecation plan.
    • Need to document it, our target this week.
    • Need sign off on this and what we’ll do next, the target being next week. 
    • Need a way to get deprecated items into contrib. 
      • Are there any dependencies to deprecate items? Do we have to have them in contrib first?

Auto Updates

Status

  • Contrib Alpha-1 was released this week, it includes:
    • Notifying site owners when a public safety announcement has been released by the security team.
      • Optionally this can be done via email, or in a set message on the page (similar to existing alerts).
    • 8 or so readiness checks to determine if your site can be automatically updated.
    • The above features are available in drupal 7 and 8.
    • Signature signed package now on packagist (dev-master), credit to MWDS folks.
      • This will ensure that MITM attacks are not possible because files are known.
      • This is a base that actual automated updates will rely on.
      • Timelines in the initiative update are still accurate.
      • No impact on 8.8, if added they will be new features on the update module.
      • More user-testing is needed for the PSA. We would like it to be in 8, but it could be in contrib in the meantime. The module is future-proofed enough to disable in contrib when it comes time to enable in core.

Blockers

There are currently no blockers.

Next Steps

  • Following the PSA there will be the beta release and user testing.
  • The core release is targetting 8.9?

CMI2

Status

  • The API patch was committed since the last call, it’s in an experimental module.
  • By the 8.8 feature freeze, the UI portion of the module will not yet be ready.
  • This means we can’t release it in core until 9.1.

Blockers

  • Resources are limited, the patch is large, and more people are needed to help with other aspects of the tool (maybe at Drupalcon Amsterdam).
  • The plan requires the release manager review & discussion.

Next Steps

  • Requires sign off for accessibility on WYSIWYG.
  • Two views issues fixed for content moderation.

Drupal 9

Status

Blockers

There are currently no blockers.

Next Steps

Continue working on Symfony4, Drupal 8 not using deprecated API’s and multi-version compatibility Symfony3 requirements to open the Drupal9 branch.

Claro

Status

  • We have made progress to the critical global issues, mini pager, messages, and others are not completed but are looking likely to land. Some specific one-off pages are most at risk.
  • Targeting 8.8 release, but hoping to descope some things to meet this goal.
  • Authoring components still need designs, hoping to get that work done to get into beta for 8.8
  • Roadmap to stabilize Claro.

Blockers

There are two issues the release manager needs to review.

Next Steps

Composer (Much of the information here was covered in the Composer meeting on 8/21)

Status 

  • Moving well, major patches are in which are focused on bells and whistles that will make the experience better.
    • One major patch changed core development workflow there have been no complaints so far (requires composer 1.9 if you’re working w/ 8.8 repo).
  • Scaffolding plan today requires some duplication of some files. It still needs to be confirmed with the core team.
  • Vendor core clean up plugin now strips out all the testing files from the vendor directory and makes sure there is an htaccess file, this is now a vendor hardening plugin. Waiting to be committed.
  • After the scaffolding and vendor hardening plugin are complete we can create a template that will generate a tarball that is composer ready.

Blockers

There are two patches waiting to be committed. 

Next Steps

Demo

Status

Aug 26 2019
Aug 26

My previous post explores how requesting a medical appointment online begins a patient's digital journey. A hospital's appointment request form might be a new patient's first interaction with a healthcare institution. As such, it is one of the most public-facing forms for a healthcare institution, establishing the baseline for the quality of other external and even internal facing forms. The user experience of the appointment request form sets the tone for the entire patient experience.

Experience

A patient's successful user experience when finding, filling out, and submitting an appointment request form is determined by the visual, information, and interactive design of the website and form itself.

Visual design matters

Visual design identifies a healthcare institution’s brand and aesthetic. The quality of care at a healthcare institution is reflected on their website. The website's visual design should be clean, efficient, and caring. Since an appointment request form results in a call back from a live person, including a photograph of a nurse or clinician on the form or landing page can visually reinforce this expectation and experience.

Information design matters

Information design ensures that the patient understands how to navigate the form and makes it clear what information is required and what information is optional. Appointment request forms act as an ambassador of sorts, beginning an interaction between a patient and healthcare clinicians. The form's information design and corresponding editorial sets the tone of a patient's interaction with a healthcare institution.

Interactive design matters

Interactive design improves the flow and process for filling out a form. As someone fills out an appointment request form, conditional logic may hide or show different individual or groups of inputs. When a user clicks submit, how a form validates submitted data can increase or decrease a user's frustration.

Keeping the visual, information, and interactive design in mind, let's look at how the top US hospitals approach their request an appointment.

 U.S. News' Best Hospitals rankings

 U.S. News' Best Hospitals rankings

Evaluation

The U.S. News' Best Hospitals rankings is primarily based on the overall quality of a hospital's care, treatments, and outcomes. The U.S. News' patient experience ratings never directly ask questions about a patient's digital experience. However, the fact that the number 1 ranked hospital, the Mayo Clinical, has a 'Request Appointment' link on the U.S. News' Best Hospitals landing page reinforces the importance of this form.

Mayo Clinical's 'Request Appointment' link

Mayo Clinical's 'Request Appointment' link

We are going to look at how the top four hospitals allow patients to request an appointment on their individual websites. We will examine the positives and negatives to each hospital's appointment request form. The goal is to extrapolate some recommendations and best practices which can be used to build an appointment request form template for the Webform module for Drupal 8.

It’s essential for us to see how users navigate to the appointment form from a hospital's homepage. We will examine how many inputs, questions, steps, and conditions are included on a form, and then determining the level of effort needed to complete a form. Finally, we want to assess the overall user experience, which encompasses the visual, information, and interactive design of the form.

Mayo Clinic

Mayo Clinic's Request an Appointment

Mayo Clinic's Request an Appointment

On the Mayo Clinic's home page, the appointments link is the first and largest element on the page. It takes three clicks from the home page to navigate to their appointment request form. The appointment request form contains 20-30 inputs depending on who is completing the form. Inputs are grouped by contact, insurance, and medical information. Most inputs are required.

Information matters

On top of the U.S. News' direct link to the Mayo Clinic's appointment form as well as the prominent appointment link on the Mayo Clinic's homepage, Mayo Clinic's appointment landing page does a great job of providing all the relevant contact information needed to make an appointment by phone. This ensures even if a patient can't complete the form, they can call the hospital. This contact information, with links to dedicated international and referring physician form, are then included as a sidebar on Mayo's appointment request form.

Experimentation matters

The goal of this blog post's exploration is to identify best practices and recommendations, which helps to establish a baseline for a good appointment request form. Once there is a baseline, it also helps to experiment to improve a form's user experience

Typically, required inputs are marked with red asterisks. Since most inputs are required and there are only a very few optional inputs on the form, Mayo's team inverts the normal, more common, approach of noting required inputs. Instead, they include the message, "All fields are required unless marked optional" with a right aligned "(optional)" label next to optional inputs. This approach appears to be successful. The more important lesson is to be cognizant of a form's required and optional inputs.

Layout matters

The form groups related inputs into visually defined and well-labeled sections. The form also uses a vertical layout with left-aligned labels. Left-aligned labels are more difficult for users to comprehend the relationship between inputs and labels. Alternatively, the form's mobile layout uses the recommend top-aligned labels approach, and it is easier to fill-out. Mayo Clinic should experiment with using top-aligned labels for the desktop form and see if it increases the forms completion rates.

Improvement matters

Mayo Clinic values and promotes their appointment request form. The Mayo's Clinic digital team has put a considerable amount of thought, care, and consideration into their appointment request form's user experience. They should continue to experiment, test, and improve this form.

Cleveland Clinic

Cleveland Clinic's Online Appointment Request Form

Cleveland Clinic's Online Appointment Request Form

The Cleveland Clinic's homepage also includes a prominent link to appointments. It takes three clicks from the home page to navigate to the appointment request form. Their appointment request form is a multistep form with five pages and contains between 50-90 inputs. Most inputs are required. The form collects information about patients, caregivers, referring physicians, and comprehensive medical history. Entering this much information takes a considerable amount of time to complete.

Design matters

Cleveland Clinic's forms is a beautifully designed and logically arranged multi-step form. Multi-step forms makes it easier to collect a lot of information. It is important to ask, "Does an appointment request form need to collect a lot of information?"

Time matters​

When it comes to digital experiences, people have less patience. How long it takes to complete a task matters. Does a new patient need to supply their marital status or subscribe to a mailing list to book their first appointment? Wouldn’t it be more appropriate to collect a patient's medical history after the appointment is booked?

Information matters

Relating to how long it takes to fill out an appointment request form, it is important to ask the value and purpose of each question. Cleveland Clinic's form asks, "Has the patient been seen at Cleveland Clinical in the past?" if answered "Yes", the form then proceeds to ask for a Medical Record Number (MRN). It makes complete sense to collect an existing patient's MRN, which can be used to look up a patient's medical history. However, once the patient provides an MRN, does the form still need to obtain a comprehensive medical history?

Failure matters​

Asking for too much information increases the likelihood that a patient or caregiver might not be able to complete a form or result in them being frustrated with the process of completing the form. Let’s not forget the appointment request form facilitates a phone call that ultimately results in an appointment. Cleveland Clinic's form should also include the phone number on every page to ensure if someone is frustrated they can pick up the phone and still book an appointment.

Goals matters

Cleveland Clinic's form is an overwhelming user experience. At the very least, users should be given the option to complete a more straightforward appointment request form, but also provided a secondary option to provide a full medical history

Before even building any form, it helps to summarize the goal of the form with the scope of the information being collected. This summary can also become the form's introduction text. For public-facing forms, each input should be scrutinized to determine if the information is required to complete the immediate task at hand.

Johns Hopkins Hospital

Johns Hopkins Hospital's Request an Appointment

Johns Hopkins Hospital's Request an Appointment

The Johns Hopkins Hospital homepage includes a small icon with a label linking to their appointments landing page. It takes four clicks from the home page to navigate to the appointment request form. Their form contains 20-25 inputs with five optional inputs. The form has one conditional question asking, "Who are you seeking care for?"

Simplicity matters

By and large, the Johns Hopkins Hospital's appointment request form is the simplest form that we are going to review. There is no input grouping and it has a straightforward two column layout. This, despite the fact that grouping related inputs using top-alignment and a one column layout, is generally recommended. The simplicity of this form makes it easy for users to understand how much information is required to fill out the form. Having all the required inputs displayed before any optional inputs also makes it easier for a user to fill in the form quickly. The two column layout does make the form more difficult to fill out. When the browser is resized to a mobile layout, the form is easier and faster to fill out. Making a form easy for everyone to fill out, including users with disabilities, is essential.

Accessibility matters

The Johns Hopkins Hospital's appointment request form is not accessible to users who depend on screen readers. The form's inputs are missing the necessary code, which allows screen readers to accurately describe to an end-user the labels for each input. Instead of a screen reader saying something like, "Input first name required" the screen reader just states, "Input required" for all the form's inputs.

The inaccessibility of the Johns Hopkins Hospital's appointment request form is a significant problem. An inaccessible appointment request form is the equivalent to not having a wheelchair ramp to enter a building. Fortunately, this issue is not hard to fix. The John Hopkins Hospital digital team needs to become more aware of and monitor the accessibility of their forms.

Expectations matters

Nowhere on the Johns Hopkins Hospital's appointment request form does it indicate the patient or caregiver will receive a callback within a defined period of time. The Johns Hopkins Hospital's appointments landing does include this information. This information should be repeated at the top of the form just in case the user has not been to the appointments landing page. Users need to know what to expect when the click submit on a form. In the specific case of an appointment request form, users need to know when (i.e. how many days) and how they will be contacted (i.e., phone or email).

Massachusetts General

Massachusetts General's Request an Appointment

Massachusetts General's Request an Appointment

The Massachusetts General's homepage also includes two links to appointments. It takes three clicks from the home page to navigate to the appointment request form. The appointment request form contains 20-25 inputs in three groupings with only two optional inputs. Yes/no conditional logic is used to ask for more information about the patient and referring physician

Tone matters

Massachusetts General's appointment request form asks questions using a tone that guides the user through the process of completing the form. For example, every hospital wants to know if a patient has insurance. Instead of displaying a large dropdown menu labeled "Insurance", Massachusetts General's form states, "To facilitate processing, please provide insurance plan name if known." Using this type of phrasing explains to a user why this information is needed, while also stating it is optional. This is an excellent example of the relationship of asking the right question with collecting the best answer.

Questions matters

Many patients decide which hospital to be treated based on physician referrals. When intaking new patients, hospitals may need to contact a referring physician to get a patient's medical history. Most appointment request forms ask for the referring physician's contact information as individual inputs. Massachusetts General's appointment request form just asks for the "Name of referring provider, facility name, address and phone if known." This approach is a brilliant and deliberate decision. First off, it makes it very easy for a patient to enter what information they have about their physician ensuring that at least some information is collected. Appointment requests are always reviewed by an actual person who can take the plain text physician information, fix a misspelling, look up the referring physician's NPI (National Provider Identifier), and get all the needed information into a new patient's medical record.

Massachusetts General appointment request form is asking some very well thought out questions, which are being supported by the right process.

Consistency matter

Massachusetts General has an exemplary appointment request form. There are no immediate issues that need to be addressed by their form. Ironically, Massachusetts General's international appointment request form does have some design and user experience issues. Without getting into specifics, most of the problems on international appointment request form would be resolved if this form followed the best practices and care established by the domestic appointment request form. Providing a consistent user experience across an institution's webforms can ensure a better patient experience.

An appointment request form is a small but key piece to the complex puzzle of a patient’s experience and journey. Soon, someone is going to disrupt, change, and simplify this puzzle. For example, people will be able to schedule an appointment using voice-controlled systems. When we look at the user experience of these relatively simple appointment request forms, there is still some work to be done now and moving forward.

We have extensively explored four hospitals appointment request forms. In my next blog post, I am going to make some concrete recommendations and strategies for building an exemplary appointment request form.

Almost done…

We just sent you an email. Please click the link in the email to confirm your subscription!

OK

Aug 26 2019
Aug 26

The Drupal CMS is a universal platform for website development. It is equally suitable for small business websites and complex e-commerce solutions. This article will tell you about the price formation of Drupal website development and answer the question “How much does it cost to build a website?”.

Website design

First of all, you need to understand what goals and objectives you set for your website. For example, you need a website for your blog with a standard set of features and simple design. Such a Drupal website can be created by yourself absolutely free. Just read a guide or watch tutorial videos. 

Another matter, if you need to develop a selling promotional website for your business with a custom design, it is a serious step that you will have to spend money on. It’s no secret to anyone that websites with a unique design increase sales (according to “The business value of design” by McKinsey). A professional web designer can help you to develop a custom design for your project, but that's not all. Also, you will need a front-end developer to run your website with a new design. Of course, if you have some extra money and your time is too expensive, you can hire a project manager who will monitor the progress of the project, but it’s not neсessary.

Drupal development

Every Drupal website contains Drupal modules. Drupal modules allow you to add the necessary features to your website and you can install them for free. But every business is unique and sometimes a standard set of features is not enough to implement a project, that's why the development of additional Drupal modules is required. To do this, you need an experienced Drupal development team. Below you can look at average hourly rates for mid-level developers.

  • North America: $60-$140 per hour

  • Western Europe: $40-$60 per hour

  • Eastern Europe: $20-$40 per hour

  • Asia and Pacific: $10-$20 per hour

As you can see, the per-hour rate can vary dramatically based on a region of the world, that’s why companies prefer to work with outsource web development firms from different countries. 

Website development cost

Mainly, the cost of a website is calculated based on an hourly rate multiplied by a number of hours spent on a project. 

The website development cost directly depends on the features that you need and on the complexity of website design. Below, you’ll find some examples of the existing types of sites and see how long it takes to develop a website, based on our experience.

Simple promo page

If you need an opportunity to edit the content of your promo page by yourself, the Drupal CMS will be the best solution. The development of a simple promo page will take 50-70 hours.

Approximate total cost:

  • North America: $4200-$9800 

  • Western Europe: $2000-$4200 

  • Eastern Europe: $1000-$2800 

  • Asia and Pacific: $500-$1400 

Company website

The development of creative custom design is very important for corporate websites because the reputation of your company will directly depend on this. Development of a standard corporate website with such sections as "About Us", "Services", "Portfolio", "News", "Blog", "Contacts" will take 200-250 hours.

Approximate total cost:

  • North America: $15000-$35000 

  • Western Europe: $8000-$15000 

  • Eastern Europe: $4000-$10000 

  • Asia and Pacific: $2000-$5000 

Mobile app

Drupal is universal. It suits not only the websites’ development, but it can also be used for the back-end development of mobile applications. This fact makes Drupal even more attractive for your choice. The minimum estimated time for the back-end run on Drupal is 150-200 hours.

Approximate total cost:

  • North America: $9000-$28000 

  • Western Europe: $6000-$12000 

  • Eastern Europe: $3000-$8000 

  • Asia and Pacific: $1500-$4000 

E-commerce

E-commerce websites are online stores and large trading platforms, such as Amazon. Perhaps, the most difficult type of projects to implement, so the time which will be needed for an e-commerce website development is appropriate. The minimum estimated time is 250-300 hours.

Approximate total cost:

  • North America: $15000-$42000 

  • Western Europe: $10000-$18000 

  • Eastern Europe: $5000-$12000 

  • Asia and Pacific: $2500-$6000 

Conclusion

In this article, we gave you approximate costs for different kinds of websites and now you have an idea of how the price is calculated.

Remember, that price for website development entirely depends on a development team and your requirements. It is very important to find the experienced development team to achieve excellent results in your project. And don't forget that every business is unique and requires individual solutions.


Liked the article? Twit @adcisolutions and let us know what you think about this article on Twitter.
Aug 25 2019
Aug 25

I have previously written about using Drupal’s definition files in component-based theming and how it is possible to have component-specific layout and asset library definition files. Now that Layout Builder is stable it is time to have a look at how it can be used with theme components.

Two schools

There are two main approaches to integrating independent theme components with Drupal. They could be described as the ‘Twig developer’ approach and the ‘site builder’ approach.

In the Twig developer approach integration is done entirely in Twig ‘presenter’ templates. A significant benefit of this straightforward approach is that no contrib modules are required (although Component Libraries is usually used). There is also no need for endless pointing and clicking in the admin UI, and dealing with related configuration. The downside is that in most cases this approach leaves the admin UI disconnected from the components. This means that a site builder not familiar with Twig or with no access to the Twig templates has no way of affecting the appearance of the integrated components.

In the site builder approach integration is done using the admin UI. Until now this has required sophisticated contrib modules like Display Suite and UI Patterns, with their own ecosystems of related modules. These are great modules, but ideally it should be possible to use just Drupal core for the site builder approach. With Layout Builder in core we are a huge leap closer to that goal.

Revisiting an old example

Back in January 2017 I wrote a blog post on how to use UI Patterns to integrate a simple Code component. I will now use the same component in an example of how to use Layout Builder for the same task.

Step 1: Create layout and asset library definitions

Starting from where we ended up in that old blog post, we need to provide information about the component’s variables and assets to Drupal in some other way than a .ui_patterns.yml file. This can now be easily done using Drupal core’s definition files.

Drupal core only supports theme (or module) specific definition files, but the Multiple Definition Files module can be used to add support for component-specific files. The results are identical in both approaches, so using Multiple Definition Files is completely optional.

Multiple Definition Files does add one useful feature though. Drupal places layout regions in content, so for example a code region would be accessed in Twig as content.code. To remove this Drupalism, Multiple Definition Files makes layout regions accessible in the Twig root context, so a code region is accessible in Twig as just code.

The definitions in this example are in component-specific files. If you want to use Multiple Definition Files, be sure to enable the module at this point.

The layout definition, shila-code.layouts.yml:

shila_code:
label: 'Code'
category: 'Shila'
template: source/_patterns/01-molecules/content/shila-code/shila-code
library: shila_theme/shila_code
regions:
language:
label: 'Language'
code:
label: 'Code'

and the library definition, shila-code.libraries.yml:

shila_code:
css:
theme:
source/_patterns/01-molecules/content/shila-code/prism.css: {}
js:
source/_patterns/01-molecules/content/shila-code/prism.js: {}

Note the library reference in shila-code.layouts.yml, linking the two definitions. Be sure to clear caches after adding the new definitions.

(As a side note, the UI Patterns From Layouts module makes it possible to use Drupal’s definition files with UI Patterns as well.)

Step 2: Enable Layout Builder

After enabling the Layout Builder module, Layout Builder has to be enabled for our Code paragraphs type’s display.

If you are moving away from a Display Suite layout, be sure to select - None - for the layout and save the settings before enabling Layout Builder. It seems that if you do not do this, Display Suite is still active and Layout Builder won’t work as expected.

Step 3: Integrate the component using Layout builder

Clicking on the ‘Manage layout’ button takes us to Layout Builder. Display Suite and UI Patterns show a textual representation of a layout, but Layout Builder is different as it displays the layout visually.

First we remove the default section, then we add a new section and choose the ‘Code’ layout for it.

As a reminder, this is what our extremely simple shila-code.html.twig template looks like:

<pre><code{% if language %} class="language-{{ language }}"{% endif %}>{{ code }}</code></pre>

Layout Builder’s visual representation will not be a problem in many cases. However, components often use variables for non-visual purposes or have layouts with overlapping regions. Our Code component is a good example, since language is used as a part of a CSS class name. Layout Builder does not expect layouts to be used in this way, and the result is a completely unusable UI.

Oh no. What we need is a text based way to configure the layout. The good news is that due to accessibility reasons a textual version of the Layout Builder UI is already in the works. So, let’s apply the patch from that issue, clear the caches, and see what happens.

A ‘Layout overview’ link has appeared! Let’s click on it.

This is looking good! We can now add the right paragraph fields to the respective regions in the Code layout and then click on the ‘Save layout’ button. Let’s have a look at a page that contains a Code paragraph.

Something is obviously going wrong here. Let’s have a look at the HTML source.

Since we have placed fields in Layout Builder blocks, there is unwanted HTML from the block and field templates. What we really need is just the contents of the fields, nothing else. This can be an issue with many independent theme components, since they often do not expect extra markup to be provided.

Display Suite has a ‘Full Reset’ field formatter and UI Patterns has an ‘Only content’ option. But what can we do here? Creating specific Twig templates to remove the HTML is an option, but not an ideal one. What we want is an easy UI based way to deal with the problem.

Step 4: Install and enable the Full Reset module (if required)

After searching for existing solutions and not coming up with anything, I ended up writing a small module for this particular purpose. Full Reset does two things: it provides a field formatter and adds a reset option to Layout Builder blocks. The name is a hat tip to the trusty Display Suite field formatter I have used so many times.

A good way to support third party settings for Layout Builder blocks is still being worked on, so a core patch must be applied for Full Reset to work.

With the core patch applied and Full Reset installed and enabled, we can select the ‘Full Reset’ field formatter and tick the ‘Full Reset this block’ checkbox.

Now the block and field templates are not used at all, and the component works and looks like expected! It still does not work in Layout Builder’s visual preview though, because Layout Builder expects the content to be a valid DOM node and have a surrounding HTML tag. For this reason Full Reset does not remove block theming in the preview.

Conclusion

Layout Builder can be successfully used in component-based theming today. There might be some problems with specific components like the one in this example, but these problems can be circumvented by applying two patches and using the Full Reset module.

For me this means that I can now stop using Display Suite, a module that I have used in almost all of my Drupal projects since 2011. Personally I find this change to be just as significant as it was when Field API and Views were added to core.

Aug 25 2019
Aug 25

The Drupal 8 Shield module allows you to protect your site using a simple htaccess authentication. It’s great for sites that you are working on that you don’t want the world (or Google) to see yet. This way you can send the site to a client or anyone really to test and just provide them the username/password to view the site. Once it’s ready to go, you can launch the site and remove this module.

If you have ever wanted to password protect your Drupal site, the Shield module will help with that!

Install the Drupal 8 Shield module like you would any other Drupal module. Note that it shows up in the module list as PHP Authentication Shield.

Once it’s installed, click on the Configure link to get to the configuration page.

It’s recommended to leave the Allow command line access checkbox checked. This will not let the Shield module affect command line access using tools such as Drush. You will want to fill out the Username and Password based on what you want a user to use to access the site.

Note: The Authentication message does not seem to work when I tested it with Chrome.

Click the Save configuration button and you should immediately see the authentication popup. You can test this out in a private browsing or incognito window to see it again. Once you enter the correct username and password, you will be able to access every page on your site.

That’s all there is to it. This works well for development and staging environment while you are developing the site but want the client to be able to look at it. Now go out and password protect your Drupal sites!

Aug 24 2019
Aug 24

Greg Anderson, Open Source Contributions Engineer at Pantheon joins Mike Anello to talk about the Drupal Community's Composer Support in Core Initiative.

Discussion

DrupalEasy News

Upcoming Events

Sponsors

  • Drupal Aid - Drupal support and maintenance services. Get unlimited support, monthly maintenance, and unlimited small jobs starting at $99/mo.
  • WebEnabled.com - devPanel.

Follow us on Twitter

Subscribe

Subscribe to our podcast on iTunes, Google Play or Miro. Listen to our podcast on Stitcher.

If you'd like to leave us a voicemail, call 321-396-2340. Please keep in mind that we might play your voicemail during one of our future podcasts. Feel free to call in with suggestions, rants, questions, or corrections. If you'd rather just send us an email, please use our contact page.

Aug 24 2019
Aug 24

In the last blog post we were introduced to managing migration as configuration entities using Migrate Plus. Today, we will present some benefits and potential drawbacks of this approach. We will also show a recommended workflow for working with migration as configuration. Let’s get started.

Example workflow for managing migration configuration entities.

What is the benefit of managing migration as configurations?

At first sight, there does not seem to be a big difference between defining migrations as code or configuration. You can certainly do a lot without using Migrate Plus’ configuration entities. The series so far contains many examples of managing migrations as code. So, what are the benefits of adopting s configuration entities?

The configuration management system is one of the major features that was introduced in Drupal 8. It provides the ability to export all your site’s configuration to files. These files can be added to version control and deployed to different environments. The system has evolved a lot in the last few years, and many workflows and best practices have been established to manage configuration. On top of Drupal core’s incremental improvements, a big ecosystem has sprung in terms of contributed modules. When you manage migrations via configuration, you can leverage those tools and workflows.

Here are a few use cases of what is possible:

  • When migrations are managed in code, you need file system access to make any changes. Using configuration entities allows site administrators to customize or change the migration via the user interface. This is not about rewriting all the migrations. That should happen during development and never on production environments. But it is possible to tweak certain options. For example, administrators could change the location to the file that is going to be migrated, be it a local file or on a remote server.
  • When writing migrations, it is very likely that you will work on a subset of the data that will eventually be used to get content into the production environment.  Having migrations as configuration allow you to override part of the migration definition per environment. You could use the Configuration Split module to configure different source files or paths per environment. For example, you could link to a small sample of the data in development, a larger sample in staging, and the complete dataset in production.
  • It would be possible to provide extra configuration options via the user interface. In the article about adding HTTP authentication to fetch remote JSON and XML files, the credentials were hardcoded in the migration definition file. That is less than ideal and exposes sensitive information. An alternative would be to provide a configuration form in the administration interface for the credentials to be added. Then, the submitted values could be injected into the configuration for the migration. Again, you could make use of contrib modules like Configuration Split to make sure those credentials are never exported with the rest of your site’s configuration.
  • You could provide a user interface to upload migration source files. In fact, the Migrate source UI module does exactly this. It exposes an administration interface where you have a file field to upload a CSV file. In the same interface, you get a list of supported migrations in the system. This allows a site administrator to manually upload a file to run the migration against. Note: The module is supposed to work with JSON and XML migrations. It did not work during my tests. I opened this issue to follow up on this.

These are some examples, but many more possibilities are available. The point is that you have the whole configuration management ecosystem at your disposal. Do you have another example? Please share it in the comments.

Are there any drawbacks?

Managing configuration as configuration adds an extra layer of abstraction in the migration process. This adds a bit of complexity. For example:

  • Now you have to keep the uuid and id keys in sync. This might not seem like a big issue, but it is something to pay attention to.
  • When you work with migrations groups (explained in the next article), your migration definition could live in more file.
  • The configuration management system has its own restrictions and workflows that you need to follow, particularly for updates.
  • You need to be extra careful with your YAML syntax, especially if syncing configuration via the user interface. It is possible to import invalid configuration without getting an error. It is until the migration fails that you realize something is wrong.

Using configuration entities to define migrations certainly offers lots of benefits. But it requires being extra careful managing them.

Workflow for managing migrations as configuration entities

The configuration synchronization system has specific workflows to make changes to configuration entities. This imposes some restrictions in the way you make updates to the migration definitions. Explaining how to manage configuration could use another 31 days blog post series. ;-) For now, only a general overview will be presented. The general approach is similar to managing configuration as code. The main difference is what needs to be done for changes to the migration files to take effect.

You could use the “Configuration synchronization” administration interface at /admin/config/development/configuration. In it you have the option to export  or import a “full archive” containing all your site’s settings or a “single item” like a specific migration. This is one way to manage migrations as configuration entities which lets you find their UUIDs if not set initially. This approach can be followed by site administrators without requiring file system access. Nevertheless, it is less than ideal and error-prone. This is not the recommended way to manage migration configuration entities.

Another option is to use Drush or Drupal Console to synchronize your site’s configuration via the command line. Similarly to the user interface approach, you can export and import your full site configuration or only single elements. The recommendation is to do partial configuration imports so that only the migrations you are actively working on are updated.

Ideally, your site’s architecture is completed before the migration starts. In practice, you often work on the migration while other parts of the sites are being built. If you were to export and import the entire site’s configuration as you work on the migrations, you might inadvertently override unrelated pieces of configurations. For instance, this can lead to missing content types, changed field settings, and lots of frustration. That is why doing partial or single configuration imports is recommended. The following code snippet shows a basic Drupal workflow for managing migrations as configuration:

# 1) Run the migration.
$ drush migrate:import udm_config_json_source_node_local

# 2) Rollback migration because the expected results were not obtained.
$ drush migrate:rollback udm_config_json_source_node_local

# 3) Change the migration definition file in the "config/install" directory.

# 4a) Sync configuration by folder using Drush.
$ drush config:import --partial --source="modules/custom/ud_migrations/ud_migrations_config_json_source/config/install"

# 4b) Sync configuration by file using Drupal Console.
$ drupal config:import:single --file="modules/custom/ud_migrations/ud_migrations_config_json_source/config/install/migrate_plus.migration.udm_config_json_source_node_local.yml"

# 5) Run the migration again.
$ drush migrate:import udm_config_json_source_node_local

Note the use of the --partial and --source flags in the migration import command. Also, note that the path is relative to the current working directory from where the command is being issued. In this snippet, the value of the source flag is the directory holding your migrations. Be mindful if there are other non-migration related configurations in the same folder. If you need to be more granular, Drupal Console offers a command to import individual configuration files as shown in the previous snippet.

Note: Uninstalling and installing the module again will also apply any changes to your configuration. This might produce errors if the migration configuration entities are not removed automatically when the module is uninstalled. Read this article for details on how to do that.

What did you learn in today’s blog post? Did you know the know the benefits and trade-offs of managing migrations as configuration? Did you know what to do for changes in migration configuration entities to take effect? Share your answers in the comments. Also, I would be grateful if you share this blog post with others.

This blog post series, cross-posted at UnderstandDrupal.com as well as here on Agaric.coop, is made possible thanks to these generous sponsors. Contact Understand Drupal if your organization would like to support this documentation project, whether it is the migration series or other topics.

Aug 23 2019
Aug 23

Today, we are going to talk about how to manage migrations as configuration entities. This functionality is provided by the Migrate Plus module. First, we will explain the difference between managing migrations as code or configuration. Then, we will show how to convert existing migrations. Finally, we will talk about some important options to include in migration configuration entities. Let’s get started.

Example of migration defined as configuration entity.

Drupal migrations: code or configuration?

So far, we have been managing migrations as code. This is functionality provided out of the box. You write the migration definition file in YAML format. Then, you place it in the migrations directory of your module. If you need to update the migration, you make the modifications to the files and then rebuild caches. More details on the workflow for migrations managed in code can be found in this article.

Migrate Plus offers an alternative to this approach. It allows you to manage migrations as configuration entities. You still use YAML files to write the migration definition files, but their location and workflow is different. They need to be placed in a config/install directory. If you need to update the migration,  you make the modifications to the files and then sync the configuration again. More details on this workflow can be found in this article.

There is one thing worth emphasizing. When managing migrations as code you need access to the file system to update and deploy the changes to the file. This is usually done by developers.  When managing migrations as configuration, you can make updates via the user interface as long as you have permissions to sync the site’s configuration. This is usually done by site administrators. You might still have to modify files depending on how you manage your configuration. But the point is that file system access to update migrations is optional. Although not recommended, you can write, modify, and execute the migrations entirely via the user interface.

Transitioning to configuration entities

To demonstrate how to transition from code to configuration entities, we are going to convert the JSON migration example. You can get the full code example at https://github.com/dinarcon/ud_migrations The module to enable is UD config JSON source migration whose machine name is udm_config_json_source. It comes with four migrations: udm_config_json_source_paragraph, udm_config_json_source_image, udm_config_json_source_node_local, and udm_config_json_source_node_remote.

The transition to configuration entities is a two step process. First, move the migration definition files from the migrations folder to a config/install folder. Second, rename the files so that they follow this pattern: migrate_plus.migration.[migration_id].yml. For example: migrate_plus.migration.udm_config_json_source_node_local.yml. And that’s it! Files placed in that directory following that pattern will be synced into Drupal’s active configuration when the module is installed for the first time (only). Note that changes to the files require a new synchronization operation for changes to take effect. Changing the files and rebuilding caches does not update the configuration as it was the case with migrations managed in code.

If you have the Migrate Plus module enabled, it will detect the migrations and you will be able to execute them. You can continue using the Drush commands provided the Migrate Run module. Alternatively, you can install the Migrate Tools module which provides Drush commands for running both types of migrations: code and configuration. Migrate Tools also offers a user interface for executing migrations. This user interface is only for migrations defined as configuration though. It is available at /admin/structure/migrate. For now, you can run the migrations using the following Drush command: drush migrate:import udm_config_json_source_node_local --execute-dependencies.

Note: For executing migrations in the command line, choose between Migrate Run or Migrate Tools. You pick one or the other, but not both as the commands provided by the two modules have the same name. Another thing to note is that the example uses Drush 9. There were major refactorings between versions 8 and 9 which included changes to the name of the commands.

UUIDs for migration configuration entities

When managing migrations as configuration, you can set extra options. Some are exposed by Migrate Plus while others come from Drupal’s configuration management system. Let’s see some examples.

The most important new option is defining a UUID for the migration definition file. This is optional, but adding one will greatly simplify the workflow to update migrations. The UUID is used to keep track of every piece of configuration in the system. When you add new configuration, Drupal will read the UUID value if provided and update that particular piece of configuration. Otherwise, it will create a UUID on the fly, attach it to the configuration definition, and then import it. That is why you want to set a UUID value manually. If changes need to be made, you want to update the same configuration, not create a new one. If no UUID was originally set, you can get the automatically created value by exporting the migration definition. The workflow for this is a bit complicated and error prone so always include a UUID with your migrations. This following snippet shows an example UUID:

uuid: b744190e-3a48-45c7-97a4-093099ba0547
id: udm_config_json_source_node_local
label: 'UD migrations configuration example'

The UUID a string of 32 hexadecimal digits displayed in 5 groups. Each is separated by hyphens following this pattern: 8-4-4-4-12. In Drupal, two or more pieces of configuration cannot share the same value. Drupal will check the UUID and the type of configuration in sync operations. In this case the type is signaled by the migrate_plus.migration. prefix in the name of the migration definition file.

When using configuration entities, a single migration is identified by two different options. The uuid is used by the Drupal’s configuration system and the id is used by the Migrate API. Always make sure that this combination is kept the same when updating the files and syncing the configuration. Otherwise you might get hard to debug errors. Also, make sure you are importing the proper configuration type. The latter should not be something to worry about unless you utilize the user interface to export or import single configuration items.

Tip: If you do not have a UUID in advance for your migration, you can copy one from another piece of configuration and change some of the hexadecimal digits. Keep in mind that this could lead to a duplicated UUID if you happen to make the exact changes to the UUID in two separate files. Another option is to use a tool to generate the values. Searching online for UUID generators will yield many  tools for this.

Automatically deleting migration configuration entities

By default, configuration remains in the system even if the module that added it gets uninstalled. This can cause problems if your migration depends on custom migration plugins provided by your module. It is possible to enforce that migration entities get removed when your custom module is uninstalled. To do this, you leverage the dependencies option provided by Drupal’s configuration management system. The following snippet shows how to do it:

uuid: b744190e-3a48-45c7-97a4-093099ba0547
id: udm_config_json_source_node_local
label: 'UD migrations configuration example'
dependencies:
  enforced:
    module:
      - ud_migrations_config_json_source

You add the machine name of your module to dependencies > enforced > module array. This adds an enforced dependency on your own module. The effect is that the migration will be removed from Drupal’s active configuration when your custom module is uninstalled. Note that the top level dependencies array can have others keys in addition to enforced. For example: config and module. Learning more about them is left as an exercise for the curious reader.

It is important not to confuse the dependencies and migration_dependencies options. The former is provided by Drupal’s configuration management system and was just explained. The latter is provided by the Migrate API and is used to declare migrations that need be imported in advance. Read this article to know more about this feature. The following snippet shows an example:

uuid: b744190e-3a48-45c7-97a4-093099ba0547
id: udm_config_json_source_node_local
label: 'UD migrations configuration example'
dependencies:
  enforced:
    module:
      - ud_migrations_config_json_source
migration_dependencies:
  required:
    - udm_config_json_source_image
    - udm_config_json_source_paragraph
  optional: []

What did you learn in today’s blog post? Did you know that you can manage migrations in two ways: code or configuration? Did you know that file name and location as well as workflows need to be adjusted depending on which approach you follow? Share your answers in the comments. Also, I would be grateful if you shared this blog post with others.

This blog post series, cross-posted at UnderstandDrupal.com as well as here on Agaric.coop, is made possible thanks to these generous sponsors. Contact Understand Drupal if your organization would like to support this documentation project, whether it is the migration series or other topics.

Aug 23 2019
Aug 23

Marketing department these days are feeling the heat to make processes faster, agile, and efficient in this fast-paced digital world. That’s where the concept of Drupal comes into the picture! 

Drupal, since its inception, has been considered crucial for the companies which are trying to get on the digital transformation bandwagon in order to provide a faster and more nimble digital experience to customers.

Drupal comes equipped with a set of marketing tools to target the right audience and increase the overall ROI for the businesses. Additionally, the suite of tools and solutions available, build a solid foundation important for marketers, to unearth its potential and other martech capabilities; making it one of the most powerful and widely approached platforms for all the marketing needs. 

The blog gives you insights on how Drupal can be a perfect choice for marketers.-

Leveraging Drupal For Marketing

Drupal has become an inseparable part of the marketing space which not only drives the business revenue but also makes a remarkable balance between marketing technologies and its ecosystem with its content first, commerce-first, and community-first marketing solutions.

Creative Freedom 

Dependence on the IT team for implementation is the most known problem for digital marketers. With traditional CMS it takes an added effort, time, and resources across design, development and marketing departments to work in sync.

By the time changes are implemented the model already looks outdated to the new situation.

“Drupal seamlessly incorporates with the existing marketing and sales technologies of the enterprises”


The cutting-edge Drupal modules and distributions empowers different teams with to have their creative freedom in order to manage the development of a project at their own pace and convenience. 

With Drupal’s architecture, organizations have a platform where they can dynamically launch their website. Marketing teams can curate the structure with segment content and visuals to lay the foundation of a strong digital strategy in its backbone.

Drupal turns out to be a useful asset for organizations which are looking for implementing it in their digital business as it seamlessly incorporates with their existing marketing and sales technologies.

Balancing Marketing Ecosystem

CRM systems are important in running businesses and boosting sales management. It is important for organizations to integrate and be able to customize the martech stack to their unique needs. 

Salesforce written in blue cloud
Drupal offers features that help you create a fine balance between CRM and marketing ecosystem- with content-first, commerce-first, and community-first marketing solutions. Business, technology, and marketing professionals use Drupal to create such agile solutions that are easy-to-use and offer a wide reach across the web. Built fully-responsive, customers and users can discover products and solutions with the help of any device. 

It possesses infinite potential with native features and module extensions, including collaboration with third-party digital marketing tools.

In all, it’s a platform to help you push your strategy for the upcoming phase of digital customer engagement and digital business.

Responsive Customer Experience

Based on the system of engagement, it gives a solid foundation important, for all the single interactions with customers and people within the organization alike, to provide the ultimate unified experience. Streamlining your business operations and aligning your digital strategies, it delivers on the mobile-first approach. Enterprises’ marketing teams can manage the website and administrative pages with ease across multiple platforms.

“Personalized customer experience is evolving at a blistering pace”

Organizations competing in this customer-centric age are putting efforts to provide a personalized experience to customers for keeping them engaged and simultaneously attracting new visitors on board for lead generation.

The digital strategies run by marketing teams majorly focuses on increasing leads, conversions, and revenue percentage of the company gradually via digital channels. 

As customers lie at the center of the organization, this can be accomplished by offering them a tailor-made experience across all channels.

The motive is the same but the way it is rendered has evolved significantly and Drupal has a large part to play in it. Here’s how-

Have a look at the video to understand more about the Acquia Lift and how it can be beneficial for marketers-

Focus on Personalization with Acquia Lift

There is no denying to the fact that companies are leaving no stone unturned to provide personalized results to users- it's not just about presenting content on the website but rather tweaking the whole experience. Simply put, they are trying to ensure that every single user-prospect or customer - gets what they want, even before they realize it.

Acquia Lift is a powerful amalgamation of data collection, content distribution, and personalization helping marketers deliver on the refined user experience 

Acquia Lift bolsters the personalized experiences of the customers. It is a powerful amalgamation of data collection, content distribution, and personalization that helps enterprises’ marketing teams ascertain the refined user experience without much dependency on development or IT teams. 

7 Vertical and 4 horizontal blocks with text written on them

Source: Acquia

Acquia Lift encompasses three key elements to boost personalization-

Profile Manager

It helps you build a complete profile of your users right from when they land on your website as anonymous visitors up until the stage where they are repeat visitors or loyal customers. Collecting user info such as demography, historical behavior, and real-time interactions, it complements the collected insights on user preference with best-suggested actions in your capacity.

Content Hub

This cloud-based tool provides a secure content syndication, discovery, and distribution. Any content created within the organization can be consolidated and stored here; readily available to broadcast on any channel, in any format.

Searches on varied topics and automatic updates give insights on a wide spectrum of content being developed within the enterprise- in different departments, across websites, and on different platforms.

Experience Builder

This is the most essential element of Acquia Lift. It lets you create a personalized experience for your users from the beginning.

The Experience Builder is an easy-peasy drag-and-drop tool that allows you to personalize every section of your website to showcase different content to different target audiences, based on the information retrieved from the Profile Manager.

Marketing teams can:
  1. Define the rules and protocol on how content should be displayed to a different segment of site visitors
  2. Carry out A/B testing to determine what type of content drives more conversions for which user segments.


All this can be executed with simple overlays onto the existing website segments, without disturbing the core structure of the site and without depending on IT teams for implementation.


Multilingual

With companies expanding their reach to the international markets, Drupal multilingual features are definitely worth to make capital out of it. As it supports 94 international languages, it can translate the complete website within a fraction of seconds with less than 4 modules in action, enabling marketing teams to deliver localized content experiences; thus increasing the probability of turning a visitor into a customer. 

“Drupal multilingual features are definitely worth to make capital out of it”

Despite the features of Drupal site’s language for the audience, the editors and admins have the option to choose a different language for themselves at the backend. Marketing and editorial teams have the rights to create and manage multilingual sites, with no need for additional local resources.

Layout Builder

The Layout Builder with its stable features showcased in Drupal 8.7 allows content marketers to create and edit page layouts and arrange the presentation of individual content, its types, media, and nodes. It also lets you feed user data, views, fields, and menus.

"Marketing teams can preview the pages with ease without impacting the user-experience”

It acts as a huge asset for enterprise marketing and digital experience teams-

  1. It offers flexibility to help you create custom layouts for pages and other specific sections on websites. You can override the predefined templates for individual landing pages when required.
  2. Content authors can embed videos effortlessly across the site to enhance user experience and ultimately drive conversion rate.
  3. Marketers can preview the pages with ease and without the fear of impacting the user experience


Layout Builder explained with the help of flowers in square gift box

Source: Dri.es


All these features offered ensures that marketing teams have more control over the website and can work independently without needing to take help from developers. This ultimately reduces the turnaround time for launching campaigns and hence the dependency on development teams.

Improved UI with API - First headless architecture

With the ever-increasing demands of acquiring and retaining customers, marketing teams are always in a hurry to redesign and update the backend and front-end in a short period. However, this becomes quite a strenuous task for them to update and redesign digital properties rapidly keeping in mind the evolving customer expectations.

Traditional Drupal architecture could take ample amount of time to make updates and redesigns because the refinement needs to take place at both front end and back end resulting in a dependency on developers and designers for the completion of the project.

“A decoupled CMS strategy can work wonders for a website that is in desperate need for a change”

But now with the powerful feature of Drupal, i.e., decoupled Drupal, marketing teams can become more agile and efficaciously segregate the processes & streamline the upgrades without impacting the user experience at the front end. This uber-cool feature of the Drupal helps in making the design and UX alterations easier to maintain.


Three blocks on the grey background with text written on them
Source: Acquia

Having the flexibility to come up with more ideas and implementing them by being able to easily add them to the website is a huge upliftment for marketers. New requirements can pop up anytime in the marketer’s mind that could be added to the site for more customer engagement and lead generation. The decoupled approach gives the extra agility needed to keep improving the public-facing site.

Final Words

Drupal has the potential to provide a promising future for better digital experiences with its every upcoming release. The appending of new features in Drupal will help marketing teams to become more flexible and scalable and yet provide a surpassing customer experience in no time. Consequently, conversion rate, sales, and brand visibility would increase manifolds.

Marketing teams in the organization who are already running their sites on Drupal have a lot to be happy about. Specifically, the development around Acquia Lift and Acquia Journey gives them freedom and hence no reliance on developers for any website updates and making content live, etc, to target the audience at the right time and increase ROI.

And for the marketing teams of the organizations those who are envisaging to roll out a plan for shifting to Drupal due to its all-inclusive features, the culmination from the highly-skilled and empowered team will make it worth all the efforts.

Aug 23 2019
Aug 23

With enterprises looking for ways to stay ahead of the curve in the growing digital age, machine learning is providing them with the needed boost for seamless digital customer experience.

Machine learning algorithms can transform your Drupal website into an interactive CMS and can come up with relevant service recommendations targeting each individual customer needs by understanding their behavioural pattern.


Machine Learning integrated Drupal website ensures effortless content management and publishing, better targeting and empowering your enterprise to craft personalized experiences for your customers. It automates the customer service tasks and frees up your customer support teams, subsequently impacting RoI.

However, with various big names competing in the market, let’s look at how Amazon’s Machine Learning stands out amongst all and provides customised offerings by integrating with Drupal.

Benefits of Integrating AWS Machine Learning with Drupal

AWS offers the widest set of machine learning services ranging from pre-trained AI services for computer vision, language, recommendations, and forecasting. These capabilities are built on the most comprehensive cloud platform and are optimized without compromising security. Let’s look at the host of advantages it offers when integrated with Drupal.

Search Functionality

One of the major problems encountered while searching on a website is the usage of exact keyword. If the content uses a related keyword, you will not be able to find it without using the correct keyword.

This problem can be solved by using machine learning to train the search algorithm to look for synonyms and display related results. The search functionality can also be improved by using automatically filtering as per past reads, the search results according to the past reads, click-through rate, etc.

Amazon Cloudsearch is designed to help users improve the search capabilities of their applications and services by setting up a scalable search domain solution with low latency and to handle high throughput.

Image Captioning

Amazon Machine Learning helps in automatic generation of related captions for all images on the website by analyzing the image content. The admin would have the right to configure whether the captions should be added automatically or after manual approval, saving a lot of time for the content curators and administrators of the website.

Amazon Rekognition helps search several images to find content within them and easily helps segregate them almost effortlessly with minimal human interaction.

Website Personalization

Machine learning ensures users get to view tailored content on websites as per their favorite reads and searches by assigning them unique identifier (UID) and tracking their behaviour (clicks, searches, favourite reads etc) on the website for personalized web experience.

Machine learning analyzes the data connected with the user’s UID and provides personalized website content.

Amazon Personalize is a machine learning service which makes it easy for developers to create individualized recommendations for its customers. It saves upto 60% of the time needed to set up and tune the infrastructure for the machine learning models as compared to setting own environment.

Another natural language processing (NLP) service that uses machine learning to find insights and relationships in text is Amazon Comprehend. It easily finds out which topics are the most popular on the internet for easy recommendation. So, when you’re trying to add tags to an article, instead of searching through all possible options, it allows you to see suggested tags that sync up with the topic.

Vulnerability Scanning

A website is always exposed to potential threats, with a risk to lose customer confidential data.

Using machine learning, Drupal based websites can be made secure and immune to data loss by automatically scanning themselves for any vulnerabilities and notifying the administrator about them. This gives a great advantage to websites and also help them save the extra cost spent on using external software for this purpose.

Amazon Inspector is an automated security assessment service, which helps improve the security and compliance of the website deployed on AWS and assesses it for exposure, vulnerabilities, and deviations from best practices.

Voice-Based operations

With machine learning, it’s possible to control and navigate your website by using your voice. With Drupal standing by its commitment towards accessibility, when integrated with Amazon Machine Learning features, it promotes inclusion to make web content more accessible to people.

Amazon Transcribe is an automatic speech recognition (ASR) service. When integrated with a Drupal website, it benefits the media industry with live subtitling of news or shows, video game companies by streaming transcription to help hearing-impaired players, enables stenography in courtrooms in legal domain, helps lawyers to make legal annotations on top of live transcripts, and enables business productivity by leveraging real-time transcription to capture meeting notes.

The future of websites looks interesting and is predicted to benefit users through seamless experience by data and behavior analysis. The benefits of integrating Amazon Machine Learning with Drupal will clearly give it a greater advantage over other CMSs and will pave the way for a brighter future and better roadmap.

Srijan has certified AWS professionals and an expertise in AWS competencies. Contact us to get started with the conversation.

Aug 23 2019
Aug 23

Agiledrop is highlighting active Drupal community members through a series of interviews. Now you get a chance to learn more about the people behind Drupal projects.

This time we talked with Greg Dunlap, pinball wizard and Lullabot's senior digital strategist. We spoke of how satisfying it is to work on interesting things with the right people, the importance of the Drupal Diversity and Inclusion initiative, and the close similarities between the Drupal community and Greg's local pinball community.

1. Please tell us a little about yourself. How do you participate in the Drupal community and what do you do professionally?

I am a Senior Digital Strategist at Lullabot; historically, I’ve been very much involved in the development and technical side of website building, but in recent years I’ve gotten much more into the content, digital strategy and information architecture part, and so that’s more how I do my work these days, sort of dealing with bigger picture problems. 

As far as my participation in the Drupal community, it’s been pretty light these days. I still speak at conferences here and there about various things, but my contribution beyond that has dropped off quite a bit. I’ll pop in in an issue here and there, and I’ve been involved in the Drupal Diversity and Inclusion group and similar projects, but other than that, my participation is pretty light right now. 

I think it’s also been a bit of me taking my life back, I was very very involved in the Drupal community for almost a decade, and so I think a lot of it was also just sort of me taking some time for myself. It’s hard because I built my career through participating in the community, so to some extent that’s necessary, but you really need to find a balance. 

2. When did you first come across Drupal? What convinced you to stay, the software or the community, and why?

I came across Drupal at an interesting time in my life. It was around 2007, Drupal 5 was out, and I was working for a newspaper in Seattle called the Seattle Times. We were doing a migration to Drupal, and through part of that migration we hired Lullabot to come in and help us out. And that was when I first met Jeff Eaton and Matt Westgate, and it was Jeff Eaton who pushed me to get involved in contributing. I was talking to him about a problem and he said “wow, you should really file a core issue about that”, which I did, and 10 years later it got marked “won’t fix”, so that was great. 

But at the time I was looking for a new job anyway and Drupal was just taking off, so I started getting involved with the local Drupal user group, and through that I met a bunch of really cool people. I also needed to figure out what my next thing was going to be professionally, and so all of this stuff kind of came at just about the right time for me. I was looking for something in Drupal to dig my teeth into, and we were having a bunch of problems around deployment and configuration management at the Seattle Times, and so that kind of just became my niche. 

And through that I met a lot of people who helped and I also wrote a lot of code, and that ended up getting me my first Drupal job, which was at Palantir; then, everything just kind of snowballed after that.

Now I’m basically working for the company that was my first contact with Drupal. And through that Eaton and I became really close friends, and the two of us are the strategy team at Lullabot. One of the things that I’ve learned over the years is that people always tell you to follow your passions, but I really think that the people that you do things with are much more important than what you do. Because, granted, it’s great to do what you want, but if you don’t have the right people around you, it’s not going to be any good anyways.

3. What impact has Drupal made on you? Is there a particular moment you remember?

I’m doing a project right now for Lullabot which is involving me going back and listening to a lot of our old podcasts, and one of the things I did was, I went back and listened to my old podcasts (we used to do this series called Drupal Voices and I got interviewed on it a lot). 

And I really noticed as I listened to them, that every year as I was on the podcast, I could hear in my voice my confidence level growing. The first year, for example, was at DrupalCon D.C. and I was very scattered, I could tell I was very nervous; then the next year, I sounded much more confident, and then the year after that when I was at DrupalCon Chicago, I could tell I had really found my steps and stride.

Chicago turned out to be a really formative DrupalCon for me because I gave the very first ever core conversation at a DrupalCon and I was very very nervous about it. And as a result of that core conversation Dries came and asked me to lead the initiative for Drupal8, for CMI. So, DrupalCon Chicago really stands out as a turning point for me in the Drupal community, and where I really hit my stride. I’m just a little upset that the talk wasn’t recorded.

4. How do you explain what Drupal is to other, non-Drupal people?

Even when I explain my job, usually all I say is just “I build websites”, and so I just say that Drupal is software that you use to build websites. Sometimes you’ll meet somebody who actually understands conceptually what a content management system is, but I usually don’t even bother going down that rabbit hole. “I build websites” is close enough for anybody to at least get the idea.

5. How did you see Drupal evolving over the years? What do you think the future will bring?

It’s crazy seeing how Drupal has changed. Returning to the podcasts from question three: the first Lullabot podcast was in 2006, and at that time, while there were a couple of shops doing things, the global economy in Drupal was essentially nothing. Now, it’s grown to billions of dollars in thirteen years, and I think that change has been incredible; not in a good or bad way in particular, it’s just been change. 

For a lot of people who prefer a small scrappy group, it’s probably been a negative thing, but for people who prefer a more mature industry that they can grow into and make a career out of, it’s been a positive change. And I think that, as Drupal grows, we’re going to be seeing more and more focus on that maturity, that focus on stability. One of the things that we hear all the time now in the Drupal community is that we’re much more focused on predictable releases, on backwards compatibility, easier migrations, all of the stuff that you focus on when your focus is much more on stability.

Because, given the kind of industries that we’ve grown into, stability and predictability are super important. And I think that theme is going to continue to grow over the years, not that we won’t have new features, but I think that the turnaround time on them is going to continue to be more stretched out; we’re already seeing this now with Drupal 9, for example.

The experimental modules are also interesting, as they’ve allowed us to get new features into core in the middle of the release cycle, e.g. the Content Moderation came in and Migration and stuff like that. This takes a long time, however; Content Moderation is a functionality that’s been in development for years in contrib, it’s not as straightforward as somebody just writing a patch and whipping that out in 3 months.

Getting new functionality into core is a very long process that demands lots of testing, and even then the module has to have the “experimental” status for 6 months. All these smaller processes make the overarching process longer, but they also make it more transparent, predictable and stable - it’s essentially just a trade-off. 

6. What are some of the contributions to open source code or to the community that you are most proud of?

I wrote a lot of code for the Configuration Management initiative, and even though that code ended up all getting thrown away and rewritten, to me the mere ability to put that on a path to getting done is extremely satisfying. I ran it for about 2 years and then I handed it off to Alex Pott who ran it for about 2 years before Drupal 8 got released. 

Getting all of those concepts in place and putting together a team of people to get those concepts in place and get it working and rolling forward is something that I’m really happy with. And it was also really great because it represented the end of a long period for me; I started with configuration management as my niche that I dug into in the Drupal 5 era, and then to see that all the way through to getting done in Drupal 8 was really satisfying for me. 

Recently, working with the Drupal Diversity and Inclusion group has been really satisfying. It’s a truly amazing group of people who are really interested in growing our community in positive ways, and making sure our community is open to everybody who wants to contribute and welcoming to everybody who wants to contribute. 

I think this is going to be more and more important going forward, because as Drupal becomes a global enterprise, we need to be able to bring all of those voices in to speak. Even Dries is starting to talk about how important that is now (e.g. in Seattle in his keynote). 

I think that work is really important and I’m really glad to see more focus on the community management side, because traditionally Drupal has been a place where we bring in contributors and then we kind of burn through them. We need to realize that contribution is hard and takes a lot of time, and focusing on how we can make that contribution cycle more healthy for people is really crucial to sustaining the community - so, in whatever ways that comes in or works towards is really great. 

I think that anything we could do to make the Drupal community more welcoming to people is going to be really important. Obviously, growing the community is important, but so is bringing in different voices and viewpoints, so that we can make the community more open and more interesting and really bring in all of the wonderful differences we have in the world.

7. Is there an initiative or a project in Drupal space that you would like to promote or highlight?

Well, the one that we just talked about, of course! Anybody who’s interested in the Drupal Diversity and Inclusion initiative can join the DDI channel (diversity-inclusion) on Slack. That’s where a lot of that discussion happens; there are weekly meetings on Thursdays, and from there you can get links to their website and other similar resources. 

I’ve also been really interested in the work on the Drupal product side for Layout Manager recently. I was a little skeptical of that when it first came out, but we’ve been using it on a couple of client projects and I’ve been really impressed with it. I think that it’s going to fill a lot of gaps and needs in the Drupal community. 

While the UI is still a little rough, I think that once the usability gets some polish on it, it’s going to be a really important thing for Drupal going forward. I’ve been really really pleased to see how that’s been working, and the clients just adore it; every time we demo it for a client, they completely freak out, so, I’m really looking forward to seeing how that progresses.

8. Is there anything else that excites you beyond Drupal? Either a new technology or a personal endeavor. 

I don’t get into a lot of technology stuff out of work anymore these days as a hobby. My biggest hobby outside of Drupal is pinball; I’ve been playing it competitively for 25 years and I’ve always been very involved in the local community here in Portland, Oregon. 

I’ve been really involved in that for a really long time - running tournaments and playing in tournaments, I was also the state representative for the group that runs the pinball rankings for many years, and recently I’ve gotten really into fixing up and repairing pinball machines, which has been really cool. It’s very physical and manual, not at all like working on your computer, so it kind of like rubs a different part of my brain than computer work does - and then when you’re done, you have something fun you can play, which is really cool. 

But I will say that one of the nicest things about running tournaments and being involved, similarly to Drupal, in the pinball community, is that you can build the community that you want to see. One of the things I’ve really done in Portland is trying to bring together a different set of voices to help run tournaments, to be the face of the community here, to create welcoming and safe spaces for people. And we have seen, for instance, the number of women that we have competing in tournaments here grow by leaps and bounds as a result of that work, and that’s been extremely gratifying too. 
 

Aug 22 2019
Aug 22
ReThink Orphanages

ReThink Orphanages is a courageous organization with a benevolent charge, grand ambition, a network of high-powered partners, and a commitment to make the world a better place. The organization is…

Visit Site Witness.orgWitness.org: Using Video to Document and Tell Stories

Witness.org is a Brooklyn-based non-profit that “…makes it possible for anyone, anywhere to use video and technology to protect and defend…

Visit Site NWETC

The Northwest Environmental Training Center (NWETC) is an organization that is committed to helping environmental protections improve their career opportunities. The organization focuses on two…

Visit Site Fred Hutchinson Cancer Research Center eagle-i IntegrationAbout Fred Hutchinson Cancer Research Center

Fred Hutchinson Cancer Research Center’s (FHCRC) Shared Resources core facilities support biomedical research by providing services and expertise that…

Visit Site University of Washington Center for Reinventing Public Education (CRPE)

We originally partnered with CRPE's in-house web manager in 2012. He was familiar with the content management aspect of Drupal, but needed a bit of support with the more intricate ways that Drupal…

Visit Site Seattle Humane Society

We began working with Seattle Humane Society in February of 2016. They had reached out to us because they needed some emergency help with their website. For some reason, their site was reverting…

Visit Site Middle East Policy Council

The Middle East Policy Council (MEPC) is a 501(c)(3) nonprofit organization founded in 1981 whose mission is to contribute to American understanding of the political, economic and cultural issues…

Visit Site American College for Healthcare Sciences (ACHS)

ACHS originally partnered with Freelock in August of 2011 to perform some easy wins on their site. We started with a Freelock Site Assessment and code review of their Drupal 6 website. After that…

Visit Site Peninsula College and Athletics Sites

Peninsula College reached out to us in 2012 for some emergency work related performance issues on their site and problems with their site crashing. Once we were able to jump in and diagnose, we…

Visit Site Appliance Standards Awareness Project - ASAP

Appliance Standards Awareness Project (ASAP) organizes and leads a broad-based coalition effort that works to advance, win and defend new appliance, equipment and lighting standards which deliver…

Visit Site Georgetown University Qatar

We were contacted by the Georgetown University team in Qatar in late November 2016 regarding several of their sites that were using the Drupal Domain Access module. They had several requirements…

Visit Site Seattle Children's Alliance

Another Drupal 8 site upgrade! In June of 2016 we were approached by Seattle’s Children’s Alliance for a Drupal 5 to Drupal 8 migration. Their main concern was that their Drupal 5 site modules…

Visit Site Fred Hutchinson Cancer Research Center – HANC

We started working with another team at Fred Hutch in early 2016. They contacted us after they were needing some local TLC with their HIV/AIDS Network Coordination global CRM database (…

Visit Site DIYZ.com

We were approached in late 2015 by a marketing/design agency to take over this project. Their Drupal developer was on her way out and they needed some Drupal expertise. The website was just in…

Visit Site Jim Ovia Foundation

We originally worked with the main Jim Ovia stakeholder on a separate project, when she worked with a different giving organization. She then reached out to us in September of 2015 to let us know…

Visit Site World Vision Knit for Kids

When the team at World Vision approached us in late 2015 to work on a few of their Drupal sites, they also had their Knit for Kids website that was in WordPress. They wanted to take that site and…

Visit Site Second Nature Sports

Second Nature Sports is owned and operated by the same folks over at Locker Soccer Academy, so when their previous developer was closing shop, it was just natural to have them roll this site into…

Visit Site Locker Soccer AcadamyLocker Soccer Academy

In December of 2013, our friends at Locker Soccer Academy reached out to us regarding their soccer academy sites, that were already developed by a shop in Colombus, Ohio. The development shop was…

Visit Site mEducation Alliance screenshotWorld Vision mEducation Alliance

The product owners of the mEducation Alliance website contacted us in September, 2015 to provide ongoing monthly Drupal security and module updates. The site is a partnership between…

Visit Site World Vision Chinese siteWorld Vision Chinese/Korean Websites

World Vision decided to partner with Freelock in September of 2015 for their Chinese and Korean websites. Headquartered in Federal Way, Washington – this was a perfect fit! We took…

Visit Site Queen City Yacht Club

Our friends at Queen City Yacht Club approached us in April of 2015 regarding their Drupal 5 website. It was at the end of life and they wanted to upgrade. Their motivations were that they wanted…

Visit Site National Center for Science Education

In June of 2015, our collegue recommended our Drupal maintenance services to the National Center for Science Education. Our colleague was looking forward to large project and just didn't have the…

Visit Site Snoqualmie Tribe

Snoqualmie Tribe contacted us in December of 2014 in desperate need to secure their website. It turns out, they were susceptible to the Drupalgeddon attack and needed the Drupal 7.34 core…

Visit Site Lease Crutcher Lewis

In December of 2013, we were contacted by Lease Crutcher Lewis to take over their hosting. Their site is a great contender for Drupal 8! They were also interested in our Drupal maintenance plan…

Visit Site Bonavita World

In early 2014, our friends at Bonavita came to Freelock requesting that we help manage their main website Espresso Supply. Soon thereafter, they wanted to launch a website for one of their brands…

Visit Site Washington Housing Alliance Action Fund

The Washington Low Income Housing Alliance Action Fund first talked to us about building a new site last October, but they were not ready to proceed. This summer they were finally ready, and used…

Visit Site Makah Community Portal About the Makah Tribe

The Makah Tribe is located in Neah Bay, Washington. They had a custom portal for the Makah Tribe and the Neah Bay Community. Their website is the hub for the community,…

Visit Site Max Dale's Steak and Chop House

Max Dale's Steak House is a popular restaurant a few dozen miles up the road in Mt. Vernon, Washington. If you're not from this state, you might have heard of Mt. Vernon when a major Interstate…

Visit Site IslandWood

About Islandwood

IslandWood is a nonprofit educational…

Visit Site Northwest Wall & Ceiling Bureau

Freelock built an informational website for the Northwest Wall & Ceiling Bureau (NWCB), an international trade organization for the wall and ceiling industry. We delivered a snappy…

Visit Site Totem Ocean

Freelock teamed up with Eben Design to build a new site for Totem Ocean Trailer Express, a shipping company that has two voyages between Tacoma and Anchorage each week. In addition to being a…

I-TECH

The International Training and Education Center for Health (I-TECH), a non-profit collaboration between the University of Washington and the University of California, San Francisco, came to…

Bellingham School District

For the Bellingham School District, Freelock put together a large, multi-faceted Drupal site for district-wide information and standardized sites for schools within the district containing…

Visit Site Olympic Peninsula Tourism Commission

The Olympic Peninsula Tourism Board came to us in early 2009 to assist them in developing a visually stunning and highly functional website in a Content Management System. The site needed to be…

Latitudes in Learning

Latitudes in Learning creates personalized learning experiences that engage people and organizations in meaningful interactions with the world. Learning is one of the most difficult activities in…

Visit Site World Class Hunting The Project

The owner of World Class Hunting approached Freelock to launch their website. They had worked with Drupal on a previous project, so they were familiar and enjoyed Drupal. The project…

Visit Site Joey Klein The Project

Joey Klein's team approached Freelock to "rescue" their website. Through the course of their planning stages, the original developers found that they had reached a point where they did…

Visit Site LedgerSMB Web Site

LedgerSMB is an open source accounting and financial package. Freelock has used LedgerSMB for its bookkeeping practically since the project began, forking an earlier open source project called…

Visit Site Hydra Content Management System

Freelock worked closely with Consumer Media Network to create a content management system (Hydra) through which they can track assignments and submissions. We implemented a custom workflow system…

Visit Site DanceSafe

DanceSafe is a non-profit, harm reduction organization promoting health and safety within the rave and nightclub community. Local chapters consist of young people from within the dance culture…

Visit Site Answers for Elders

Freelock developed an informational website for Answers for Elders. Answers for Elders is an online resource for adults caring for elderly parents. Branded as the "Boomers' Online Community to…

Visit Site Ge•cko About Geocko

Geocko builds powerful tools that save time and increase engagement for nonprofit organizations.  They simplify tasks, so organizations can stay forcused on running programs and…

Visit Site I-TECH Drupal 7 Upgrade About I-TECH

The International Training and Education Center for Health (I-TECH) is a center in the University of Washington's Department of Global Health.  I-TECH headquarters in Seattle with…

Visit Site Nightingale-Bamford School About Nightingale-Bamford School

The Nightingale-Bamford School is an independent all-female university-preparatory school founded in 1920.  With grades K-12, NBS is one of the top-ranked private…

Visit Site Lindbergh Gallery About Lindbergh Gallery

Lindbergh Gallery is Erik Lindbergh's venue for selling aerospace sculptures, furniture and other custom art pieces he creates.  Erik Lindberg is the grandson of Charles…

Visit Site RevEquip About RevEquip

The RevEquip team is composed of foodservice consultants experienced in Revit and the creation of foodservice documents.  Revit Families are developed to meet the standards…

Visit Site Nia Technique, Inc. About Nia Technique, Inc.

Nia is a sensory-based movement practice that draws from martial arts, dance arts and healing arts. Nia Technique Inc is built…

Visit Site Crossfit Games/PLY Interactive

In December 2011, Ply Interactive came to Freelock for assistance theming the Crossfit Games site, because of our Drupal expertise and the tight schedule for the project. This site was built from…

Visit Site Alaska Fishing Jobs Center

Scott Coughlin, a 24-season veteran of Alaska's salmon, herring and halibut fisheries, came to Freelock for help getting his site done. We picked up the pieces of the development project, yet…

Visit Site West Seattle Family Zone

The folks at WSFZ needed a directory-based website that was easy to maintain and easy to use. Catering to the families of this popular Seattle-area neighborhood, West Seattle, WSFZ wanted a clean…

Visit Site Cool Day Trips

Freelock was approached to build a website focused on the “Top 50” Day Trips from Seattle, based on rankings and reviews from users. Cooldaytrips.com includes a description of each destination,…

Visit Site Maltby Produce Markets, CSA

Our friends at Flower World came back to us for their latest project, Maltby Produce Markets CSA. Maltby Produce Markets grow a wide variety of fruits and vegetables in their fields, orchards,…

Visit Site Littlestar Prints

Littlestar Prints is one of our favorite sites. It combines some powerful drag-and-drop photo editing in the browser with e-commerce. We got to use both of our favorite software packages--Drupal…

Visit Site Organic Materials Research Institute (OMRI)

Another large scale project by Freelock Computing, the Organic Materials Research Institute tested our abilities to work with a variety of sources and the Drupal system. The project brought their…

Visit Site TerraBella Flowers - Organic Florist in Seattle

TerraBella Flowers & Mercantile specializes in European garden-style designs, using an assortment of local, organic, and sustainably grown flowers and is based in the Greenwood neighborhood…

Visit Site RadioFrame Networks

RadioFrame Networks came to Freelock in late 2008 for a web development project aimed at bringing their existing corporate static site content into the Joomla CMS. We worked with their existing…

WestSide Baby

WestSide Baby, in partnership with the Puget Sound community, provides essential items to local children in need by collecting and distributing diapers, clothing, toys and equipment. They partner…

Visit Site Booktrope

Booktrope's goal of getting online books to as many people as possible (for free!) was a project right up our alley, fitting nicely with our open source foundation.

From the beginning, we…

Visit Site Robinson Newspapers

Robinson Papers hired us to build a centrally-managed web site with different personalities for each of their papers but shared control. We built out a sophisticated system in Drupal, with…

Visit Site Riverview Community Church

Riverview originally came to Freelock to design and develop a website that would serve the needs of their rapidly growing church community. Originally a simple Joomla deployment, we have since…

Visit Site Phytec America

In early 2008, Phytec America brought Freelock on to maintain its Linux server systems and add an additional server. We configured a new server as an internal mail and file server, and converted…

Visit Site BlueView Technologies

BlueView Technologies initially hired us for Linux server administration in 2007. We have worked extensively with BlueView, expanding their IT infrastructure from one server which did everything…

Visit Site Running Wild Spirit Zen Cart

Running Wild Spirit is one of our earliest Zen Cart projects and has greatly contributed to the success and growth of their business. Based out of the small town of Maltby an hour and a half away…

Visit Site Seattle Jobs Scraper System

SeattleJobs.org hired Freelock to implement a system to scrape job listings from their customers' web sites. SeattleJobs.org lists jobs from about 45 member companies. In order for Seattle Jobs…

Visit Site Outdoor Research

Outdoor Research, an outdoor gear manufacturer, hired us to build their main public web site, along with a full administrative back end. We implemented a system that imports product data from…

Visit Site P5 Group Website

When P5 Group was looking to create a dynamic website, they looked to Freelock Computing to help. Using Drupal as a CMS platform, we created a site with multiple access levels for their wide…

Visit Site An XML-based Report Browser

One of Freelock, LLC's ongoing customers needed a web front-end for a proprietary reporting tool. This reporting tool could be configured to generate reports that ended up in Microsoft Excel.…

Custom Client Extranet

One of Freelock, LLC's ongoing customers needed a password-protected site that actually provided different content to different users, depending on the company or user group the user is part of…

Single-Source Help System

Over the course of creating help systems for multiple clients, Freelock, LLC developed a set of scripts that manage browse sequences, tables of contents, and cross-references for help systems.…

Aug 22 2019
Aug 22

22 Aug

Nick Veenhof

1xINTERNET made a great post about calculating how much you are giving back and contributing to Open Source. Open Source makes our business model possible and for some people in the company, and certainly yours truly, made a career out of it. Our CEO even started Dropsolid as he thought we could approach this differently. If this makes a difference for us, why not for some of our employees as well, or could it even be a big impact in recruitment?

Delivering Open Digital Experiences with Open Source components on our own Dropsolid Platform is also is very interesting for clients. We are able to deliver quality projects for a budget someone once could only dream of. The downside is that you are not alone that has this magical ability, so it is a competitive landscape. The question then is how you can stand out, how can you make that difference.

Let's honor 1xINTERNET by doing our math.

How do we contribute

Our contribution to the Drupal project can be divided in the same 3 areas:

  • Community work
  • Event, Sponsorships, and memberships
  • Source code

Community work

During the last year and even this year we spent a lot of time helping the Belgian Drupal Community. We organised Drupal User Group meetings, helped with organising Drupalcamp Belgium and are currently actively involved in organising Drupal Developer Days. Next to that is yours truly also active with the European Splash Awards as an organisational member and active as Chair of the Belgian Drupal Community. We also had several speakers from Dropsolid on all these events and I've also volunteered as Drupalcon Track Chair or Co-Chair for the DevOps track, both Drupal Europe and Drupalcon Amsterdam. Attending those meetings during the day or evening takes time, but is well worth it. And next to that we gave back to the community by going to Drupalcamp Kiev and Drupal Mountain Camp to share the knowledge.

When taking all of the above, time spent on community activities adds up to around 1 FTE. This includes time spent to organise it, give back with speaker preparations and time spent to organise those sponsorships and facilitate our employees to attend. For your information, in 2018 we had on average 65 people working for Dropsolid.

Sponsorships and memberships

Since 2018 we invested in the following events and memberships

  • Silver Sponsor for Drupal Europe 2018
  • Gold Sponsor for Drupalcamp Belgium
  • Gold Drupaljam XL 2019
  • Diamond Sponsor Drupalcon Amsterdam 2019
  • Organisation Member of Belgian Drupal Community
  • Organisation Member of the Drupal Association
  • Supporting Partner of the Drupal Association
  • Donation to the Promote Drupal Fund
  • Employ a Provisional member of the Drupal Security Team

in total we spent close to 1% of our total yearly spent in sponsorships, travel to and memberships related to the Drupal project.

Drupal.org contributions

Without a doubt we actively contribute back to the Drupal.org source code. Some contributions are also meetings from the community events and other important milestones that matter for the community. All efforts matter, not just code.

We contributed our complete install profile Dropsolid Rocketship and all our paragraphs that come with it. Dropsolid has a dedicated R&D team that consists out of 8 people. This team has a Drupal Backend & Frontend R&D Engineer, a couple DevOps engineers, 2 machine learning geniuses and myself. Next to that we supported one of our employees in his personal pet project drupalcontributions.org. Let's take a look at that site to come up with some numbers.

We support over 30 projects, have over 207 credits, with just 58 credits in the last 3 months and 181 comments in the last 3 months. I dare to say that we are ramping it up.

We currently rank nr. 42 of the organisations that contribute most back to Drupal in the last 3 months. Compared to 1xINTERNET we have a bigger headcount, but nevertheless I am equally proud, just as they are, to see this happen and to make an impact. 

Not only do we contribute with our own agenda, we also contract Swentel to spend just 1 day a month on Drupal contributions without any agreed agenda. By just supporting 1 day a month, this accounted to 17 comments in the last 30 days alone. To make a point, just counting credits isn't fair. It's how you come together and find a common solution. Communication is key and we support this with hard earned money.

How much did we contribute since 2018?

So, given that from the 8 R&D persons, we at least contribute back 1 FTE to the Drupal ecosystem. Combined with contributions that happen back from the projects by our development teams and the core contribution sponsorships for Swentel we easily get to 2 FTE's to contribute back on a total headcount of 65. 

If we add up the efforts, next to the payroll, in community work, sponsorships and memberships Dropsolid contributes an equivalent of +- 4.5% of our annual yearly spent in 2018. With the Drupalcon Amsterdam in 2019 happening as a Diamond Sponsor, this will be even bigger. We're aiming to send more than 20 payroll employees to Drupalcon and have 3 selected sessions! We're also not cutting back on our efforts to contribute back, on the contrary!

War on Talent

Being visible in the ecosystem with code contributions, event contributions, sponsorships helps to be a company people want to work for. Obviously this is not the only motivation but given that in our technical teams we employ around 1/4th of the employees as remote developers, a good company brand is important. We see that this works and this helps us to attract and retain highly skilled employees. We now have employees in countries such as Portugal, Slovenia, Poland, Ukraine, Romania, Bulgaria, ...

Lead, don't just follow

By actively participating, whether it is code or events or anything that decides the future of Drupal or for that matter Digital Experiences, you are ahead of the curve. This allows for more competitive offers towards clients and this helps you win deals that as a follower of the pack might not be possible. Dropsolid believes in taking the lead when it comes to being the best Drupal partner in Greater Europe, and is proud to believe we are. Up to you to believe us or not ;-)

Are you a client or looking for a job and are interested in hearing more, don't hesitate to contact us.

Aug 22 2019
Aug 22

The first part of this article described why and how the stakeholders of a project can contribute to Drupal. This developer-oriented article is a summary of the Drupal.org documentation for new code contributors. We will cover: how to work on the issue queue, how to publish a project, and how to approach this process with Drupal 9 in mind. 

Before we start, let’s just remember two things:

  • Contributing is easy and not only for senior developers. Even the smallest contribution matters, this is the summation of those contributions that make the community and the codebase strong.
  • You are part of a community. A great way to get started with contribution is to find a mentor at a Drupal event, and there is always someone ready to help on the Drupal Slack #contribute channel.

The Drupal.org issue queue

While working on client projects, you might encounter core or contributed bugs or you may need a new feature.

Chances are that this issue has already been discussed in the issue queue that covers the Drupal core or all projects, including contributed. If your issue is not yet in the queue, use the issue template summary to file one. You might also check this page about creating good issues.

The main process for an issue is Active > Needs work > Needs review > Reviewed and tested by the community (RTBC) > Fixed > Closed.

When the issue is fixed or closed, the project maintainer will include the changes in the codebase (dev branch) and will most probably wrap it with several other closed issues into a  release. Read about the semantic versioning model.
If you need the patch before the release, you can include it in your client project with Composer.

If the issue has not been reviewed/tested yet, just make sure that you are using the right version of the Drupal core or contributed project.

Project Version

From there, the main tasks you can contribute to as a developer are:

  • When the issue needs review: manual or automated testing
  • When the issue needs work: re-roll or create a patch.

We will summarize the process of creating a patch below:

You can also find novice issues and continue your reading with the Novice code contribution guide and the New contributor tasks: programmers.

Create a patch

Now that the codebase is on GitLab, there was a proof of concept for fixing a trivial issue on Drupal.org using only the issue queue and GitLab.

For most cases, you still need to use the following flow.

Create the patch from the issue branch

The branch is the one from the Version meta on the issue.

Drupal core: https://www.drupal.org/project/drupal/git-instructions
Contributed project: click on the Version control tab

Version control tab

Follow then the git instructions on the page to create the patch, basically:

  • Create a new branch: git checkout -b [patch_branch]

  • Make your changes then review them with git diff

  • Add your changes with git add

  • Output them as a patch file
    git diff [base_branch] > [description]-[issue-number]-[comment-number].patch

Submit the patch

Once your changes have been made, upload the patch on the issue, then include a comment that describes your solution and everything that will facilitate the review, like screenshots or interdiff.

If there are unit tests available, request a test for your patch, then set the issue status as “Needs review”.

Needs review

Contribute a full project

Workflow

There are several ways to go, but this procedure works well for us while working for client projects on a module (or theme) that could be contributed.

The first iteration is built straight on the client project repository with a generic name until the related ticket testing is finished, so that:

  • reviewers can check all the code in one place and most likely they already have an environment running with exhaustive data for testing.
  • if the ticket is reopened, the process is simplified (there is no release to maintain or Drupal.org process involved).

Using a generic name still favours reusability among other client projects if we decide that it will not be contributed on Drupal.org yet. We can then isolate it in a dedicated repository, then include it via Composer.

Publish it on Drupal.org

1. Make sure that your project contains the following files

  • The composer.json file that contains the project metadata + system requirements (e.g. ext-json), third party libraries (e.g. solarium/solarium) and Drupal dependencies (e.g. drupal/search_api).
  • The [your_project].info.yml file with the Drupal dependencies, prefixed by their namespace (e.g. drupal:node or webform:webform_ui).
  • A readme with the use case, the dependencies, related projects (and how this one differs), a roadmap if the project is not feature complete.

2. Fix and check coding standards

  • Autofix with phpcbf: phpcbf --standard=Drupal --extensions=php,module,inc,install,test,profile,theme,css,info,txt,md /path/to/project
  • Check what needs to be fixed manually: phpcs --standard=Drupal --extensions=php,module,inc,install,test,profile,theme,css,info,txt,md /path/to/project

3. Create the project page

Most likely, it will be a Module or a Theme: https://www.drupal.org/project/add

Give it a name and a machine name.
Add the key points: project classification, description (the project readme is a good start) and organization. Make also sure to add other maintainers if you are working in a team.

4. Create a first, unversioned, dev release

  • Create a dev branch (e.g. 8-x-1.x) and push the code
  • Follow instructions on the Version control tab.
  • Not mandatory, but it is recommended to create a clean installation with the latest release of Drupal so we’re sure about 2 things: all the composer based dependencies are resolved properly and there are no side effects caused by the client project.
  • At this stage, other developers might include your code in the project with composer require drupal/your_project:1.x-dev
  • On your client project, as there is no semantic version release yet (e.g. 8.1.0), it is a good habit to add the commit hash so you make sure that you keep control over the codebase that is actually used:  composer require drupal/your_project:1.x-dev#commit

5. Create intermediate versioned releases

When the development is ready, you might create alpha then beta releases.
Beta releases should not be subject to api changes.

  • Check out the latest dev branch commit: git checkout  8.x-1.x
  • Tag it: git tag 8.x-1.0-alpha1
  • Push it: git push origin tag 8.x-1.0-alpha1
  • Create the release and document it. This article from Matt Glaman provides great information about creating better release notes.

6. Create a stable versioned release

When there are no open bugs and the project is feature complete, a first stable release will tell site builders that the module is ready to be used (e.g. 8.1.0). Follow the same steps as 5.

7. Apply for permission to opt into security advisory coverage

Follow the security review process.

8. Create a new major release

If there are backward compatibility changes, creating a new major release (e.g. 8.2.x)  will indicate site builders and developers that there are BC breaks.

Read more with Best practices for creating and maintaining projects.

Be prepared for Drupal 9

TLDR; Drupal 9 should be released in June 2020. The first version will be about removing deprecated APIs and getting new versions of third-party libraries (Symfony 4 or 5, Twig 2, …). So, if a Drupal 8 project does not use deprecated code, it will work in Drupal 9 :)

Drupal 8.7 started the deprecation work and 8.8 will define the full deprecation list.

Code contribution is needed there as well: on the core with issues like deprecating legacy include files, or by helping contributed modules to be Drupal 9 ready and fixing contributed modules Drupal 9 compatibility issues.

Codebase analysis tools

Several tools are here to help if you are contributing to issues or maintaining a project.

The upgrade status module scans the code of the contributed and custom projects you have installed, and reports on any deprecated code that must be replaced before the next major version.

Drupal-check, a static analysis tool based on PHPStan, will check for correctness (e.g. using a class that doesn't exist), deprecation errors, and more.

Further read about Drupal 9, how to prepare for it and analysis of top uses of deprecated code in Drupal contributed projects in May 2019.

Developer toolbox

Here is a brief recap of the tools that might help you while contributing.

Other ways to contribute than code

There are more ways to contribute, even if you are not a developer, the Drupal community is looking for

Read more about contribution on the Getting Involved Guide.

Aug 21 2019
Aug 21

Today we will learn how to migrate content from LibreOffice Calc and Microsoft Excel files into Drupal using the Migrate Spreadsheet module. We will give instructions on getting the module and its dependencies. Then, we will present how to configure the module for spreadsheets with or without a header row. There are two example migrations: images and paragraphs. Let’s get started.

Example configuration for Microsoft Excel and LibreOffice Calc migration.

Getting the code

You can get the full code example at https://github.com/dinarcon/ud_migrations The module to enable is UD Google Sheets, Microsoft Excel, and LibreOffice Calc source migration whose machine name is ud_migrations_sheets_sources. It comes with four migrations: udm_google_sheets_source_node.yml, udm_libreoffice_calc_source_paragraph.yml, udm_microsoft_excel_source_image.yml, and udm_backup_csv_source_node.yml. The image migration uses a Microsoft Excel file as source. The paragraph migration uses a LibreOffice Calc file as source. The CSV migration is a backup in case the Google Sheet is not available. To execute the last one you would need the Migrate Source CSV module.

You can get the Migrate Google Sheets module using composer: composer require drupal/migrate_spreadsheet:^1.0. This module depends on the PHPOffice/PhpSpreadsheet library and many PHP extensions including ext-zip. Check this page for a full list of dependencies. If any required extension is missing the installation will fail. If your Drupal site is not composer-based, you will not be able to use Migrate Spreadsheet, unless you go around a lot of hoops.

Understanding the example set up

This migration will reuse the same configuration from the introduction to paragraph migrations example. Refer to that article for details on the configuration. The destinations will be the same content type, paragraph type, and fields. The source will be changed in today's example, as we use it to explain Microsoft Excel and LibreOffice Calc migrations. The end result will again be nodes containing an image and a paragraph with information about someone’s favorite book. The major difference is that we are going to read from different sources.

Note: You can literally swap migration sources without changing any other part of the migration.  This is a powerful feature of ETL frameworks like Drupal’s Migrate API. Although possible, the example includes slight changes to demonstrate various plugin configuration options. Also, some machine names had to be changed to avoid conflicts with other examples in the demo repository.

Understanding the source document and plugin configuration

In any migration project, understanding the source is very important. For Microsoft Excel and LibreOffice Calc migrations, the primary thing to consider is whether or not the file contains a row of headers. Also, a workbook (file) might contain several worksheets (tabs). You can only migrate from one worksheet at a time. The example documents have two worksheets: UD Example Sheet and Do not peek in here. We are going to be working with the first one.

The spreadsheet source plugin exposes seven configuration options. The values to use might change depending on the presence of a header row, but all of them apply for both types of document. Here is a summary of the available configurations:

  • file is required. It stores the path to the document to process. You can use a relative path from the Drupal root, an absolute path, or stream wrappers.
  • worksheet is required. It contains the name of the one worksheet to process.
  • header_row is optional. This number indicates which row containing the headers. Contrary to CSV migrations, the row number is not zero-based. So, set this value to 1 if headers are on the first row, 2 if they are on the second, and so on.
  • origin is optional and defaults to A2. It indicates which non-header cell contains the first value you want to import. It assumes a grid layout and you only need to indicate the position of the top-left cell value.
  • columns is optional. It is the list of columns you want to make available for the migration. In case of files with a header row, use those header values in this list. Otherwise, use the default title for columns: A, B, C, etc. If this setting is missing, the plugin will return all columns. This is not ideal, especially for very large files containing more columns than needed for the migration.
  • row_index_column is optional. This is a special column that contains the row number for each record. This can be used as unique identifier for the records in case your dataset does not provide a suitable value. Exposing this special column in the migration is up to you. If so, you can come up with any name as long as it does not conflict with header row names set in the columns configuration. Important: this is an autogenerated column, not any of the columns that come with your dataset.
  • keys is optional and, if not set, it defaults to the value of row_index_column. It contains an array of column names that uniquely identify each record. For files with a header row, you can use the values set in the columns configuration. Otherwise, use default column titles like A, B, C, etc. In both cases, you can use the row_index_column column if it was set. Each value in the array will contain database storage details for the column.

Note that nowhere in the plugin configuration you specify the file type. The same setup applies for both Microsoft Excel and LibreOffice Calc files. The library will take care of detecting and validating the proper type.

Migrating spreadsheet files with a header row

This example is for the paragraph migration and uses a LibreOffice Calc file. The following snippets shows the UD Example Sheet worksheet and the configuration of the source plugin:

book_id, book_title, Book author
B10, The definitive guide to Drupal 7, Benjamin Melançon et al.
B20, Understanding Drupal Views, Carlos Dinarte
B30, Understanding Drupal Migrations, Mauricio Dinarte
source:
  plugin: spreadsheet
  file: modules/custom/ud_migrations/ud_migrations_sheets_sources/sources/udm_book_paragraph.ods
  worksheet: 'UD Example Sheet'
  header_row: 1
  origin: A2
  columns:
    - book_id
    - book_title
    - 'Book author'
  row_index_column: 'Document Row Index'
  keys:
    book_id:
      type: string

The name of the plugin is spreadsheet. Then you use the file configuration to indicate the path to the file. In this case, it is relative to the Drupal root. The UD Example Sheet is set as the worksheet to process. Because the first row of the file contains the header rows, then header_row is set to 1 and origin to A2.

Then specify which columns to make available to the migration. In this case, we listed all of them so this setting could have been left unassigned. It is better to get into the habit of being explicit about what you import. If the file were to change and more columns were added, you would not have to update the file to prevent unneeded data to be fetched. The row_index_column is not actually used in the migration, but it is set to show all the configuration options in the example. The values will be 1, 2, 3, etc.  Finally, the keys is set the column that serves as unique identifiers for the records.

The rest of the migration is almost identical to the CSV example. Small changes were made to prevent machine name conflicts with other examples in the demo repository. For reference, the following snippet shows the process and destination sections for the LibreOffice Calc paragraph migration.

process:
  field_ud_book_paragraph_title: book_title
  field_ud_book_paragraph_author: 'Book author'
destination:
  plugin: 'entity_reference_revisions:paragraph'
  default_bundle: ud_book_paragraph

Migrating spreadsheet files without a header row

Now let’s consider an example of a spreadsheet file that does not have a header row. This example is for the image migration and uses a Microsoft Excel file. The following snippets shows the UD Example Sheet worksheet and the configuration of the source plugin:

P01, https://agaric.coop/sites/default/files/pictures/picture-15-1421176712.jpg
P02, https://agaric.coop/sites/default/files/pictures/picture-3-1421176784.jpg
P03, https://agaric.coop/sites/default/files/pictures/picture-2-1421176752.jpg
source:
  plugin: spreadsheet
  file: modules/custom/ud_migrations/ud_migrations_sheets_sources/sources/udm_book_paragraph.ods
  worksheet: 'UD Example Sheet'
  header_row: 1
  origin: A2
  columns:
    - book_id
    - book_title
    - 'Book author'
  row_index_column: 'Document Row Index'
  keys:
    book_id:
      type: string

The plugin, file, amd worksheet configurations follow the same pattern as the paragraph migration. The difference for files with no header row is reflected in the other parameters. header_row is set to null to indicate the lack of headers and origin is to A1. Because there are no column names to use, you have to use the ones provided by the spreadsheet. In this case, we want to use the first two columns: A and B. Contrary to CSV migrations, the spreadsheet plugin does not allow you to define aliases for unnamed columns. That means that you would have to use A, B in the process section to refer to these columns.

row_index_column is set to null because it will not be used. And finally, in the keys section, we use the A column as the primary key. This might seem like an odd choice. Why use that value if you could use the row_index_column as the unique identifier for each row? If this were an isolated migration, that would be a valid option. But this migration is referenced from the node migration explained in the previous example. The lookup is made based on the values stored in the A column. If we used the index of the row as the unique identifier, we would have to update the other migration or the lookup would fail. In many cases, that is not feasible nor desirable.

Except for the name of the columns, the rest of the migration is almost identical to the CSV example. Small changes were made to prevent machine name conflicts with other examples in the demo repository. For reference, the following snippet shows part of the process and destination section for the Microsoft Excel image migration.

process:
  psf_destination_filename:
    plugin: callback
    callable: basename
    source: B # This is the photo URL column.
destination:
  plugin: 'entity:file'

Refer to this entry to know how to run migrations that depend on others. In this case, you can execute them all by running: drush migrate:import --tag='UD Sheets Source'. And that is how you can use Microsoft Excel and LibreOffice Calc files as the source of your migrations. This example is very interesting because each of the migration uses a different source type. The node migration explained in the previous post uses a Google Sheet. This is a great example of how powerful and flexible the Migrate API is.

What did you learn in today’s blog post? Have you migrated from Microsoft Excel and LibreOffice Calc files before? If so, what challenges have you found? Did you know the source plugin configuration is not dependent on the file type? Share your answers in the comments. Also, I would be grateful if you shared this blog post with others.

This blog post series, cross-posted at UnderstandDrupal.com as well as here on Agaric.coop, is made possible thanks to these generous sponsors. Contact Understand Drupal if your organization would like to support this documentation project, whether it is the migration series or other topics.

Aug 21 2019
Aug 21

Today we will learn how to migrate content from Google Sheets into Drupal using the Migrate Google Sheets module. We will give instructions on how to publish them in JSON format to be consumed by the migration. Then, we will talk about some assumptions made by the module to allow easier plugin configurations. Finally, we will present the source plugin configuration for Google Sheets migrations. Let’s get started.

Example configuration for Google Sheets migration

Getting the code

You can get the full code example at https://github.com/dinarcon/ud_migrations The module to enable is UD Google Sheets, Microsoft Excel, and LibreOffice Calc source migration whose machine name is ud_migrations_sheets_sources. It comes with four migrations: udm_google_sheets_source_node.yml, udm_libreoffice_calc_source_paragraph.yml, udm_microsoft_excel_source_image.yml, and udm_backup_csv_source_node.yml. The last one is a backup in case the Google Sheet is not available. To execute it you would need the Migrate Source CSV module.

You can get the Migrate Google Sheets module and its dependency using composer: composer require drupal/migrate_google_sheets:^1.0'. It depends on Migrate Plus. Installing via composer will get you both modules.  If your Drupal site is not composer-based, you can download them manually.

Understanding the example set up

This migration will reuse the same configuration from the introduction to paragraph migrations example. Refer to that article for details on the configuration. The destinations will be the same content type, paragraph type, and fields. The source will be changed in today's example, as we use it to explain Google Sheets migrations. The end result will again be nodes containing an image and a paragraph with information about someone’s favorite book. The major difference is that we are going to read from different sources. In the next article, two of the migrations will be explained. They read from Microsoft Excel and LibreOffice Calc files.

Note: You can literally swap migration sources without changing any other part of the migration.  This is a powerful feature of ETL frameworks like Drupal’s Migrate API. Although possible, the example includes slight changes to demonstrate various plugin configuration options. Also, some machine names had to be changed to avoid conflicts with other examples in the demo repository.

Migrating nodes from Google Sheets

In any migration project, understanding the source is very important. For Google Sheets, there are many details that need your attention. First, the module works on top of Migrate Plus and extends its JSON data parser. In fact, you have to publish your Google Sheet and consume it in JSON format. Second, you need to make the JSON export publicly available. Third, you must understand the JSON format provided by Google Sheets and the assumptions made by the module to configure your fields properly. Specific instructions for Google Sheets migrations will be provided. That being said, everything explained in the JSON migration example is applicable in this case too.

Publishing a Google Sheet in JSON format

Before starting the migration, you need the source from where you will extract the data. For this, create a Google Sheet document. The example will use this one:

https://docs.google.com/spreadsheets/d/1YVJt9isPNjkUNHf3YgoTx38r04TwqRYnp1LFrik3TAk/edit#gid=0

The 1YVJt9isPNjkUNHf3YgoTx38r04TwqRYnp1LFrik3TAk value is the worksheet ID which will be used later. Once you are done creating the document, you need to publish it so it can be consumed by the Migrate API. To do this, go to the File menu and then click on Publish to the web. A modal window will appear where you can configure the export. Note that it is possible to publish the Entire document or only some of the worksheets (tabs). The example document has two: UD Example Sheet and Do not peek in here. Make sure that all the worksheets that you need are published or export the entire document. Unless multiple urls are configured, a migration can only import from one worksheet at a time. If you fetch from multiple urls they need to have homogeneous structures. When you click the Publish button, a new URL will be presented. In the example it is:

https://docs.google.com/spreadsheets/d/e/2PACX-1vTy2-CGzsoTBkmvYbolFh0UDWenwd9OCdel55j9Qa37g_earT1vA6y-6phC31Xkj8sTWF0o6mZTM90H/pubhtml

The previous URL will not be used. Publishing a document is a required step, but the URL that you get should be ignored. Note that you do not have to share the document. It is fine that the document is private to you as long as it is published. It is up to you if you want to make it available to Anyone with the link or Public on the web and potentially grant edit or comment access. The Share setting does not affect the migration. The final step is getting the JSON representation of the document. You need to assemble a URL with the following pattern:

http://spreadsheets.google.com/feeds/list/[workbook-id]/[worksheet-index]/public/values?alt=json

Replace the [workbook-id] by worksheet ID mentioned at the beginning of this section, the one that is part of the regular document URL, not the published URL. The worksheet-index is an integer number starting that represents the order in which worksheets appear in the document. Use 1 for the first, 2 for the second, and so on. This means that changing the order of the worksheets will affect your migration. At the very least, you will have to update the path to reflect the new index. In the example migration, the UD Example Sheet worksheet will be used. It appears first in the document so worksheet index is 1. Therefore, the exported JSON will be available at the following URL:

http://spreadsheets.google.com/feeds/list/1YVJt9isPNjkUNHf3YgoTx38r04TwqRYnp1LFrik3TAk/1/public/values?alt=json

Understanding the published Google Sheet JSON export

Take a moment to read the JSON export and try to understand its structure. It contains much more data than what you need. The records to be imported can be retrieved using this XPath expression: /feed/entry. You would normally have to assign this value to the item_selector configuration of the Migrate Plus’ JSON data parser. But, because the value is the same for all Google Sheets, the module takes care of this automatically. You do not have to set that configuration in the source section. As for the data cells, have a look at the following code snippet to see how they appear on the export:

{
  "feed": {
    "entry": [
      {
        "gsx$uniqueid": {
          "$t": "1"
        },
        "gsx$name": {
          "$t": "One Uno Un"
        },
        "gsx$photo-file": {
          "$t": "P01"
        },
        "gsx$bookref": {
          "$t": "B10"
        }
      }
    ]
  }
}

Tip: Firefox includes a built-in JSON document viewer which helps a lot in understanding the structure of the document. If your browser does not include a similar tool out of the box, look for one in their extensions repository. You can also use a file formatter to pretty print the JSON output.

The following is a list of headers as they appear in the Google Sheet compared to how they appear in the JSON export:

  • unique_id appears like gsx$uniqueid.
  • name appears like gsx$name.
  • photo-file appears like gsx$photo-file.
  • Book Ref appears like gsx$bookref.

So, the header name from Google Sheet gets transformed in the JSON export. They get a prefix of gsx$ and the header name is transformed to all lowercase letters with spaces and most special characters removed. On top of this, the actual cell value, that you will eventually import, is in a $t property one level under the header name. Now, you should create a list of fields to migrate using XPath expressions as selectors. For example, for the Book Ref header, the selector would be gsx$bookref/$t. But that is not the way to configure the Google Sheets data parser. The module makes some assumptions to make the selector clearer. So, the gsx$ prefix and /$t hierarchy are assumed. For the selector, you only need to use the transformed name. In this case: uniqueid, name, photo-file, and bookref.

Configuring the Migrate Google Sheets source plugin

With the JSON export of the Google Sheet and the list of transformed header names, you can proceed to configure the plugin. It will be very similar to configuring a remote JSON migration. The following code snippet shows source configuration for the node migration:

source:
  plugin: url
  data_fetcher_plugin: http
  data_parser_plugin: google_sheets
  urls: 'http://spreadsheets.google.com/feeds/list/1YVJt9isPNjkUNHf3YgoTx38r04TwqRYnp1LFrik3TAk/1/public/values?alt=json'
  fields:
    - name: src_unique_id
      label: 'Unique ID'
      selector: uniqueid
    - name: src_name
      label: 'Name'
      selector: name
    - name: src_photo_file
      label: 'Photo ID'
      selector: photo-file
    - name: src_book_ref
      label: 'Book paragraph ID'
      selector: bookref
  ids:
    src_unique_id:
      type: integer

You use the url plugin, the http fetcher, and the google_sheets parser. The latter is provided by the module. The urls configuration is set to the exported JSON link. The item_selector is not configured because the /feed/entry value is assumed. The fields are configured as in the JSON migration with the caveat of using the transformed header values for the selector. Finally, you need to set the ids key to a combination of fields that uniquely identify each record.

The rest of the migration is almost identical to the JSON example. Small changes were made to prevent machine name conflicts with other examples in the demo repository. For reference, the following snippet shows part of the process, destination, and dependencies section for the Google Sheets migration.

process:
  field_ud_image/target_id:
    plugin: migration_lookup
    migration: udm_microsoft_excel_source_image
    source: src_photo_file
destination:
  plugin: 'entity:node'
  default_bundle: ud_paragraphs
migration_dependencies:
  required:
    - udm_microsoft_excel_source_image
    - udm_libreoffice_calc_source_paragraph
  optional: []

Note that the node migration depends on an image and paragraph migration. They are already available in the example. One uses a Microsoft Excel file as the source while the other a LibreOffice Calc document. Both of these migrations will be explained in the next article. Refer to this entry to know how to run migrations that depend on others. For example, you can run: drush migrate:import --tag='UD Sheets Source'.

What did you learn in today’s blog post? Have you migrated from Google Sheets before? If so, what challenges have you found? Did you know the procedure to export a sheet in JSON format? Did you know that the Migrate Google Sheets module is an extension of Migrate Plus? Share your answers in the comments. Also, I would be grateful if you shared this blog post with others.

Aug 21 2019
Aug 21

Do you like people who are warm and friendly or cold and hostile? You’ve got it right! I’m comparing Interactive to Non-interactive (static) websites here. In this increasingly digital generation, it isn’t sufficient to place some content on your website and wait for it to work its magic. Providing a web User experience without interactivity is like opening a store filled with inventory without a salesperson to interact with. 
When you create an interactive website, you are forming a connection with your audience. It propels a two-way communication on a medium where you cannot directly interact with a user. Studies have proven that people are more likely to convert on, return to or recommend websites that are interactive. Drupal CMS offers a wide variety of interactive themes and modules that can be easily adapted to your website and further customized.

What is an interactive website?

Put simply, an interactive website is a website that communicates and allows for interaction with users. And by interaction, we don’t just mean allowing users to “click” and “scroll”. Offering users with content that is amusing, collaborative and engaging is the essential objective of an interactive website. An interactive website design will not just display attractive content, it will exhibit interactive content. Content that will compel users to communicate and deeply engage with the website. 
 

Interactive website design 
Interactive Website Designs Communicate & Engage with users 

Why do you need one?

Today, all businesses in the digital market are racing to expand their audience. Most of them, however, forget that increasing traffic is simply not enough. Retaining and engaging users is what converts. Engaging your users should be your prime motive and for this you will first need an interactive business website. 

  • Drives more engagement. Interactive business websites can make your website less boring, thus garnering more action. 
  • Users will spend more time on a website that interacts with them. This increases your conversion rate, decreases bounce rate and can boost the SEO of your website.
  • Develops a more personalized user experience that can result in happy users. 
  • Engaged users are more likely to maintain a long-term relationship with websites.
  • Interactive website designs can create lasting effects in user’s minds. This improves your brand awareness and reach. 
  • Interactive websites encourages users to recommend your website and link back to it.
  • More conversions means you have a better chance in making a sale!

How to make interactive websites? 

Creating an interactive website from scratch is easier and more effective as you envision and plan the customer journey from day one. Nevertheless, if you already have a website that you think is static or needs more interactive website features, it is never too late. The first step is to define your business objectives and then identify various touch points from where you can interact with your customers.
If budgets and timelines are constraints you could also look at HTML5 interactive website templates (not recommended if you need customizations).
There are various interactive website features that can increase user engagement but you should pick the ones that suit your business goals. For example, if you are sell financial services, having an interest calculator in your website can prove to be very useful. Nonetheless, the most essential interactive feature that you just cannot ignore is responsiveness. Users will respond to your website on various devices only when it looks and feels presentable.
So what kind of interactive website features or elements can you utilize for your benefit?

  • Social Media Applications

There is no denying that Social media marketing can give you the visibility like no other marketing programs if done right. Provide your users with an option to like and share your content on social media platforms like LinkedIn Twitter or Facebook. Or just to be able to follow your page. You can also display live feed from your social media page to keep users updated. 

  • Simple Interactive Tools

Offer your users with simple interactive tools like Quizzes, short Games, math tools, tax calculators, etc. connected to your business objectives. Integrating simple software tools that can provide your users with instant results have proven to boost user engagement. 

  • Interactive Page Elements

You can enhance your page elements by adding something interesting and attractive to it. For example, colourful and dynamic hover-states on links or images, on-scroll or on-click loading/animation, navigation with clicks on image stories, and much more. Add videos or animations to say more about your business in an interactive way.

  • Forms and Feedback

Allowing users to get in touch with you via a contact form is a great way to connect with them. Not only does it let you increase your database of leads, it is a nice way of saying “We care”. Feedback forms lets you identify your strengths and weaknesses via the best source – your audience! 

  • Chat Widgets

What’s better than a live person chatting with you, answering all your questions about the products or services being offered?! That’s probably the highest level of interactivity you can offer in an interactive business website. If live chat sounds like too much commitment, you could also opt for Chat bots that can be configured to answer predictive questions.

  • User-generated Content

Letting users add their content on your website is a great way to improve interactivity. This can be done in the form of Comments (in your blogs/articles section), inviting them to write guest posts, submit images or even creating a small discussion Forum.

  • Other interactive website Features

You can get creative with the interactive features you want for your audience but here’s a short list of commonly used interactive elements –

  • Google Maps makes you a more trust-worthy brand and provide a great way to improve interaction especially when they are clickable.
  • Newsletters can keep your users coming back to your website for more updates.
  • Voting and showing them results of previous polls helps increase engagement.
  • Search functionalities eases the user from the pain of navigating through your website.
  • Ratings can be a quick and interactive method of getting instant feedback that can improve your products/services/work.
  • Slideshows offer a great way to engage users and can make them want to keep going to the next image.
interactive elements     
           Interactive Website Features and Elements 
                                         

Drupal for Interactive Websites

When you build your website with Drupal, you will come across multiple options in the form of modules and features that can instantly turn your static website into an interactive one. With Drupal 8, responsiveness comes out of the box. Which means that you don’t need any additional modules to make your Drupal website look great irrespective of the devices. In addition there are a variety of modules that encourage interactivity like the Search API, Contact forms module, Social Media module, Slideshow module,  SimpleNews (or newsletters) and much more! 

Aug 21 2019
Aug 21

The JS frameworks have changed quite a lot in Drupal especially with API-first concept adding to the scenario. It is only expected that developers are inclined towards learning more about JS and related possibilities.

Recently, I was tasked to render blocks on a page while keeping individual components encapsulated in their own. JavaScript has the potential to make web pages dynamic and interactive without hindering the page speed and so I decided to opt for progressively decoupled blocks.

I came across this Drupal module Progressive Decoupled Blocks, which allows us to render blocks in a decoupled way and seemed a perfect fit for the situation.

Anatomy of Progressive Decoupled Blocks

The module is javascript-framework-agnostic, progressive decoupling tool to allow custom blocks to be written in the javascript framework of one’s choice. What makes it unique is one can do all this without needing to know any Drupal API.

It keeps individual components compact in their own directories containing all the CSS, JS, and template assets necessary for them to work, and using an info.yml file to declare these components and their framework dependencies to Drupal.

Did it work?

I decided to test the module in the new Umami Demo profile, that comes out of the box with Drupal, with React and Angular blocks.

I used the Layout module and picked a page to place these blocks. This is one of the best parts I liked about the module, every piece of block content that we need to be placed on the page, can be done without even visiting the block page and setting visibility.

Module Architecture

The module has a custom block derivative, which searches for all components that have been added and exposes them as blocks. This architecture makes it super easy to add a new block.

You can refer to this git for the block derivative - https://git.drupalcode.org/project/pdb/blob/8.x-1.x/src/Plugin/Derivative/PdbBlockDeriver.php

Refer to this git for component discovery - https://git.drupalcode.org/project/pdb/blob/8.x-1.x/src/ComponentDiscovery.php

Possibilities

For React JS there are 2 examples in this module.

First is with a simple React Create element class, without JSX. This example renders a simple text inside a React component. This is a very good starter example for anyone who wants to understand how to integrate a react component with Drupal in a progressive way.

The second example is a ToDo app, which allows you to add and remove todo items to a list. It makes use of local storage to store any item that has been added. This app creates React components, integrate these components in other component and renders a fully functional DOM inside Drupal DOM in a progressive way.

This example comes with a package.json which needs to be installed before making it functional. However, the component did not render perfectly on Umami theme, I made certain changes to make sure it renders correctly. Patch attached in the end.

progressive decoupled block

I decided to extend this module and added a new component to render a banner block (comes OOB with Umami), and exposed this block as an API via JSON:API. As this was a very simple block, I decided to create this block without JSX. Also, I decided to generate the URL with static ID in the path. This can, however, be made dynamic, which I plan to do later in my implementation. I also decided to choose the same class names to save on styling. The classes can be found in the Umami theme profile.

(function ($, Drupal, drupalSettings) {
Drupal.behaviors.pddemo = {
attach: function (context, settings) {
$.ajax(drupalSettings.pdb.reactbanner.url, {
type: 'get',
success: function (result) {
var image = result.included[0].attributes.uri.url;
var title = result.data.attributes.field_title;
ReactDOM.render(React.createElement(
'div',
{
className: 'block-type-banner-block',
style: {backgroundImage: 'url(' + image + ')'}
},
React.createElement(
'div',
{className: 'block-inner'},
React.createElement(
'div',
{className: 'summary'},
React.createElement(
'div',
{className: 'field--name-field-title'},
title
)
)
)
), document.getElementById('reactbanner'));
},
error: function (XMLHttpRequest, textStatus, errorThrown) {
console.log(textStatus || errorThrown);
},
});
}
};
})(jQuery, Drupal, drupalSettings);


Angular JS

Angular JS comes with many more easy and complex examples. Things didn’t work as smoothly as it did with React. There were a couple of changes we were required to make it work.

Patch is attached in the end. You can refer to this gist for the patch.

You will have to install node modules before, to make it work. Also, all JS is in the form of Typescript, which needs to be processed to JS before you can make it work.

Conclusion

The module gives you a great kickstart to move any block, rendered by Drupal, to a progressively decoupled block, using React, Angular or Ember as JS framework. However, you may want to extend the module or create your own to render the blocks.

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web