Dec 14 2018
Dec 14

We want to get you better acquainted with the kind of company Agiledrop is, the practices we employ and the team spirit we cultivate. So, we’ve decided to start a series of blog posts that tell the story of how the company has managed to make a name for itself and form a team that major global agencies can trust and depend upon. 

In the first post of the series, we’ll present our workflow and the advantages such an approach brings, both in producing satisfied clients and in motivating our team to help each of us with our personal and professional growth.
 

The nature of the work at Agiledrop dictates a different approach to resource management. With the help of online tools and at least 2 hours of overlap for daily standup meetings, we have been working successfully with customers from all over the world.

Our clients are mostly Drupal agencies who need to expand their existing teams with experienced Drupal developers or agencies who don’t have their own development department but would like to manage their projects entirely in-house. 

After onboarding, our developers adopt the client’s best practices, tools and workflows. They are present at meetings and daily standups, and work alongside existing teams while being entirely managed by the client’s project managers. As such, they essentially become members of the client’s team for the duration of the project. 
 

What This Means for Our Team ...

The opportunity to work on a variety of projects with teams from all over the world helps our employees with their personal growth and development. It emphasizes their knowledge, experiences, confidence, and, most importantly, it helps them grow professionally.

They have the opportunity to encounter environments with a variety of different workflows, practices and skill sets, and the chance to work with people outside our company makes their work interesting and never boring or monotonous.
 

… And for Our Clients

From the clients’ perspective, they are able to fulfill their resource capacities at any given time without a long and costly recruitment process, which means the pipeline for new projects can be more adaptive. 

The stress typically caused by catching deadlines or by unpredictable events, such as sick leaves of your in-house developers, can be managed more easily or even eliminated entirely. 

Most importantly, the clients can always rely on the collective knowledge and skill sets of the entire Agiledrop team working side by side with the developer they’ve hired.
 

Granted, such an approach is not without its unique challenges. In the next post of the series, we’ll deal with the first major obstacle that arose from our desire to provide only the best for our employees and our clients, and how we managed to very efficiently solve it, even turning it to our advantage. 

Be sure to check back for the next chapter of the series!
 

Dec 14 2018
Dec 14

Referred to as the de facto standard of e-learning, Shareable Content Object Reference Model aka SCORM was sponsored by US Department of Defense to bring uniformity in the standards of procuring both training content and Learning Management Systems. 

Long gone but not forgotten are those days when learning was only limited to books and classrooms. With the development of technology, virtual learning has transformed into an approachable and convenient method.

Can Drupal, which is a widely popular CMS for education websites, conform to SCORM standards? How does it ensure that it remains SCORM compliant? 

Items entering from left in a grey and blue machine coming out in brown packets on right in a yellow background.


In Details - What is SCORM?

SCORM is a set of standard guidelines and specifications for the programmers on how to create LMS and training content to be shared across systems. 

The agenda to bring SCORM was to create standard units of training and educational material to be shared and reused across systems. 
                           
A white background with SCORM full form written in blue color


Shareable Content Object refers to creating units of online training material that can be shared and reused across systems and contexts.

Reference Model refers to the existing standards in the education industry while informing developers on how to properly use them together.

Working with the authoring tools to design and produce the content, e-learning professionals, training managers, and instructional designers are the ones who typically use SCORM packages.

Content (used in courses and LMS) is exported to a SCORM package (.zip folder) to deliver the learners a seamless and smooth upload of the content.

The Evolution of SCORM

Since SCORM wasn’t built as a standard from the ground up and was primarily a reference to the existing ones, the goal was to create an interoperable system that will work well with other systems. 

Till date, there are three released versions of SCORM, each built on top of the previous one solving the problem of its predecessor.

SCORM 1.0 was merely a draft outline of the framework. It did not include any fully implementable specifications but rather contained a preview of work which was yet to come. 

SCORM 1.0 included the core elements that would become the foundation of SCORM.  

In other words, this version specified how the content should be packaged. How content should communicate to systems and how the content should be described.

SCORM 1.1 was the first implementable version of SCORM. It marked the end of the trial implementation phase and the beginning of the application phase for ADL. 

SCORM 1.2 solved the many problems that came along version 1.1. It provided with robust and implementable specifications, this version presented its end users with drastic cost savings. 

It was and still remains one of the most widely used version.

  • SCORM 2004 (1st - 4th edition)

The 2004 1st edition allowed content vendors to create navigation rules between SCOs. The 2nd edition covered the various shortcomings of the 1st. It brought with it Advanced Distributed Learning which focused on developing and assessing the distributed learning prototypes, enabling more effective, efficient, & affordable learner-centric solutions.

The 3rd edition removed any ambiguity, improving the sequencing specifications for greater interoperability.

The final and 4th edition was focused on disambiguation and addition of new sequencing specifications. These specifications widened the options available to the content authors which made the creation of sequenced content even more simple.
 

 Infographic on the evolution of SCORM


Why Should You Use SCORM?

Now that we have an idea about SCORM and its attempt of reducing chaos in the entire industry, let’s know what benefits it brings along. 

Here are some of the reasons that can contribute to a huge factor in terms of using SCORM.

  • It is a pro-consumer initiative. The online courses are eligible to be used on any compliant LMS vendor. You can alternatively upload the courses to LMS as long as you have a zip folder.
  • All the high-quality LMSs and the authoring tools are SCORM compliant so that they can build and be part of a great ecosystem of interoperability and reliability.
  • The introduction and evolution in SCORM have brought about a great reduction in overall cost of delivering training. The reason is that it has no additional cost for integrating any type of content. 
  • SCORM helps in standardizing eLearning specifications. SCORM provides a set of technical specifications that gives the developers a standard blueprint to work with.

How does SCORM Work?

Other than guiding the programmers, SCORM administers two main things, i.e packaging content and exchanging data at runtime to ensure workability. 

  • Packaging content or content aggregation model (CAM) defines how a piece of content should be presented in a physical sense. It is required by the LMS to export and import a launch content without the use of any human interventions
  • Runtime communication or data exchange helps in defining how the content is supposed to work with the LMS while it is actually being played. This is the part which describes the delivery and tracking of the content. Eventually, these are the things that include “request the learner’s name” or “tell the LMS that the learner scored 95% in a test”. 
“SCORM recommends contents to be delivered in a self-contained directory or a ZIP file.”
One black block with two grey blocks and text written on them, and another grey block connected to two pentagons


Working of SCORM Packages

SCORM recommends contents to be delivered in a self-contained directory or a ZIP file. These files contain content, defined by the SCORM standards and is called Package Interface File (PIF) or in other words SCORM packages. 

It contains all the files that are needed to be delivered in the content packages via SCORM runtime environment. 

Course manifest files are considered as the heart of the SCORM content packaging system. The manifest is considered as the XML file that describes the content. 

Some of the pieces involved in the packaging are:

  • Resources 

Resources are the list of parts that bundle up to be a single course. There are two types of resources that contribute to the course.

The first is the collection of one or more files that make up a logical unit presented to the users. The other is SCO or Sharable Content Object which is the unit of instructions that are composed of one or more files, to communicate with LMS. It mostly contains the instructional or static part of a content that is presented to the users via course. 

Resources should contain a complete list of all the files that are required for proper functionality of the resources. 

This is done to port the list to a new environment and function it the similar way. 

A GIF on how SCORM and web content are added together

 

  • Organizations

Organizations are considered as the logical grouping of the parts of resources into a hierarchical arrangement. This is what is delivered to a particular learner when the item has been selected. 

  • Metadata 

Metadata are used to describe elements of a content package in its manifest file. They are important because they facilitate the discovery of learning resources across content package or in a repository. 

When a learning resource is intended to be reusable, it is a best practice to describe it with metadata. 

For describing learning content, Learning Object Metadata contains many predefined fields.   
  • Sequencing

Sequencing is responsible for determining what happens next when a learner exits an SCO. With navigational control, it orchestrates the flow and status of the course as a whole. 

However, it doesn’t affect how SCOs operate and navigate internally, that is defined by the content developer.

Drupal With SCORM 

Drupal is best at managing the digital content, but the task of planning, implementing, and assessing a specific learning process can be best done by an LMS.

How can Drupal become a platform for an organization that delivers effective training, manage learners, individual progress and record results?

Since Drupal is not an LMS, its distributions and modules help it become more effective. When it comes to SCORM compliance, Drupal has Opigno LMS as its core distribution.  

blue opigno logo Opigno LMS is a Drupal distribution that integrates H5P technology (an open-source content collaboration framework based on javascript), which enable you to create rich interactive training content. It allows you to maintain the training paths that are organized in courses and lessons. 

This distribution includes the latest version of Opigno core that offers you effective and innovative online training tools.

Opigno LMS is fully compliant with SCORM (1.2 and 2004 v3) which offers a powerful editor for content management, in particular, to create course material. These courses can eventually be grouped into classes to provide easy and manageable training paths. It should also be noted that this distribution is the quickest way to present a functional e-learning platform out of the box, with the users, courses, certificates, etc. 

Based on this distribution, Opigno SCORM implements the SCORM feature in Opigno which allows you to load and play SCORM packages within Opigno training and is also responsible to handle and manage training paths that are organized in courses and lessons. 

Opigno LMS comprises an app store that also enables you to install latest features easily, without asking you to upgrade the current install. 

According to the requirements and expectations of the learners, Opigno LMS can be summarized by the following specification:

  1. Scalable to manage the hardships of a dynamic and modifying environment
  2. Safe and easy to update
  3. Support further development of customized functionalities with proper integration with the core solution in a modular way
  4. Open to letting each client be free and independent
  5. And most importantly, easy integration with other enterprise systems 

H5P javascript framework makes it easy to create, share and reuse HTML5 content and applications, allowing users to built richer content. With the use of H5P, the authors can edit and construct videos, presentation games, advertisement etc. To create an e-learning platform, the integration of HP5 framework and SCORM is essential.  

Blue background where SCORM xAPI is written in white color


H5P SCORM/xAPI module allows to upload and view SCROM and xAPI packages. It uses two HP5 libraries namely (HP5 libraries are used to create and share rich content and applications)

  1. H5P SCORM/xAPI library to view SCORM package.
  2. H5PEditor SCORM library to upload and validate SCORM package.

You can create a new content type by uploading it in the preceding step of a process using the H5P editor.

In the nutshell

Different people adopt SCORM for different reasons. You and your team are the only ones that can decide whether sticking to SCORM is worthwhile or not. 

Depending upon the nature of your requirement and the course of action, it can be decided which platform is best for you. At OpenSense labs, we have been giving adequate solutions to our customers. Contact us on [email protected] to make the right decision on the correct choice of a platform. 

Dec 14 2018
Dec 14

Drupal is one of the leading open source and secure content management systems used by businesses across the world. Drupal helps developers build open web; marketers with its templating approach; and agencies with achieving customer goals faster.

Marketo, on the other hand, is a leading marketing automation platform that helps digital marketers engage customers and prospects, and automate email marketing and other tasks. And its recent acquisition by Adobe due to its popularity is worth the mention.

blog-image

Integration between these two was not available before we created our own Drupal Marketo Integration Connector. The connector utilizes the known leads’ information by tracking cookies on your website to personalize the content on your website.

It comes with numerous features like real-time personalization, form pre-fill, and automated email notifications.

How Personalization on Drupal Pages Works Using Shortcodes?

We have implemented shortcodes in Drupal (which are not Drupal standard), which help you target leads known to your website and show them personalized and localized content on your website. Therefore, increasing the user engagement on your website.

Listed below are a few examples of how you can personalize content using these shortcodes.

Suppose, the following lead values are stored in your Marketo database:

blog-image

1. Showing a Lead Value on Your Drupal Web Page

To personalize content by utilizing the known lead value (like FirstName, LastName, Email, Company or Industry) from Marketo on your Drupal basic pages, enter the below shortcode in the WYSIWYG editor.

blog-image

Using the shortcode, you will get the FirstName field value Tom and LastName field value Taylor, that is stored in your Marketo database. The result will appear on the front-end when the user visits your website.

blog-image

2. Personalizing Content Based on the Matching Condition of Shortcodes

We can create targeted content using a matching condition shortcode. The shortcode helps check whether the Marketo field name has the value which you have defined in the conditional block. Based on whether the condition is true or false, we can personalize the content.

For example, if the company field name in Marketo is Industry and field value is Automobile, you can personalize the content for users from this industry. Enter the below matching condition shortcode in WYSIWYG editor in the Drupal back-end.

blog-image3

If the condition is true, the slider will show automobile-related images.

automobile-data

Similarly, if the field value is Healthcare, you can personalize the website content for users from this industry. Enter the below matching condition shortcode in WYSIWYG editor in the Drupal back-end.

img4

If the condition is true, then the slider will display healthcare related images and so on.

healthcare-data

3. Personalizing Content Based on Matching Condition Shortcodes with the AND and OR Operators.

You can also personalize content based on the matching condition shortcode using the ‘AND’ and ‘OR’ operator. This means you can compare multiple fields and their values for personalization (*You cannot compare more than three fields in a single conditional block). Based on whether the condition is true or false, you can personalize the content.

For example, if the FirstName field name in Marketo matches the field value Tom and the Industry field name equals field value Automobile, then you can personalize the content based on this true condition. In case the condition is false, you can personalize the content through the editor or leave it blank. To personalize content, enter the below matching condition shortcode in WYSIWYG editor in the Drupal backend.

img5

Want to Know More About Our Drupal Services? Contact Us.

Grazitti has over 10 years of experience in offering full-fledged Drupal web development services to companies across the globe. We also help automate processes using our custom plugin that integrates your website with Marketo to boost conversions and sales. Learn more about our web development services here or write to us at [email protected].
Dec 14 2018
Dec 14

A lot of people have been jumping on the headless CMS bandwagon over the past few years, but I’ve never been entirely convinced. Maybe it’s partly because I don’t want to give up on the sunk costs of what I’ve learned about Drupal theming, and partly because I’m proud to be a boring developer, but I haven’t been fully sold on the benefits of decoupling.

On our current project, we’ve continued to take an approach that Dries Buytaert has described as “progressively decoupled Drupal”. Drupal handles routing, navigation, access control, and page rendering, while rich interactive functionality is provided by a JavaScript application sitting on top of the Drupal page. In the past, we’d taken a similar approach, with AngularJS applications on top of Drupal 6 or 7, getting their configuration from Drupal.settings, and for this project we decided to use React on top of Drupal 8.

There are a lot of advantages to this approach, in my view. There are several discrete interactive applications on the site, but the bulk of the site is static content, so it definitely makes sense for that content to be rendered by the server rather than constructed in the browser. This brings a lot of value in terms of accessibility, search engine optimisation, and performance.

A decoupled system is almost inevitably more complex, with more potential points of failure.

The application can be developed independently of the CMS, so specialist JavaScript developers can work without needing to worry about having a local Drupal build process.

If at some later date, the client decides to move away from Drupal, or at the point where we upgrade to Drupal 9, the applications aren’t so tightly coupled, so the effort of moving them should be smaller.

Having made the decision to use this architecture, we wanted a consistent framework for managing application configuration, to make sure we wouldn’t need to keep reinventing the wheel for every application, and to keep things easy for the content team to manage.

The client’s content team want to be able to control all of the text within the application (across multiple languages), and be able to preview changes before putting them live.

There didn’t seem to be an established approach for this, so we’ve built a module for it.

As we’ve previously mentioned, the team at Capgemini are strongly committed to supporting the open source communities whose work we depend on, and we try to contribute back whenever we can, whether that’s patches to fix bugs and add new features, or creating new modules to fill gaps where nothing appropriate already exists. For instance, a recent client requirement to promote their native applications led us to build the App Banners module.

Aiming to make our modules open source wherever possible helps us to think in systems, considering the specific requirements of this client as an example of a range of other potential use cases. This helps to future-proof our code, because it’s more likely that evolving requirements can be met by a configuration change, rather than needing a code change.

So, guided by these principles, I’m very pleased to announce the Single Page Application Landing Page module for Drupal 8, or to use the terrible acronym that it has unfortunately but inevitably acquired, SPALP.

On its own, the module doesn’t do much other than provide an App Landing Page content type. Each application needs its own module to declare a dependency on SPALP, define a library, and include its configuration as JSON (with associated schema). When a module which does that is installed, SPALP takes care of creating a landing page node for it, and importing the initial configuration onto the node. When that node is viewed, SPALP adds the library, and a link to an endpoint serving the JSON configuration.

Deciding how to store the app configuration and make all the text editable was one of the main questions, and we ended up answering it in a slightly “un-Drupally” way.

On our old Drupal 6 projects, the text was stored in a separate ‘Messages’ node type. This was a bit unwieldy, and it was always quite tricky to figure out what was the right node to edit.

For our Drupal 7 projects, we used the translation interface, even on a monolingual site, where we translated from English to British English. It seemed like a great idea to the development team, but the content editors always found it unintuitive, struggling to find the right string to edit, especially for common strings like button labels. It also didn’t allow the content team to preview changes to the app text.

We wanted to maintain everything related to the application in one place, in order to keep things simpler for developers and content editors. This, along with the need to manage revisions of the app configuration, led us down the route of using a single node to manage each application.

This approach makes it easy to integrate the applications with any of the good stuff that Drupal provides, whether that’s managing meta tags, translation, revisions, or something else that we haven’t thought of.

The SPALP module also provides event dispatchers to allow configuration to be altered. For instance, we set different API endpoints in test environments.

Another nice feature is that in the node edit form, the JSON object is converted into a usable set of form fields using the JSON forms library. This generic approach means that we don’t need to spend time copying boilerplate Form API code to build configuration forms when we build a new application - instead the developers working on the JavaScript code write their configuration as JSON in a way that makes sense for their application, and generate a schema from that. When new configuration items need to be added, we only need to update the JSON and the schema.

Each application only needs a very simple Drupal module to define its library, so we’re able to build the React code independently, and bring it into Drupal as a Composer dependency.

The repository includes a small example module to show how to implement these patterns, and hopefully other teams will be able to use it on other projects.

As with any project, it’s not complete. So far we’ve only built one application following this approach, and it seems to be working pretty well. Among the items in the issue queue is better integration with configuration management system, so that we can make it clear if a setting has been overridden for the current environment.

I hope that this module will be useful for other teams - if you’re building JavaScript applications that work with Drupal, please try it out, and if you use it on your project, I’d love to hear about it. Also, if you spot any problems, or have any ideas for improvements, please get in touch via the issue queue.

Dec 13 2018
Dec 13

This is the first of many articles about why and how to give back to the community. The information can be used by individuals, agencies, and companies that want to increase their community contribution efforts. 

'Tis the season for giving. A time of giving thanks for the blessing of the harvest and of the preceding year. A time of light in the dark of winter; a season of reflection and thanks. I’d like to recognize the work and efforts of the Drupal Community and reflect on Hook 42’s historic contributions; additionally, I'd like to share how we contribute and how you can contribute more. 

But why listen to Hook 42? We are not one of the largest Drupal agencies; however, Hook 42 was ranked 18th in the list of the global contributors by Dries. We also strategically shaped and sponsored the community efforts of one of the top 30 contributors to the project. 

Hook 42 has a fairly humble approach when sharing our community contributions data. We usually let our team’s commit numbers, sessions, logos, and event organization speak for our continued dedication to the fostering the Drupal community. 

For full disclosure, Kristen Pol and Aimee Degnan (myself) are co-founders / owners of Hook 42, a full-service web development agency specializing in Drupal. We are based out of San Francisco and we have team members distributed across the United States.

Make a Commitment to Contribution

First of all, you must want to contribute. Everyone has different reasons to contribute and how you personally contribute is your own path. It is a deliberate decision that must be turned into action.

The first steps into contribution may be attending a Drupal users group, camp, or logging into Slack or IRC for support. At this point, you meet the community and learn how to further contribute. Hopefully, the Community has been inviting enough that you want to stay. :)

When Kristen and I started Hook 42, we chose “Contribute to Community” as one of our core values. This decision created the foundation for our culture of contribution. We both personally contributed to the project and community in different ways and found great value from giving back. All of our team members have the desire to contribute to community efforts; the passion and self-starting ability to contribute is vital.

Another one of our values, “Ongoing Education and Improvement”, is a strong motivator for our contribution efforts. The Drupal project, and its supporting events, provide a fantastic opportunity for professional development. Coding, documentation, speaking, organization, and volunteerism; there are so many opportunities for growth that are beyond our client work or formal training classes.

I want to thank those organizations and individuals that have made community contributions a deliberate part of their work and life. We are not the only company to do so. We are not the only individuals to do so.

Invest in Contribution

Once you decide contribution is important, a real investment must be made. Contribution takes time. It takes skill. It takes practice. And it also takes money. 

A patch isn’t going to write itself, test itself, or be committed back into the project itself. Documentation must be written and copy edited. Events need to be sponsored and organized.

Hook 42 sponsors 15% of the team’s overall work hours to community contribution. Doing the math, 15% is roughly 4,000 - 4,500 hours of work per year donated to the Drupal community. That is time Hook 42 is not making a profit and is considered a completely operational cost. The work is performed within a normal 40 hour work week for our team members; another value of ours is "Strive for Work and Life Balance".

Our clients also sponsor contributions through the course of project work, session presentations, and case studies.

The 15% metric of sponsored community time does not include the amount of money spent for sponsorship, travel, or one-off support of community contributions like DUG dinners, code sprint snacks, and other community-focused spend. 

An individual’s personal level of investment will probably be much different from a business. The personal investment may include patches, testing, sessions, and camp attendance but is often constrained by personal budget and time one is wanting to commit.

Personally, Kristen and I contribute differently. Both of us organize events, speak, attend conferences, work on core initiatives, and mentor individuals within the community and the team. I contribute less code because my role is more related to business topics (although I’m highly technical); whereas, Kristen’s contributions are more commit and documentation related. This is just an example of how contribution profiles may differ.

From a business perspective, it is both of our jobs to provide a stable environment and work to provide our team members the opportunity to contribute. Consider us “Community Force Multipliers” that strategically align our team’s work with community efforts while we, personally, do less of the commit-centric work.

Many of our team members, including Kristen and myself, contribute additional personal time on top of work-sponsored hours. Those that do are passionate about the community and are active in event organization, preparing sessions and trainings, and participating in sprints. Plus, Kristen and I have to keep the business running regardless of our personal community efforts. Someone has got to keep the lights on. 

I want to thank those organizations and individuals that have made the investment in the community. We are not the only company to do so. We are not the only individuals to do so.

Ongoing Commitment to Contribution

But why do Kristen and I choose to contribute so much to the community? Why do we keep contributing? The two of us drive Hook 42’s budget and time invested in community efforts. Don’t we want more profit or more money in our pockets?

As business owners, we believe that supporting the Drupal project and community ecosystem is an investment in our ongoing business. All people using Drupal must contribute - at their own ability level - to Drupal to keep the platform viable and supported. That contribution can be as easy as active use of the platform.

Improving Drupal isn’t just about benefits to our company, or benefits to Drupal independent contractors, and other Drupal service providers. Improving Drupal benefits organizations using Drupal, so they can have confidence that their choice is a sound investment over time. Again, contribution as a business is not about us, it is about supporting people building their businesses on Drupal.

As people managers, the community ecosystem and the Drupal and Drupal-adjacent technologies provide a great environment for ongoing learning and professional development for our team. Kristen, myself, and Adam help our team members and others in the community find the best fit of work for their goals.

As individuals, honestly, we love the people and the community environment. One of our other company values is “Be Ethically and Morally Good”. As good members of the community, we understand that our individual choices and actions can benefit the greater good of Drupal. 

But I have to admit, sometimes it is a challenge. For all who choose to contribute, you will also be more visible to a diverse, global group of individuals. You may receive criticism for your work or opinions that may not seem agreeable. Kristen and I have been told that the support we provide, both personally and through the business, is not enough. Perhaps without visibility to the actual data of our total contributions, others may not understand the amount of our personal investment in the project. Thankfully, that was an edge case and not the norm. Overall, most community members have provided positive feedback to our sponsorship efforts.

Why don’t we just give up? How much more are we supposed to do? How much more are we supposed to give?

Honestly, constructive criticism and differing opinions make the product and the people improve. No one improves if they are unaware there is room for change.

We constantly renew our commitment to community because we love the people we work with. It brings us great joy to work with such creative and enthusiastic people.

I want to thank those organizations and individuals that recommit their efforts to the community, even after heavy public criticisms. We are not the only company to do so. We are not the only individuals to do so.

Season of Giving Beyond Drupal

I want to thank those organizations and individuals that provide donations to charities throughout the year at different community events. This type of altruistic approach to charitable donations represents the quality of the people in the Drupal community.

Hook 42 has donated to the following charitable organizations chosen by our team members. Some of the organizations were selected by multiple members:

Learn how you can be a top contributor in our next Community Post: The How-to Guide to Successful Contribution.

Dec 13 2018
Dec 13
Mike and Matt talk with the team that helped implement content strategy on Georgia.gov.
Dec 13 2018
Dec 13

There are great opportunities with local, state and federal government contracting. That’s why we made a strategic decision over 10 years ago to build a government public affairs and marketing area of our company’s service offerings. It wasn’t easy, it was a long process, but it worked. Here’s how we did it and what it has meant to the growth of Texas Creative.

Get Your Certifications:

Our first stop was the South Central Texas Regional Certification Agency (SCTRCA), where we received our Small Business Enterprise (SBE) Certification and our Women-owned Business Enterprise (WBE) Certification. Most local entities, and certainly the State of Texas, regard this regional certification as gold. These are renewed every two years and audits can be part of the equation, only to ensure that the company is being run as indicated in the applications.  Other certifications are available through the agency, so check all that apply to you.

Texas HUB (Texas Historically Underutilized Business) Certification is the statewide database of woman-owned, minority-owned, small-business, veteran-owned businesses that are available for state agencies and prime contractor HUB subcontracting. In our category of business, most state and local agencies target 26% of their spending with HUB companies. This certification can really lift your visibility with buyers and give your proposals a stronger appeal.

Get on the Schedule:

We tried getting on the Federal GSA (General Services Administration) Schedule by ourselves, but the paperwork and technical language barrier were too much to handle, so we hired a federal contracting specialist company to help us complete the process and get on the schedule. It took us nearly two years to complete that process, but we succeeded in getting onto the 541-5 schedule which is the Integrated Marketing Services defined within The Professional Services Schedule (PSS) Advertising & Integrated Marketing Solutions (AIMS) Schedules. Our GSA has led us to engagements with the Maine Air National Guard, The Peace Corps, Army Medical Department, and the US Air Force.

Once you are on the federal schedule you can apply to be on the comparable state schedule called Texas Multiple-Award Schedule (TXMAS) through the Texas Comptroller’s Office. The TXMAS contract follows the same offerings of your GSA Contract. This process took us another year so plan on playing the long game if you want to play in this marketplace. Our work with Texas Commission on Environmental Quality (TCEQ) for public affairs campaigns for Take Care of Texas and Galveston Back the Bay, website design, website development and hosting have been very rewarding work for ourselves and the taxpayers of Texas. We have helped food insecure families in Texas find information about the Summer Food Program offered by the Texas Department of Agriculture, and we’ve helped Texas Department of Transportation (Two Steps, One Sticker) change the behavior of millions when it comes to state inspection of their vehicles.  

Our most recent accomplishment has been 5 years in the making and we were just awarded a Comprehensive Web Development and Management Services contract through the Texas Department of Information Resources Agency (Texas DIR). All state agencies are required by law to procure web development projects through the DIR. Optionally, universities and local school districts, local governmental agencies may use the DIR contracting tools. We are able to offer our full-service, in-house web development and hosting services throughout the state. This is big news for our small company. Our previous TXMAS work for TCEQ, the Governor’s Office and the University of Texas can know fall into the DIR seamlessly.

These sales range from small task-oriented projects for Texas Department of Public Safety (DPS) and Employees Retirement System of Texas (ERS) to media planning and buying for Texas Tech University and upwards of multi-million dollar contracts with other state agencies to assist them in getting the word out about their amazing programs. It’s work we love and it’s work that makes the lives of Texans better. Can we help your agency successfully raise awareness and engagement for your programs?

For more information on our contracts please visit:

Texas Creative GSA Contract

Texas Creative TXMAS Contract

Texas Creative DIR Contract 

Check out a couple of our recent projects for our government clients. 

[embedded content]
[embedded content]

Dec 13 2018
Dec 13

Gabe, Mateu and I just released the third RC of JSON:API 2, so time for an update! The last update is from three weeks ago.

What happened since then? In a nutshell:

RC3

Curious about RC3? RC2RC3 has five key changes:

  1. ndobromirov is all over the issue queue to fix performance issues: he fixed a critical performance regression in 2.x vs 1.x that is only noticeable when requesting responses with hundreds of resources (entities); he also fixed another performance problem that manifests itself only in those circumstances, but also exists in 1.x.
  2. One major bug was reported by dagmar: the ?filter syntax that we made less confusing in RC2 was a big step forward, but we had missed one particular edge case!
  3. A pretty obscure broken edge case was discovered, but probably fairly common for those creating custom entity types: optional entity reference base fields that are empty made the JSON:API module stumble. Turns out optional entity reference fields get different default values depending on whether they’re base fields or configured fields! Fortunately, three people gave valuable information that led to finding this root cause and the solution! Thanks, olexyy, keesee & caseylau!
  4. A minor bug that only occurs when installing JSON:API Extras and configuring it in a certain way.
  5. Version 1.1 RC1 of the JSON:API spec was published; it includes two clarifications to the existing spec. We already were doing one of them correctly (test coverage added to guarantee it), and the other one we are now complying with too. Everything else in version 1.1 of the spec is additive, this is the only thing that could be disruptive, so we chose to do it ASAP.

So … now is the time to update to 2.0-RC3. We’d love the next release of JSON:API to be the final 2.0 release!

P.S.: if you want fixes to land quickly, follow dagmar’s example:

If you don't know how to fix a bug of a #drupal module, providing a failing test usually is really helpful to guide project maintainers. Thanks! @GabeSullice and @wimleers for fixing my bug report https://t.co/bEkkjSrE8U

— Mariano D'Agostino (@cuencodigital) December 11, 2018
Dec 12 2018
Dec 12

This blog has been re-posted and edited with permission from Dries Buytaert's blog. Please leave your comments on the original post.

At Drupal Europe, I announced that Drupal 9 will be released in 2020. Although I explained why we plan to release in 2020, I wasn't very specific about when we plan to release Drupal 9 in 2020. Given that 2020 is less than thirteen months away (gasp!), it's time to be more specific.

Shifting Drupal's six month release cycle

A timeline that shows how we shifted Drupal 8's release windows

We shifted Drupal 8's minor release windows so we can adopt Symfony's releases faster.

Before I talk about the Drupal 9 release date, I want to explain another change we made, which has a minor impact on the Drupal 9 release date.

As announced over two years ago, Drupal 8 adopted a 6-month release cycle (two releases a year). Symfony, a PHP framework which Drupal depends on, uses a similar release schedule. Unfortunately the timing of Drupal's releases has historically occurred 1-2 months before Symfony's releases, which forces us to wait six months to adopt the latest Symfony release. To be able to adopt the latest Symfony releases faster, we are moving Drupal's minor releases to June and December. This will allow us to adopt the latest Symfony releases within one month. For example, Drupal 8.8.0 is now scheduled for December 2019.

We hope to release Drupal 9 on June 3, 2020

Drupal 8's biggest dependency is Symfony 3, which has an end-of-life date in November 2021. This means that after November 2021, security bugs in Symfony 3 will not get fixed. Therefore, we have to end-of-life Drupal 8 no later than November 2021. Or put differently, by November 2021, everyone should be on Drupal 9.

Working backwards from November 2021, we'd like to give site owners at least one year to upgrade from Drupal 8 to Drupal 9. While we could release Drupal 9 in December 2020, we decided it was better to try to release Drupal 9 on June 3, 2020. This gives site owners 18 months to upgrade. Plus, it also gives the Drupal core contributors an extra buffer in case we can't finish Drupal 9 in time for a summer release.

A timeline that shows we hope to release Drupal 9 in June 2020

Planned Drupal 8 and 9 minor release dates.

We are building Drupal 9 in Drupal 8

Instead of working on Drupal 9 in a separate codebase, we are building Drupal 9 in Drupal 8. This means that we are adding new functionality as backwards-compatible code and experimental features. Once the code becomes stable, we deprecate any old functionality.

Let's look at an example. As mentioned, Drupal 8 currently depends on Symfony 3. Our plan is to release Drupal 9 with Symfony 4 or 5. Symfony 5's release is less than one year away, while Symfony 4 was released a year ago. Ideally Drupal 9 would ship with Symfony 5, both for the latest Symfony improvements and for longer support. However, Symfony 5 hasn't been released yet, so we don't know the scope of its changes, and we will have limited time to try to adopt it before Symfony 3's end-of-life.

We are currently working on making it possible to run Drupal 8 with Symfony 4 (without requiring it). Supporting Symfony 4 is a valuable stepping stone to Symfony 5 as it brings new capabilities for sites that choose to use it, and it eases the amount of Symfony 5 upgrade work to do for Drupal core developers. In the end, our goal is for Drupal 8 to work with Symfony 3, 4 or 5 so we can identify and fix any issues before we start requiring Symfony 4 or 5 in Drupal 9.

Another example is our support for reusable media. Drupal 8.0.0 launched without a media library. We are currently working on adding a media library to Drupal 8 so content authors can select pre-existing media from a library and easily embed them in their posts. Once the media library becomes stable, we can deprecate the use of the old file upload functionality and make the new media library the default experience.

The upgrade to Drupal 9 will be easy

Because we are building Drupal 9 in Drupal 8, the technology in Drupal 9 will have been battle-tested in Drupal 8.

For Drupal core contributors, this means that we have a limited set of tasks to do in Drupal 9 itself before we can release it. Releasing Drupal 9 will only depend on removing deprecated functionality and upgrading Drupal's dependencies, such as Symfony. This will make the release timing more predictable and the release quality more robust.

For contributed module authors, it means they already have the new technology at their service, so they can work on Drupal 9 compatibility earlier (e.g. they can start updating their media modules to use the new media library before Drupal 9 is released). Finally, their Drupal 8 know-how will remain highly relevant in Drupal 9, as there will not be a dramatic change in how Drupal is built.

But most importantly, for Drupal site owners, this means that it should be much easier to upgrade to Drupal 9 than it was to upgrade to Drupal 8. Drupal 9 will simply be the last version of Drupal 8, with its deprecations removed. This means we will not introduce new, backwards-compatibility breaking APIs or features in Drupal 9 except for our dependency updates. As long as modules and themes stay up-to-date with the latest Drupal 8 APIs, the upgrade to Drupal 9 should be easy. Therefore, we believe that a 12- to 18-month upgrade period should suffice.

So what is the big deal about Drupal 9, then?

The big deal about Drupal 9 is … that it should not be a big deal. The best way to be ready for Drupal 9 is to keep up with Drupal 8 updates. Make sure you are not using deprecated modules and APIs, and where possible, use the latest versions of dependencies. If you do that, your upgrade experience will be smooth, and that is a big deal for us.

Special thanks to Gábor Hojtsy (Acquia), Angie Byron (Acquia), xjm(Acquia), and catch for their input in this blog post.

Dec 12 2018
Dec 12

Webforms in Drupal 8 are configuration entities, which means that they are exportable to YAML files and this makes it easy to transfer a webform from one server environment to another. Generally, anything that defines functionality or behavior in Drupal 8 is stored as simple configuration or a configuration entity. For example, fields, views, and roles are stored as configuration entities. Things that are considered 'content' are stored in the database as content entities. Content entities include nodes, comments, taxonomy terms, users, and also webform submissions.

Configuration is exportable as YAML.
Content is stored in the database.

Managing Configuration

The core concept behind Drupal's configuration management is you can export and import how a website is configured (aka how it works) from one environment to another environment. For example, we might want to copy the configuration from your staging server to your production server. Drupal 8 has initially taken the approach that all configuration from one environment needs to be moved to the new environment. The problem is that…

The imported configuration will clobber (aka replace) any existing configuration

In the case of webforms and blocks, this is a major issue because site builders are continually updating these config entities on a production website. The Drupal community is aware of this problem - they have provided some solutions and are actively working to fix this challenge/issue in a future release of Drupal.

Improving Configuration Management

Below is a summary of these three modules.

The Config Filter module provides an API for controlling what configuration is imported or not imported for different environments.
 

The Configuration Split module allows defining sets of configuration that will get exported for different environments.
 

The Config Ignore module allows specified configuration not to be imported (aka ignored) and overwritten.

Geert van Dort's recipe for Configuration Management

Geert van Dort has written a great resource documenting how to exclude config from configuration management in Drupal 8 and he uses Webforms as his example. Geert van Dort's article is a must-read.

There is one somewhat obvious configuration management gotcha that I have not seen fully documented.

The importing of outdated configuration gotcha

In the Webform issue queue, I am repeatedly seeing tickets related to exported webform config that have not been properly updated. For example, someone exports a Webform from 8.x-5.0-rc10, updates the Webform module to 8.x-5.0-rc26, runs the database/config updates, and then imports the Webform configuration from 8.x-5.0-rc10, which is missing any new configuration properties and changes from 8.x-5.0-rc26. Usually, I am able to pinpoint the missing update hook and suggest that someone runs the missing update hook using a drush command like…

drush php-eval 'module_load_include('install', 'webform'); webform_update_8144()';​

How to prevent the outdated configuration import gotcha

Good configuration management comes down to deciding on an approach and following it.

For me, the above concept was the most important takeaway from Michael Anello's Drupal 8 Configuration System Basics.

The solution to prevent the outdated configuration gotcha is to define a process and follow it. Below is a general outline of the steps required to make sure your exported configuration is always up-to-date

On your local or development environment:

When deploying core and contrib module updates:

I have deliberately avoided including what specific mechanism you should be using for updating your site, which generally should be a combination of GIT, Composer, and Drush because the most important step is…

Make sure when you update Drupal core and contrib modules, your exported configuration has also been updated.

The Distribution Configuration Gotcha

Distributions are full copies of Drupal that include Drupal Core, along with additional software such as themes, modules, libraries ​and installation profiles.

Distributions include predefined configuration files, which can quickly become out-of-sync with the distribution's modules. For example, a distribution's exported configuration could be expecting an older version of the Webform module. It is the distribution maintainer's responsibility to keep the configuration up-to-date which is challenging because they have to do a full installation of the distribution, update the distribution's modules and then export the updated configuration. This is a very tedious process.

One immediate thing that a module or theme maintainer can do to help distribution maintainers is to keep change records and tag any changes that might impact a distribution. Another best practice is to try to keep configuration breaking changes to a minimum.

Webform specific configuration management tools

Exporting webform configuration

The fact that an entire Webform is exportable into one file has made it possible to create feature and element specific test webforms, which I use during automated testing. Currently, there are over 200+ test webforms included in the Webform's test modules. The Webform module provides an 'Export' tab which allows you to easily and quickly export any webform.

Webform Export Tab

Webform Export Tab

If you find an issue with a webform, the best way to get help is to isolate the problem to a simple example webform, export it, and then upload it to the Webform issue queue.

Tiding exported YAML

Another minor enhancement you will see in exported Webform is that the multiline strings in the YAML file are formatted in a more readable form. To learn more, see Issue #2844452: Export configuration YAML strings as multiline. If you need your exported YAML configuration file in a slightly more readable format you can also use the Webform's tidy Drush command.

Repairing admin configuration and webform settings

If you happen upon some webform configuration that is outdated, you can run the drush webform:repair command or click the 'Repair configuration' via the admin UI under that 'Advanced' configuration tab (/admin/structure/webform/config/advanced)

Repair Webform Configuration

Repair Webform Configuration

The future of configuration management

I have been using Drupal's configuration management since early alpha releases of Drupal 8 when configuration was not even stored in the database. The Config Filter, Configuration Split, and Config Ignore trifecta of contributed modules show how the Drupal community can work together to fix some challenging issues, paving the way for solutions to make their way in Drupal core.

The concept that we can quickly export an entire Webform into one shareable file is awesome and has made my life exponentially easier in maintaining the Webform module for Drupal 8.

The proposal for the Configuration Management 2.0 initiative is comprehensive and exciting. For example, if the Webform module could hook into the configuration import process, we should be able to repair outdated configuration as it is imported. For now, I want to say thanks to everyone involved in Drupal's Configuration Management Initiative.

Almost done…

We just sent you an email. Please click the link in the email to confirm your subscription!

OKSubscriptions powered by Strikingly

Dec 12 2018
Dec 12

Your future awesome e-commerce website can be closer than you imagined. Drupal makes site creation especially quick and easy thanks to distributions. Ready distributions in the sphere of commerce are among the numerous reasons why Drupal is the best solution for e-commerce websites. They allow to quickly create an online store and spend much less on it.

You can always rely on our Drupal team for any help in selecting the optimal distribution for you and brushing it up to meet your needs. And now let’s review top e-commerce Drupal distributions.

For a start, what are Drupal distributions?

Distributions are Drupal packages that contain the core, particular modules and theme, predefined configuration, useful libraries, and so on. Drupal has 1200+ distributions, usually tailored for websites of a certain type or industry. Where nothing extraordinary is needed, distributions let you avoid reinventing the wheel and get the desired website sooner.

Great e-commerce Drupal distributions

Commerce Kickstart

Commerce Kickstart is more than number one among Drupal e-commerce distributions — it is also the most popular Drupal distribution of all. The number of its downloads exceeds a million. Why? Commerce Kickstart lets you quickly create an online store based on Drupal Commerce, the famous e-commerce platform for Drupal.

In addition to Drupal core, Drupal Commerce, and a set of helpful modules, Commerce Kickstart offers an attractive theme, preconfigured catalog of products, shopping cart, user login feature, search feature including filters for refined searches, product category page, social media integration, payment gateway integration, hero banner, and more.

Commerce Kickstart comes in 3 versions:

  • Commerce Kickstart 1.x for Drupal 7: the version with the minimum modules and configuration.
  • Commerce Kickstart 2.x for Drupal 7: the improved package for creating a full-featured store
  • CommerceKickstart.com for Drupal 8: instead of the Drupal 8 version for the distribution, there is a browser-based tool to build a Composer file for your future online store.

Easy Booking

This distribution is primarily designed for hotel, hostel, or inn websites. However, it can be refurbished to fit many other kinds of websites that need a good booking system.

Easy Booking is available for Drupal 7 website and has not yet been ported to Drupal 8. The distribution is powered by Drupal Commerce and Drupal Rooms, two famous solutions for websites. Among the interesting and useful features of this distribution are:

  • an online booking system with a checkout
  • email notifications to the the administrator
  • a flexible system for managing rooms and their properties
  • an easy-to-manage room availability calendar
  • a front-page slideshow
  • “Our services”, “News”, and “About” sections
  • a colorized Google Map
  • responsive design

and more.

Presto! — Drupal 8 Starter Kit with Commerce Integration

“Presto!” is a distribution for Drupal 8 that contains optimally pre-configured e-commerce features based on Drupal Commerce. However, its maintainers only recommend installation via Composer to make the e-commerce functionality work.

The distribution also has plenty of general configuration that will be very helpful for online stores. This includes Google Analytics integration, XML sitemap generation, automated human-readable URL creation via Pathauto, social sharing options, pre-configured Paragraph types (Promo bar, Carousel, Block etc.), and more.

Opigno LMS

The next on our list of Drupal e-commerce distributions is Opigno LMS, which is primarily associated with education thanks to the famous e-learning platform at its base —.Opigno LMS. However, online learning is a blend of education and e-commerce.

So the Opigno LMS distribution has strong features for online selling of of learning courses that could also be useful to many other websites. It also offers quizzes, awards to successful students, certificates, forums, chats, and much more.

The distribution has a stable version for Drupal 7 and a release candidate version for Drupal 8, which means its full-fledged Drupal 8 version is underway.

RedHen Raiser

Here is an interesting distribution designed for Drupal 7 websites with fundraising campaigns. The RedHen Raiser distribution is based on the RedHen CRM with its useful RedHen Donation and RedHen Campaign modules.

This distribution is commerce-ready and allows for the easy adding of payment methods. It offers single-page donation forms, progress widgets, mini blog, automated start and end dates, and much more.

Let’s quickly create an online store for you with a distribution!

E-commerce Drupal distributions are definitely huge time and cost savers, so there are no more excuses for you in not getting your desired website today ;) Drupal e-commerce development is one of our areas of expertise, so contact our Drupal team — and we will help you select the optimal distribution and customize it to meet your ideas!

Dec 12 2018
Dec 12

At Drupal Europe, I announced that Drupal 9 will be released in 2020. Although I explained why we plan to release in 2020, I wasn't very specific about when we plan to release Drupal 9 in 2020. Given that 2020 is less than thirteen months away (gasp!), it's time to be more specific.

Shifting Drupal's six month release cycle

A timeline that shows how we shifted Drupal 8's release windowsWe shifted Drupal 8's minor release windows so we can adopt Symfony's releases faster.

Before I talk about the Drupal 9 release date, I want to explain another change we made, which has a minor impact on the Drupal 9 release date.

As announced over two years ago, Drupal 8 adopted a 6-month release cycle (two releases a year). Symfony, a PHP framework which Drupal depends on, uses a similar release schedule. Unfortunately the timing of Drupal's releases has historically occurred 1-2 months before Symfony's releases, which forces us to wait six months to adopt the latest Symfony release. To be able to adopt the latest Symfony releases faster, we are moving Drupal's minor releases to June and December. This will allow us to adopt the latest Symfony releases within one month. For example, Drupal 8.8.0 is now scheduled for December 2019.

We hope to release Drupal 9 on June 3, 2020

Drupal 8's biggest dependency is Symfony 3, which has an end-of-life date in November 2021. This means that after November 2021, security bugs in Symfony 3 will not get fixed. Therefore, we have to end-of-life Drupal 8 no later than November 2021. Or put differently, by November 2021, everyone should be on Drupal 9.

Working backwards from November 2021, we'd like to give site owners at least one year to upgrade from Drupal 8 to Drupal 9. While we could release Drupal 9 in December 2020, we decided it was better to try to release Drupal 9 on June 3, 2020. This gives site owners 18 months to upgrade. Plus, it also gives the Drupal core contributors an extra buffer in case we can't finish Drupal 9 in time for a summer release.

A timeline that shows we hope to release Drupal 9 in June 2020Planned Drupal 8 and 9 minor release dates.

We are building Drupal 9 in Drupal 8

Instead of working on Drupal 9 in a separate codebase, we are building Drupal 9 in Drupal 8. This means that we are adding new functionality as backwards-compatible code and experimental features. Once the code becomes stable, we deprecate any old functionality.

Let's look at an example. As mentioned, Drupal 8 currently depends on Symfony 3. Our plan is to release Drupal 9 with Symfony 4 or 5. Symfony 5's release is less than one year away, while Symfony 4 was released a year ago. Ideally Drupal 9 would ship with Symfony 5, both for the latest Symfony improvements and for longer support. However, Symfony 5 hasn't been released yet, so we don't know the scope of its changes, and we will have limited time to try to adopt it before Symfony 3's end-of-life.

We are currently working on making it possible to run Drupal 8 with Symfony 4 (without requiring it). Supporting Symfony 4 is a valuable stepping stone to Symfony 5 as it brings new capabilities for sites that choose to use it, and it eases the amount of Symfony 5 upgrade work to do for Drupal core developers. In the end, our goal is for Drupal 8 to work with Symfony 3, 4 or 5 so we can identify and fix any issues before we start requiring Symfony 4 or 5 in Drupal 9.

Another example is our support for reusable media. Drupal 8.0.0 launched without a media library. We are currently working on adding a media library to Drupal 8 so content authors can select pre-existing media from a library and easily embed them in their posts. Once the media library becomes stable, we can deprecate the use of the old file upload functionality and make the new media library the default experience.

The upgrade to Drupal 9 will be easy

Because we are building Drupal 9 in Drupal 8, the technology in Drupal 9 will have been battle-tested in Drupal 8.

For Drupal core contributors, this means that we have a limited set of tasks to do in Drupal 9 itself before we can release it. Releasing Drupal 9 will only depend on removing deprecated functionality and upgrading Drupal's dependencies, such as Symfony. This will make the release timing more predictable and the release quality more robust.

For contributed module authors, it means they already have the new technology at their service, so they can work on Drupal 9 compatibility earlier (e.g. they can start updating their media modules to use the new media library before Drupal 9 is released). Finally, their Drupal 8 know-how will remain highly relevant in Drupal 9, as there will not be a dramatic change in how Drupal is built.

But most importantly, for Drupal site owners, this means that it should be much easier to upgrade to Drupal 9 than it was to upgrade to Drupal 8. Drupal 9 will simply be the last version of Drupal 8, with its deprecations removed. This means we will not introduce new, backwards-compatibility breaking APIs or features in Drupal 9 except for our dependency updates. As long as modules and themes stay up-to-date with the latest Drupal 8 APIs, the upgrade to Drupal 9 should be easy. Therefore, we believe that a 12- to 18-month upgrade period should suffice.

So what is the big deal about Drupal 9, then?

The big deal about Drupal 9 is … that it should not be a big deal. The best way to be ready for Drupal 9 is to keep up with Drupal 8 updates. Make sure you are not using deprecated modules and APIs, and where possible, use the latest versions of dependencies. If you do that, your upgrade experience will be smooth, and that is a big deal for us.

Special thanks to Gábor Hojtsy (Acquia), Angie Byron (Acquia), xjm (Acquia), and catch for their input in this blog post.

December 12, 2018

3 min read time

Dec 12 2018
Dec 12

Healthcare. When you listen to this word, you get a feeling of ‘care’ and ‘improvement’ because that’s what the healthcare industry does. This industry cares for the people and works on the improvement of their health. And today’s growing number of healthcare providers, payers, and IT professionals need a HIPAA-compliant website that also ‘cares’ about processing, storing and transmitting protected health information.

A doctor pointing his finger on his laptop screen as another man looks at it.


Open source software align tremendously well with the requirements of the healthcare sector. Drupal, being an open source content management framework itself, aligns very well with Health IT interoperability and is great for commercial applications at the enterprise level.

What is HIPAA compliance?

The Health Insurance Portability and Accountability Act of 1996 (HIPPA) is a legislation that was implemented for the easy retention of healthcare insurance coverage that can be of huge significance whenever the US workers change or lose their jobs.

The Health Insurance Portability and Accountability Act of 1996 is a legislation that sets the standard for the protection of sensitive patient data

It sets the standard for the protection of sensitive patient data and any organisation that confronts with protected health information (PHI) has to make sure that all the required physical, network and process security measures are being adhered to. According to Amazon Web Services, PHI includes a very wide set of personally identifiable health and health-related data, including insurance and billing information, diagnosis data, clinical care data, and lab results such as images and test results.

It also encourages to use electronic health records (EHR) for the betterment of efficiency and quality of the US healthcare system via improved information sharing.

It encompasses covered entities (CE), anyone who offers treatment, payment and operations in healthcare and business associates, and people who can access patient information and offer support in treatment, payment or operations. Moreover, subcontractors or business associates of business associates should also be in compliance.

How to make your website HIPAA compliant?

In order to make sure that your website is HIPAA-compliant, you must start by establishing new processes. Make sure that the PHI is only accessible to authorised personnel in addition to establishing processes for deleting, backing up and restoring PHI as needed. Emails consisting of PHI should be sent in an encrypted and secure manner.

Moreover, it is of great significance that you partner with web hosting companies that are HIPAA compliant and have processes for protecting PHI. Also, sign a business associate contract with third parties who have access to your patient’s PHI.

It is of paramount importance to buy and implement an SSL certificate for your website and ensuring that all web forms on your site are encrypted and safe.

Use Case: Drupal ensures HIPAA compliance

Drupal, being one of the most security-focussed CMS, comes with stupendous database encryption mechanisms. For high-security applications, Drupal can be configured for a firm database encryption. When the whole database encryption is not desirable, top-notch granularity is available for safeguarding more specific information like user accounts, particular forms, and also the values of particular fields can be encrypted in an otherwise plaintext database.

Drupal’s encryption system is configurable to adhere to the norms of PCI, HIPAA and state privacy laws constituting offsite encryption key management

Drupal’s encryption system is configurable to adhere to the norms of Payment Card Industry (PCI), HIPAA and state privacy laws constituting offsite encryption key management. Drupal is also a spectacular solution as an enterprise-grade healthcare system because it can be extended. It is possible to leverage Drupal as a content dissemination network, intranet, or even to incorporate several systems within a single platform.

By integrating Drupal data layer and an electronic medical record (EMR) via a RESTful API connection could dramatically enhance interoperability. It can unlock the important data from the proprietary systems and their data silos.

Flowchart with semicircles, full circle and boxes in blue colours to explain HIPAA compliant websiteSource: Acquia

Proprietary EMR systems are astronomical with their top-of-the-line standardised approaches and the ability to organise and store enormous amounts of data. But they are not great for customisation or interoperability. The dearth of interoperability of high-priced, single-vendor solutions and the hurdles while functioning in an integrated healthcare delivery setting results in HIPAA non-compliance. Even if the files are electronic, the difficulty in moving the files boundlessly leads to frequent violations like the shortage of legal authorisation or unencrypted emails.

Drupal and EMR integration is an excellent example of a Drupal-powered healthcare technology that empowers the staff to leverage critical and potentially life-saving data. Healthcare delivery systems, with the division of data silos, can witness the evolution from being a reactive diagnostic model to a proactive preventative model.

Drupal can be layered on top of multiple EMR systems within a medical group and the information can be compiled into one physician portal. The integration of Drupal and EMR systems can be made through numerous feeds from API calls, XML or JSON feeds and RESTful APIs.

Drupal offers granular user access control, that is, it can offers site administrators complete authority over who can see and who can modify different parts of a site. It operates on the basis of a system of extensible user roles and access permissions. Thus, with the help of role-based provisioning, Drupal can emphasise on critical data that dwells behind a firewall in a HIPAA secure environment.

Drupal can also be configured to look into the database via web services integration on the basis of specific EHR authorisation requirements. And it does so by adhering to the user access permission controls. Therefore, the data remains safe and secure all the time.

Conclusion

Drupal is a magnificent solution for enterprises in the healthcare sector to help them process, store and transmit protected health information.

We have been steadfast in our goals to deliver a great digital experience with our expertise in Drupal development.

Contact us at [email protected] to build HIPAA-compliant Drupal website.
 

Dec 12 2018
Dec 12

After reading this from Ars Technica, which describes how a developer offered to 'help' the maintainer of an NPM module - and then slowly introduced malicious code to it - I can't help but wonder if the Drupal community is vulnerable to the exact same issue. Let's discuss!

Please, don't touch my package

NPM modules have been hacked at before, and it's not pretty when it happens. Because of the way we use packages, it's a lot easier for nasty code to get sucked in to a LOT of applications before anyone notices. Attacks on the code 'supply chain', therefore, have tended to be high-profile and high-damage.

NPM is used as a source for a huge number of code projects, many of which use other bits of code from other NPM packages. Even moderate applications for PC, mobile or web can have hundreds or thousands of NPM packages pulled in. It's common for packages to depend on other packages, which depend on other packages, which need other packages, which require... you get the picture? There are so many fragments, layers and extra bits that NPM is used for, that the developers of the applications don't necessarily know all the packages that are being pulled in to their application. It's so easy to just type "npm require somefancypackageineed" without thinking and without vetting. NPM will just go and get everything for you, and you don't need to care.

That's how it should be, right? We should be able to just add code and know that it's safe, right? In a perfect world, that would be fine. But in reality there's an increasingly large amount of trust being given when you add a package to your application, and developers don't realise it. It's events like this that are making people aware again that they are including code in their projects that they either do not scrutinise or do not know exists.

Drupal's moment will come

Fortunately, Drupal is a little different to NPM. Whilst modules are often dependent on other modules, we tend to have a lot less layers going on. It's much easier to know what modules and dependencies you're adding in when you include a new module. But that doesn't mean we're immune.

This particular incident came about when a tired, busy module maintainer was approached and offered help. It's a classic social engineering hack.

"Sure, I'll help you! [mwahaha]"

What struck me was that Drupal probably has hundreds of module maintainers in similar circumstances. Put yourself in those shoes, for a moment:
- You maintain an old Drupal 7 module
- It has a few thousand sites using it still
- You're busy, don't have time for it anymore

If somebody offered to sort it all out for you, what would you say? I'm pretty sure most would be ecstatic! Hurrah! But how would you vet your new favourite person in the whole world, before making them a co-maintainer and giving them the keys to the kingdom?

Alternatively, what of this:
- There is an old module, officially unmaintained
- It still has users
- The maintainer cannot be contacted

Drupal has a system for allowing people to be made maintainers of modules, when the original maintainer cannot be contacted. How are these people vetted? I'm sure there's some sort of check, but what if it's not enough?

In particular, I want to point out that as Drupal 7 ages, there will be more and more old, unmaintained and unloved modules still used by thousands of sites. If we forget them and fail to offer them sufficient protection, they will become vulnerable to attacks just like this. Drupal's moment will come.

This is an open source issue

It would be rather very easy to run away screaming right now, having decided that open source technologies sound too dangerous. So I'll put in some positive notes!

That Drupal should be increasingly exposed to the possibility of social engineering and malevolent maintainers is no new issue. There are millions of open source projects out there, all exposed to exactly these issues. As the internet grows and matures and ages, these issues will become more and more common; how many projects out there have tired and busy maintainers?!

For now, though, it must be said that the open source communities of the world have done what few thought possible. We have millions of projects and developers around the world successfully holding onto their trusty foundations, Drupal included. Many governments, enterprises and organisations have embraced the open source way of working on the premise that although there is risk in working differently, there is great merit in the reward. To this day, open source projects continue to thrive and to challenge the closed-source world. It is the scrutiny and the care of the open source community that keeps it clear and safe. As long as we continue to support and love and use our open source communities and contributions, they will stay in good repair and good stead.

If you were thinking of building a Drupal site and are suddenly now questioning that decision, then a read of Drupal's security statement is probably worthwhile.

Know your cattle by name

The key mitigation for this risk, it should be said, is for developers to know what code is in their application. It's our job to care and so it's our job to be paranoid. But it's not always easy. How many times have you installed a module without checking every line of code? How many times have you updated a module without checking the diff in Git? It's not always practicable to scan thousands and thousands of lines of code, just in case - and you'd hope that it's not necessary - but that doesn't mean it's not a good idea.

Using Composer with Drupal 8 makes installing new modules as easy as using NPM, and exposes the same problems to some extent. Add in a build pipeline, and it's very easy to never even see a single line of the new code that you've added to your project. Am I poking a paranoia nerve, yet? ;)

For further fun, think back to other attacks in the last year where sources for external JS dependencies were poisoned, resulting in compromised sites that didn't have a single shred of compromised code committed - it was all in the browser. How's THAT for scary!

In short, you are at risk if:
- You install a module without checking every line of code
- You update a module without checking every line of code / the diff
- You use a DEV release of a module
- You use composer
- Your application pulls in external dependencies

These actions, these ways of working all create dark corners in which evil code can lie undetected.

The light shall save you

Fortunately, it can easily be argued that Drupal Core is pretty safe from these sorts of issues. Phew. Thanks to the wide community of people contributing and keeping keen eyes on watch, Core code can be considered as well-protected. Under constant scrutiny, there's little that can go wrong. The light keeps the dark corners away.

Contrib land, however, is a little different. The most popular modules not only have maintainers (well done, guys!), but many supporting developers and regular release cycles and even official 'Security Coverage' status. We have brought light and trust to the contrib world, and that's a really important thing.

But what does 'Security Coverage' really provide? Can it fail? What happens if there is a malicious maintainer? I wonder.

When the light goes out

Many modules are starting to see the sun set. As dust gathers on old Drupal 7 modules and abandoned D8 alpha modules, the dark corners will start to appear. 'Security Coverage' status will eventually be dropped, or simply forgotten about, and issue lists will pile up. Away from the safety of strong community, keen eyes and dedicated maintainers, what used to be the pride of the Drupal community will one day become a relic. We must take care to keep pride in our heritage, and not allow it to become a source of danger.

Should a Drupal module maintainer be caught out by a trickster and have their work hacked, what would actually happen? Well, for most old D7 modules we'd probably see a few thousand sites pull in the code without looking, and it would likely take some time for the vulnerability to be noticed, let alone fixed.

Fortunately, most developers need a good reason to upgrade modules, so they won't just pull in a new malicious release straight away. But there's always a way, right? What if the hacker nicely bundled all those issues in the queue into a nice release? Or simply committed some new work to the DEV branch to see who would pull it in? There are loads of old modules still running on dev without an official release. How many of us have used them without pinning to a specific commit?

Vigilance is my middle name!

I have tried to ask a lot of questions, rather than simply doom-mongering. There's not an obvious resolution to all of these questions, and that's OK. Many may argue that, since Drupal has never had an issue like this before, we must already have sufficient measures in place to prevent such a thing happening - and I disagree. As the toolkit used by the world's hackers gets ever larger and ever more complex, we cannot afford to be lax in our perspective on security. We must be vigilant!

Module maintainers, remain vigilant. Ask good questions of new co-maintainers. Check their history. See what they've contributed. Find out who they really are.

Developers, remain vigilant. Know your cattle. Be familiar with what goes in and out of your code. Know where it comes from. Know who wrote it.

Drupalers, ask questions. How can we help module maintainers make good decisions? How can we support good developers and keep out the bad?

Some security tips!
- Always know what code you're adding to your project and whether you choose to trust it
- Drupal projects not covered by the Security Team should be carefully reviewed before use
- Know what changes are being made when performing module updates and upgrades
- If using a DEV version of a module in combination with a build process, always pin to a specific git commit (rather than HEAD), so that you don't pull in new code unknowingly

Dec 12 2018
Dec 12

Using partial Twig templates is a great way to organise your frontend code because you can reuse code fragments in multiple templates. But what happens if you want to use the same partial template in multiple places and customise its content slightly?

You can do this by passing variables to the included partial template using the with keyword.

Here is an example of including a partial Twig template using the with keyword to pass it a variable:

 {% include 'paywall.html.twig' with { 'title': 'Sign in with your membership details to attend this event' } %}

Background: partial twig templates

Partial Twig templates help you reduce code duplication by adding code in a template and including it in multiple other templates. To find out more, check out my tutorial on partial Twig templates.

An example

I recently implemented a paywall for a membership site where people had to log in to view certain articles and book on events. For this to work, there is a box that consists of a login form, links to become a member (if you aren’t already) and some intro text to explain why they need to log in. Everything in this box was the same for articles and events except for the intro text.

All the code for the box is contained in a single partial Twig template, called paywall.html.twig. This template is then included where needed in other templates, such as the node template for articles and events.

The full snippet for paywall.html.php is:

<div class="paywall">
  <h3>Please sign in</h3>
  {{ paywall_login_form }}
  <p><a href="https://befused.com/apply-membership">Apply for membership</a></p>
  <p><a href="https://befused.com/contact">Contact Us</a></p>
</div>

To include this partial template in the node template, I can use the following:

 {% include 'paywall.html.twig' %}

This will add the same markup when I include it in both the event and article node template. In order to customise it, I can use the with keyword to pass in a variable. In this case, I want to customise the title. That way, I can change the title on events and content templates so they are different.

On events node template, I will use the following for the title:

 {% include 'paywall.html.twig' with { 'title': 'Sign in with your membership details to attend this event' } %}

And on the article node template, I can use the following for the title:

 {% include 'paywall.html.twig' with { 'title': 'Sign in with your membership details to continue reading this article' } %}

Then all you need to do is add title as a variable in paywall.html.php.

This code snippet:

<h3>Please sign in</h3>

Will become:

<h3>{{ title }}</h3>

The full snippet for paywall.html.php is:

<div class="paywall">
  <h3>{{ title }}</h3>
  {{ paywall_login_form }}
  <p><a href="https://befused.com/apply-membership">Apply for membership</a></p>
  <p><a href="https://befused.com/contact">Contact Us</a></p>
</div>

When this is rendered, the H3 for articles will become:

<h3>Sign in with your membership details to continue reading this article</h3>

Wrapping up

Using partial templates is a fantastic way to organise your code base and reuse fragments of template code. Using the with keyword takes this one step further by allowing you to customise your partial Twig templates depending on where it is used.

Dec 11 2018
Dec 11

It’s not easy to find a development partner you can trust. Particularly if you’ve never been immersed in the world of web development, it may take you some time to learn the language. That can make it even more difficult to know whether your partner is really staying on track with what you want to accomplish.

Luckily, knowing what to look for in a business partner can save you from all of the potential troubles later on. Ratings and reviews sites like Clutch can help you get there. This platform focuses on collecting and verifying detailed client feedback, and then using a proprietary research algorithm to rank thousands of firms across their platform. Ultimately, Clutch is a resource for business buyers to find the top-ranked service providers that match their business needs.

Luckily for us, users on Clutch will also find Kanopi Studios at the top of the list to do just that. Kanopi has been working with Clutch for a few months to collect and utilize client feedback to find out what we should focus on in the coming year. Through the process, we’ve coincidentally been named among the firm’s top digital design agencies in San Francisco.

Here are some of the leading client reviews that led us to this recognition:

“They were fantastic overall. We had great success communicating to their team via video conferencing, and they were able to answer every question we had. They also worked quickly and were very efficient with their time, so we got a great value overall.”

“Kanopi Studios’ staff members are their most impressive assets — extremely intelligent, experienced, and personable. Building a website is never easy, but working with people you both respect and like makes a huge difference.”

“Kanopi Studios successfully migrated our Drupal platform while preserving all the content that we’ve built up over the years. They worked hard to achieve a responsive design that works well on both mobile and large desktop displays.”

Not only have these kind words earned us recognition on Clutch, but we’ve also gained the attention of the how-to focused platform, The Manifest (where we are listed among top Drupal developers in San Francisco), and the portfolio-focused site, Visual Objects (where we are gaining ground among top web design agencies site-wide).

Thank you, as always, to our amazing clients for the reviews and the support.

Dec 11 2018
Dec 11

When it comes to ecommerce, a fast site can make a big difference in overall sales. I recently went through an exercise to tune a Drupal 7 Commerce site for high traffic on a Black Friday sales promotion. In previous years, the site would die in the beginning of the promotion, which really put a damper on the sale! I really enjoyed this exercise, finding all the issues in Commerce and Drupal that caused the site to perform sub-optimally.

FYI, We also have a Drupal 8 Commerce Performance Tuning guide here.

Scenario

Check Out Our High Five Drupal Web Series

In our baseline, for this specific site the response time was 25 seconds and we were able to handle only about 1000 orders an hour. With a very heavy percentage of 500s, timeouts and general unresponsiveness. CPU and memory utilization on web and database servers was very high.

Fast-forward to the end of all the tuning and we were able to handle 12K-15K orders an hour! The load generator couldn’t generate any more load, or the internet bandwidth on the load generators would get saturated, or something external to the Drupal environment became the limiter. At this point, we stopped trying to tune things. Horizontal capacity by adding additional webheads was linear. If we had added more webheads, they could handle the traffic. The database server wasn’t deadlocking. Its CPU and memory was very stable. CPU on the web servers would peak out at ~80% utilization, then more capacity would get added by spinning up a new server. The entire time, response time hovered around 500-600ms.

Enough about the scenario. Let’s dive into things.

Getting Started

The first step in tuning a site for a high volume of users and orders is to build a script that will create synthetic users and populate and submit the form(s) to add item(s) to the cart, register new users, input the shipping address and any other payment details. There’s a couple options to do this. JMeter is very popular. I’ve used it in the past with pretty decent success. In my most recent scenario, I used locust.io because it was recommended as a good tool. I hadn’t used it before and gave it a try. It worked well. And there are other load testing tools available too.

acro.blog-performance-tuning-4

OK, now you are generating load on the site. Now start tuning the site. I used New Relic's APM monitoring to flag transactions and PHP methods that were red flags. Transactions that take a long time or happen with great frequency are all good candidates for red flags. If you don’t have access to New Relic, another option is Blackfire. Regardless what you use for identifying slow transactions, use something.

Make sure that there’s nothing crazy going on. In my case, there was a really bad performing query that was called in the theme’s template.php and it was getting loaded on every single page call. Even when it wasn’t needed. Tuning that query gave use an instant speed-up in performance.

After that, we starting digging into things. There are several core and contrib patches I’ll mention and explain why and when you should consider applying them on your site.

In your specific commerce site, things might be different. You might have different payment gateways or external integration points. But the process of identifying pain points it the same. Run a 30-60 minute load test and find long running PHP functions. Then fix them so it doesn’t take as long.

As a first step, install the Memcache (or Redis) module and set it up for locking. Without that one step, you’ll almost immediately run into deadlocks on the DB for the semaphore table. This is a critical first step. From my experience, deadlocks are the number one issue when running a site under load. And deadlocks on the semaphore table is probably the most common scenario. Do yourself a favor. Install Memcache and avoid the problem entirely.

Then see if you can disable form caching on checkout and user registration. This helped save a TON of traffic against the database for forms that really don’t need to be cached. More about that later in specific findings.

One last thing before diving into some findings...

SHOW ENGINE INNODB STATUS

...will become your favorite friend. Use it to find deadlocks on your MySQL server.

Specific Findings

The following section describes specific problems and links to issues and patches related to the problems.
  • Do not attempt field storage write when field content did not change
    Commerce and Rules run and reprocess an order a lot. And then blindly save the results. If nothing has changed, why re-save everything again? So don’t. Apply this patch and see fewer deadlocks on order saves.
  • field_sql_storage_field_storage_load does use an unnecessary sort in the DB leading to a filesort
    Many times it makes sense to use your database to process the query. Until it doesn’t make sense. This is a case it leads to a filesort in MySQL (which you can discover using EXPLAIN in MySQL) and locking of tables and deadlocks. It is not that hard to do the sort in PHP. So do it.
  • Do not make entries in "cache_form" when viewing forms that use #ajax['callback'] (Drupal 7 port)
    This is a huge win, if you can pull it off. For transient form processing like login and checkout, disabling form cache is a huge relief to the DB. You might need to put the entire cart checkout onto a single page. No cart wizard. But the gains are pretty amazing.
  • If you are using captcha or anything with ajax on it on the login page, then you’ll need to make sure you are running the latest versions of Captcha and Recaptcha. See issues #2449209 and #2219993. Also, side note: if using the timing feature of recaptcha, the page this form falls on will not be cacheable and tends to bust page cache for important pages (like homepages that have a newsletter sign up form).
  • form_get_cache called when no_cache enabled
    You’ve done all that work to cut down on what is stored in cache. Great. But Drupal still wants to retrieve from cache. Let’s fix that. Cut down more DB calls.
  • commerce_payment_pane_checkout_form uses form_state values instead of input
    If your webshop is like most webshops, it is there to generate revenue. If you disable form caching on checkout, without this patch the values in your payment (including the ones for receiving payment) aren’t captured. Oops. Let’s fix that too.
  • Variable set stampede for js and css during asset building
    If you are using any auto scaling system and building out new servers when the site is under heavy load, you might already be using Advagg. But if you aren’t and are still using Drupal core’s asset system, spinning up a new system or two will cause some issues. Deadlocks galore when generating the CSS and JS aggregates. So either install Advagg or this patch.
  • Reduce database load by adding order_number during load
    Commerce and Rules really like to reprocess orders. An easy win is to reduce the number of one-off resaves and assign the order number after the first load.
  • Never use aggregation in maintenance mode
    While the site is under heavy load, the database sometimes becomes unreachable. Drupal treats this as maintenance mode. And tries to aggregate the JS/CSS and talk to the database. But the database isn’t reachable. It is a little ridiculous to aggregate JS/CSS on the maintenance page. And even more to try to talk to the database. So cut out that nonsense.
  • drupal_goto short circuits and doesn't set things to cache
    If you have any PHP classes you are using during the checkout, Drupal’s classloader auto loads them into memory. It then keeps track of where the files exist on the disk and this makes the next load of those classes just that much faster. Well, drupal_goto kills all this caching. And drupal_goto gets called when navigating through checkout.

Recap

acro.blog-performance-tuning-5

Wow! That was a long list of performance enhancements. Here’s a quick recap though. Identify critical flow of your application. Generate load on that flow. Use a profiler to find pain points in that process. Then start picking things off, looking on drupal.org for existing issues, filing bugs, applying patches. Many of the identified issues discussed here will apply to your site. Others won’t apply and you’ll have different issues.

Surprisingly, or maybe not surprisingly, the biggest wins in our discovery process were the low hanging fruit, not the complex changes. That query in the template.php was killing the site. After that, switching to use Memcache for the semaphore table and eliminating form cache for orders also cut down on a lot of chatter with the database.

I hope you too can tune that Drupal 7 Commerce site to be able to handle thousands of orders an hour. The potential exists in the platform, it is just a matter of giving performance bottlenecks a little attention and fine tuning for your particular use case. Of course, if you need a little help we'd be happy to assist. A little bit of time spent can have you reaping the rewards from then on.

Contact Acro Media Today!

Dec 11 2018
Dec 11

Here are some performance tuning tips and instructions for setting up a very performant Drupal 8 Commerce site using Varnish, Redis, Nginx and MySQL. I’ve got this setup running nicely for at least 13,000 concurrent users and it should scale well past that.

FYI, We also have a Drupal 7 Commerce Performance Tuning guide here.

Varnish

Config

You’ll need some specific config for Drupal as well as some extra config to work nicely with BigPipe caching. These are standard for Varnish and Drupal and not specific to Commerce.

Drupal

acro.blog-performance-tuning-1You’ll want to setup the Purge and Varnish Purge modules to handle tag based cache invalidation, nothing here is unique to Commerce, so you can follow the standard instructions. You will, however, want to make sure your pages actually are cached, as often modules or small misconfigurations can make a page not cacheable. To work nicely with Varnish, you want the entire page to be cacheable so your webserver doesn’t even get hit. An underused module that I find very helpful is Renderviz, which will show you a 3D breakdown of what cache tags are attached to what parts and can help you identify problem parts. I run

renderviz(‘max-age’, ‘0’) to show me anything that can’t be cached. Usually the parts you find can be corrected and made cacheable.

For example: In a recent set of performance testing I was doing, I found a newsletter signup that appeared on the bottom of every page had an overly aggressive honeypot setting, which rendered the page uncacheable. Changing the settings to only apply to necessary forms, as well as correcting a language selector, turned tons of uncached pages into cacheable pages. Now these pages return <10ms and put zero load on my web servers or database.

Demo Drupal Commerce today! View our demo site.

Web Servers

PHP

Use the most modern version of PHP you can, preferably the latest stable. Never ever ever use PHP 5 which is terrible, terrible, terrible. Otherwise, make sure you have sufficient memory and allowed threads, and that will cover most of your PHP tuning. This is almost certainly the most resource heavy part of your Drupal stack, but it is also easy to scan horizontally, pretty much indefinitely. Also, the more you can make use of Varnish, the less this will get used.

Nginx/Apache

Most of this is just making sure you can handle the number of connections. You may need to up the file limit...

ulimit -n

...of your web user to allow for more than 1024 connections per nginx instance.

Database

acro.blog-performance-tuning-2

A Commerce site is usually more write-heavy than your standard site, as your users create lots of "content" (aka carts and orders). This will usually change your MySQL config a bit, although the majority of your queries will still be reads. A pretty simple way to tune your site is to run...

mysqltuner

...against it after getting some real traffic data for at least a couple days, or simulating high traffic. It’s recommendations will get you a pretty good setup.

There is one other VERY important thing you need to do, you need to change your transaction isolation level from READ-REPEATABLE to READ-COMMITTED. READ-REPEATABLE is much too aggressive at table locking to work with most Drupal sites, especially anything write heavy. You will suffer from constant deadlocks even at fairly low traffic levels without this. Frankly, I think this should be a flag in the status page, but my patch hasn’t gotten any traction.

Cache Server

Nothing special here, but you are going to want use a separate caching option. It could be Memcache, Redis or even just a separate MySQL database. Redis is nice and fast, but the biggest gain is just splitting your cache away from the rest of your db so you can scale them easier.

Patches

There are a few specific patches that will be a great help to your performance.

_list cache tag invalidation

See: https://www.drupal.org/project/drupal/issues/2966607

Every entity type has an entity_type_list cache tag, which gets invalidated any time an entity of that type is added or changed and that those lists will need to get rebuild. This happens a LOT, but is a relatively simple query.

update cachetags set invalidation=invalidation+1 where tag=’my_entity_list’

This is an update, which is a blocking query, nothing else can edit this row while this query is running, which wouldn’t be so bad except...

acro.blog-performance-tuning-3This query often gets run as a part of larger tasks, in our case, such as when placing an order. A big task like this is run in a transaction, which basically means we save up all the queries and run them at once so they can be rolled back if something goes wrong. This means though, that this row stays locked for the whole duration of the transaction, not just the short time it takes this little query to run. If this invalidation happens near the start of the transaction, it can take a query that would talk 0.002 seconds and make it take 0.500 seconds, for example. Now, if we have more than 2 of these happening a second, we start to back up and build a queue of these queries, which just keeps getting longer and longer until we just start returning timeouts. Since this query is part of the bigger order transaction, it stops the whole order from being processed and can bring your checkout flow to a halt. 

Thankfully, the above listed patch allows these cache invalidations to be deferred so as to not block large transactions. I think the update query for invalidating cache tags is still a bottleneck as you could eventually reach it without these long transactions, but at this point that problem is more hypothetical than something you will practically encounter.

Add index to profiles

See: https://www.drupal.org/project/profile/issues/3017788

As you start getting more and more customers and orders, you will get more profiles. Loading them, especially for anonymous users, will really start to slow down and become a bottleneck. The listed patch simply adds an index to prevent that. Please note, this is a patch for the Profile module, not Commerce itself.

Make language switcher block cacheable

See: https://www.drupal.org/project/drupal/issues/2232375

This issue is unfortunately on hold pending some large core changes, but once it does land, this will allow the language switcher block to be used without worry of it blocking full page caching.

Conclusion

You should be able to scale well above 10,000 concurrent users with these tips. If you encounter any other bottlenecks or bugs, I’d love to hear about them. If you want help with some performance improvements from Acro Media and yours truly, feel free to contact us.

Contact Acro Media Today!

Dec 11 2018
Dec 11

Custom Post Types in WordPress

By Andrea Roenning

Custom post types are the key to taking a WordPress website from a simple blog to a robust system for managing many types of content and data. They help create a WordPress administrator experience which makes it easy for editors to add and edit repetitive content and gives the developer flexibility to create unique web pages to fit the site’s individual needs.

The Flexibility of Drupal 8

By Michael Miles

This article demonstrates six different methods of changing content and functionality in Drupal. Each method requires a different skill set and level of expertise, from non-technical inexperienced users to advanced Drupal developers. For each method, we describe the components, skills, knowledge, and limitations involved. The goal is to highlight Drupal’s flexibility as a Content Management framework.

How to Learn PHP Unit Testing With Katas

By David Hayes

Sometimes code is first tested when the unfortunate client or user feels like using the feature and tests the developer’s work. Hopefully, they’re not disappointed. That’s why you should be interested in unit testing and Test-Driven Development (TDD)—because it makes your life as a programmer better. It would have saved me hours of work in situations like those outlined in the last paragraph. TDD and automated testing let us focus on what we’re there for: solving complicated problems with code and leaving the computers to do more of the rest.

It’s About Time

By Colin DeCarlo

As applications scale and gain adoption, dates and time become much more of a concern than they once were. Bugs crop up, and developers start learning the woes of time zones and daylight saving time. Why did that reminder get sent a day early? How could that comment have been made at 5:30 a.m. if the post didn’t get published until 9:00 a.m.? Indiana has how man time zones?!

The Dev Lead Trenches: Creating a Culture

By Chris Tankersley

I have spent much time talking about creating and managing a working team, but there is one important piece I’ve left out of the puzzle until now—creating and crafting a culture that makes people want to work on your team and stay on your team. If you have a company culture which does not attract people, employees will be hard to find.

The Workshop: Producing Packages, Part Three

By Joe Ferguson

Over the past two months we’ve been building PHP Easy Math a purposely simple example library to demonstrate how to build a reusable package for the PHP ecosystem. Make sure to check the previous issues if you’re just now joining us! This article is the third and final installment in this series. We’re going to cover triaging and managing issues users may open as well as pull requests to your library.

Education Station: Interview Coding Challenges

By Edward Barnard

Meanwhile, the days where employees stay with the same company 20-30 years are long gone. We move around or move on from contract to contract. For many of us, this means formal job interviews. Many of those interviews include coding challenges; that can be a problem. Let’s talk about that!

Security Corner: Adventures in Hashing

By Eric Mann

Last month, the PHP community had the opportunity to come together for the excellent php[world] conference in Washington, D.C. As part of the event, we held a hackathon to work through some of the challenges posed by Cryptopals. Some of the cryptographic primitives we discussed were hashes, and it’s useful to take a more in-depth look at what they are and how to use them in PHP.

By James Titcumb

The final days of 2018 are looming on us, and I wanted to take a look back on some of the things, good and bad, that happened in and around the PHP community this year.

finally{}: The Seven Deadly Sins of Programming: Greed

By Eli White

As you are reading this magazine, we are in a time of winter holidays and typically associated with a spirit of goodwill towards others. That is embodied often as giving presents to people as a gesture of that goodwill. Unfortunately, this can have the opposite effect at times of causing greed to form as people want more and more given to them. Alternatively, they may misinterpret it as greed, as a specific green haired character learns in his holiday tale.

Dec 11 2018
Dec 11

Drupal 8 brought along with it many notable features which have made it easier to use and develop for the platform. One such feature was the incorporation of RESTful web services in Drupal 8 core for API calls. Using RESTful web services, a host of possibilities for customization of the platform open up; not to mention that these web services are the underlying principles which enable the concept of ‘headless Drupal’. In this post, I’ll start by performing a very basic Drupal function using these web services, i.e. creating a node.

Enable Modules

Start by enabling the following 4 core modules in Drupal:

  • HAL;
  • HTTP Basic Authentication;
  • RESTful Web Services;
  • Serialization.

Download the REST UI module as well, since it allows changing permissions and settings through a simple GUI, negating the need to go into the rest.settings.yml file in order to do the same changes.

Create User and Set Permissions

I’m now going to create a new authenticated user for the site. I’ll be doing this to teach you the kind of permissions that need to be set. Note that if you log in as an admin, all the following permissions will already be enabled.

Now, create a new user and navigate to admin/configuration/web services/REST. Click edit for the content row, since that’s what we will be doing in this article, and then set the permissions as shown in the screenshot below:

Now I’ll set proper permissions for our new authenticated user in order to let the user create, edit and delete content. I do this by navigating to admin/people/permissions. Set the following permissions:

  • Basic Page: Create new content
  • Basic Page: Delete own content
  • Basic Page: Edit own content

Get User’s Token

Before we start creating a node, we need to get our new user’s token in order to pass authentication. This can be done by testing API calls. For testing API calls, I’ll use the Restlet Client – Rest API Testing extension for Chrome. Of course, if you prefer some other method, feel free to use that one instead. 

Now, to test my API calls and get the new user’s token, I’ll first log out of my site as an admin and log in with the new user account. Now, I’ll simply copy the URL of my site, add rest/session/token at the end of it and paste it in the Restlet client’s URL field. Next, I’ll select the ‘GET’ method from the dropdown and send the URL to get the token from the body field. Here’s a screenshot from an earlier call:

Create Node and Test API Call

Now that I’ve got my unique token, I can start creating a node. To do so, the POST method is used to POST the entity/node, and the content-type should be set to application/hal+json. The title and type fields should be declared in the body field like this:

{

 “_links”:{

   “type”:{

     “href”:”http://example.com/rest/type/node/page

   }

 },

 “title”:[

   {

   “value”:”My first page”

   }

   ]

}

The following headers have to be added for this call:
Content-Type : application/hal+json
X-CSRF-Token : ‘The token that we got from the previous step.’

Click ‘Add authorization’ and enter the credentials of the authenticated user to add authentication if required.

This is what it should look like before firing the API call:

Next, go to your Drupal site and navigate to admin/content. Check to see if the node you created is visible on your site. If it’s there, your API call works.

Conclusion

That’s it! You’ve successfully created your first Drupal node using RESTful web services. Note that this is a very basic function of RESTful web services intended to give a demo of the feature.

Having trouble with your Drupal project? Stuck with customizing your Drupal site to your own liking? Lay aside your worries and hand them over to us at Agiledrop. 
 

Dec 11 2018
Dec 11

If you are looking to build your website on Drupal in 2019 but don’t know where to start from, you need to know more about the Distributions. 

Here’s a list of top Drupal 8 distributions you could use in 2019. 

eight logos on a white background


The Basics. What are Drupal Distributions? 

Distributions are full copies of Drupal that include core, in addition to themes, modules, libraries, and installation profiles. For a layman’s understanding distributions are a website (dummy) copy for an industry with an explicit requirement for that niche. 

“A Drupal distribution packages a set of contributed and custom modules together with Drupal core to optimize Drupal for a specific use case or industry”

Accordingly, there are Distributions for Media and Publishing, Government,  E-commerce, Education, Travel among others. 

There are two types of Drupal distributions:

  1. Full-featured distributions: complete solutions for specialized use cases (as discussed above).
  2. Other distributions as the starting points for developers and site builders.

Why Bother Yourself with Distributions?

  1. Out-of-the-box Additional Features: Remember “...Distributions are a package of the core, modules, themes”? Well, this means zero time searching for the Drupal modules and other required features, you don’t need to work extra to set up your website. You get all the required features according to your requirement in your distribution. 
     
  2. Easy set up of the website: Since you don’t need to look for additional features, it saves your time to set up your website. Launching your website now means setting up the distribution and getting a preconfigured site in a single download!
     
  3. Less Maintenance: Downloading a host of modules and features means more maintenance and more security. With distributions, you get the whole website ecosystem into one, maintenance is not an issue since everything will be updated in a single update. 

Drupal Distributions for your Website

Drupal Starter Kit

  1. Varbase 

    Varbase is an enhanced Drupal distribution packed with adaptive functionalities and essential modules, that can speed up your development process, and provides you with standardized configurations, making your life easier.

    The essence of Varbase lies within the basic concept of DRY (Don’t Repeat Yourself). It is probably the best starter distribution because it can help relieve you from repeating all the modules, features, configurations that are included in a Drupal project.

  2. Panopoly

    six white boxes arranged in a square in black background Panopoly is another powerful base distribution. At its roots lies the power of Chaos Tools and Panels magic. Designed to be both a general foundation for site building and a base framework upon which one can build other distributions, Panopoly provides search, widgets, responsiveness at its core. 

  3. Presto! - Drupal 8 Starter Kit

    Claiming to save the development time by up to 20%, Presto! is a Drupal 8 Starter Kit with more than a few tricks up its sleeve!

    With intelligent content editing functionality, configuration defaults, and optional eCommerce integration, Presto is ready to use right out-of-the-box. Other functionalities include - 

    • An Article content type with some pre-configured fields

    • A couple of pre-configured user roles: Administrator & Editor

    • The ability to share nodes on social media

    • Pathauto for automatical URL aliasing

    • A Basic Page content type with a Paragraphs-based body field

    • Some pre-configured Paragraph types:

      1. Textbox

      2. Image

      3. Promo bar

      4. Divider

      5. Carousel

      6. Block (allows the embedding of Drupal block

Media and Publishing

  1. Thunder 

    The distribution for professional publishing, Thunder functionalities are publisher-centric. At its core, it has the Paragraphs module, media entity module, and various SEO features to keep it working. As a free and open-source technology, it is cost-effective, knocks down developmental efforts and isn't time-consuming.
     

    top 3 reason why thunder is popular

    Easy to install, deploy and add new functionality, Thunder is SEO friendly, responsive, Google AMP friendly, and allowing editors to publish their content easily as instant articles on Facebook as well. 

  2. Lightning

    Lightning enables the developers to create a great authoring experience. It delivers a powerful set of tools to your editorial team and site builders right out of the box and is best suited for the media and publishing websites. The distribution provides a lightweight framework, documentation and best practice examples for building working solutions in Drupal.

    Built using the top 4 modules to enhance the editing experience - Workflow, Media, Layout, and Preview - it has taken the advantage of D8 functionalities and is tightly coupled to make up the new standard for enterprise authoring. 

Government

  1. aGov

    aGov is a Drupal distribution built for government websites and was developed for the Australian Government. It incorporates the Digital Transformation Agency's UI Kit, helping you adhere to the Digital Service Standard.

    australian govenment website with blue banner and text written in blocks


    aGov is ideal for any federal, state or local government agency wanting to move their websites quickly and easily to Drupal whilst retaining full control of their codebase and choice of hosting provider.

    • WCAG 2.0 Level AA compliance, independently audited by Media Access Australia

    • Editor Workflows (Workbench Moderation)

    • Responsive design 

    • Example content - to get you started quickly

    • Common content types: News, Events, Publications, Blogs

    • WYSIWYG editor with media library
       

  2. deGov 

    After aGov, Drupal’s deGov is another Drupal 8 distribution focussing on the needs of (German) governmental organisations. One can build all levels of government websites (federal, regional, local) to publish information. With the base of Acquia Lightning, it extends the valuable functions to meet the use cases for different scenarios.
     

    six columns with text written on it


    It provides a service-oriented E-Government portal to close the gap between citizens and your administration with various citizen engagement portals to discuss. 

    Following are the features of deGov distributions:

    • Open311 portals for civic issue tracking

    • Open data portals to publish and create communities around data

    • Intranet/Extranet for government employees

Community

  1. OpenSocial

    OpenSocial is an online community and intranet solution for nonprofits and innovative companies. It is currently employed by customers like the United Nations, Greenpeace and hundreds of smaller organizations to connect with their employees, volunteers and other stakeholders.

    It is an out-of-the-box solution to build online community platforms. Providing with features such as notifications, timeline, events, follow, groups it can help personalise your news feed with the people you want to connect with.

Learning Management System

  1. Opigno LMS

    A full-fledged Learning Management System based on Drupal, Opigno is the quickest way to get started with the e-learning framework.

    It allows to:

    • Manage training paths organized in courses and lessons

    • Assess students with quizzes

    • Award certificates

    • Facilitate interactions with live meetings, forums, and chats

      SCORM (1.2 and 2004 v3) compliant, it integrates the innovative H5P technology, making possible to create rich interactive training contents.

  2. OpenLMS

    OpenLMS is an interactive LMS to make learning a fun and easier process. It provides various kinds of the content type such as written content, video lectures, quiz, calendar, and documents to be used as course material for students/users.

    Modules such as webform are part of its core. It also provides interactive content powered by H5P which can be added to a course.

    Read about the top Drupal e-learning modules.

Headless Architecture

  1. Contenta

    Contenta helps you make the headless transition smooth.  Contenta (and Drupal in the backend) has the capacity to control behind-the-scenes systems for languages like Python, PHP, Java React, Vue and Ember. 

    Powerful and complex, Contenta is API-First.

    It optionally installs all the content types and dummy content necessary to build an application.  

OpenSense Labs has sound experience in building new pillars for organizations. We would love to hear your requirements, drop a mail at [email protected].

Dec 11 2018
Dec 11

If you are looking to build your website on Drupal in 2019 but don’t know where to start from, you need to know more about the Distributions. 

Here’s a list of top Drupal 8 distributions you could use in 2019. 

eight logos on a white background


The Basics. What are Drupal Distributions? 

Distributions are full copies of Drupal that include core, in addition to themes, modules, libraries, and installation profiles. For a layman’s understanding distributions are a website (dummy) copy for an industry with an explicit requirement for that niche. 

“A Drupal distribution packages a set of contributed and custom modules together with Drupal core to optimize Drupal for a specific use case or industry”

Accordingly, there are Distributions for Media and Publishing, Government,  E-commerce, Education, Travel among others. 

There are two types of Drupal distributions:

  1. Full-featured distributions: complete solutions for specialized use cases (as discussed above).
  2. Other distributions as the starting points for developers and site builders.

Why Bother Yourself with Distributions?

  1. Out-of-the-box Additional Features: Remember “...Distributions are a package of the core, modules, themes”? Well, this means zero time searching for the Drupal modules and other required features, you don’t need to work extra to set up your website. You get all the required features according to your requirement in your distribution. 
     
  2. Easy set up of the website: Since you don’t need to look for additional features, it saves your time to set up your website. Launching your website now means setting up the distribution and getting a preconfigured site in a single download!
     
  3. Less Maintenance: Downloading a host of modules and features means more maintenance and more security. With distributions, you get the whole website ecosystem into one, maintenance is not an issue since everything will be updated in a single update. 

Drupal Distributions for your Website

Drupal Starter Kit

  1. Varbase 

    Varbase is an enhanced Drupal distribution packed with adaptive functionalities and essential modules, that can speed up your development process, and provides you with standardized configurations, making your life easier.

    The essence of Varbase lies within the basic concept of DRY (Don’t Repeat Yourself). It is probably the best starter distribution because it can help relieve you from repeating all the modules, features, configurations that are included in a Drupal project.

  2. Panopoly

    six white boxes arranged in a square in black background Panopoly is another powerful base distribution. At its roots lies the power of Chaos Tools and Panels magic. Designed to be both a general foundation for site building and a base framework upon which one can build other distributions, Panopoly provides search, widgets, responsiveness at its core. 

  3. Presto! - Drupal 8 Starter Kit

    Claiming to save the development time by up to 20%, Presto! is a Drupal 8 Starter Kit with more than a few tricks up its sleeve!

    With intelligent content editing functionality, configuration defaults, and optional eCommerce integration, Presto is ready to use right out-of-the-box. Other functionalities include - 

    • An Article content type with some pre-configured fields

    • A couple of pre-configured user roles: Administrator & Editor

    • The ability to share nodes on social media

    • Pathauto for automatical URL aliasing

    • A Basic Page content type with a Paragraphs-based body field

    • Some pre-configured Paragraph types:

      1. Textbox

      2. Image

      3. Promo bar

      4. Divider

      5. Carousel

      6. Block (allows the embedding of Drupal block

Media and Publishing

  1. Thunder 

    The distribution for professional publishing, Thunder functionalities are publisher-centric. At its core, it has the Paragraphs module, media entity module, and various SEO features to keep it working. As a free and open-source technology, it is cost-effective, knocks down developmental efforts and isn't time-consuming.
     

    top 3 reason why thunder is popular

    Easy to install, deploy and add new functionality, Thunder is SEO friendly, responsive, Google AMP friendly, and allowing editors to publish their content easily as instant articles on Facebook as well. 

  2. Lightning

    Lightning enables the developers to create a great authoring experience. It delivers a powerful set of tools to your editorial team and site builders right out of the box and is best suited for the media and publishing websites. The distribution provides a lightweight framework, documentation and best practice examples for building working solutions in Drupal.

    Built using the top 4 modules to enhance the editing experience - Workflow, Media, Layout, and Preview - it has taken the advantage of D8 functionalities and is tightly coupled to make up the new standard for enterprise authoring. 

Government

  1. aGov

    aGov is a Drupal distribution built for government websites and was developed for the Australian Government. It incorporates the Digital Transformation Agency's UI Kit, helping you adhere to the Digital Service Standard.

    australian govenment website with blue banner and text written in blocks


    aGov is ideal for any federal, state or local government agency wanting to move their websites quickly and easily to Drupal whilst retaining full control of their codebase and choice of hosting provider.

    • WCAG 2.0 Level AA compliance, independently audited by Media Access Australia

    • Editor Workflows (Workbench Moderation)

    • Responsive design 

    • Example content - to get you started quickly

    • Common content types: News, Events, Publications, Blogs

    • WYSIWYG editor with media library
       

  2. deGov 

    After aGov, Drupal’s deGov is another Drupal 8 distribution focussing on the needs of (German) governmental organisations. One can build all levels of government websites (federal, regional, local) to publish information. With the base of Acquia Lightning, it extends the valuable functions to meet the use cases for different scenarios.
     

    six columns with text written on it


    It provides a service-oriented E-Government portal to close the gap between citizens and your administration with various citizen engagement portals to discuss. 

    Following are the features of deGov distributions:

    • Open311 portals for civic issue tracking

    • Open data portals to publish and create communities around data

    • Intranet/Extranet for government employees

Community

  1. OpenSocial

    OpenSocial is an online community and intranet solution for nonprofits and innovative companies. It is currently employed by customers like the United Nations, Greenpeace and hundreds of smaller organizations to connect with their employees, volunteers and other stakeholders.

    It is an out-of-the-box solution to build online community platforms. Providing with features such as notifications, timeline, events, follow, groups it can help personalise your news feed with the people you want to connect with.

Learning Management System

  1. Opigno LMS

    A full-fledged Learning Management System based on Drupal, Opigno is the quickest way to get started with the e-learning framework.

    It allows to:

    • Manage training paths organized in courses and lessons

    • Assess students with quizzes

    • Award certificates

    • Facilitate interactions with live meetings, forums, and chats

      SCORM (1.2 and 2004 v3) compliant, it integrates the innovative H5P technology, making possible to create rich interactive training contents.

  2. OpenLMS

    OpenLMS is an interactive LMS to make learning a fun and easier process. It provides various kinds of the content type such as written content, video lectures, quiz, calendar, and documents to be used as course material for students/users.

    Modules such as webform are part of its core. It also provides interactive content powered by H5P which can be added to a course.

    Read about the top Drupal e-learning modules.

Headless Architecture

  1. Contenta

    Contenta helps you make the headless transition smooth.  Contenta (and Drupal in the backend) has the capacity to control behind-the-scenes systems for languages like Python, PHP, Java React, Vue and Ember. 

    Powerful and complex, Contenta is API-First.

    It optionally installs all the content types and dummy content necessary to build an application.  

OpenSense Labs has sound experience in building new pillars for organizations. We would love to hear your requirements, drop a mail at [email protected].

Dec 11 2018
Dec 11

Like display modes that allow you to display an entity in multiple ways, Drupal 8 allows you to create multiple form modes that can be used on entities, whether they are users, taxonomy terms, contents or any custom entity. Let's discover here how to use these form modes, from their creation to their use to customize the input of a user's information, for example.

The creation of form modes is quite simple and can be done in a few clicks, from the administration interface (from the path /admin/structure/display-modes/form).

Drupal 8 form modes

Let's add a new form mode that we will call for example Profil.

Form mode Profil

And the User entity now has a new Profile form mode, in addition to the existing Register form mode (used for the registration form on a Drupal 8 site).

And we find our new form mode on the configuration page of the forms display (path /admin/config/people/accounts/form-display) of Drupal users.
 

Profil form

That we can activate so that we can then configure which fields will be rendered in this form mode. For example, we can configure this form mode to fill only the First name, Last name, Organization and Picture fields that have been created for the User entity.

Form profil configuration

So far so good. But how do we use our new form mode? From which path?

To finalize this we will use a small module that we will call my_module.

This module will allow us to declare our new form mode for the User entity, and to create a route, as well as a menu, which will allow us to access and complete our form.

First, let's declare this form mode and associate a Class with it, from the file my_module.module.

/**
 * Implements hook_entity_type_build().
 */
function my_module_entity_type_build(array &$entity_types) {
  $entity_types['user']->setFormClass('profil', 'Drupal\user\ProfileForm');
}

Here we associate the default form class User ProfileForm with our profil form mode. We could just as easily have used a Class MyProfilCustomForm by extending the Class AccountForm.

All we have to do now is create a route, from the file my_module.routing.yml, and we can then access our form.

my_module.user.profile:
  path: '/user/{user}/profil'
  defaults:
    _entity_form: 'user.profil'
    _title: 'Profil'
  requirements:
    _entity_access: 'user.update'
    _custom_access: '\Drupal\my_module\Access\MyModuleUserAccess::editProfil'
    user: \d+
  options:
    _admin_route: FALSE

From the route declaration, we specify the path (/user/{user}/profile), the form mode to be used for the User entity, specify route access rights (the right to modify a user, as well as customized permissions if necessary), and can also specify whether the route corresponds to an administration route, or not, to define the theme under which the form will be rendered (backoffice or frontoffice).

To finalize our new form mode, we will create a dynamic menu entry in the user account menu, in order to give an access link to users or administrators. In the file my_module.links.menu.yml, let's add an entry to create the corresponding menu link.

my_module.user.profil:
  title: 'Profil'
  weight: 10
  route_name: my_module.user.profil
  base_route: entity.user.canonical
  menu_name: user-account
  class: Drupal\my_module\Plugin\Menu\ProfilUserBase

What is notable here, in this menu entry, is the class property that will allow us to define the dynamic {user} parameter of the route corresponding to this menu entry. 

This ProfileUserBase Class will only return the ID of the accessed user, if the menu link is displayed on the user's page, or return the ID of the connected user if it is not, otherwise.

<?php

namespace Drupal\my_module\Plugin\Menu;

use Drupal\Core\Menu\MenuLinkDefault;
use Drupal\Core\Url;
use Drupal\user\UserInterface;
use Drupal\Core\Session\AccountInterface;
use Drupal\Core\Plugin\ContainerFactoryPluginInterface;
use Symfony\Component\DependencyInjection\ContainerInterface;
use Drupal\Core\Menu\StaticMenuLinkOverridesInterface;
use Drupal\Core\Routing\RouteMatchInterface;
use Drupal\Core\Entity\EntityTypeManagerInterface;

/**
 * Profile Menu Link
 */
class ProfilUserBase extends MenuLinkDefault implements ContainerFactoryPluginInterface {

  /**
   * The entity type manager.
   *
   * @var \Drupal\Core\Entity\EntityTypeManager
   */
  protected $entityTypeManager;

  /**
   * The current route match service.
   *
   * @var \Drupal\Core\Routing\CurrentRouteMatch
   */
  protected $currentRouteMatch;

  /**
   * The current user.
   *
   * @var \Drupal\Core\Session\AccountInterface
   */
  protected $currentUser;

  /**
   * Constructs a new MenuLinkDefault.
   *
   * @param array $configuration
   *   A configuration array containing information about the plugin instance.
   * @param string $plugin_id
   *   The plugin_id for the plugin instance.
   * @param mixed $plugin_definition
   *   The plugin implementation definition.
   * @param \Drupal\Core\Menu\StaticMenuLinkOverridesInterface $static_override
   *   The static override storage.
   * @param \Drupal\Core\Entity\EntityTypeManagerInterface $entity_type_manager
   *   The entity type manager service.
   * @param \Drupal\Core\Routing\RouteMatchInterface $current_route_match
   *   The current route match service.
   * @param \Drupal\Core\Session\AccountInterface $current_user
   *   The current user.
   */
  public function __construct(array $configuration, $plugin_id, $plugin_definition, StaticMenuLinkOverridesInterface $static_override, EntityTypeManagerInterface $entity_type_manager, RouteMatchInterface $current_route_match, AccountInterface $current_user) {
    parent::__construct($configuration, $plugin_id, $plugin_definition, $static_override);
    $this->entityTypeManager = $entity_type_manager;
    $this->currentRouteMatch = $current_route_match;
    $this->currentUser = $current_user;
  }

  /**
   * {@inheritdoc}
   */
  public static function create(ContainerInterface $container, array $configuration, $plugin_id, $plugin_definition) {
    return new static(
      $configuration,
      $plugin_id,
      $plugin_definition,
      $container->get('menu_link.static.overrides'),
      $container->get('entity_type.manager'),
      $container->get('current_route_match'),
      $container->get('current_user')
    );
  }

  public function getRouteParameters() {
    return ['user' => $this->getUserIdFromRoute()];
  }

  /**
   * {@inheritdoc}
   */
  public function getCacheContexts() {
    return ['user', 'url'];
  }


  /**
   * Get the Account user id from the request or fallback to current user.
   *
   * @return int
   */
  public function getUserIdFromRoute() {
    $user = $this->currentRouteMatch->getParameter('user');
    if ($user instanceof AccountInterface) {
     return $user->id();
    }
    elseif (!empty($user)) {
      $user = $this->entityTypeManager->getStorage('user')->load($user);
      if($user instanceof AccountInterface) {
        return $user->id();
      }
    }

    return $this->currentUser->id();
  }

}

And now you can use your new form mode that you can customize at will, either from the graphical interface or from a customized Form Class allowing you to introduce any business and/or complex logic easily.

Finally, I can't conclude this post on the Drupal 8 form mode without mentioning the Form Mode Manager module which can allow you to do all this without the need for a Drupal developer. Depending on your needs and the level of expertise you want, you can choose one or the other of these solutions. But this may be the subject of another ticket.

Dec 10 2018
Dec 10

This blog has been re-posted and edited with permission from Dries Buytaert's blog. Please leave your comments on the original post.

This week I was in New York for a day. At lunch, Sir Martin Sorrell pointed out that Microsoft overtook Apple as the most valuable software company as measured by market capitalization. It's a close call but Microsoft is now worth $805 billion while Apple is worth $800 billion.

What is interesting to me are the radical "ebbs and flows" of each organization.

In the '80s, Apple's market cap was twice that of Microsoft. Microsoft overtook Apple in the the early '90s, and by the late '90s, Microsoft's valuation was a whopping thirty-five times Apple's. With a 35x difference in valuation, no one would have guessed Apple to ever regain the number-one position. However, Apple did the unthinkable and regained its crown in market capitalization. By 2015, Apple was, once again, valued two times more than Microsoft.

And now, eighteen years after Apple took the lead, Microsoft has taken the lead again. Everything old is new again.

As you'd expect, the change in market capitalization corresponds with the evolution and commercial success of their product portfolios. In the '90s, Microsoft took the lead based on the success of the Windows operating system. Apple regained the crown in the 2000s based on the success of the iPhone. Today, Microsoft benefits from the rise of cloud computing, Software-as-a-Service and Open Source, while Apple is trying to navigate the saturation of the smartphone market.

It's unclear if Microsoft will maintain and extend its lead. On one hand, the market trends are certainly in Microsoft's favor. On the other hand, Apple still makes a lot more money than Microsoft. I believe Apple to be slightly undervalued, and Microsoft is to be overvalued. The current valuation difference is not justified.

At the end of the day, what I find to be most interesting is how both organizations have continued to reinvent themselves. This reinvention has happened roughly every ten years. During these periods of reinvention, organizations can fall out of favor for long stretches of time. However, as both organizations prove, it pays off to reinvent yourself, and to be patient product and market builders.

Dec 10 2018
Dec 10

Nowadays everyone has an API and it's fairly common to want a website you're working on to fetch data from a 3rd party API. That's because pulling 3rd party data into your website can not only enriches your website's content, but doing so can prevent the need to duplicate commonly needed data.

API provided data could include displaying weather information, going through drupal.org projects, looking through census results, or even displaying Magic the Gathering card data. In fact, every WordPress site comes with an active JSON API out of the box.

There really is an API for almost anything. It's no surprise that you'll eventually want to consume some random API while developing a Drupal website. Enough of the sales pitch, let's get started consuming JSON APIs.

The Plan:

  1. Look at what it takes to fetch and consume JSON data in general.
  2. Explore the popular Guzzle PHP library.
  3. Create a Drupal 8 module that consumes an API and displays the data on your website.

Seems pretty straightforward huh? I think so too, but before we go any further let's define some of the terms that appear throughout this post.

  • API - Application Programming Interface. Literally, "a thing developers can use to interact with another program".
  • Request - An interaction with a web API. Very similar to visiting a website in your browser. When you visit a website in your browser, you are making a "request" to the website's server.
  • Base URI - Part of a URL that is the root of a web API request. This tends to be a domain name such as "api.mywebsite.com", but can also include a path such as "mywebsite.com/api".
  • Endpoint - Part of the URL for an API request that usually defines what type of data you are requesting. For example, one of the most common WordPress API endpoints is "posts", which retrieves Posts from the WordPress website.
  • Query Parameters - Part of the URL for an API request that further describes the specific data you are requesting. Query parameters look like key-value pairs within the URL for the API request.

Here is an example on an API request URL:
http://demo.wp-api.org/wp-json/wp/v2/posts?per_page=2

And here is how that API request URL breaks down into the terms defined above:

  • Base URI - http://demo.wp-api.org/wp-json/wp/v2/
  • Endpoint - posts
  • Query Parameters - ?per_page=2

Hopefully that helps clarify some of the terminology used throughout the rest of this post.

Getting Started: Fetching JSON in PHP

The first thing we want to look at are the basics of getting data from an API. We need the following things:

  1. A public API that will provide us with data.
  2. A way to visit the API and get its data.
  3. A method of converting the raw data the API gives us into an array so we can easily process and output it as we see fit.

Broken down into their tiniest pieces, each of the above needs are fairly straightforward. Out of the box, PHP provides us with a function that handles both visiting an API, getting its data file_get_contents(), and another function for converting that raw string data into an array in json_decode().

As for the public API… I'd like to introduce you to Cat Facts, a public API that will provide us with all the facts about cats we could ever want!

Let's put all these pieces together and write a simple PHP script that will consume the Cat Facts JSON data.

<?php
$data = file_get_contents('https://cat-fact.herokuapp.com/facts/random?amount=2');
$cat_facts = json_decode($data, TRUE);

foreach ($cat_facts as $cat_fact) {
  print "<h3>".$cat_fact['text']."</h3>";
}

Result:

There we go. Now we can use this function to get as many random Cat Facts as we want (up to 500 per request). But before we continue, let's break down this script a bit and see what it's doing.

  1. First we use file_get_contents() to request the data from the API.
    (The response data comes back to us in the form of a string.)
  2. Next we convert it into an Array using the json_decode() function.
  3. Now that the data is an array, we can loop through it and output all of our brand new cat facts. Outstanding!

Note: I didn't make up the URL shown in this example, I read the documentation. You can visit that API request URL directly and see what the response data looks like. Go ahead, I'll wait here…

… Welcome back!

Check this out: Most APIs you interact with will have its own documentation that you should use when planning your project. If you find yourself struggling to figure out how to get data from an API, look for more documentation. Documentation is king when dealing with APIs. The better the documentation, the better the API in my opinion.

Object Oriented API Requests with Guzzle

Now that we have a decent understanding of the main points of requesting data from an API, let's take a look at how we might do that in a more practical and modern way. Let's use the very popular PHP HTTP client named Guzzle to do basically the same thing we just did above.

The main differences in using Guzzle are mostly around the abstractions provided by Guzzle's Client library. Rather than trying to describe each difference out of context, let's look at an example:

<?php
require 'vendor/autoload.php';

$client = new \GuzzleHttp\Client([
  'base_uri' => 'https://cat-fact.herokuapp.com/',
]);

$response = $client->get('facts/random', [
  'query' => [
    'amount' => 2,
  ]
]);

$cat_facts = json_decode($response->getBody(), TRUE);

foreach ($cat_facts as $cat_fact) {
  print "<h3>".$cat_fact['text']."</h3>";
}

Let's paws for a moment and review what has changed and why:

  1. The first thing to note is that we've created a new instance of Guzzle's Client object and passed in some parameters as an array. Rather than provide the entire URI for the API's endpoint along with query parameters in one string like before ('https://cat-fact.herokuapp.com/facts/random?amount=2'), we will instantiate the client with just the base URI for the API in general. This way we can easily make multiple requests to multiple endpoints with the same Client object.
  2. Next, we use the Client's get() method to request a specific endpoint of 'facts/random'. Internally Guzzle will combine the endpoint with the base_uri we provided during object instantiation.
  3. Additionally, we provide an array of query parameters to the get() method. Internally Guzzle will convert this array into a query string that looks like this '?amount=2' and append it to the URL before submitting the request to the API.
  4. Unlike file_get_contents(), the Guzzle client returns a Response object. This object contains much more information about the reply from the API, as well as the contents of the response.
  5. Finally, we access the contents of the response by using the getBody() method on the Response object.

It may seem like a lot has changed from our first example, but I highly recommend becoming more comfortable with this approach. Not only is it significantly more powerful and flexible than the file_get_contents() approach, but also because Drupal 8 uses the Guzzle library.

Ultimately, both this and the previous example are doing the exact same things. They are both visiting the Cat Facts API and fetching data.

Now that we are comfortable with requesting data from an API (aka, visiting a URL), I think we're ready to do this in Drupal 8.

Guzzle in Drupal 8

The plan is simple; create a Drupal 8 module that fetches cat facts from the Cat Facts API and displays those facts in a Block.

There are a few ways to accomplish this, but we'll start with the most basic. Have a look at this custom Block:

<?php

namespace Drupal\cat_facts\Plugin\Block;

use Drupal\Component\Serialization\Json;
use Drupal\Core\Block\BlockBase;

/**
 * Block of Cat Facts... you can't make this stuff up.
 *
 * @Block(
 *   id = "cat_facts_block",
 *   admin_label = @Translation("Cat Facts")
 * )
 */
class CatFacts extends BlockBase {

  /**
   * {@inheritdoc}
   */
  public function build() {
    /** @var \GuzzleHttp\Client $client */
    $client = \Drupal::service('http_client_factory')->fromOptions([
      'base_uri' => 'https://cat-fact.herokuapp.com/',
    ]);

    $response = $client->get('facts/random', [
      'query' => [
        'amount' => 2,
      ]
    ]);

    $cat_facts = Json::decode($response->getBody());
    $items = [];

    foreach ($cat_facts as $cat_fact) {
      $items[] = $cat_fact['text'];
    }

    return [
      '#theme' => 'item_list',
      '#items' => $items,
    ];
  }

}

Block Output:

At first this looks like a lot more code, but most of the new stuff is the code necessary to create a Drupal 8 block. Rather than focus on that, let's look at the important difference between this and the previous example.

  1. Drupal core provides a service designed to create HTTP clients: 'http_client_factory'. This service has a method named fromOptions() which accepts an array, just like the Guzzle Client constructor did before. We even passed in the exact same parameter.
  2. Instead of calling json_decode() function, we use the Drupal provided Json::decode() method.

Note: We could have instantiated our own Guzzle Client object, but this approach has the following added benefits.

  • First, Drupal is going to merge our options array into a set of its own options that provide some HTTP request best practices.
  • Second, using this method (somewhat) future proofs us from major changes in the Guzzle library. For example, if Drupal improves its http_client_factory service, or even decides to switch libraries all together, we'd like to believe that the core developers will take on the burden of ensuring the http_client_factory service still works the way we expect.
  • The same reasoning applies to using the Json::decode() method Drupal provides as opposed to the json_decode() function.

Generally, whenever Drupal provides a way to do something, you should use the Drupal way.

Now this is all very good, we're making API requests the "Drupal Way" and we can place it almost anywhere on our site with this handy block.

But, I know what you're thinking. You're thinking, "Jonathan, Cat Facts shouldn't be contained to just a block. What if I want to use them somewhere else on the site? Or what if another contributed module wants to make use of these awesome Cat Facts in their own module?"

And you're totally right to be thinking that. Hence...

Cat Facts as a Service (CFaaS)

There is no better way to make Cat Facts available to other non-block parts of your site and other contributed modules than to provide a Cat Fact service in our own module. I can see it now, with such a powerful feature our Cat Facts module will soon be a dependency of almost every other popular Drupal 8 module.

First thing we need to do is define our new Cat Facts service in our module's services file.

cat_facts.services.yml

services:
  cat_facts_client:
    class: Drupal\cat_facts\CatFactsClient
    arguments:
      - '@http_client_factory'

Purrfect.

Next, we need to write this new CatFactsClient class. It will look relatively similar to the work we've done already; but, this time instead of calling the Drupal::service() method, we'll use dependency injection to provide our CatFactsClient class with the http_client_factory core service automatically.

src/CatFactsClient.php

<?php

namespace Drupal\cat_facts;

use Drupal\Component\Serialization\Json;

class CatFactsClient {

  /**
   * @var \GuzzleHttp\Client
   */
  protected $client;

  /**
   * CatFactsClient constructor.
   *
   * @param $http_client_factory \Drupal\Core\Http\ClientFactory
   */
  public function __construct($http_client_factory) {
    $this->client = $http_client_factory->fromOptions([
      'base_uri' => 'https://cat-fact.herokuapp.com/',
    ]);
  }

  /**
   * Get some random cat facts.
   *
   * @param int $amount
   *
   * @return array
   */
  public function random($amount = 1) {
    $response = $this->client->get('facts/random', [
      'query' => [
        'amount' => $amount
      ]
    ]);

    return Json::decode($response->getBody());
  }

}

What we've done here is create a simple class that focuses solely on the Cat Facts API. It abstracts most of the work you would normally have to do when you need to request data from the API by providing a method named random() which will perform the HTTP request and return a decoded array of data back to the caller.

Finally, all we need to do is update our Cat Facts block to allow our new service to be injected into it as a dependency and we should be done.

src/Plugin/Block/CatFacts.php


<?php

namespace Drupal\cat_facts\Plugin\Block;

use Drupal\Core\Block\BlockBase;
use Drupal\Core\Plugin\ContainerFactoryPluginInterface;
use Symfony\Component\DependencyInjection\ContainerInterface;

/**
 * Block of Cat Facts... you can't make this stuff up.
 *
 * @Block(
 *   id = "cat_facts_block",
 *   admin_label = @Translation("Cat Facts")
 * )
 */
class CatFacts extends BlockBase implements ContainerFactoryPluginInterface {

  /**
   * @var \Drupal\cat_facts\CatFactsClient
   */
  protected $catFactsClient;

  /**
   * CatFacts constructor.
   *
   * @param array $configuration
   * @param $plugin_id
   * @param $plugin_definition
   * @param $cat_facts_client \Drupal\cat_facts\CatFactsClient
   */
  public function __construct(array $configuration, $plugin_id, $plugin_definition, $cat_facts_client) {
    parent::__construct($configuration, $plugin_id, $plugin_definition);
    $this->catFactsClient = $cat_facts_client;
  }

  /**
   * {@inheritdoc}
   */
  public static function create(ContainerInterface $container, array $configuration, $plugin_id, $plugin_definition) {
    return new static(
      $configuration,
      $plugin_id,
      $plugin_definition,
      $container->get('cat_facts_client')
    );
  }

  /**
   * {@inheritdoc}
   */
  public function build() {
    $cat_facts = $this->catFactsClient->random(2);
    $items = [];

    foreach ($cat_facts as $cat_fact) {
      $items[] = $cat_fact['text'];
    }

    return [
      '#theme' => 'item_list',
      '#items' => $items,
    ];
  }

}

And there we have it! Now our block has the cat_facts_client service injected into it during creation, and rather than make its own Guzzle Client and API calls, it uses our shiny new service.

Ya feline it?

View module code on GitHub.

Dec 10 2018
Dec 10

Zivtech is happy to be offering a series of public Drupal 8 trainings at our office in downtown Philadelphia in January 2019. 

Whether you consider yourself a beginner or expert Drupal developer, our training workshops have everything you need to take your Drupal skills to the next level. 

Our experience

The Zivtech team has many years of combined expertise in training and community involvement. We have traveled all over the world conducting training sessions for a diverse range of clients including, the United States Department of Justice, the Government of Canada, CERN, Howard Hughes Medical Institute, Harvard University and more. 

We pride ourselves in educating others about open source, and attendees will leave our trainings with the knowledge to build custom Drupal sites, solve technical issues, make design changes, and perform security updates all on their own. We also offer private, onsite trainings that are tailored to your organization's specific needs. 

Our public Drupal trainings for January 2019 include:

Interested in learning more about our upcoming trainings? Click here. You can also reach out to us regarding multi-training and nonprofit discounts, or personalized trainings. 

We hope to see you in January!
 

Dec 10 2018
Dec 10
Although JSON Web Tokens (JWT) is a younger specification than its more well-established cousin, OAuth 2.0 Bearer Token authentication, JWT has been adopted by quite a few in the Drupal community due to its relative simplicity. In this installment, we explore JSON Web Tokens and how this authentication mechanism can benefit your decoupled Drupal architecture.
Dec 10 2018
Dec 10

Over the last few months, I've been involved in a UX study to shed some light on what would make a good content editor experience in Drupal. I helped run a survey asking content editors for their feedback about the Drupal admin UI and got some interesting results. Then, we started looking around for examples of good content editor UX, which led to a comparative usability test of other CMS's, generating some ideas about patterns to follow and avoid.

I have a job where I get to train lots of people how to use Drupal: developers, site builders, and sometimes content editors. So I've gathered a lot of anecdotes of people's first impressions of editing content with Drupal.

As you've probably heard, there is work being done on a brand new Admin UI for Drupal, but this might take a while to build. So what are some things about content editor UX that we might be able to improve on before that?

Autosave

When you ask content editors what we should change about the admin UI, this one always come up. All content authors have anxiety about losing their content. And when we ran our user testing with other CMS’s, autosave clearly reassured and delighted editors. For example, Contentful has a nice autosave message that helps users know that their content is saved.

"Last saved" screenshot

There is an open issue for adding autosave to Drupal. It would be great to get momentum behind implementing this!

A Content Editor Role

One of the challenges of using Drupal is that there are so many options. For site builders and developers, a UI that provides a lot of options is great because you can see that the platform is flexible. For content editors, seeing a lot of options that don’t relate to content editing is intimidating. From our usability testing, we can see that content editors love using a UI that is more streamlined, less complicated, and clearly designed for them.

Creating a content editor role as part of the Standard install profile might be a nice way to encourage the practice of limiting what content editors have access to by providing a set of default permissions to start with.

What should the content editor role look like? Of course, content editors need to be able to create and edit content. And they will need access to the content overview page. Should they have access to the Administer Content permission? What about working with files, taxonomy terms, revisions, and menus? Based on our content editor survey, it seems like giving all these permissions would align the role with a content editor’s typical tasks. But I think a more limited set of permissions would work as well, as long as it's clear that the permissions are a starting point that can be expanded on by an administrator.

Show Me the Content!

On a standard Drupal install, when you log into Drupal you're redirected to your profile page. I think for most content editors, this doesn’t make sense. Your home base is probably the content overview page. Maybe we should redirect content editors to the content overview page when they log in.

Move the Save Button

Currently, the Save button, along with the preview buttons and the interface to publish and moderate content is at the bottom of the node edit page. From our usability testing, it seems like content editors expect the Save button to be in the top right-hand corner. For long-form content, this would make those links more readily available and I think this would be a usability improvement for new content editors.

Current Drupal add content page screenshot

The key to success with all the save, preview, and content moderation options seems to be to put them all together. In the comparative usability testing, this feedback rang loud and clear.

One thing I like about moving the Save button top right-hand corner is that it puts those buttons in proximity to the “Revisions” section of the edit page, which is closely related to the task of saving or changing the status of a node.

That being said, I think it could really confuse existing Drupal editors if we just start moving buttons around. And I assume that there could be accessibility implications that we should take into account. So I leave this one as an open question. What do you think?

Modernizing the UI

One of the big pieces of feedback we got from the content editors survey was that the Drupal UI looks dated. Well, the idea of modernizing the UI already has a ton of momentum behind it. I’m super excited about the work Christina Chumillas and others have done to create new, improved designs for the existing Admin UI of Drupal 8. This is not an overhaul of the UX, but a new, modern style guide applied to the existing one. It's a big improvement that I think will bolster first impressions of Drupal. And it's something that is planned to be implemented in the near future.

What’s Next?

If you’re a UX person who wants to contribute to Drupal, there are ways to get involved! See the #ux and #admin-ui channels on Drupal Slack. If you have ideas for other content editor UX improvements, or know of existing UX issues in the queue that need some attention, I'd love to hear about them here. Or continue the conversation in those channels.

Dec 10 2018
Dec 10

Global leading enterprises, brands, governments, and universities are experiencing the power of Drupal to engage with their audience through their websites and beyond. In a quest to lead the digital transformation experience, Drupal helps them manage and deliver a host of content across channels and devices. 

Worldwide, the organizations are looking for a solution that will support their active community, growing businesses, and user experience. With technological innovations speeding up, be careful when you are choosing the CMS for your enterprise. 

Enterprises’ demands, stakeholders’ demands, future requirements - choosing the right technology is a tough call. However, more and more organizations are leveraging Drupal for enterprise website development. 

Fulfilling the business requirements as well as meeting the technical aspects, Drupal is used 7 times the number of top sites as its next two competitors combined (BuiltWith.com)

a man holding the rely for the race on ground ready to run


What is the Market Demand Like?

In one of its reports, Statista predicts the size of the enterprise content management (ECM) market worldwide to reach $67 billion by 2022.

a bargraph with blue bars

The possible spike in the graph is likely due to digital transformation which has become a key strategy for every organization. In order to make the most of the data and information the focus is moving from managing content to using content to support process productivity with simple intuitive backend technology. 

In this scenario, the information control and governance features of Enterprise Content Management move to the background, and content presentation comes to the fore. 

Reliable, scalable, secure, flexible, presentation-friendly, future-proof to sustain the content without hampering the performance of your website. Can Drupal manage the unending list of demands? 

Drupal is Fostering Billion Dollar Businesses…

Drupal can manage, in fact, it is very well doing in that matter. Fostering billion dollar businesses under the aegis of its brand. The list includes names like Tesla Motors, The Economist, Puma, Pfizer, Timex, LO’real, Honda, Johnson and Johnson, and a million others. 

Drupal 8 rightly taps into the concentrated innovation from its open source community. 

Taking advantage from its open source community, Drupal derives the most value by being flexible while keeping its core robust without compromising on security providing with new capabilities for successful digital experiences.

Acknowledging that enterprise solutions often demand complex requirements, Drupal has it sorted for you.

5 Unconventional Reasons to Choose Drupal For Your Enterprise

With Drupal, you can build a powerful website that’s optimized for every device, personalized for every visitor, and integrates with all your marketing tools.

Here are some of the solid reasons which make Drupal an excellent candidate for the enterprise of any scale or vertical.

With the new technological developments, Drupal meets these unconventional market demands just right. 

  1. Content Presentation with a Headless Architecture
    Customer experience is the new standard to measure the success of your online business, so much so that even the commerce is going headless. Supporting a headless architecture, Drupal can integrate frontend technologies like React and Angular to give a new dimension to the digital experience. Today, headless is destined to take user experience to a whole new level where the experience itself can be merchandised.
     
  2. Balancing the Hardware-Software Relationship
    With changing demands, enterprises are entering new ways to distribute the content. The dramatic increase in smartphone usage and portable devices have triggered the emergence of stand-alone software apps. This requires an easy integration matrix between hardware devices (say an Apple Watch) and a data management system (software) to present the content in the desired format. This is especially important as a medical device CMS
     
  3. Personalization with Machine Learning and Predictive UX
    Bringing together advanced analytic capabilities, data mining, real-time scoring, and machine learning enterprises are trying to discover patterns in data and forecast events to offer the best digital user experience. In fact, 65% of customers are more likely to shop at a store or online business that sent relevant and personalized promotions.

    For enterprises, machine learning has the ability to scale across a broad range of businesses. 

    Acquia Lift Connector, a Drupal module, offers integration with the Acquia Lift service and an improved user experience for web personalisation, testing and targeting directly on the front end of your website.

    It leverages machine learning to automatically recommend content based on what a user is currently looking at or has looked in the past. It has the feature of real-time adaptive targeting that refines segments while A/B helps in keeping the users engrossed with the content that resonates.

  4. Chatbots to Drive the Business Value
    Providing a conversational UI that communicate with the users and is vital for an organization, you can't afford to miss Chatbots in your website. Understanding what users want to know, pass on the information to the backend, and provide them with quick and better responses. Chatbot API, in Drupal helps integrate chatbots with your website. 

  5. Exploring Markets with AR and VR
    Drupal leads by an example when it comes to innovating with technologies outside of its periphery. By the year 2022, the market size of Virtual Reality and Augmented Reality is forecasted to increase by humungous levels as can be seen in the graph below.

    a bargraph with grey graphsSource: Statista


    Several examples of VR applications built on top of Drupal 8 have been presented in the community. One of the famous examples of VR built on Drupal is the 'The Drupal site of Massachusetts State University' (a fictional university).

    [embedded content]


    Augmented Reality has a lot to offer to different industries, with transportation, sports, education, and healthcare reaping the most benefits out of it. More recently with Google adding AR to its maps for a street view mode.

    [embedded content]

The Conventional Reasons to Opt for Drupal 

  1. It is Easy To Build

    With Drupal 8.6 “quick-start” command, launching a Drupal site is easy in one command using only one dependency, PHP. Setting up a demo website no longer requires you to have setup a web server, a database, containers, or anything else.

    A fully functional Drupal demo application can be downloaded and installed in less than two minutes. 

    Providing easy-to-set-up solutions with distribution, the development time is cut by half. 

    Enabling companies to deploy core features and functionality rapidly, it allows easier customization as per their business requirements. It is easier to choose the layout and themes for your Drupal website, as themes and appearances are just a click away. With features simplified to make non-developers comfortable around Drupal, the editorial capabilities have been made fluent and easy.

  2. Drupal is Secure

    Used by hundreds and thousands of websites, Drupal’s core, codes, and passwords are repeatedly encrypted and hashed to strengthen the life of your website.

    Drupal security animation Supported by experts, and a large and continuously growing community, it has a dedicated security team to patch any probable security violation.

    Frequent Updates

    In case of any security update, the community ensures that you get notified the day patches are released. Security release windows are released every Wednesday for contributed projects, and the third Wednesday of every month for core, usually, for a fixed period of time.

    Even though the release window does not necessarily mean that a release will actually be rolled out on that date, it exists for the site administrators to know in advance the days they should look out for a possible security release.

    Security Modules

    In addition to the proven security of core, numerous contributed modules can strengthen the security of your website. These modules extend the security by adding password complexity, login, and session controls, increasing cryptographic strength, and improving Drupal' logging and auditing functions. Additional security modules like security kit, captcha, paranoia, Drupal Security Review can be used as a checklist to test and check for many of the easy-to-make mistakes making your site vulnerable.  

    Security Team and Working Group

    The security team works closely with the Drupal Security Working Group (SecWG), comprising dozens of experts from around the world to validate and respond to security issues, aim being - to ensure that core and contributed project system provides world-class security and provide security practices to community developers.

    Its core is designed to prevent any possible security breach. Vulnerabilities in the core are coordinated with branch maintainers and individual project maintainers respectively.

    Drupal has proven to be a secure solution for enterprise needs and is used by top-tier enterprises.

  3. Drupal is Scalable and Flexible

    Another salient feature that makes it popular among businesses. When concerning web technology, enterprises require the ability to handle considerable traffic throughout - especially if it is a media and entertainment site.

    Drupal has stood both the test of time and traffic spike.

    Its ability to make the framework extensible via its modules and distributions is at the heart of much of its success. While it has enabled the core to sustain the bulk of the content, its way to streamline the demands of new industries by allowing them to address their needs in the form of custom modules and distributions has given it more satisfactory customer reviews. 

    One matter that addresses the worries of enterprises is the cost of maintenance. Many government and non-government organizations have migrated to Drupal to avoid the licensing and maintenance cost of the proprietary systems.  

  4. Excels at Faster and Responsive Development

    According to Google’s official statement, more than 50% of search queries globally now come from mobile devices. People want to be able to find answers as fast as possible and various studies have proved that people really do care about the loading speed.

    And that is why a recent Google release says that page speed will be a ranking factor for mobile searches from July 2018. It’s high time that you take the combination of performance and mobile responsiveness as a serious factor for improving visibility and revenue from the web. It is very important to have an AMP friendly website. 

    Drupal 8 is built for a mobile-first world. Everything in version 8 supports mobile responsive design. Its admin and default designs are responsive for both developers and content authors providing a responsive front-end theming framework.

    Increasing the loading speed of your web page opens numerous doors for business. And when users can view your Drupal website the same way on a desktop and mobile devices you cannot be having second thoughts.

    Mobile responsiveness helps you deliver the optimal mobile visitor experience. It supports the best responsive design practices and ensures that your users get a coherent experience anytime and every time.  

  5. Supports Multi-site Functionalities

    Given that your organization is running more than one site, the maintenance and management would require big bucks and time. But with the multi-site feature you can share one single Drupal installation (which includes core code, contributed modules, and themes) among other several sites.

    Enterprises, this way, can handle complex requirements from a single Drupal installation which implies that less time and resources are required to build your network of websites.

    One can manage any number of sites across their organization or brand, crossing geographies and campaigns from a single platform that allows swift and uncomplicated site creation and deployment.

    This is particularly useful for managing the core code since each upgrade only needs to be done once. While each site will have its own database and configuration settings to manage their own content, the sites would be sharing one code base and web document root.

    The multisite feature can be used for sites with the same features and functionalities. But if you have different functionalities it is better to test each site independently.

Drupal For Every Enterprise

Realizing the needs of every industry is different, Drupal has something for everyone.

Media and entertainment

Editing and Scalability

The Warner Brothers logoMedia and entertainment websites worldwide use Drupal for their online platforms for seamless editing and scalability. The list of over one million organizations includes The Economist, ET Online, MTV(UK), The Grammy, The Emmy, The Weather.com, The Beatles, and Warner Bros Music.

Scalability is all about quantity - how many requests and amount of information you can handle at any given time without breaking or bending. Supporting some of the world’s most visited sites, Drupal is the other name of scalability.

Allowing easy content editing and management, which media and entertainment websites look for, it provides it all with WYSIWYG and CKEditor without another weighty feature.

SaaS

Community solutions:

SaaS enterprises are using Drupal to build the platform for their product as well as a community to engage with the clients and followers. It is easy to develop the platforms and then keep on adding the features in the later phase.

Given that community platforms are one of the key needs of SaaS organizations which allow the domain for the prospects and help the product and community to grow alike, distributions like OpenSocial offer great help.

Zoho is one of the SaaS products using Drupal for its community platforms.

E-commerce

E-commerce functionalities

Providing easy payment gateway to conduct online transactions, Drupal ensures the customer information passes seamlessly and remains safe.

Its core commerce payment module and distributions (Drupal commerce and Commerce KickStart) support the payment API, for a smooth payment collection procedure, through the check out form.

Supporting Paypal Express Checkout and Paypal Credit along with Amazon Pay, it lets you reach a wider audience by letting your shoppers complete the payment and shipping information stored on their Amazon accounts.

Lush is a Drupal patron. 

Tour and travel

For a potential traveler, your site shouldn’t look like just-another-information-brochure on the web. The need for an end-to-end solution to integrate all the minute details (from hotel booking to landing back) has never been greater.  

Booking Engine:

Providing two of the best booking solutions for your website:

  • EasyBooking - Distribution
  • BAT - Module

A complete solution for your vacation portal, BAT allows you to build an exclusive booking engine for a better customer relationship management. And EasyBooking gives a set of options to your visitors to make room reservations, contact hotel administration, or just sign-up for the hotel’s newsletter to be aware of the special offers and discounts.

FMCG

Theming

A design which resonates with your brand, interests and engages with your visitors is what you should indulge your resources in developing.

It’s the psychological effect which drives the visitor to make a transaction or to explore provided possibilities throughout the interface. Every landing page matters.

Regardless of your showcased products, Drupal themes provide sound navigation throughout the categories and sections with in-built hero banners’ section and pop-ups which are definitely customizable.

Additional modules can be further used to build an industry-specific theme. In order to cope up with varied demands, it provides more than two thousand easy and free to use themes on the go.

WholeFoods is built on Drupal.

Government and Non-Government

Cost and Security:

In 2012 when the Georgian government shifted to Drupal, the first reason to dump its previous CMS (Vignette) was its rising maintenance costs. 

Running a total of 65 state websites on two different versions of this proprietary system proved to be costly in the long run

Another decisive factor for government websites, uncompromised security is why government organizations are opting for Drupal. Around 150 governments are already powered by it. Just like the Georgian government, costs have been a significant factor affecting the choice of government and non-government agencies.  

Higher Education

Distributions:

To quickly build your higher education website, distributions provide an easy opportunity to build the website halving the development time and providing quick features. Opigno and OpenEDU are two of the distributions used widely by the higher-ed websites.

Drupal is most widely used CMS in the education sector no wonder why top international universities like the Harvard, Brown, Yale, Pennsylvania, and Columbia rely on it.

Harvard and Oxford are built on Drupal. 

HealthCare and Life Sciences

Content and User access control:

It can conform to any workflow that can be programmed with just a few configurations available. You can identify different types of content such as text, images, comments, file attachments, and any other information on your website for easy content integration and management.

Pfizer, a leading pharma organization is built on Drupal. 

Drupal As an Enterprise Management System

The need for an intranet system cannot be emphasized enough. For your business to grow by leaps and bounds, it is necessary to establish clear communication within your organization.

As your business expands, the need for an intranet system which can help in storage and sharing of data increases. ECMS is different from the web content management system in the way that the former is specifically designed for enterprise websites and is more dynamic.

Drupal allows building ECMS in two ways, either by using its modules and features or with the third party configuration. Its integration capabilities help the website to serve as a central content management system integrated with other necessary advancements.

Drupal Is Easier To Manage

Drupal isn’t hard to use, but it can be hard to learn how to use. Even though it requires more technical experience it is capable of producing exceptionally advanced sites. There is a WYSIWYG editor and drag-and-drop functionality to ease out the process and help you start straight away.

The release of version 8 has made the platform easier to use even for non-developers(and it includes content authors). Managing your website is easy as the community platform provides you with necessary documentation and answers in case you get stuck.

Summary

Being one of the leading technologies in the market, Drupal gives your enterprise the features and flexibility to innovate as per your visitor behavior and preferences.

We’d love to hear your thoughts. To get in touch, drop a mail at [email protected] and let us know how we can enhance your statistics with Drupal.

Dec 10 2018
Dec 10

It's a fact: “the next generation” of web apps aren't just extremely fast, they're highly scalable, as well. Which brings us to the next question: “How do you scale a web application in Drupal?”

What tools, best practices, and latest techniques do you use for leveraging Drupal 8's scalability capabilities?

For ensuring that your custom web app will keep on scaling to:
 

  • handle sudden spikes in traffic
  • avoid downtime 
  • withstand “surprise” content overloads
     

Well, here they come:
 

1. But Is Drupal Scalable? How Scalable? 

Let's just say that:

Drupal's built with scalability in mind and that Drupal 8 is... extremely scalable.

It's powering some of the world's most trafficked and content-rich websites (Weather, Grammy, Princess Cruises...). Therefore, it's designed to cope with heavy infrastructures of thousand content contributors, Drupal users and site/app visitors...

And when gauging Drupal 8's scalability you need to go beyond Drupal's unmatched modularity: +30,000 free modules.

Instead, just think of:
 

  • Drupal turned into a central API 
  • all the improvements brought to Drupal 8's scalability till this day
  • Drupal 8 enabling you, right out of the box, to integrate it with a wide range of third-party apps, software, and systems
  • RESTful API now in core!!!
     

… and how all that empowers you, the Drupal web app developer, to easily serve JSON or HTML code.

And Drupal 8's unparalleled scalability comes down to this:

Empowering developers to create content and send it to any third-party app via JSON.

Of course, its out-of-the-box scalability can get further optimized via:
 

  • an established set of best practices
  • additional support from various tools and technologies
     

2. How to Scale a Web Application in Drupal: Server Scaling Techniques

Let's say that... “it's time”:

You've applied all the optimization techniques on your web application so that it should seamlessly “accommodate” the increasing influxes of traffic and content load. And still, its server hardware has started to show its limitations.

So, it's time to scale your server hardware. And you have 2 options at hand:
 

2.1. You scale up your server vertically 

This is the handiest methods, so to say. That “emergency” technique to go for when:
 

  1. you don't have time to install a caching module
  2. there's no one in your team with the needed expertise for adding more servers
     

So, what do you do? You increase your existing server size. 

You boost its performance by adding more resources.

This way, it could keep up with all those new traffic challenges calling for more memory, more CPU cores...

Word of caution: there' no such thing as “sky is the limit”; you'll still reach the limit of the hardware at some point when you scale up a web app in Drupal using this method.
 

2.2. You scale up your server horizontally

The second best practice for scaling up your server is a bit more complex.

And it involves 2 approaches, actually:
 

a. You separate your database from your Drupal web app

Basically, your database will have its own server and thus you get to split the load in 2. Then, you can vertically scale each one of the 2 servers.
 

b. You add multiple servers and distribute the load between them.

This is the most complex way to scale a web app in Drupal. 

Just think about it:

How will the servers included in this whole “ecosystem” “know” which users to take over?

It goes without saying that you'll need a load balancer for properly “splitting up” the traffic load. And a database server, as well.

See? It already gets more complex compared to the other 2 above-mentioned server scaling techniques.

Nevertheless, this is the method which, when done properly, will reduce dramatically the load that each server must handle.
 

3. “Juggling with” Multiple App Servers for Drupal

Let's say that you've opted for the last method of scaling up your server, so:

Now you find yourself facing the challenge of handling multiple app servers.

How will you deploy code to each of them simultaneously? That is the biggest question when you scale a web app in Drupal.

The best practice is to keep all your servers on the same local network. 

Having one single data center will speed up the data transfer compared to having it traveling through the internet.
 

The END! This how you can leverage Drupal 8's scalability capabilities and easily “adjust” your web app to withstand unexpected surges of traffic.

Have you tried other techniques and best practices? 

Dec 09 2018
Dec 09

If you've adopted a Composer-based Drupal 8 workflow (hopefully using the Drupal Composer/Drupal Project template) where you're keeping dependencies in your project's repository, then you've no-doubt experienced the annoyance of a rouge .git directory ending up in one of your project's dependencies. This will always happen when you're using the -dev version of a Drupal module. 

For example, as of the authoring of this Quicktip, the Field Redirection module does not yet have a stable release for Drupal 8. When added to a project using Composer, the results look like this:

Michaels-MacBook-Pro:dcoweek5 michael$ ddev composer require drupal/field_redirection
Executing [composer require drupal/field_redirection] at the project root (/var/www/html in the container, /Users/michael/sites/dcoweek5 on the host)
Using version 2.x-dev for drupal/field_redirection
./composer.json has been updated > DrupalProject\composer\ScriptHandler::checkComposerVersion
Loading composer repositories with package information
Updating dependencies (including require-dev)
Package operations: 1 install, 0 updates, 0 removals
  - Installing drupal/field_redirection (dev-2.x e1c30f2): Cloning e1c30f24f9 from cache
Writing lock file

Git directory

Notice on the "Installing drupal/field_redirection..." line, it indicates that the project is cloned, not downloaded. This means that a .git directory has been created in the Field Redirection directory.

Note that I'm calling Composer as "ddev composer ..." - this is because I use DDEV as my local development environment and am utilizing its built-in Composer command.

If this goes unnoticed, and you attempt to do a normal "git add/commit" workflow for the new module, you'll end up with a somewhat-friendly Git message indicating that you now have a Git submodule.

Unfortunately, Git submodules aren't normally necessary nor wanted when you are committing dependencies to the project repository. So, the typical solution is to delete the .git directory of the dependency prior to performing the "git add/commit".

Luckily, there's an easier way! Travis Neilans recently pointed me in the direction of the Composer Cleanup VCS Directories project. By adding this as a dependency of your project, any .git directories that result from adding project dependencies will be automatically removed! First, install the Composer Cleanup VCS Directories project using:

composer require topfloor/composer-cleanup-vcs-dirs

Then, anytime you use "composer require" to install a project dependency, if there's a .git directory, you'll see a message indicating that it has been automatically removed.

Deleting .git directory from /var/www/html/web/modules/contrib/field_redirection/.git

Dec 09 2018
Dec 09

Customer experience was elevated to a new high when storefronts and shopping carts first burst onto the scene. Then, e-commerce created a different dimension in the world of shopping. In this age of shoppable everything, brands and retailers need to seize the moment of truth as consumers look for shiny objects of inspiration. Today, headless commerce is destined to take shopping experiences to a whole new level where the experience itself can be merchandised.

A black and white image of a man wearing a shirt with flower designs and his face blurred by smoke


Headless commerce is here to get the consumer’s attention and get the deal done. Drupal Commerce allows you to implement headless commerce and helps in monetising the moment of heightened emotion as most of the time, the inspiration, which is potent at the moment, diminishes to a fleeting whim.

Headless Commerce: Explained

When the frontend of your shopping experience is decoupled from the backend, that is, the presentation is segregated from the commerce stack, it is referred to as headless commerce architecture.

In headless commerce, the frontend of your shopping experience is decoupled from the backend.

Such an architecture supports an e-commerce platform without ‘head’ - front-end presentation layer - permitting additional flexibility, customisation, enhanced shopping experiences and creative freedom.

In a headless commerce, you do not have to worry about interfering with the back-end commerce setup. Marketers and merchants can experiment and make alterations to the front-end with agility.

Flowchart with a box, desktop icon, and smartphone icon showing the workflow of headless commerceSource: Snipcart

Headless commerce vs Traditional commerce

Following are the two key differences:

Source: Snipcart


Presentation layer

In a traditional commerce, design constraints are observed as alterations would require you to invest an awful lot of time in editing the database, code, and the frontend platform.

In contrast, frontend developers do not have to be concerned about modifying databases in the backend and can create a unique user experience that adheres to their business requirements with just a simple API call. With no presentation layer, everything from product pages to the landing pages has to be built from the scratch in headless commerce.

More room for customisation

In traditional commerce, developers have to edit several layers of coding even for a single customisation starting from front-end to the database layer that is dwelling in the back-end.

Headless commerce offers more room for customisation and personalisation.

Headless commerce offers more room for customisation and personalisation. It gives you complete authority over the look and feel of your digital commerce platform.

Benefits of headless commerce

Following are some of the major benefits from different perspectives:

Magnificent shopping experience

It gives a magnificent shopping experience to developers, merchants, and consumers. For instance, adding a new field to a customer account, implementing a custom checkout flow, adding ratings and reviews to the shopping experience and many others can be performed without redeploying the digital commerce platform.

With less duplication of functionality, it offers a better experience to the merchants while administering the shop. At the same time, end-users or consumers get content-rich experiences.

Leveraging portable backends

There is no dependency on large, restricting infrastructures which lets you stay competitive with your front-end without getting bogged down by coupled solutions.

Choosing best front-end tools

Whether it is a web application, mobile application or Internet of Things, headless commerce lets you select the best front-end tools for every platform or device.

Utilising JAMstack

Headless commerce allows you to build using JAMstack which faster, safer and cost-effective thereby providing a creative and strategic development experience

Headless commerce with Drupal

Drupal Community, with its commitment towards making Drupal more better, is continuously working on decoupling Drupal Commerce and the result is incredible.

Drupal Camp Ashville 2018

A session held at Drupal Camp Ashville 2018 delineated the development of the Commerce Demo project. It talked about the out-of-the-box capabilities of the demo store and an ecosystem update constituting the roadmap for decoupled Drupal Commerce. The demo store is available online

[embedded content]


The Commerce Demo project was built to create a product catalogue that can be easily removed without hampering the configuration of the site. The design involved modern shopping cart interaction paradigms. It required simple Views and Form API based solutions.

As a result, the design resulted in building a standalone Commerce Cart API module. This can be leveraged by anyone to develop a custom shopping cart widget and Commerce Cart Flyout which is the reference implementation for demonstrating how decoupling Drupal Commerce would look like. 

The design resulted in the development of a standalone Commerce Cart API module that anyone can use to develop a custom shopping cart widget and Commerce Cart Flyout. It showed a reference implementation that demonstrates what decoupling Drupal Commerce can look like.

Drupal Europe 2018

Another session held at Drupal Europe 2018 showed a Cart API for progressively decoupled cart experiences. It talked about how to decouple Drupal Commerce to enhance scalability and flexibility for the organisations. It demonstrated the works that were done for developing a standard Cart API to support progressively decoupled shopping carts in Drupal Commerce for Drupal 8. It was not without hurdles as the core RESTful Web Services were reviewed and involved contributions in JSON API projects.

[embedded content]


The new Commerce Cart API project and the reference implementation in Commerce Cart Flyout demonstrated the end result showing a potential of creating a unique user experience. With this, Drupal Commerce can offer great shopping experiences that are on par with leading e-commerce software platforms. This would, in turn, make it simpler for Drupal agencies to sell Drupal Commerce to the customers.

Conclusion

Enterprises encounter increasing pressure to offer innovative customer experiences, An API-based headless architecture provides increased flexibility with its decoupled user interface (UI) approach. Technical professionals responsible for digital commerce can utilise Drupal Commerce for delivering fantastic shopping experiences.

We have been committed to providing a great digital experience with our expertise in Drupal Development.

Contact us at [email protected] to implement headless commerce for your enterprise.

Dec 09 2018
Dec 09

As Leicester City fans continued to rejoice at their team’s unlikely triumph in the 2016 edition of English Premier League to become the champions for the very first time, some people were contemplating how the underdogs managed to beat such high odds. The way a team envisages itself performing is just as important as the physical strength of the players. And in this age of digitisation, imagining themselves performing well in the digital space is immensely significant. This is where Drupal comes in.

Three Leicester City players wearing blue coloured outfit running on the football ground


Be it a football club, a tennis open or a cricket world cup, any popular sports team and the tournament would want to be an instant hit among the sports lovers around the globe and build on it to create a unique brand identity. As one of the leading content management systems, Drupal can help a sports brand establish themselves as an important entity in the digital arena with a unique and powerful website.

Benefits of Drupal

Drupal powers websites of biggest of the sports brands like National Basketball Association, PGA Tour, Major League Soccer, Sevilla Fútbol Club, Kentucky Derby, New England Patriots among others.

Drupal powers websites of biggest of the sports brands

Why do such great names choose Drupal? You have the reasons stated below:

Phenomenal Features

Drupal offers a large number of open source modules, themes and distributions to help build a sports-centric website with robust features.

For instance, Sports League module helps govern content that is typically used for a sports club. It can help manage multi-competition editions and their standings. It can handle rosters, automatic statistics on players and teams and the match moments.

If you need a lightweight Drupal theme for a sports-related website, Drupal 8 Premier League Theme is a great fit. It has features like slider functionality, colour switcher functionality, social media integration among others.

For a complete starter pack, there is a Drupal distribution called Sportsleague that comes with superb features to help build a sports-related website with its robust features for managing content, user accounts, image uploading, and search.

Security

Among the leading open source CMSs, Drupal has made great inroads for being one of the most secure frameworks and has reported least vulnerabilities among the leading players in the industry. Drupal Security Team actively validates and responds to security issues to it the best security-focussed CMS.

Scalability

Drupal can scale with your needs and help you handle busiest of the days by effectively coping with an enormous amount of traffic.

Drupal can scale with your needs

Multilingual

Drupal 8 has in-built support for language handling in the form of core modules and lets you deliver localised digital experiences.

Mobile-responsive

In this age of mobile devices, Drupal enables the development of responsive sites and web applications that would let you interact with your consumers on-the-go.

Speed

Drupal’s flexible platform ensure the continuous delivery of the web development project and run an agile team. Moreover, Drupal is awesome for implementing performance optimisation techniques and build a high performing website.

Third-party integration

To get the best out of the tools that are outside the periphery of Drupal, you can integrate a variety of marketing technologies and business applications.

Content Workflow

Drupal has terrific tools for making it a loveable CMS for the content authors while creating and publishing content on the site. Its preview feature delineates how your content will look across various devices. Also, the authentication and permissions bring in improvement in the editorial workflow.

A box containing text with a heading and a paragraph describing Drupal's capability in building a great sports website


 

Homepage of NBA with an image of three players wearing blue-yellow, blue-orange, and red-white coloured outfits


Content architecture

You can create the right content architecture and exhibit appropriate content for each context with the help of stupendous display mode tools, Views and a wide range of media types.

Content-as-a-service

Drupal’s content as a service approach allows the front-end developers to build engaging customer experiences with Drupal’s presentation neutral content and RESTful API by utilising tools like Angular, Ember, Backbone and so on.

Multisite

It helps you in governing multisite setup across your enterprise brands, geographies and promotional campaigns on a centralised platform.

Drupal helps you build a solution that adheres to your business requirements

Business-driven

Drupal helps you build a solution that adheres to your business requirements. It does things as your business demands.

Perfect tech stack

Drupal dwells on a modern LAMP technology stack that constitutes Linux, Apache, MySQL, and PHP. This allows Drupal to fulfill the needs of fast-moving, flexible and agile organisations who are in pursuit of creating ambitious digital experiences and help them build next-generation digital platforms.

Large community

One of the most beautiful things about Drupal is its huge community presence as thousands of organisations create solutions with Drupal and in the process build Drupal itself.

Case studies

Travelling along the process of building an actual sports website with Drupal would throw the light on how it powers a sports brand. To do so, let us look at the development of two of the massive names in the sports viz. Major League Soccer and Sevilla FC.

Development of Major League Soccer

Homepage of Major League Soccer with 6 different images showing players playing football and jam-packed football stadium


Major League Soccer (MLS) is a professional soccer league that primarily represents top-notch talent from both United States and Canada. It took help from a digital agency to migrate each team’s site to the Drupal platform.

Drupal proved a remarkable solution to improve the fan experience which was the top-of-the-line requirement. While the site maintainers of the each of the sites could easily configure their team’s instance, custom, rich multimedia platform allowed viewers to enjoy games in real-time from any device. Moreover, switching to a multisite setup cut their deployed footprint.

The Drupal platform integrated with the majority of MLS properties comprising of the league’s videos and stats. Furthermore, the integration of Doubleclick for Publishers Drupal module optimised the placement of ads and attract visitors with vibrant advertising media. Also, Drupal allowed faster editorial capabilities and strong customisations.

Thus, Drupal helped build a stable infrastructure for MLS and reduced severity of alerts.

Development of Sevilla FC

Homepage of Sevilla Football Club with two images of a man speaking before the press and some people posing for a photo in front of a statue


Sevilla FC was established in 1890 and is one of the oldest football teams in the history of Spain’s football history. A digital agency used Drupal 8 to help them improve their online brand spectacularly.

Drupal 8 helped in incorporating multisite architecture to build the main website and several satellite sites. The Club was also able to edit and publish most of the app content from a sole entry point.

Most importantly, the multilingual capability was enabled easily with Drupal 8. Drupal’s open source security was also an important reason for considering Drupal for the redressal of the website. Drupal has also resulted in a better website performance, enriching user experience, lower maintenance costs, and improved hardware consumption.

Hence, Drupal helped in increasing monthly user acquisition and the global audience in addition to stability and high availability of the website.

Conclusion

The mantra of Go Play And Get Fit is something we hear from our peers all our life. And we also hear that mental strength and physical strength go hand-in-hand. That is absolutely true. It also applies to a sports-based website albeit in a different manner so that it can be ‘fit’ enough to provide great digital experiences and grow as a brand. Drupal powers innovation and can be a marvellous CMS for building a sports-related website.

We can assist you in growing as a sports brand with our suite of services.

Contact us at [email protected] to power your sports brand with Drupal.

Dec 09 2018
Dec 09

Thanks to a tutorial from @swentel my attention got back to webmentions again and I gave the first release candidate for another try. What's still confusing to me that I have to enter a link-class and a publishing target manually. The former might be a task for a link widget and the latter may be some lack of understanding the API so far.

The Drupal looks pretty solid and feature rich. Webmentions - the new thing on the web - makes me hope that the dependencies on "social" networks like Twitter, Facebook, Reddit or others could be massively reduced if not eliminated. Sounds like another major step towards de-centralising the web. Thanks a lot to all the contributors to Webmentions and the Drupal integration.

Dec 07 2018
Dec 07
 

In case you missed the news from September, Dries Buytaert announced the end of life dates for both Drupal 7 & 8. The date for both Drupal 7 & 8 is slated for November of 2021, and that may seem strange, but it really makes sense given the differences in 7 & 8 and the widespread usage of 7. Drupal 8 is ending alongside Symfony 3 which powers a lot of the underlying framework of Drupal 8, so that makes perfect sense.

But why is Drupal 7 sticking around? Drupal 7 is the point in Drupal history where many large organizations bought into the CMS. It found a large user base with complicated government, education, and non-profit sites. Drupal was a web revelation for many of these large organizations, and they invested in the concept with time, money, and staff. In turn, the Drupal community benefited from having these organizations invest in the Drupal eco-system. More developers learned Drupal and more agencies took on Drupal in order to services these organizations. This enriched the Drupal community with tons of contributed modules and core contributions.

An unfortunate side of large organizations is that they move at a glacial pace. Making a move from Drupal 7 to Drupal 8 is not something they can just plan and complete in a matter of months. Sometimes funding needs to be procured in a specific way, and other times non-web savvy board members need to be educated on why their perfectly good website needs to go through a costly overhaul. Not to mention the gargantuan planning task that comes with a migration of a complicated site. These things take time, often many months of time. This is why I believe the Drupal 7 end of life has been extended out so far, it's Drupal giving some slack to these large entities that helped grow it into the community and platform it is today.

Still Not Sure You're Ready to Upgrade?  Learn more about the benefits and our upgrade process. 

Start planning now

Starting with Drupal 8, the full rebuild major version upgrades are supposed to be a thing of the past. More on that here. Once the move to 8 is completed major version updates are supposed become a little more smooth as long as regular minor version updates are kept up on. The important takeaway from this is that there is now a target date for when everybody needs to be moved over, it’s three-ish years, and that is really not a ton of time if you are one of these organizations moving at turtle speed. Now is the time to start securing funds, interviewing vendors, and making a plan to get over to Drupal 8. 

It’s true that Drupal 8 is also going to hit its end of life on the same day as Drupal 7, but don’t let that stop you from making the move to 8. As I mentioned already, major version updates after 8 are going to be streamlined and not require complete re-architecting so a move to 8 now will be an easy pipeline to Drupal 9. If you wait for Drupal 9 to upgrade your Drupal 7 site, you may find yourself racing against the clock, and we all know that can be costly for a large web launch. There is no way around it, large sites take a lot of time to plan and migrate, and it’s not unheard of for a rebuild to be estimated out to 13 or 14 months of development. During this time you are also going to want new features and upgrades. Make sure you give your organization the needed time to plan and build the next version of your site.

Do things better

A move from Drupal 7 to 8 shouldn’t just be thought of as a migration of the same old site over to a new CMS. This is the time to update and make your site better, stronger, and faster. Take all of the things you learned from the last version of your site and make it into a better system for both end users and administrators. Starting the move now will give you the time to analyze data and usage, you can send surveys and interview users to make intentional updates that users will be excited about.

Making a plan early will allow you the time to figure out a new infrastructure that is faster and more secure. Maybe integrating a new CDN is the right move for faster page loads, perhaps using something like Solr will speed up getting your customers to the information or products they are seeking faster. This might be the time to explore moving your front end to React for greater performance. Take the time to research the latest and greatest in security options for your site. Stating the plan now gives you the time to review the options. This will not only make the next version of your site so much better, but it will also save on costs in both time and money.

The Drupal updates keep coming

One key reason to get over to Drupal 8 as soon as possible is to take advantage of all of the latest development of Drupal core. Drupal 7 is only getting patch fixes at this point, and all new feature development is happening on Drupal 8. The same thing goes for contributed modules, the majority of new module development is for Drupal 8 only and many popular modules are only doing new development for 8. The past six minor updates for Drupal core have been delivered on time and have pushed the platform to new heights with each release. The next minor release for Drupal 8 (8.7) comes out on May 1st, 2019 and the next (8.8) will be released on December 4th, 2019 so now is a good time to get on board to take advantage of these upcoming updates. Check out the development roadmap for more details.

The end (future) is near

Now that there is a planned end date for Drupal 7, there is no reason not to start preparing the move to Drupal 8 now. If you are waiting for Drupal 9, you are just shortchanging your site the proper time to plan a thoughtful rebuild and migration. All sites still running on Drupal 7 are missing out on the current development efforts of Drupal. You have a date, you know when you’re site is going to fall out of support, now is the time to give it a new extension on life.

Offer for a free consultation with an Ashday expert

Dec 07 2018
Dec 07

At Drupal Europe in September, we were very pleased that project founder Dries Buytaert highlighted a visual prototype of our upcoming integration with GitLab in his keynote.

This video outlines the migration phases that we discussed in the announcement of our partnership with GitLab. Our migration window for Phase 1 is targeted for the first weeks of January, and we hope Phase 2 to be completed shortly in the beginning of 2019.

So what has it taken to get this integration working between September and now?

We're now in the midst of serious migration testing. Testing and re-testing the process in our staging environment, putting load testing in place to stress test our integration, and doing user-validation testing to ensure that the workflows affected by this integration are working as expected.

All in all, we're thrilled with the progress, and very thankful for GitLab's close collaboration. We're excited to be moving the Drupal project to its next generation tooling soon. Once Phase 1 of our migration is complete, it'll be time for Phase 2 and our community will start seeing some tremendous improvements in efficiency and collaboration.

Cheers!

Tim Lehnen

Drupal Association - Executive Director

Dec 07 2018
Dec 07

We’ve collected all of our blog posts from November 2018 to make them even more easily accessible to you! Check them out below.

Our first post from November was a detailed, step-by-step description of creating custom blocks in Drupal 8. It explains what ‘blocks’ are and provides the reader with thorough instructions for both creating blocks through Drupal’s own GUI and for doing it programmatically. Since blocks are a key component of a Drupal site, this post is a useful read for anyone who is just now getting familiar with Drupal as well as for those more experienced developers.

The next blog post was an interview with Agaric’s David Valdez. In this interview, he talks about his mixed early experiences with Drupal and presents Drutopia, Agaric’s project for nonprofits and other low-budget groups. To him, Drupal presented the perfect opportunity to learn a ton of new things, which made him eager to give back to the community. Discover what motivated him to become an active member of the Drupal community and which contributions he is proudest of.

The interview with Adam Bergstein aka Nerdstein provides a very personal aspect, which makes it extremely easy for the reader to relate to him. You can truly feel his passion and get genuinely excited reading about his achievements, and you can’t help but smile at his honest display of love when he talks about his family. In his view, Drupal core should put a greater emphasis on stability than on adding new features. He believes the future of Drupal depends as much on the community as it does on the technology and invites anyone interested in participating in his projects to reach out to him. 

Last but not least came our interview with John Piccozi, co-host of the weekly podcast Talking Drupal. As the Senior Drupal Architect at Oomph, he’s known in the company as the  resident Drupal enthusiast. He kicks things off with an impressive roster of Drupal projects that he’s currently involved in or that he’s worked on in the past, and spices things up with a personal anecdote or two. He concludes with a powerful quote by American scientist Margaret Mead that feels like it was written specifically about the Drupal community. Curious about what it is? Read the interview and you’ll see for yourself!

Well, that’s it for our November blog posts. Keep checking back to never miss an update or a new post!
 

Dec 07 2018
Dec 07

In this blog, we will talk about the best practices which will help you enhance your user experience from good to better and how Drupal 8 can be a game changer in this quest. 

Often we have heard how Drupal is not for beginners. While setting up a Drupal website has become a lot easier with Drupal 8.6, a good website is not just about setting up some web pages under a single domain name. 

Google processes over 3.5 billion searches per day. A stale site, which is not usable or loads very slow and you risk leaving your potential customers frustrated and reflects poorly on your business. While maintenance should never be out of the plan, investing in some good practices can result in the longevity of your website and business alike.  

a man pointing finger on the screen in english clothes


Here are some of the best Drupal web development practices to ensure your website is as much providing a great user experience as it helps you rank your content on Google page 1. 

8 Best Drupal Practices

Creativity isn’t alone when it comes to considering, there are many other factors which can influence your reputation in the online market. 

Drupal’s Architecture

A healthy architecture will not only ensure the demands of different stakeholders is met, but also that the site remains robust for future initiatives as well. 

Make your architecture robust. 

On a basic level, your content structures must include all the fields and content types. A clean content architecture not only helps ensure good performance but a great user experience, and easier maintenance too. 

Although a well-maintained website is critical, you can’t do much if the base is full of errors. 

Less is always more. 

In your development plan choose limited content types and fields so as not to confuse your content creators. Drupal is a powerful tool for displaying content in different formats, languages, and screens. 

Configuration plays an important role in architecture. 

In simple words, configuration is the collection of settings of how the default site functions for the admin as against the popular notion of the placement of content on the website. It includes site name, the content types and fields, taxonomy vocabularies, views and so on.

You can also check out this video by Pantheon on Drupal 8 architectural practices.

[embedded content]

Site configuration data in Drupal is consistent, everything from the list of enabled modules, content types, taxonomy vocabularies, fields, and views can be done easily. 

Your approach must be flexible. 

In the basic level, you must use a new entity type for different types of data. In case of similar data types, use bundles for a single entity type. However, for modules, they are designed to work with Nodes, and not other entity types. 

Using nodes is easy, as it allows to create a new content type through the admin interface without much coding against the case of creating entities which require coding. As a developer, your approach should be flexible. 

Bonus tip: Don’t make configuration changes on a live site without testing them locally.

Check the Codes

We all know how good coding helps improve quality and get better results. Basics should be to start with simple codes. Remember, it is always difficult to modify complicated codes in the future. Thus, you should keep your coding simple for the longer run.

Drupal coding standards are version-independent and "always-current". All new codes follow the current standards, regardless of (core) version. In case you want to update existing code for the current standards, always create separate and dedicated issues and patches instead of squeezing into unrelated patches.

Remember to use US English standards for spellings in your code, which means it will be "color" not "colour".

Here are the top 6 coding practices by Drupal.org:

  1. Use an indent of 2 spaces, with no tabs and the lines, should have no trailing whitespace at the end.
     
  2. All binary operators (operators that come between two values), such as +, -, =, !=, ==, >, etc. should have space before and after the operator, for readability.
     
  3. Control statements should have one space between the control keyword and opening parenthesis, to distinguish them from function calls. Control structures include - if, for, while, switch, etc.
     
  4. All lines of code should not be longer than 80 characters. Lines containing longer function names, function/class definitions, variable declarations, etc are allowed to exceed 80 characters. Conditions should not be wrapped into multiple lines.
     
  5. Arrays should be formatted using short array syntax with a space separating each element (after the comma).
     
  6. When unconditionally including a class file, use require_once(). When conditionally including a class file, use include_once(). In either case, it will ensure that class files are included only once.

You can use Coder for coding standards validation without loading Drupal. 

Infrastructure

Infrastructure covers the stack your website lives on, including the server, the database, and any software layers, such as Varnish or Memcached, which ensure your visitors have a snappy experience. Planning the infrastructure from the start and developing in the same environment can greatly reduce variables and risk at launch time.

Having reliable multiple environment configurations and a solid disaster recovery plan shouldn’t be left to last-minute decisions. When it is, mistakes start arising. Here are a few tips to avoid the most common errors.

Best Practice:

  • Size your stack correctly, not too large, not too small. This can ensure you’re economically prepared for anything.
     
  • Bottlenecks can arise from the hardware or from processes hogging memory.
     
  • Check logs for errors and prepare for growth and spikes. Your stack is only as fast as the slowest component. Focus your efforts there; you’ll probably find low hanging fruit.
     
  • In terms of security, it’s crucial to configure to protect from internal attacks as well as external attacks

Optimize the Frontend

The frontend is more than just theming. 

While the features and their functioning depends on the backend, the usability and aesthetics depend on how well the frontend is taken care of. 

Performance of the website is shouldered equally between both.

“..powerful, adaptable, accessible, clear, concise, natural.”

Quickly brushing up the basics, here’re the best Drupal frontend practices: 

  1. Define component elements (sub-objects) using their own classes. This is to avoid relying on markup structure and overly-generic class names, prefixing them with the component’s name followed by two underscores. 
     
  2. Thoroughly exercise and test your site and resolve any PHP errors that are displayed during theming development.
     
  3. Use a stable administrative theme during development.
     
  4. Use DRY CSS and group reusable CSS properties together. Name these groups logically. 
     
  5. Name components using design semantics. HTML elements already impart semantics on the content and machines cannot derive content-level semantics from class names.
     
  6. In order to reduce the load on the frontend performance of your website:
     
    • Minify JavaScript, CSS, and HTML
    • Aggregate JavaScript and CSS
    • Enable gzip compression
    • Use lazy loading for site assets
    • Keep Inline background images under ~4KB in size
    • Remove unused CSS
    • Use efficient CSS selectors
    • Download 3rd party scripts asynchronously
       
  7. Use SASS to keep your responsive design more organized

Test, Error. Repeat. QA is Important

It is very important to keep the quality of your website an utmost priority before you move on to other technical stuff. While this might feel like a lot of work for your lot to do, it can make an actual difference. A review by your peers (is a must) will help you get an additional idea of how things look and should work.

Successful regression testing gives you the much-needed confidence.

It is also very important that you keep an eye on previous things while adding new stuff. The testing framework PHP unit is inbuilt in Drupal 8. By setting up the testing environment, websites can be tested easily (as samples of test cases which have been written by the community are already available).

Drupal has a very active community support. With almost 1,00,000 active developers who write test cases (which are later merged into Drupal) and submit solutions. With this, you can say that Drupal is quick in providing you with solutions to your problems.

Aim for Google Page 1. Don’t Forget the SEO 

75% of users don’t even click past the first page! 

Starting with the search, it is very important that the user gets the best results in her first search. Now, as a user, you must have tried different keywords in an attempt to connect to the content of your website. Here comes the art (and science) of SEO. 
Drupal offers a suite of SEO modules. However merely deploying the modules isn’t enough, it is important to configure and enable them as well. 

The SEO best practices include: 

  1. Using Robots.txt, so the right pages and information is indexed. 
     
  2. Ability to customize page titles and metadata. Also, it should be capable of automatically populating these respective fields as per SEO norms and best practices. 
     
  3. Navigational drop-down menus are crucial internal link structures, silently contributing to search engine optimization. They establish relevancy and hierarchy across your website to help search engines index them in from the beginning of time. It should also provide easy customization of navigation menus.
     
  4. URL aliasing must be enabled with Pathauto as it ensures the search engine understands what the webpage implies. 

Security Practices

Security is a vast area of expertise and it is quickly changing with time. While the list of do’s and don'ts is extensive and exhaustive to keep up with the threats, vulnerabilities and mitigation strategies, here are the best Drupal security practices to follow in order to keep up the health and security of your website. 

  1. Keep your core updated: A key practice, keeping the core updated will always be the first when listing healthy security practices. Always look out for core updates (include the minor releases as well). In all of its advisories, the Drupal Security Team asks for updating the core version of the system. 
     
  2. Use additional security module: When covering security, there is nothing as better than equipping yourself with more and more. To keep the walls up high, you can use the additional security modules like security kit, captcha, and paranoia. Drupal Security Review can be used as a checklist to test and check for many of the easy-to-make mistakes making your site vulnerable.  
  3. But use only security team approved modules: Your site probably uses a number of contributed modules, although that’s not an issue. Using the stable and approved modules is where the key lies. This is especially worth noting for contrib modules which are more susceptible to vulnerability. 

    Always look out for the green batch when downloading a contrib module. Rest, as the advisory reads, Use it at your own risk! 

  4. Keep Up your Backup:  As an administrator, you have to be prepared for all uninvited events.

Drupal’s open-source basics give it a possibility that it is updated frequently with more and better security modules. 

Maintenance Practices

The life cycle of a website begins from initial plans and extends to the end of the site. The site exists in three different phases: development, deployment, and maintenance. After the site is launched, your website lifecycle practices become critical to the success of changing and maintaining your site.

  • Keep your code under version control.
  • Maintain separate environments for the different stages of the site, and keep them up to date.
  • Restrict access to the production site for all but the most trusted users.
  • Review all logs periodically, including Apache, Drupal, and MySQL.
  • Review and assess your architecture periodically, and plan for the future.

At OpenSense Labs, we understand how important your website is to you. Reach out to us at [email protected] to get a holistic view on how to enhance your user experience.

Dec 07 2018
Dec 07

Online stores open unlimited opportunities with no geographical boundaries. Behind their lines of code are successful purchases, great profits, and happy customers. So online stores should be reliable, efficient, and attractive in everything — from product catalog to e-commerce checkout. An awesome choice for building an online store is Drupal 8, particularly with one of its greatest treasures — Drupal Commerce 2.x. Let’s explore Drupal Commerce 2.x features for your Drupal 8 online store in more detail.

But first, why choose Drupal 8 for e-commerce websites?

Before we move on to Drupal Commerce 2.x features, will start with mentioning at least a couple of Drupal 8 characteristics that will be a great support for your Drupal 8 e-commerce website.

  • Content + e-commerce = more conversions. Drupal is a powerful content management system, which means you are getting more than just an online store. You will have a website full of diverse content types to support your store with good content. The approach of content-driven commerce is a sure way to engaged customers, better SEO, and increased conversions.
  • Mobile responsiveness. Since mobile sales are headed to overtake desktop sales in the nearest years, it is awesome that Drupal 8 is mobile responsive out-of-box. Drupal 8 has everything to make any theme responsive, so its elements magically adjust to mobile device screens. Let your mobile users shop with pleasure!​
  • ​​​​​​Better performance. Loading speed is especially important for online stores. Users want to browse your website, compare various products, and place orders without delay. Drupal 8 has powerful caching techniques for faster loading. One of them is the innovative BigPipe, which lets you serve static website elements to users instantly and load dynamic ones next.
  • ​​​​​​Third-party integration. An e-commerce website usually relies on third-party services for payments, marketing analytics, online customer support, CRMs, and so on. Drupal 8 has built-in web services that make integrations incredibly smooth. 
  • Multi-language. Customers are more willing to buy when they see information in their native language. Drupal 8 has awesome multi-language support. It is very easy to add languages to websites and translate exactly what you need. Interface translations are ready for a hundred languages.

Drupal Commerce 2.x for your Drupal 8 online store

Drupal Commerce is a solution for creating e-commerce sites from scratch or adding e-commerce features of any complexity to existing sites. It is free and open-source, like any other contributed Drupal module. However, it works like a full-fledged e-commerce framework that covers all needs of an online store: product catalog, cart, checkout, shipping, orders, and so on.

Drupal Commerce 2 is a version for Drupal 8, and it has recently issued a new update — Drupal Commerce 2.10, which shows great advancements in product administration, third-party integration, and more. 

There is also an ocean of add-on free contributed modules the extend Drupal Commerce 2.x features endlessly. However, Drupal Commerce 2.x out-of-box comes with an impressive pack of submodules that show its comprehensive capabilities:

Drupal Commerce 2-x - submodules

So let’s see what your online store can enjoy!

Why use Drupal Commerce 2.x for your online store

  • Single or multi-stores available

One of the awesome Drupal Commerce 2.x features is that, instead of having one store, you can enjoy multiple stores in Drupal 8. Each of them can have its own product types, currencies, taxes, contact information, cart, checkout, and everything else.

  • Smart support for multiple currencies 

Drupal 8 lets you add multiple currencies out of the list, as well as custom ones by providing the currency code. 

Here, the true multilingual soul of Drupal 8 shows perfectly! Drupal Commerce 2.x takes into account the names of each currency in each language, their formatting, and other important details. 

This extensive information is provided by the commerceguys/intl library. It works in accordance with the internationally recognized CLDR standard.

Drupal Commerce - adding custom currency
  • Awesome product handling 

Your Drupal 8 online store can have whatever products you wish — from simple ones to products with attributes. Each unique combination of attributes is a product variation in Drupal Commerce 2.x, and it has its own SKU, or machine-readable ID. This is very convenient for managing your orders and stocks. 

Thanks to the Fancy Attributes module now included out-of-box, creative selectors like color swatch are available for product attributes. Multilingual products are easy to set up.

  • 70+ payment gateways & flexible payments features 

Customers are more likely to buy when they have their preferred payment gateways. One of the great Drupal Commerce 2.x features is the support for 70+ payment gateways! Authorize.Net, Braintree, PayPal, Square, Stripe, Vantiv are just the beginning of the list. There are add-on contributed modules for any provider you wish. Custom modules can be created for all others.

Drupal Commerce 2-x - adding payment gateway

The flexibility in payment types is also awesome:

  1. On-site payment gateways let customers enter credit card details directly on your site, and tokenization can be applied for data protection. 
  2. Off-site (redirect) gateways redirect customers to the third-party payment service and back to your site upon successfully completed operation.
  3. Off-site (iframe) gateways do not redirect customers to the third-party service and handle the checkout process in an embedded frame. 
  4. There is also a built-in “Manual” gateway option for such payment methods as Cash on Delivery, Card on Delivery, Cheque, Bank Transfer, and so on. They are marked as pending payments and are moved to complete by your store admin. 
  5. It is also possible to set up an IPN (Instant Payment Notification) service and be instantly notified about events related to PayPal transactions.

Your Drupal 8 online store will have handy interfaces that let you authorize, void, or refund the payments.

  • Flexible promotions and discounts

Engage your customers with promotions! Your Drupal 8 online store will offer you a number of built-in offer types for fixed or percentage amounts. This can work in accordance with conditions that you set.

Drupal Commerce 2-x - adding promition
  • Smart handling for taxes

There is no need to bother about specifying the product price with taxes. Drupal 8 can automatically calculate the taxes for you. It has predefined taxes for countries, as well as lets every merchant choose special taxes. The system knows taxing specifics, for example, the difference in VAT calculation for physical and digital products in the EU. 

  • Multiple order types

Your Drupal 8 onlines store can have not one but multiple orders types with special workflows for each. This allows for creating fine-tuned experiences for various product types. For example, selling tickets will need a different workflow than selling T-shirts. Order receipt emails are also very customizable.

  • Flexible checkout options

Checkouts are incredibly handy. They offer progress indication bars, the ability for users to checkout as a returning customer and reuse the previously entered information, the option for you to allow or not guest checkout, and so on.

Moreover, there can be different checkout flows for different product types. They may vary in the number of steps to take, which customer information to require, and so on.

Drupal Commerce 2-x - checkout flows
  • Great shipping options

With Drupal Commerce 2.x. you can configure the shipping methods, define which products are shippable, and manage the shipments. 

The basic features are handled by an add-on module called Commerce Shipping. It lets you set up such shipping methods as flat rate per order and flat rate per item. 

For more specific and advanced shipping integrations, there is a plugin-based system. There are existing plugins for famous shippers, and developers can create any other for you — whatever your favorite shipper is. It is also possible to provide shipping rates based on conditions.

Create an online store with Commerce 2.x on Drupal 8

This has been just a brief overview of Drupal Commerce 2.x features. This box of treasures can offer much more in good hands. In addition, customization miracles can fine-tune your e-commerce website in every detail to your liking. 

Entrust your Drupal 8 e-commerce website to our Drupal development team. Our guys have a lot of experience with e-commerce and know how to use the benefits of Drupal Commerce 2.x in the best ways for you.

Dec 07 2018
Dec 07

I don't find a lot of time to get on the tools these days and sometimes I miss getting into code. Recent projects have seen me focus on strategy, architecture, data and systems integration. Despite that I am comfortable describing myself as an expert in Drupal 7, having spent years giving the D7 codebase a forensic examination. However, despite Drupal 8.0.0 being released three years ago on November 19, 2015 I have not yet looked at a single line of code or even taken in a demo of its feature set.

Today that changes.

For starters I would like to see just how D8 differs from D7 when we start looking at the code used to build it and the database behind it. The areas I am interested in are:

  • Tech stack
  • Server Setup
  • Installing Drupal
  • File System Layout
  • Page Lifecycle
  • Creating a Theme
  • Site Building
  • Database
  • Deployment process
  • Drush

To that end I am going to create a small, Drupal 8 brochure site for my consulting business, https://fullstackconsulting.co.uk and every step of the way I will dig into the code to see how Drupal 8 differs from Drupal 7.

Tech Stack

It was very tempting to get up and running quickly with one of the Docker containers, but while I would ultimately like to move to a containerised solution my priority today is to take a look at all the nuts and bolts, starting with building the stack that Drupal will run on top of. Therefore, while it is great to see this advancement I am not going to dig into it today.

Instead I am going to create a Puppet profile to handle the server configuration, which will let me tweak and rebuild the server when I need to as I learn more about D8.

First job, select a database server. Many years ago MySQL was the only viable option when spinning up a new Drupal installation, but today the list of recommended database vendors is: MySQL, MariaDB or Percona Server.

We can also choose between versions, for instance MySQL 5.5, 5.6, 5.7 or 8.0, or the corresponding versions of MariaDB or Percona Server.

Of course, MariaDB and Percona Server represent forks of MySQL, so how do we choose which one to select? First a quick recap of the history. MySQL started life under the Swedish company MySQL AB, Sun Microsystems acquired MySQL AB in 2008 and Oracle acquired Sun in 2010. Oracle is already a big player in the database market and that caused concerns, although during acquisition negotiations with the European Commission, Oracle committed to keeping MySQL alive.

Some MySQL folk had already left MySQL and founded Percona in 2006, while others jumped ship to create the MariaDB fork in 2008. General opinion seems to be that the MySQL community has contracted, now mainly relying on Oracle employees for commits, while the MariaDB community is thriving. 

The Percona and MariaDB forks get a lot of credit for being significantly more performant and efficient with memory, and this is appealing after running some big Drupal 7 sites with many entity types and lots of fields. But equally, MySQL 8 claims to be 2x faster than 5.7 with up to 1.8 million queries/second.

Percona aims to stay closer to the MySQL code base, meaning updates to MySQL surface quicker for Percona than they do for MariaDB, but MariaDB will tend to be more ambitious in terms of adding new features.

Pantheon, a major managed Drupal host has adopted MariaDB, which certainly gives a lot of confidence in that approach.

I am not going to benchmark this myself today as I am focussing on Drupal not the database engine it uses. That said, I would like to come back and revisit this topic to see which variant wins the performance contest with a heavyweight setup.

If you need to select a database server for a live project that you expect to be heavy on the database I would suggest you consider the following:

  1. Create some actual tests to establish performance supremacy in your own context.
  2. MariaDB's broader feature set could have more value in a custom build project, not a Drupal project, which ought to adhere more closely to standards for wide compatibility. But do you see a feature that could add value to your project that is not available in the others?
  3. Look at your neighbours. What are they using and why?
  4. People close to MySQL, MariaDB and Percona comment that Oracle is proving to be a good steward for the MySQL open source project, and so maintaining alignment is a positive thing.
  5. Does your OS have a preferred package? If not, would you be prepared to manage packages yourself in order to deviate? The ability to maintain your stack is paramount.

For starters, the stack for this project will look like this:

  • Ubuntu 18.04 LTS
  • Apache 2.4
  • MySQL 5.7 (because it is the one supported by Ubuntu 18.04 out of the box)
  • PHP 7.2

Server Setup

  1. Create a VM, I'm on Windows 10 so it will be Hyper-V
  2. Install Ubuntu 18
  3. Update package list
    1. apt update
  4. Upgrade all upgradable packages,updating dependencies:
    1. apt dist-upgrade
  5. Create a user account to own the files that will exist in the website docroot. This user should be in the www-data group, assuming a standard Apache install. This user will allow us to move files around and execute Composer commands - Composer will complain if you try to execute it as root, more on that later.
  6. Many VM consoles lack flexibility, such as copy/paste for the command line. It will save time if I setup keyless SSH access to the VM. But until that is setup I need an easy way to move data/files from our host machine to our VM. One easy way to achieve this is to create an iso. Most VM software will let you load this onto the VM via the virtual DVD drive.
    1. I will create a private and public key that will be used to:
      1. Access the VM from our host OS without password prompt
      2. Access the git repository holding our code
    2. To create the SSH key and add it to the iso I use the Linux subsystem in Windows 10 to execute the following commands:
      1. mkdir certs
      2. ssh-keygen -t rsa -b 4096 -C "[email protected]"
        1. When prompted change the path to the newly created directory)
      3. genisoimage -f -J -joliet-long -r -allow-lowercase -allow-multidot -o certs.iso certs
        1. In case you use non-standard names the flags in this command prevent the iso from shortening your filename.
    3. Via your VM software load the iso into the DVD Drive.
      1. Mount the DVD Drive
        1. mkdir /mnt/cdrom
        2. mount /dev/cdrom /mnt/cdrom
        3. ls -al /mnt/cdrom
          1. You should see your SSH key listed
    4. Copy the SSH private and public key to the relevant location
      1. mkdir /root/.ssh
      2. cp /mnt/cdrom/id_rsa /root/.ssh
      3. cp /mnt/cdrom/id_rsa.pub /root/.ssh
    5. Add the public key to the authorized_keys file, to facilitate login from your host OS. Using some output redirection makes this easy:
      1. cat ~/.ssh/id_rsa.pub ~/.ssh/authorized_key
    6. Tighten up file permissions to keep the SSH client happy
      1. chmod 0600 -R /root/.ssh
        1. You'll need to do this on both the VM and the host
    7. Find the IP address of your VM
      1. ifconfig
    8. Test your login from your host OS
      1. ssh [email protected]_IP_ADDRESS -i /path/to/your/ssh_key
  7. Setup Bitbucket
    1. Add the public key to your Bitbucket/GitHub/GitLab profile.
    2. If you use a non-standard name for your SSH key, you can tell the SSH client where to find it by providing SSH config directives in /root/.ssh/config on the VM (on your host you will likely not be using root):
    3. host bitbucket.org
          HostName bitbucket.org
          IdentityFile ~/.ssh/YOUR_PRIVATE_KEY
          User git
  8. Deploy DevOps scripts to VM
    1. I want to get my Puppet profile onto the VM. One day maybe there will be a team working on this so I follow a process that use Git to deploy DevOps related scripts, including Puppet, onto the VM. This means that any time the DevOps scripts are updated it will be simple for all team members to get those updates onto their own VMs. The Git repo is managed in Bitbucket so that means I need to get an SSH key onto the VM and then register it on a relevant Bitbucket account.
  9. Now I can deploy the Puppet profile from the git repo and install it on the VM
    1. mkdir /var/DevOps
    2. cd /var
    3. git clone [email protected]_REPO.git DevOps
  10. Puppet is not contained in the official Ubuntu package repositories, but Puppet do maintain their own package repository, which is what I am going to use for this setup:
    1. https://puppet.com/docs/puppet/5.4/puppet_platform.html
  11. No Drupal development environment is complete without XDebug. Again, this is not present in official Ubuntu package repositories, so I am going to enable the Universe repository by adding to /etc/apt/sources.list.d/drupaldemo.list
    1. deb http://archive.ubuntu.com/ubuntu bionic universe
      deb-src http://archive.ubuntu.com/ubuntu bionic universe
      deb http://us.archive.ubuntu.com/ubuntu/ bionic universe
      deb-src http://us.archive.ubuntu.com/ubuntu/ bionic universe
      deb http://us.archive.ubuntu.com/ubuntu/ bionic-updates universe
      deb-src http://us.archive.ubuntu.com/ubuntu/ bionic-updates universe
      deb http://security.ubuntu.com/ubuntu bionic-security universe
      deb-src http://security.ubuntu.com/ubuntu bionic-security universe
    2. I manage these repositories via the Puppet profile
    3. I won't need this on the production system, so I can keep the production OS to official packages only, not using the Universe repository

Installing Drupal

The options are:

  1. Download the code from the project page: https://www.drupal.org/download
  2. Use Composer - a PHP dependency manager: https://getcomposer.org/

The recommended option is to use Composer, that way dependencies on libraries besides Drupal can all be managed together.

Composer will warn if it is executed as root. Fortunately I already created a user in the www-data group, so I can use that user to execute the Composer commands. The reason root is not recommended is that some commands, such as exec, install, and update allow execution of third party code on our system. Since plugins and scripts have full access to the user account which runs Composer, if executing as root they could cause a lot of damage if they contained malicious, or even just broken code.

There is a kickstarter composer template at drupal-composer/drupal-project. Not only will this install the core project but it will also install utilities such as Drush and it will configure Composer to install Drupal themes and modules into appropriately named directories, rather than installing everything into the Composer standard /vendor directory . Using that project the Drupal codebase is installed with this command:

composer create-project drupal-composer/drupal-project:8.x-dev my_site_name_dir --stability dev --no-interaction

Another file layout point is that it will load the core Drupal files into a subdirectory named "web".

Because Drupal is being built with Composer there is a /vendor directory, which will host all of the libraries installed by Composer. This presents another choice to make, do I:

  1. Commit the contents of the vendor directory to git
  2. Add the vendor directory to gitignore.

The argument for committing it is that all the code that our project needs is stored, versioned and cannot change unless explicitly updated, making it stable. But the argument against it is that we will significantly increase the size of the versioned code base and we duplicate the history of the dependencies into our git repository. It is also possible to pin our project to specific versions of libraries via composer.json configuration meaning so we do not need to be concerned about stability.

I will follow the recommendation of Composer and ignore the vendor directory with the following in .gitignore, which is already added if using the Composer kickstarter project:
/vendor/

My Puppet profile has already taken care of creating the relevant MySQL database and user, so the next step is to run the installer in the browser. As with Drupal 7, since Drupal is not already installed I get automatically redirected to the installer:
https://DRUPAL/core/install.php

"Select an installation profile" - this feels very familiar from D7. I choose the Standard profile, which includes some fairly well used modules relating to administrative tasks, text formats etc. Minimal is a little too stripped back for most typical needs, but great for a barebones install.

I complete the install by heading to:
https://DRUPAL/admin/reports/status
This report will instruct us to tighten up file permission, define trusted hosts etc in order to secure our installation.

After completing the config screen I am redirected to the front page and…..it looks just like a fresh D7 installation. The admin menu looks a bit different and I see the layout responding to some screen width breakpoints. But that's enough about the user interface, let's see where the differences are under the bonnet.

File System Layout

The drupal-composer project has setup some directories:
/web - the document root as served by the web server is now a subdirectory of the project root.
/web/core - this is where the core Drupal project files are installed
/web/libraries - libraries that can be shared between core, modules and themes
/web/modules/contrib - modules maintained by the community
/web/profiles/contrib - profiles maintained by the community
/web/themes/contrib - themes maintained by the community
/drush/Commands - command files to extend drush functionality

If we write some code ourselves it should go here:
/web/modules/custom - modules maintained by the community
/web/themes/custom - themes maintained by the community

I can see that module and theme code location differs from the D7 standard of placing it at:
sites/all/modules/[contrib|custom]
sites/all/themes/[contrib|custom]

Composer will install non-Drupal, dependencies into:
/vendor

Core system files have moved. Under D7 we had:
/includes - fundamental functionality
/misc - jss/css
/modules - core modules
/scripts - utility shell scripts

Under D8 we have:
/web/core/includes
/web/core/misc
/web/core/modules
/web/core/scripts

Overall, this feels very familiar so far. But there are some new directories in D8:
/web/core/assets/vendor - js and cs for external libraries such as jquery

We have yaml based configuration scripts. The ones for core are here:
/web/core/config/install - only read on installation
/web/core/config/schema - covers data types, menus, entities and more

Modules can follow this same convention, defining both  the install and schema yaml scripts.

There is a new directory for classes provided by Drupal core:
/web/core/lib

A functional test suite can be found at:
/web/core/tests

Page Lifecycle

URL routing

As in D7, .htaccess routes requests to index.php. However, there is another script that looks like it could play a part in URL routing:
.ht.router.php

Upon close inspection though, .ht.router/php is used by the builtin web server and does not typically play a role in menu routing.

Request Handling

A fairly standard Inversion of Control principle is followed, as was the case with Drupal 7 and earlier. Apache routes the request to index.php which orchestrates the loading of the Drupal Kernel and subsequent handling of the request, executing any custom code we may implement at the appropriate point in the request lifecycle.

It is apparent right away that Drupal 8 is object oriented, loading classes from the DrupalKernel and Request namespaces, the latter being part of the Symfony framework.

use Drupal\Core\DrupalKernel;
use Symfony\Component\HttpFoundation\Request;

We won't have to manually include class files, as long as conventions are followed, because there is an autoload script that will scan the vendor directory:
$autoloader = require_once 'autoload.php';

Now it is time initiate an XDebug session so I can trace a request from start to finish and see exactly what this new DrupalKernel class does:

$kernel = new DrupalKernel('prod', $autoloader);

The first observation is that an environment parameter is being passed to the DrupalKernel constructor, this indicates that we can have different invocations for dev, staging and prod environments.

The $kernel object is also initialised with an instance of the autoloader, which we can expect to be used frequently in the newly object oriented D8. Next step is to create a $request object:

$request = Request::createFromGlobals();

This is standard Symfony code storing GPCES data on $request. Now the DrupalKernel class handles the request:

$response = $kernel->handle($request);
Within the handle() method we find that site settings are handled slightly differently, with the global $conf variable gone and variable_get() function replaced with the Settings class exposing a static get method to retrieve specific settings. But then things start to look familiar, with the inclusion of bootstrap.inc and the setup of some php settings. After some OO based config loading and PageCache handling we come across more familiar territory with the DrupalKernel->preHandle() method, which invokes loadLegacyInclude() to require_once many .inc scripts, such as common.inc, database.inc etc, this has a very familiar feel to it.

Module loading appears to have had an update, with the invocation of the Drupal\Core\DependencyInjection\Container class being used to load all enabled modules.

Where things have changed significantly is processing the URI to identify the menu_item which defined the callback function to invoke. The new approach is much more in keeping with object oriented concepts, favouring an event dispatcher pattern, invoked from within the Symfony HttpKernel class. There are listeners defined in Symfony and Drupal classes handling such task as redirecting requests with multiple leading slashes, authenticating user etc.

We haven't got as far as looking at modules yet, but it looks like Drupal modules are now able to define listeners for these events, nice.

Once the kernel events have all been handled a $controller object is initialised. Interestingly, before the $controller is used a controller event is dispatched, giving the opportunity for modules to modify or change the controller being used.

The registered controller is responsible for identifying the class, such as ViewPageController, that will generate the render array that is used for rendering the response body.

An interesting observation given that this debug session was initiated on a bog standard Drupal front page - where path is equivalent to /node the ViewPageController is invoked and it has code that feels very similar to the very popular, Views module from D7, such as view_id, display_id. This makes sense, because now that Views has been baked into Drupal core in D8 we would expect a page listing multiple  nodes to be powered by Views, rather than some other case specific database query.

Response handling has certainly had a refresh, no longer relying on drupal_deliver_html_page() to set the response headers and body in favour of using Response class from the Symfony framework.

There are lots of areas to look at in more detail, such as how blocks are added to the page render array etc, but from this whirlwind tour of the codebase there are some very nice architectural improvements, while also retaining a high degree of familiarity.

Creating a Theme

Before looking at how theming works in Drupal 8 I need to make a call over whether to use a base theme like Bootstrap, or a much lighter base theme such as Classy, one of the core themes added in D8.

Bootstrap would give more to play with out of the box, but my feeling is that it offers more value if you are building multiple sites, often using the various Bootstrap components, so you enjoy more benefit due to reuse which makes the effort learning the framework worthwhile. 

Since the motivation behind this exercise is to explore the capability of Drupal core in D8, I don't want to muddy the waters by adding a lot of additional functionality from contrib modules and themes unless I really need it. This approach will allow me to learn more quickly where D8 is strong and where it is deficient.

I am going to implement a responsive theme based on the core theme, Classy. For starters I create the theme directory at:
/web/themes/custom/themename

The .info script from D7 is gone and instead I start by creating:
themename.info.yml - configure the base theme, regions etc
themename.libraries.yml - specify CSS and JS scripts to load globally

At this point if I reload /admin/appearance I see my new theme listed as an uninstalled theme. If I go ahead and choose the option to Install and Set as Default, then the next time I load the home page it will be rendered in accordance with the new, empty theme.

By default I see that CSS aggregation is enabled, that is going to hinder my development efforts so I will turn that off for now.

Having defined the regions for my theme adding blocks to those regions via the admin UI is very familiar. Altering the page layout is also very similar. Yes, the templating engine has been altered, now using Twig, and variable embeds and control statements are noticeably different, using the {% %} and {{ }} markup. But while we are just setting up the html there is no significant difference at this stage.

One area I do we need to pay attention to is laying out my theme resources. For instance, there is a much stricter convention in terms of CSS scripts in an attempt to bring more structure and readability, for full details take a look here:
https://www.drupal.org/docs/develop/standards/css/css-architecture-for-drupal-8

Another interesting detail is that the theme will by default look for the logo with a svg file extension. If you have a reason to use a png you need to configure this theme setting in themes/custom/themename/config/install/themename.settings.yml:
logo:
  path: 'themes/custom/themename/logo.png'
  use_default: false

Site Building

Past D7 projects have used the Webform module extensively but upon seeing there is not a general release available for D8 I looked at other options. It was only then that I realised that the core Contact module has received a significant upgrade. When coupled with the Contact Storage and Contact Blocks modules it should make Webform redundant in many, although probably not all scenarios.

To kick things off I setup up the contact form recipient, thank you message, redirect path and a couple of fields through the core admin UI:
https://DRUPAL/admin/structure/contact

I decided I wanted a contact form embedded in the footer of my page - for now I am going to ignore what that might mean for full page caching in D8, you know, the problem where a cached page contains a form that has expired.

This is the first point at which a new module needs to be installed. Considering that I am adopting Composer for managing dependencies this ought to be a little different to the D7 days. The steps I followed were:

Update composer.json to require the contact_block module:
"require": {
        ….
        ….
        "drupal/contact_block": "^1.4"
}

Ask composer to update dependencies and install the new module:
composer update

As per the "installer-paths" config in composer.json, the contact_block module installed into the appropriate directory:
web/modules/contrib/contact_block

Now to install the module. Back in D7 I would use:
drush en contact_block

However, there is a problem:
Command 'drush' not found, did you mean:
command 'rush' from deb rush
Try: sudo apt install <deb name>

Given that our composer.json config does fetch drush, we know it is installed but the issue is that it is not in our PATH variable. We can confirm that by getting a successful result when we absolutely referencing the drush binary:
/path/to/drupal/vendor/bin/drush status

The issue is that in order to avoid package dependency problems, when managing the codebase with Composer it is recommended that drush is installed per project. There is a small program we can install in order to run drush in an unqualified context, which is perfect when combined with drush aliases to specify the Drupal root path:
https://github.com/drush-ops/drush-launcher

As in D7, I can navigate through Admin > Structure > Block, but there are some differences once on the page. I scroll to my footer region and hit the "Place block" button, launching a modal that lists out the available blocks. The one I am interested in is "Contact block" and I press the corresponding "Place block" button.

The next form prompts me to select which contact form I would like to render along with standard block config such as Content types, Pages and Roles, which has a very familiar feel. After saving the block layout page and then reloading the frontend page the contact block appears as expected.

A cache clear is needed after playing around with the block placement and styling so as a matter of habit I try:
drush cc all

I am almost surprised to see the accompanying error message:
`cache-clear all` is deprecated for Drupal 8 and later. Please use the `cache-rebuild` command instead.

That will take some getting used to!

I am not a fan of running a local mailserver. Not only does it increase the management overhead but deliverability rates are unlikely to be as high as a proper mail provider due to mail server trust scores and  white lists etc. I have had good success on past projects routing all transactional emails through Sendgrid, and the free tier for low usage projects is a great way to get started. What is more, there is already a stable contrib module available for D8. As before I will start by adding the module reference to composer.json
"require": {
        ….
        ….
        "drupal/sendgrid_integration": "^1.2"
}

Followed by:
composer update --dry-run

All looks good so I kick off the update
composer update

Enable the Sendgrid module
drush en sendgrid_integration

The sendgrid module tells us that it is dependent on this library:
https://github.com/taz77/sendgrid-php-ng

Because dependencies are being managed by Composer I have no work to do here, Composer will take care of fetching that library and installing it into the vendors directory because it is referenced in the composer.json of the sendgrid_integration module.

All that is left to do is login to Sendgrid, generate an API key and use that API key on the Sendgrid config form:
https://DRUPAL/admin/config/services/sendgrid

Another dependency of the Sendgrid module is another contrib module called Mailsystem. This is very similar to D7, I can use the Mailsystem module to specify that Sendgrid should be the default mail sender:
https://DRUPAL/admin/config/system/mailsystem

Now I can fill in the contact form embedded in the page footer and have the results emailed to me by Sendgrid. That was a piece of cake.

Database

The first thing I notice when I list out the tables in the D8 database is that field handling seems to be different. In D7 the database would have field_* and field_revision_* tables for every field. These tables would contain the field values that relate to a particular entity. Those tables are absent but so too are the tables that stored the field configuration: field_config and field_config_instance.

On closer inspection I can see that field data tables now seem to be entity type specific, for example:
node__field_image
node__field_tags

The field configuration points us towards where D8 has done a lot of work. Field configuration data has moved into a general config table. Looking at the names of the config settings it is apparent that the config table is responsible for storing config data for:
fields
field instances
node types
views
cron
text formats

..and more.

In other words it looks like anything related to config will appear here and I am expecting this, coupled with the Configuration API is what will make promoting changes from dev through staging and into production much slicker than in D7.

The users table is a little cleaner with user data no longer serialized as one single lump, instead shifted to a separate table keyed on uid and module and therefore stored in smaller lumps. Other fields that were considered entity properties in D7 were stored directly on the users table in D7, but in D8 they are shifted to a separate table, for example: pass, mail, timezone, created etc.

Similar with nodes, properties such as title, uid, created, sticky have been shifted to the node_field_data table.

On the whole though the database feels very similar. Field data is stored in a very similar fashion, retaining the bundle, entity_id, delta and one or more value columns.

Deployment process

I am going to avoid using a managed service such as Pantheon.io or Platform.sh for this project purely because I would like to see all of the moving parts, problems and general maintenance tasks that are required with a D8 install for the time being.

Instead I will use a plain VM. I will not be using one of the public clouds such as Azure, Google or AWS because this Drupal project is very small and will have stable usage patterns with no requirement for load balancers or elastic scaling capability in the near term. With that type of profile using one of those cloud providers would be a more expensive option in contrast to a bog standard VM from a provider such as Linode.

While writing the last two paragraphs my new VM has been provisioned complete with a Ubuntu 18.04 LTS image. Fortunately, right at the start of this project I wrote all of the server configuration into Puppet manifests, so configuring this server, covering the entire tech stack and including firewall rules will be a breeze.

Let's see how straight forward it is:

  1. SSH into new server
  2. Create a directory to hold our DevOps scripts
    1. mkdir /var/DevOps
  3. Create an SSH deploy key for the Git repo - the production environment should only need read access to the Git repo
    1. ssh-keygen -t rsa -b 4096 -C "[email protected]"
  4. Add the deploy key to the repo
    1. If you accepted the defaults from ssh-keygen this will be id_rsa.pub
  5. Clone the DevOps repo
    1. git clone [email protected]:reponame/devops.git /var/DevOps
  6. Set values in the Puppet config script
  7. Kick off the Puppet agent
  8. Update DNS records to point to the new server

That went without a hitch, all system requirements are now in place.

Puppet has taken care of checking out my Drupal Git repo into the web root, but as discussed earlier, this project does not commit the vendor libraries, contrib modules or core files since Composer is managing these for us. That means the next step is to ask Composer to update all of our dependencies:
composer update --no-dev

The --no-dev option is added to the update command because when deploying into the production environment we do not want libraries such as phpunit present, which could present us with a security risk and needlessly bloat the size of the deployed code base.

Composer tells us that it has completed the installation of dependencies successfully, but we are not done yet because we don't have a database. Rather than complete the Drupal installation wizard to setup the database I will promote the database used in the development environment into the production environment. Since the Drupal install wizard will not be used to setup the database and since settings scripts containing database credentials are not committed to Git a manual update is needed to /web/sites/default/settings.php

These are the settings that will be required at a minimum:
$settings['hash_salt']
$databases
$settings['trusted_host_patterns']

The database export/import is a pretty straightforward task using mysqldump in the dev environment for the export and then the mysql cli in the production environment for the import.

But that approach would not have a great degree of reusability. Instead, I turn to drush.

Drush

Drush has a sql-sync command that allows us to specify a source and a target and it will then take care of the relevant mysqldump and mysql commands. For maximum reusability it is possible to configure drush aliases so that all of the relevant server connection details are stored in a config script resulting in the following, simple command:
drush sql-sync @source @target

Up until Drush 8, which maintained support for D7, aliases were defined in a PHP script named according to this convention:
SITENAME.aliases.drushrc.php

But as of Drush 9 this has changed, the format is now YAML and the filename convention is:
SITENAME.site.yml

There are also some changes regarding where we place drush config scripts. The drupal-composer/drupal-project package that was used to build this project creates the following directory, where we can place drush scripts:
/path/to/drupal/drush/sites

However, I quite like the old convention of being able to store drush alias definitions in ~/.drush. This is because I tend to have Puppet manifests that configure all of this independently from the Drupal installation. This setup is still possible but first of all it is necessary to create the overall config script at ~/.drush/drush.yml, a path that drush will automatically scan. In that script we can specify that it will also be the location of site alias definitions with something like this:
drush:
  paths:
    alias-path:
      - "/home/USER_NAME/.drush"

Now drush will auto detect the alias script at ~/.drush/SITE_NAME.site.yml. The alias script will specify the connection properties for each version of the site:
dev:
  options:
    target-command-specific:
      sql-sync:
        enable:
          - stage_file_proxy
  root: /var/www/vhosts/DEV_HOST_NAME/web
  uri: 'https://DEV_HOST_NAME'
prod:
  host: DEV_HOST_NAME
  options: {  }
  root: /var/www/vhosts/HOST_NAME/web
  uri: 'https://DEV_HOST_NAME'
  user: USER_NAME
  ssh:
    options: '-o PasswordAuthentication=no -i ~/.ssh/id_rsa'

I test if this is working with these commands:
drush cache-rebuild
drush sa

Clearing the cache give Drush a chance to find the new config scripts. The sa command is the abbreviated site-alias command and should list out the aliases that are defined in the site.yml script. Now check that the dev and prod config looks ok:
drush @SITE_NAME.dev status
drush @SITE_NAME.prod status

At this point I can execute a database sync:
drush sql-sync @SITE_NAME.dev @SITE_NAME.prod

Job done. Now the site loads correctly in the production environment.

Wrap Up

That is all I have time for in this walkthrough and while there are some areas I would like to come back to, such as custom modules, querying the database, entity API, form API and configuration management, I have seen enough to confirm that Drupal 8 represents a great, architectural step forward while feeling very familiar to those coming from a Drupal 7 background.

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web