Feb 27 2019
Feb 27

DrupalCamp London 2019 is approaching fast. Are you ready for another great time with Drupal? This year, 42 sessions on Drupal and related topics are scheduled. We hope, that with our help, you will choose the most promising lectures. Below are a few sessions you should definitely visit. We've picked topics both for experienced coders and beginners, as well as something for business owners, editors, marketers and others. 

1. Visual regression testing  

https://drupalcamp.london/session/visual-regression-testing-patterns

This talk will cover:

  • BackstopJS for visual regression testing on Pattern Lab patterns on a Drupal 8 theme. 
  • How to set up regression testing for each pattern and for the entire pattern library and the problems you could run into when setting up regression testing for patterns. 
  • The benefits of using this approach.

To take a part in this session, it’s best to have basic knowledge of how the integration of Drupal and Pattern Lab works. 

2. No Monkey Business Static Progressive Web Apps

https://drupalcamp.london/session/no-monkey-business-static-progressive-web-apps

This talk will cover:

  • An overview of the architecture used to deliver ii.co.uk.
  • How the GatsbyJS was used to generate static content from Drupal and other dynamic sources.
  • How these pages were further hydrated with React for dynamic content after the initial page load.
  • The custom cache handling implemented to keep content build pipelines.
  • Division of the responsibilities for content generation between GatsbyJS and Drupal.
  • Resolving the real-time preview issue without waiting for Gatsby's upcoming hosted paid preview service.

3. Layouts in Drupal: past, present and future

https://drupalcamp.london/session/layouts-drupal-past-present-and-future

This talk will cover:

  • The history of building layouts in Drupal. 
  • Using Node reference (CCK), Nodequeue and custom template to build newspaper and magazine style layouts in Drupal 5. 
  • Having a look at "page builders" like Panels, Context and Block Visibility Groups. 
  • Dividing into CSS Grid layouts and using plugins like Masonry and GridStack for more advanced grid style layouts. 
  • Alternative approaches like Paragraphs, ECK/IEF and Bricks to create custom layouts. 
  • The pros and cons of these layout approaches and if and why they are now outdated.
  • New Layout Builder and some possible new approaches for building layouts in Drupal.

4. Creating an enterprise level editorial experience for Drupal 8 using React

https://drupalcamp.london/session/creating-enterprise-level-editorial-experience-drupal-8-using-react

This talk will cover:

  • Recent project results where a decoupled application with React was created, allowing the edition of content directly in the frontend. Using the possibilities of React to the fullest. 
  • Sharing an editorial experience with 'in-place' editing, 'context-sensitive' editing, 'drag-n-drop' content placement and creation, and much more.
  • Presentation of the application and vision of what an enterprise level editorial experience should look like.
  • What to expect when going fully decoupled with editorial experience and how this approach fits into the development of Drupal.

5. Migrate to Drupal

https://drupalcamp.london/session/migrate-drupal

The talk will cover:

  • Migrations in Drupal 7 and Drupal 8
  • Effective communication to project stakeholders
  • Writing and running efficient migrations

To take a part in this session, it’s best to have basic PHP coding skills and understanding of Drupal site building.

6. Droopler distribution - How can you save even 100 man-days during the development of a new website with Drupal

https://drupalcamp.london/session/droopler-distribution-how-can-you-save-even-100-man-days-during-development-new-website

Maciej Lukianski will show you that you don't have to possess a budget of over ten thousand dollars if you need Drupal 8.

This talk will cover:

  • Droopler modules.
  • What paragraphs Droopler can offer to build your new page fast.
  • How simple it is to build a new landing page with Droopler in a live demo.
  • What ideas we have for the future functionalities of Droopler.

7. Drupal 8 SEO

https://drupalcamp.london/session/drupal-8-seo

This talk will cover:

  • Drupal modules, Google tools and external tools and how to use them to prepare an SEO strategy.
  • How to plan an SEO strategy for your website and how to compare your website with the competition.

8. Out of the Box Initiative Update

https://drupalcamp.london/session/out-box-initiative-update

This talk will cover:

  • The project in the past and present state of the initiative.
  • Targets for inclusion in Drupal 8.7.0
  • Ways to contribute to the project.
  • Plans for the more distant future.

9. Scrum everywhere - how we implemented Scrum not only in software development projects

https://drupalcamp.london/session/scrum-everywhere-how-we-implemented-scrum-not-only-software-development-projects

This talk will cover:

  • Using Scrum in the marketing team.
  • Using Scrum in QA team to improve software testing in the whole company.
  • Using Scrum for company management.
  • Using Scrum for the training of junior developers.

10. Accessibility in UX Design: How we redesigned The University of West London’s website for everyone

https://drupalcamp.london/session/accessibility-ux-design-how-we-redesigned-university-west-londons-website-everyone

This talk will cover:

  • Importance of accessibility in design, showcasing examples from industry giants such as Microsoft.
  • Highlights of accessible design.

11. How to start contributing to Drupal without code

https://drupalcamp.london/session/how-start-contributing-drupal-without-code

This talk will cover:

  • Non-code contributions and impactful ways to get involved in the Drupal project.
  • How to get started.

Summary

The talks presented by us are just a small part of what you will learn during DrupalCamp London 2019. Undoubtedly, the selection of the most important presentations out of 42 proposals is a real challenge. Perhaps the above list will help you choose. Certainly, the level of the conference will be as always deliberately high. Therefore, surely everyone will leave London with a huge dose of knowledge of Drupal innovations.

Feb 27 2019
Feb 27

If you’ve ever wished there was a way to easily import CSV data into your Drupal website then this post is for you. With the new Entity Import module, website admins can easily add and configure new importers that map CSV source files to Drupal 8 entities. Entity Import adds a user interface for Drupal 8’s core migration functionality. It lets you add importers for any entity in the system, map source data to entity fields, configure process pipelines, and of course, run the import.

Why Another Drupal Import Module?

In Drupal 8 there are already several good options for importing and migrating content using the admin interface.

The Feeds module is one approach for importing content, with a great user interface and huge community of support. Feeds has been around for 7+ years, has been downloaded 1.8 million times, and, according to its Drupal project page, is being used on more than 100,000 websites.

The Migrate module, a part of core for Drupal 8, is an incredibly powerful framework for migrating content into Drupal. With Migrate, developers can create sophisticated migrations and leverage powerful tools like rollback, Drush commands, and countless process plugins for incredibly complex pipelines.

We use both Migrate and Feeds extensively. (Check out this recent post from Joel about getting started with Migrate.) Recently, though, we needed something slightly different: First, we wanted to provide Drupal admins with a user interface for configuring complex migrations with dependencies. Second, we needed Drupal admins to be able to easily run new imports using their configured migrations by simply uploading a CSV. Essentially, we wanted to give Drupal admins an easy-to-use control panel built on top of Drupal core’s migrate system.

My First Use Case: Complex Data Visualizations

Here at Aten, we do a lot of work helping clients with effective data visualizations. Websites like the Guttmacher Institute’s data center and the Commonwealth Fund’s 2018 interactive scorecard are great examples. When I started working on another data-heavy project a few months ago, I needed to build a system for importing complex datasets for dynamic visualizations. Drupal’s core migration system was more than up for the task, but it lacks a simple UI for admins. So I set about building one, and created Entity Import.

Getting Started with Entity Import

Download and Install

Entity Import is a Drupal module and can be downloaded at https://Drupal.org/project/entity_import. Alternatively, you can install Entity Import with composer:

composer require drupal/entity_import

Entity Import is built directly on top of Drupal’s Migrate module, and no other modules are required.

Importer Configuration

Adding New Importers

Once the Entity Import module is installed, go to Admin > System > Entity Import to add new importers. Click “Add Importer.”

For each importer, you will need to provide:

  • Name - In my case I used “Dataset,” “Indicators,” and “Topics.” Generally speaking, I would name your importers after whatever types of data you are importing.
  • Description - An optional description field to help your Drupal administrators.
  • Page - Toggle this checkbox if you want to create an admin page for your importer. Your administrators will use the importer page to upload their CSV files.
  • Source Plugin - At the time of this writing, Entity Import provides just one source plugin: CSV. The system is fully pluggable, and I hope to add others – like XML or even direct database connections – in the future.
  • Plugin Options - Once you choose your source Plugin (i.e. CSV) you’ll have plugin-specific configuration options. For CSVs, you can choose whether or not to include a header row, as well as whether or not to support multiple file uploads.
  • Entity Type - Specify which entity type your data should be imported into.
  • Bundles - Once you pick a entity type, you can choose one or more bundles that your data can be imported into.
Field Mappings

Configuring Importers

Each Importer you add will have its own configuration options, available under Admin > Configuration > System > Entity Importer. Configuration options include:

  • Mappings - Add and configure mappings to map source data to the appropriate destination fields on your entity bundles. (Important side note: when creating mappings, the human readable name can be whatever you wish; the machine name, however, needs to match the column header from your source data.)
  • Process Plugins - When you add new mappings you can specify one or more process plugins directly in the admin interface. This is where things get interesting – and really powerful. Drupal 8 provides a number of process plugins for running transformations on data to be migrated (read more about process plugins in Joel’s recent migrations post). With Entity Import, you can specify one or more process plugins and drag them into whatever order you wish, building process pipelines as complicated (or simple) as you need. Your admins can even manage dependencies on other imports; for example, mapping category IDs for an article importer to categories from a category importer. Again, no coding necessary. Entity Import provides a user interface for leveraging some of Migrate’s most powerful functionality.
Import Screen

Importing the Data

Once you’ve added and configured importers, go to Admin > Content > [Importer Name] to run a new import. You’ll see three tabs, as follows:

  • Import - This is the main import screen with upload fields to upload your CSV(s). If your import has dependencies specified in any of its process plugins, the import(s) it depends on will show up on this screen as well. (TODO: Currently, the interface for managing multiple, interdependent imports is a little complicated. I’d like to make it easier for users to visualize the status of all interdependent migrations at-a-glance.)
  • Actions - After you run an import, you can rollback using the options on the actions tab. (TODO: I’d like to add other actions as well; for example, the ability to change or reset import statuses.)
  • Logs - Migration logs are listed for each import, allowing admins to quickly see if there are any errors. You can quickly filter logs by message type (i.e. “notice” or “error”).
Import Actions

My Second Use Case: Importing Courses for a University

Soon after wrapping up a working prototype for the data visualization project I mentioned above, I was tasked with another project. A prominent university client needed to quickly import an entire course catalog into Drupal. Beyond running a single import, this particular organization needed the ability to upload new CSVs and update the catalog at any point in the future. The use case was a perfect match for Entity Import. I installed the module, spent a few minutes adding and configuring the course importer, and presto!

Next Steps for Entity Import

Writing clean, reusable code packaged as modules is a huge benefit for Drupal development workflows. Even better, Drupal.org module pages provide really great tools for collaboration with features like issue tracking, release management, and version control built into the interface. I have a few TODOs that I’ll be posting as issues in the days ahead, and I am excited to see if Entity Import fills a need for others like it has for me.

If you run a data or content -intensive website and have trouble keeping everything up to date, Entity Import might be just the ticket. We’d love to give you a quick demo or talk through how this approach might help – just give us a shout and we’ll follow up!

Feb 27 2019
Feb 27

Drupal 8 websites can easily exchange data with third-party websites or apps. These can be iOS or Android devices, applications on Vue.js, React, Angular, or other JS frameworks, and so on. Web services in Drupal 8 core take care of the smooth interaction. To share Drupal data, developers often use REST export with Views in Drupal 8. Today, we will take a closer look at Views REST export.

REST export with Views in Drupal 8

The Views module is in Drupal 8 core, and lets us quickly create collections of content, users, taxonomy terms, and more. We can organize Drupal data into a View and prepare it for export in the JSON format needed by third-party apps. As a result, we will have a RESTful API endpoint, where other resources can call to get our data.

Enabling the Drupal 8 modules for REST export

To provide REST export with Views in Drupal 8, we begin with enabling the core “Web services” modules. Two of the four will be enough for today’s example — RESTful Web Services and Serialization.

Creating a RESTful View

We are going to share Drupal content. For this, a custom content type or a default one (“Article” or “Basic page”) will be OK. Let’s use “Article.” We can create a couple of test articles (Article 1, Article 2, Article 3) and then move on to the View creation.

In the “Structure — Views” tab of our site’s admin, we click “Add new view,” and do 3 things:

  • Give our View a name.
  • Check “Provide a REST export” option (which we see next to the usual “Create a page” and “Create a block” options).
  • Write a REST export path (where other resources can call to get our content).

Configuring the RESTful View

Our RESTful View looks almost like the classic version, but the output is in JSON. It has standard Views capabilities, so we can:

  • choose to display fields, entity, or search results
  • do filtering and sorting
  • limit the amount of results
  • create contextual filters

and much more.

Let’s filter our View to only show the “Article” content type.

Now we choose to display fields:

And we add a couple of fields to the View — “title” and “body.” It is also possible to specify a custom URL path for each field in the field settings, if we for some reason do not want to use their default ones.

An unusual but important element in the RESTful Views interface is the “Serializer” tool in the “Format” section. In its “Settings,” we can select the formats to output our content in.

In our case, we have a choice between JSON and XML (other modules would add more formats). If no format is chosen, the default output will be in JSON.

Let’s save our View. The data in JSON is available through the URL path that we provided (json/content). This is our REST API endpoint, and it has this structure:

website-example/json/content.

We receive JSON output in ASCII format that is encoded for security reasons. Our JSON is ready to be sent to third-party apps and then decoded by them.

To check if the code validates, as well as to see how it looks in a more “attractive” way, we can use a Chrome app or one of online JSON formatters & validators.

Here’s how one of them processes our code:

Another JSON formatter does more decoding for our JSON better, and makes it even more human-readable:

JSON validates, so our View is ready to share content. We have successfully configured REST export with Views in Drupal 8.

Let’s create the desired export for your Drupal website

Although the REST architecture supports CRUD (create, read, update, and delete) operations, Views is only for R (reading).

So if you want mobile app users to be able to edit your Drupal 8 content, REST export with Views in Drupal 8 will not do for you. We could offer you a different solution then like a full REST API in the JSON format, custom endpoints, and so on.

Whatever are your ideas about third-party integration, our Drupal team will help you implement them!

Feb 27 2019
Feb 27

With the increased number of accessibility lawsuits for inaccessible websites, it's no wonder that offers for quick fixes are a hot commodity. Unfortunately, the saying, “You get what you pay for” may not apply to accessibility overlay solutions.

So, what do you do? First, let’s take look at how quick-fix web accessibility overlay solutions actually work.

 

What are Web Accessibility Overlays?

Imagine your HTML web page on your web server. It can be an actual HTML file or a cached version of the page that your content management system creates on your behalf. Now, imagine a Javascript being added to the file and that script makes changes to the HTML rendered in that HTML file.

Tada! That Javascript change is the fix you need and your web page is now accessible. Or, is it? That depends.

  • It depends on what’s wrong on your page.
  • It depends on whether a Javascript can actually fix the issue.
  • It depends on whether the cached HTML page has changed and if the Javascript still works.

 

Are Accessibility Overlays for You?

With all the depends-on-this and depends-on-that scenarios that can come up, we can’t say definitively one way or the other. Therefore, we are going to walk you through a process where you can decide what’s best for your site.

The following four-step process sounds simple, but in the long run, it might not be. We’re sorry, but it’s the nature of the beast.

  1. Find the issue.
  2. Identify the required fix.
  3. Implement a solution.
  4. Test the solution.

1. Find the Issue.

The short description of this step is to conduct an audit, using both automated tools and manual testing. If you don’t have the in-house skills to identify all your potential issues, Promet Source is here to help. 

Common web page accessibility issues that can be caught by automated testing include:

  • Missing Alt text for images
  • Insufficient color contrast
  • Label issues in your forms (there are other types of form issues as well)
  • Incorrect application of headings
  • Missing titles for iFrames
  • HTML elements are missing landmark roles
     

Other issues that are better identified with manual testing include:

  • Insufficient alternative text for images
  • Lists that are not coded as lists
  • Attached non-web content, such as PDFs, are not accessible for one or many reasons
  • Missing media captions and/or transcripts

2. Identify the Required Fix.

So, how do you fix the issues? As you can imagine, given the list above, not all fixes are code based. Let’s look at a few examples.

  • Missing Alt text. It takes a human to see an image and describe it. You would need a Javascript that could apply all the unique descriptions that are missing. 
  • Label issues in forms. Given there are numerous issues associated with this topic, you would need a Javascript for each type of issue. Of course, like Alt text, some labeling requires a human decision as to which fix renders the correct solution.
  • Keyboard focus not visible. In order to see where a user’s tab key has landed, the object needs to change. For example, a border is applied. Borders are managed in the CSS files, not the HTML file. You would need a Javascript that could apply an inline style of your choosing for each tab-able object on the page.

3. Implement a Solution.

Fix the source code or go with an Accessible Overlay solution. This is where a couple more “depends” questions come into play.

  • Depends on whether the cost to locate and configure the Javascripts to fix your pages’ issues is less than the cost to fix the source.
  • Depends on your plans for updating your site.

And, let’s not forget the “depends” questions already asked. Is there a Javascript solution for your issue? If not, fixing the source code might be your only option. 

This is where things can get complicated. Not all systems allow you access to resolve the issue. Therefore, you might have to switch to a system that supports accessibility or allows you to make the appropriate changes.

4. Test the Solution.

Just because a Javascript solution looks like it will work, that doesn’t mean you don’t have to test it. Remember, if your site is running off of a content management system, you might find that the cached HTML page that gets “fixed” by the overlay solution, no longer has the exact same broken code as it did a few hours ago.

So, test, test, test. And then, keep watch. 

Next, remember that new accessibility issues can creep up over time. For instance:

  • Content author introduces a data table in a blog post that is not accessible.
  • Functionality from an external source changes, and not for the good.
  • A security patch is applied and it’s just enough to confuse the overlay you have in place.

Continued monitoring is a must when dealing with quick-fix solutions.

 

Conclusion

Accessibility overlays can serve a purpose. However, before you commit to an overlay solution, perform a cost-benefit analysis. 

  • Is the cost of obtaining a quick fix higher or lower than the cost to fix the source? 
  • Is the cost of monitoring and reapplying the overlay solution less costly than fixing the source? 
  • Is the cost of a lawsuit for an inaccessible page, where your overlay solution has stopped working, worth the quick fix?

At Promet Source, we are serious and passionate about accessibility. We’re here to help you evaluate your options and make decisions that are sustainable and strategically sound. Contact us today 
 

Feb 26 2019
Feb 26

Hi [person or company],

 

My name is Jacob Rockowitz and I am one of the proud maintainers of the Webform module. The Webform module is one of Drupal's most installed modules. It provides all the features expected from an enterprise proprietary form builder combined with the flexibility and openness of Drupal.

 

After three years of hard work, the Webform module for Drupal 8 now has a stable release. As with most open source projects, there is still work to be done. To help support our work, the maintainers of the Webform module have set up an Open Collective.

 

At [event] we spoke briefly about [company] becoming a backer of the Webform and other Drupal-related Open Collective. Open Collective provides a transparent system to collect and distribute funds to help support the Drupal community’s ongoing efforts. We are going to use Open Collective to collect funds to help build, maintain, and promote the Webform module and its add-ons and integrations. Our goal is to help everyone have a successful experience using the Webform module. For us to help you succeed, we need to know what you need.

 

It is reasonable that organizations may have reservations about committing resources to open source projects. For large enterprise projects, providing some financial support will ensure stability and growth. Open Collective provides an easy way for organizations to get involved. It is also possible for us to leverage Open Collective to prevent any technical hurdles from delaying your projects. For example, Open Collective can be used to push bug fixes, sponsor a feature, get support, provide training, improve documentation, and even mentorship.

 

Drupal, in its current form, is amazing because everyone has contributed to it in extraordinary ways. We collaborate on the code to build ambitious enterprise websites and applications. The biggest unknown when installing an open source module or theme is, "Is this software supported?" and "How can I get help?". Backing a Drupal-related Open Collective will help address both of these concerns and more.

 

Please consider becoming a $10 monthly backer of the Webform and other Drupal-related Open Collective including Simplytest.me and Drupal Recording Initiative.

 

If you have ideas on how we could use Webform's Open Collective to help you and your clients succeed, please post your comments on this blog post.

 

Thanks for listening,

Jake Rockowitz and fellow Webform maintainers

Feb 26 2019
Feb 26

With the announcement of Drupal 9 we want to talk about how this affects our customers, what to expect when new versions come out and to let you know what we do at Amazee Labs to ensure the transition will be painless.

“The big deal about Drupal 9 is … that it should not be a big deal.”

- Dries Buytaert, Drupal Founder
 

Background

The changes to Drupal between versions 7 and 8 were, quite frankly, enormous. Drupal previously had a justified reputation for doing its own thing and ignoring burgeoning standards and practices utilised elsewhere in the PHP community. So, when Drupal 8 was announced, one of the main goals of the release was to get off the Drupal island and start to utilise some of the millions of lines of open source code and documentation available elsewhere.

There were many great sides to this upgrade. The code was being built on a more solid and tested foundation, principally being based on the Symfony framework and leveraging numerous other systems and libraries. This helped Drupal become more enterprise focussed whilst opening the development field to engineers of other systems who were already familiar with the standards and practices now utilised in Drupal.

Unfortunately, the major technical upgrade to Drupal also introduced some headaches. Migrating between Drupal 7 and Drupal 8 can be time consuming and expensive. As a result of this, businesses who undertook such a migration can be forgiven for worrying about Drupal 9 being released just 5 years after Drupal 8. Some clients have expressed concern about using Drupal 8 when another expensive upgrade seems to be just around the corner.

Why Drupal 9 is different

In short, if you keep your Drupal 8 website up-to-date, there will be no major upgrade worries. The core maintainers of Drupal want to make Drupal upgrades easy forever from now on. The Drupal team has a plan to ensure that Drupal 9 will essentially be a minor process. This is possible because Drupal 9 will be built in the same manner as Drupal 8, with the same practices and core libraries. Unlike Drupal 7 to Drupal 8, there will be no major architectural or structural changes to the codebase. 

The main changes, other than bug fixes, improvements and new features will be the upgrades to Drupal’s core libraries. For example, Symfony 3 (the library upon which Drupal is built) comes to its end-of-life in 2021, so it makes sense to have Drupal 9 running on Symfony 4 at that point.

End of support flow chart

How is this easy upgrade achievable? Well, the Drupal team will continue its 6-month release cycle until Drupal 9 is released. In these releases, the code will be deprecated and upgraded to bring it closer to the components and libraries that will be used by Drupal 9, ensuring that when the time does come to upgrade everything will be in place for an easy transition.

Maintenance is key 

Keeping up with new releases and updates ensures that your website stays relevant and secure, and also means that switching from Drupal 8 to 9 will be much more routine. By partnering with us even after your website is created, we can take proactive steps such as making sure there’s no deprecated code in your site before the newest release.

Feb 26 2019
Feb 26

Kind of stuck here? One one hand, you have all these software development technologies that are gaining momentum these days —  API, serverless computing, microservices — while on the other hand, you have a bulky "wishlist" of functionalities and expectations from your future CMS.  So, what are those types of content management systems that are and will be relevant many years to come and that cover all your feature requirements?

And your list of expectations for this "ideal" enterprise-ready content infrastructure sure isn't a short one:
 

  • to enable you to build content-centric apps quick and easy
  • multi-languages support
  • user role management
  • a whole ecosystem of plugins
  • inline content editing
  • to be both user and developer friendly
  • personalization based on visitors' search history
  • to support business agility
  • search functions in site
     

... and so on.

Now, we've done our research.

We've weighed their pros and cons, their loads of pre-built features and plugins ecosystems, we've set them against their “rivaling” technologies and selected the 3 content management systems worth your attention in 2019:
 

But What Is a Content Management System (CMS)? A Brief Overview

To put it simply:

Everything that goes into your website's content —  from text to graphics — gets stored in a single system. This way, you get to manage your content — both written and graphical — from a single source.

With no need for you to write code or to create new pages. Convenience at its best.
 

1. Traditional CMS, One of the Popular Types of Content Management Systems

Take it as a... monolith. One containing and connecting the front-end and back-end of your website: both the database needed for your content and your website's presentation layer.

Now, just turn back the hands of time and try to remember the before-the-CMSs “era”. Then, you would update your HTML pages manually, then upload them on the website via FTP and so on...

Those were the “dark ages” of web development for any developer...

By comparison, the very reason why content management systems — like Drupal, WordPress, Joomla — have grown so popular so quickly is precisely this empowerment that they've “tempted” us with:

To have both the CMS and the website's design in one place; easy to manage, quick to update.
 

Main benefits:
 

  • your whole website database and front-end is served from a single storage system
  • they provide you with whole collections of themes and templates to craft your own presentation layer
  • quick and easy to manage all your content
  • there are large, active communities backing you up
     

Main drawbacks:
 

  • they do call for developers with hands-on experienced working with that a specific CMS
  • except for Drupal, with its heavy ecosystem modules, content management systems scale don't scale well
  • they require more resources —  both time and budget — for further maintenance and enhancement
     

A traditional CMS solution would fit:
 

  • a small business' website
  • a website that you build... for yourself
  • an enterprise-level website
     

… if and only if you do not need it to share content with other digital devices and platforms.

You get to set up your website and have it running in no time, then manage every aspect of it from a single storage system.

Note: although more often than not a traditional CMS is used to power a single website, many of these content infrastructures come with their own plugins that fit into multi-site scenarios or API access for sharing content with external apps.
 

2. Headless CMS (or API-First Pattern)

The headless CMS “movement” has empowered non-developers to create and edit content without having to get tangled up in the build's complexities, as well. Or worrying about the content presentation layer: how it's going to get displayed and what external system will be “consuming” it.

A brief definition would be:

A headless CMS has no presentation layer. It deals exclusively with the content, that it serves, as APIs, to external clients.

And it's those clients that will be fully responsible with the presentation layer.

Speaking of which, let me give you the most common examples of external clients using APIs content:
 

  • static page application (SPA)
  • client-side UI frameworks, like Vue.js or React
  • a Drupal website, a native mobile app, an IoT device
  • static site generators like Gatsby, Jekyll or Hugo
     

A traditional CMS vs headless CMS comparison in a few words would be:

The first one's a “monolith” solution for both the front-end and the back-end, whereas the second one deals with content only.

When opting for a headless CMS, one of the increasingly popular types of content management systems, you create/edit your website content and... that's it. It has no impact on the content presentation layer whatsoever.

And this can only translate as “unmatched flexibility”:

You can have your content displayed in as many ways and “consumed” by as many devices as possible.
 

Main benefits:
 

  • front-end developers will get to focus on the presentation layer only and worry less about how the content gets created/managed
  • content's served, as APIs, to any device
  • as a publisher you get to focus on content only
  • it's front-end agnostic: you're free to use the framework/tools of choice for displaying it/serving it to the end-user
     

Main drawbacks:
 

  • no content preview 
  • you'd still need to develop your output: the CMS's “head”, the one “in charge” with displaying your content (whether it's a mobile app, a website, and so on)
  • additional upfront overhead: you'd need to integrate the front-end “head” with your CMS
     

In short: the headless CMS fits any scenario where you'd need to publish content on multiple platforms, all at once.
 

3. Static Site Generators (Or Static Builders)

Why are SSGs some of the future-proofed content management systems? 

Because they're the ideal intermediary between:
 

  • a modular CMS solution
  • a hand-coded HTML site
     

Now, if we are to briefly define it:

A static site generator will enable you to decouple the build phase of your website from its hosting via an JAMstack architectural pattern.

It takes in raw content and configures it (as JSON files, Markdown, YAML data structures), stores it in a “posts” or “content” folder and, templating an SSG engine (Hugo, Jekyll, Gatsby etc.), it generates a static HTML website with no need of a CMS.

How? By transpiling content into JSON blobs for the front-end system to use. A front-end system that can be any modern front-end workflow.

And that's the beauty and the main reason why static site generators are still, even after all these years, one of the most commonly used types of content management systems:

They easily integrate with React, for instance, and enable you to work with modern front-end development paradigms such as componentization and code splitting. 

They might be called “static”, yet since they're designed to integrate seamlessly with various front-end systems, they turn out to be surprisingly flexible and customizable.
 

Main benefits:
 

  • they're not specialized on a specific theme or database, so they can be easily adapted to a project's needs
  • Jamstack sites generally rely on a content delivery network for managing requests, which removes all performance, scaling and security limitations 
  • content and templates get version controlled with right out of the box (as opposed to the CMS-powered workflows)
  • since it uses templates, an SSG-based website is a modular one
     

And, in addition to their current strengths, SSGs seem to be securing their position among the most popular types of content management systems of the future with their 2 emerging new features:
 

  • the improvement of their interface for non-developers (joining the “empower the non-technical user” movement that the headless CMS has embraced); a user-friendly GUI is sure to future-proof their popularity
  • the integrated serverless functions; by connecting your JAMstack website with third-party services and APIs, you get to go beyond its static limitation and turbocharge it with dynamic functionality 
     

To sum up: since they enable you to get your website up and running in no time and to easily integrate it with modern front-end frameworks like Vue and React, static site generators are those types of content management systems of the future.

The END!

What do you think now? Which one of these CMS solutions manages to check off most of the feature and functionality requirements on your wishlist?

Feb 26 2019
Feb 26

This post was created jointly by Michael Hess of the Security Working Group, and Tim Lehnen, Executive Director of the Drupal Association.

Last year, with the security release of SA-CORE-2018-002, the most significant security vulnerability since 2014, we heard the pain of site owners and development teams around the world staying up at all hours waiting for the complex security release process to complete and the patch to drop. We heard the pain of agencies and end-user organizations required to put teams on late shifts and overtime. We heard from some users who simply couldn't respond to patch their sites on the day of release, because of lack of resources or entrenched change management policies.

We've heard calls from the community for rotating the timezones for security advisories from release to release, or for having more on-call support from security team members across the globe, or simply for a longer horizon between the release of PSA and SA.

Yet at the same time, we're cognizant that these solutions would put increased burden on a security team composed of dedicated volunteers and contributors. There are a number of generous organizations who sponsor many of the members of the security team, but relying on their altruism alone is not a sustainable long-term solution—especially if we consider expanding the role of the security team to address the larger pain points above.

Last week, with the release of SA-CORE-2019-003, we heard these concerns for site owners and the sustainability of the security team echoed again.

The Security Team and the Drupal Association have been developing solutions for this issue for well over a year.

The goals are simple:

  • Provide a new service to the Drupal community, from small site owners to enterprise-scale end users, to protect their sites in the gap from security release to the time it takes them to patch.
  • Create a new model for sustainability for the Security Team, generating funding that 1) covers the operating costs of the program 2) can support security team operations and 3) can support additional Drupal Association programs.

Although the execution will take care and careful partnership, we are happy to announce that we've found a solution.

We're tentatively calling this: Drupal Steward. It is a service to be provided by the Drupal Association, the Security team, and carefully vetted hosting partners.

Drupal Steward will offer sites a form of mitigation through the implementation of web application firewall rules to prevent mass exploitation of some highly critical vulnerabilities (not all highly critical vulnerabilities can be protected in this fashion, but a good many can be - this method would have worked for SA-CORE-2018-002 for example).

It will come in three versions:

  • Community version - for small sites, low-budget organizations, and non-profits, we will offer a community tier, sold directly by the DA. This will be effectively at cost.
  • Self hosted version - for sites that are too large for the community tier but not hosted by our vendor partners.
  • Partner version - For sites that are hosted on vetted Drupal platform providers, who have demonstrated a commitment of contribution to the project in general and the security team in particular, protection will be available directly through these partners.

Next Steps

The Drupal Association and Security Team are excited to bring this opportunity to the Drupal Community.

We believe that the program outlined above will make this additional peace of mind accessible to the broadest base of our community possible, given the inherent costs, and are hopeful that success will only continue to strengthen Drupal's reputation both for one of the most robust security teams in open source, and for innovating to find new ways to fund the efforts of open source contributors.

We will announce more details of the program over the coming weeks and months as we get it up and running.

If you are a hosting company and are interested in providing this service to your customers, please reach out to us at [email protected].

Please also join us at DrupalCon for any questions about this program.

If you are a site owner and have questions you can join us in slack #drupalsteward.

For press inquiries, please contact us at: [email protected]

Feb 26 2019
Feb 26

Recently, a client of ours shared with us their frustration that their website’s internal site search results didn’t display the most relevant items when searching for certain keywords. They had done their homework and provided us with a list of over 150 keywords and the expected corresponding search result performance. Their site was built with Drupal 7 and relied on Search API & Search API Solr for content indexing and content display.
 

Language Processing

Since the website was in Dutch, we started by applying our usual best practices for word matching. No instance of Apache Solr search will work properly if the word matching and language understanding haven’t been appropriately configured. (see also this blog post)

For this particular website, many of the best practices for Apache Solr Search hadn’t been followed. For example, you always need to make sure to store the rendered output of your content and additional data (such as meta tags) in one single field in the index. This way, it’s a lot easier to search all the relevant data. It will make your optimizations easier further down the road and it will render your queries a lot smaller & faster. Also make sure to filter out any remaining HTML code, as it does not belong in the index. Last but not least, make sure to index this data using as little markup as possible, and get rid of the field labels. You can do this by assigning a specific view mode to each content type. If you’re having trouble with these basics, just get in touch with us so we can give you a hand.
 

Boosting

Once this has been fixed, you can use the boosting functionality in Apache Solr to prioritize certain hypotheses on how much more important a certain factor will be in the overall generation of results. The client website that I mentioned earlier, for example, had some custom code written into it to boost the content based on the content type during the time of indexing.

Apache Solr works with relevance scores to determine where a result should be positioned in relation to all of the other results. Let’s dig into this a bit deeper.

http://localhost:8983/solr/drupal7/select?
q=ereloonsupplement& // Our term that we are searching for
defType=edismax& // The eDisMax query parser is designed to process simple phrases (without complex syntax) entered by users and to search for individual terms across several fields using different weighting (boosts) based on the significance of each field while supporting the full Lucene syntax.
qf=tm_search_api_aggregation_1^1.0& // Search on an aggregated Search Api field containing all content of a node.
qf=tm_node$title^4.0& // Search on a node title
qf=tm_taxonomy_term$name^4.0& // Search on a term title
fl=id,score& // Return the id & score field
fq=index_id:drupal7_index_terms& // Exclude documents not containing this value
fq=hash:e3lda9&  // Exclude documents not containing this value
rows=10& // Return top 10 items
wt=json& // Return it in JSON
debugQuery=true // Show me debugging information

When looking at the debugging information, we can see the following:

"e3lda9-drupal7_index_terms-node/10485": "\n13.695355 = max of:\n  13.695355 = weight(tm_search_api_aggregation_1:ereloonsupplement in 469) [SchemaSimilarity], result of:\n    13.695355 = score(doc=469,freq=6.0 = termFreq=6.0\n), product of:\n      6.926904 = idf, computed as log(1 + (docCount - docFreq + 0.5) / (docFreq + 0.5)) from:\n        10.0 = docFreq\n        10702.0 = docCount\n      1.9771249 = tfNorm, computed as (freq * (k1 + 1)) / (freq + k1 * (1 - b + b * fieldLength / avgFieldLength)) from:\n        6.0 = termFreq=6.0\n        1.2 = parameter k1\n        0.75 = parameter b\n        248.69698 = avgFieldLength\n        104.0 = fieldLength\n",


This information shows that item scores are calculated based on the boosting of the fields that our system has had to search through. We can further refine it by adding other boost queries such as:

bq={!func}recip(rord(ds_changed),1,1000,1000)


This kind of boosting has been around in the Drupal codebase for a very long time and it even dates back to Drupal 6. It is possible to boost documents based on the date they were last updated, so that more recent documents will end up with higher scores.
 

Solved?

Sounds great, right? Not quite for the customer, though!

As the client is usually going back and forth with the development company for small tweaks, every change you make as a developer to the search boosting requires a full check on all the other search terms. This needs to be done to make sure the boosting you are introducing doesn’t impact other search terms. It’s a constant battle - and it’s a frustrating one. Even more so because in the above scenario the result that was displayed at the top wasn’t the one that the client wanted to show up in the first place. In the screenshot you can see that for this particular query, the most relevant result according to the customer is only ranked as number 7. We’ve had earlier instances where the desired result wouldn’t even be in the top 50!

To tackle this, we use an in-house application that allows end users to indicate which search results are ‘Not Relevant’,  ‘Somewhat Relevant’, ‘Relevant’ or ‘Highly Relevant’, respectively. The application sends direct queries to Solr and allows the client to select certain documents that are relevant for the query. Dropsolid adjusted this so that it can properly work with Drupal Search API Solr indexes for Drupal 7 and 8.

We’ve used the application from the screenshot to fill in all the preferred search results when it comes down to search terms. In the background, it translates this to a JSON document that lists the document IDs per keyword and their customer relevance score from 0 to 3, with 3 being the most relevant.

This is a very important step in any optimization process, as it defines a baseline for the tests that we are going to perform.
 

Search results

Footnote: This image is a separate application in Python based on a talk of Sambhav Kothari from Bloomberg Tech at Fosdem.
 

Learning To Rank

We quote from the Apache Solr website: “In information retrieval systems, Learning to Rank is used to re-rank the top N retrieved documents using trained machine learning models. The hope is that such sophisticated models can make more nuanced ranking decisions than standard ranking functions like TF-IDF or BM25.”

Ranker performance

Using the baseline that we’ve set, we can now calculate how well our original boosting impacts the search results. What we can calculate, as shown in the screenshot above, is the F-Score, the recall & precision of the end result when using our standard scoring model.

Precision is the ratio of correctly predicted positive observations to the total predicted positive observations.
The question that Precision answers is the following: of all results that labeled as relevant, how many actually surfaced to the top? High precision relates to the low false positive rate.” The higher, the better, with a maximum of 10.
Src: https://blog.exsilio.com/all/accuracy-precision-recall-f1-score-interpretation-of-performance-measures/

Recall is the ratio of correctly predicted positive observations to the all observations in actual class.

The question recall answers is: Of all the relevant results that came back, what is the ratio compared to all documents that were labeled as relevant? The higher the better, with a maximum of 1.
Src: https://blog.exsilio.com/all/accuracy-precision-recall-f1-score-interpretation-of-performance-measures/

If we just look at the top-5 documents, the combined F-Score is 0.35, with a precision of 0.28 and a recall of 0.61. This is quite bad, as only 60% of our relevant documents appear in the top 5 of all search queries. The precision tells us that from the top-5 documents, only 30% has been selected as relevant.
 

Training our model

Before all of this can work, we have to let Solr know what possible features exist that it might use to decide the importance of each feature based on feedback. An example of such a feature could be the freshness of a node - based on the changed timestamp in Drupal -, or it could just as well be the score of the query against a specific field or data such as meta tags. For reference, I’ve listed them both below:

{  
   "name":"freshnessNodeChanged",
   "class":"org.apache.solr.ltr.feature.SolrFeature",
   "params":{  
      "q":"{!func}recip( ms(NOW,ds_node$changed), 3.16e-11, 1, 1)"
   },
   "store":"_DEFAULT_"
},
{  
   "name":"metatagScore",
   "class":"org.apache.solr.ltr.feature.SolrFeature",
   "params":{  
      "q":"{!edismax qf=tm_metatag_description qf=tm_metatag_keywords qf=tm_metatag_title}${query}"
   },
   "store":"_DEFAULT_"
}

Using the RankLib library (https://sourceforge.net/p/lemur/wiki/RankLib/), we can train our model and import it into Apache Solr. There are a couple of different models that you can pick to train - for example Linear or Lambdamart - and you can further refine the model to include the number of trees and metrics to optimize for.
You can find more details at https://lucene.apache.org/solr/guide/7_4/learning-to-rank.html
 

Applying our model

We can, using the rq parameter, apply our newly trained model and re-rank the first 100 results according to the model.

rq={[email protected]19-02-11-12:24+reRankDocs=100}

If we look at the actual result, it shows us that the search results that we’ve marked as relevant are suddenly surfacing to the top. Our model assessed each property that we defined and it learned from the feedback! Hurray!
 

Search results 2


We can also compare all the different models. If we just look at the top-5 documents, the combined F-Score of our best performing model is 0.47 (vs 0.35), with a precision of 0.36 (vs 0.28) and a recall of 0.89 (vs 0.61) This is a lot better, as 90% of our relevant documents appear in the top 5 of all search queries. The precision tells us that from the top-5 documents, 36% has been selected as relevant. This is a bit skewed, though, as for some results we only have one highlighted result.

So, to fully compare, I did the calculations it for the first result. With our original model we only see 46% of our desired results pop up as the first result. With our best-performing model, we improve this score to 79%!

Obviously we still have some work to do to turn 79% up to 100%, but I would like to stress that this result was achieved without changing a single thing to the client’s content. The remaining few cases are missing keywords in the meta tags of Dutch words that somehow are not processed correctly in Solr.
 

Solr


Of course we wouldn’t be Dropsolid if we hadn’t integrated this back into Drupal! Are you looking to give back to the community and needing a hand to optimize your search? Give us the opportunity to contribute this back for you - you won’t regret it.
 

Drupal Solr

In brief

We compiled a learning dataset, we trained our model and uploaded the result to Apache Solr. Next, we used this model during our queries to re-rank the last 100 results based on the trained model. It is still important to have a good data model, which means getting all the basics on Search covered first.

Need help with optimizing your Drupal 7 or Drupal 8 site search?

Contact us

Feb 26 2019
Feb 26

Turmoil in the Drupal community?

Considering the fact that there are around 800.000 websites currently operating on Drupal 7, there will be a huge resource drain for upgrading to the latest installment. Not only will be it be an expensive feat to achieve, but also time demanding. Quite frankly speaking, there is not enough time to be able to upgrade all of the Drupal 7 websites and also not enough Drupal developers to be able to take on the workload. So, what do you think, is it feasable for so many websites to upgrade to Drupal 9 in such a short period of time?

Drupal 9 will be released in the summer of 2020

Drupal 8 has been released on November 19, 2015, this makes it 3 years old already. Its successor, Drupal 9, is making its way towards a release. Drupal 9 is scheduled to be released on 3rd June in 2020. But what does this mean for Drupal 7 and 8?

For starters, Drupal 8 and 7 will stop receiving support in 2021. This is mainly because Symfony 3, one of the biggest dependencies of Drupal 8, will stop receiving support. Drupal 7 will not be backed up by the official community and by the Drupal association on Drupal.org. What this means is that the automated testing services for Drupal 7 will be shut down. On top of that, the Drupal Security Team will stop providing security patches. If you are not able to upgrade to Drupal 9, there will still be some organisations that will provide Drupal 7 Vendor Extended Support, which will be a paid service. However, despite this, there will be a approximately year's worth of time to be able to plan for and upgrade to the latest installment of Drupal.

Overview over the consequences for Drupal 7

What this means for your Drupal 7 sites is, as of November 2021:

  • Drupal 7 will no longer be supported by the community at large. The community at large will no longer create new projects, fix bugs in existing projects, write documentation, etc. around Drupal 7.
  • There will be no more core commits to Drupal 7.
  • The Drupal Security Team will no longer provide support or Security Advisories for Drupal 7 core or contributed modules, themes, or other projects. Reports about Drupal 7 vulnerabilities might become public creating 0 day exploits.
  • All Drupal 7 releases on all project pages will be flagged as not supported. Maintainers can change that flag if they desire to.
  • On Drupal 7 sites with the update status module, Drupal Core will show up as unsupported.
  • After November 2021, using Drupal 7 may be flagged as insecure in 3rd party scans as it no longer gets support.
  • Best practice is to not use unsupported software, it would not be advisable to continue to build new Drupal 7 sites.
  • Now is the time to start planning your migration to Drupal 8.

Source: https://www.drupal.org/psa-2019-02-25

Drupal promises a smooth upgrade to Drupal 9

Good news is that, the change from Drupal 8 to Drupal 9 will not be as abrupt as the change from Drupal 7 to 8 was. This is because Drupal 9 will be based off of Drupal 8, in fact, the first release of Drupal 9 will be similar to the last release of Drupal 8. In short, there will be some new features added, the deprecated code will be removed and the dependenciess will be updated, however, the Drupal experience will not be reinvented. Now, in order to have really smooth upgrade, the only thing necessary is to keep your Drupal 8 updated at all times. This will ensure that your upgrade will come as fluid as possible, without many inconveniences.

What is the best course of action to follow when upgrading to Drupal 9

Well, at first, you have a couple of options at your disposal:

  1. You either wait for Drupal 9 to be launched and then make the change from 7 directly to 9.
  2. You make first the change to Drupal 8 from 7, which is going to be an abrupt change anyway, and then you prepare for the change to Drupal 9.
  3. Your final option would be to find a new CMS altogether, which would be the most resource hungry option out of all. 

So, considering the choices you have at hand, the best of the bunch would be to start preparing for upgrade to Drupal 8 and then, when the time comes, to Drupal 9. By doing this, you will have enough time to plan ahead your upgrade roadmap, without having to compromise on the quality of the upgrade by rushing it. On top of that, this is an opportunity to review how good your website is at attracting leads and converting those leads to sales. Moreso, you can check if your website is still in line with your enterprise vision and mission statement, if not, then here is an opportunity to make your site reflect the beforenamed aspects of your business.

Even though change might look scary to some of you, this is an opportunity to also evaluate and improve your online digital presence. So make use of this chance at its fullest to create and provide a better online environment for your potential and current customers.

Feb 26 2019
Feb 26

This is an addition to the Bibliography & Citation module (BibCite), which enables users to automatically prepopulate bibliographic references data by entering DOI (Digital object identifier).

Let us remind you what bibliography and citation are. A bibliography is a list of all of the sources you have used (whether referenced or not) in the process of researching your work.
A citation is a borrowing of a text fragment, formulas, provisions, illustrations, tables, and other elements of the author. Also, the citation is a non-literal, translated or paraphrased reproduction of a fragment of text, an analysis of the content of other publications in the text of the work.

The Bibliography & Citation - Crossref module’s purpose is to save time and make the process of adding information easy and convenient for admins. With this module, the website’s admins shouldn’t search and write by themselves: 

  • the authors' names
  • the titles of the works
  • the names and locations of the companies that published your copies of the sources
  • the dates your copies were published
  • the page numbers of your sources in special fields.

Everything will be filled out automatically. The only thing admins need to know and write is DOI (Digital object identifier) of books, conference proceedings, articles, working papers, technical reports.

DOI is a modern standard for designating the provision of information on the Internet, used by all major international scientific organizations and publishing houses. 

DOI is unique words and numbers consisting of two parts - prefix and suffix. For example, 12.3456/789; here 12.3456 is a prefix and a publisher id (12 is an identifier tag, 3456 is a line pointing to a publisher).
789 - suffix, it is an object identifier pointing to a specific object.

The full information about the BibCite module created by ADCI Solutions you can find in this case study.

You can download for using the Bibliography & Citation - Crossref and the Bibliography & Citation module on drupal.org website.

Feb 25 2019
Feb 25

By Gio ChacónMarketing | February 25, 2019

By Gio ChacónMarketing | February 25, 2019

So you are looking out for help... Either you have a product idea to develop from scratch or need to augment your IT team, basically; you are looking for the best fit. This read will support you to set the right expectations as well as how to evaluate your future business partner effectively to gain a lifetime relationship, a technical team or the best short-term service.

Let us explore both perspectives: yours and the one of the agencies you hire, so we urge you not to filter by price but by expertise and better overall experience. 

Here are our 4 steps to finding the best suitable software agency: 

1. What is expected from you?

Do you know your scope and the objectives of your project? 
This will help you know what to delegate.  Use this general product development list as an idea:

  • UI/UX design
  • Defining functional requirements
  • Technical requirements
  • Project management
  • Product management
  • Quality assurance
  • Production and test infrastructure creation
  • Support

Having a rough plan is better than having no plan. Why is it important? It will help you set a realistic timetable and budget when negotiating your contract. Your software idea may not be completely formed or you are creating something from zero but still, try to set realistic milestones.  And your team will have clearly defined deliverables. It is good to have something to grab on, even if you are not planning to go to market yet.

2. Do you know how to work together?

From day one articulate what is expected from each other. Here is why Agile development frameworks are widely used in the industry, producing highly collaborative teams and promoting communication. We urge that each side actively tries to understand their business models and the way both measure quality.

Let's put it this way: when developers are motivated to demo and interact frequently with you... Then you (the client) have the same motivation to provide feedback or specifications too.                                                                                 
Another important aspect would be to ask about their project/communication management tools, the reason behind is to identify how accessible they will be (platforms like Asana, Trello, Jira, etc) so you can ask yourself, would these fit with the way we do things too?

3. How well do you know them?

Do some research, look for reviews about them, ask recommendations and their portfolio. Also if they have created any apps, download them, try them. How about open-source projects? Check if they meet your expectations. For example, weKnow loves to give to the Drupal Community, click to meet our main contributors

4. How much can you spend on the developer? 

The most cost-effective service is way more profitable than the cheapest. Many times low-cost offers are filled with blind spots you may regret in the future. Like dealing with lack of documentation, poorly written code, or technology difficult to work on or maintain, lack of experience or even low English level.

And contract wise, make sure to clarify what the price actually includes, specify the type of contract you are negotiating such as Fixed Price Model vs Time & Material Model project.

Pro Tip: Request details on their after-sale or maintenance service.

Consider a partner that can advise you, not only write code; giving you better ideas and features that connect with you from a technical perspective. You want to buy the services from someone who will make your business succeed.

Feb 25 2019
Feb 25
Date: 2019-February-25Vulnerability:  Drupal 7 will reach end-of-life in November of 2021Description: 

Drupal 7 was first released in January 2011. In November 2021, after over a decade, Drupal 7 will reach end of life (EOL). (More information on why this date was chosen.) Official community support for version 7 will end, along with support provided by the Drupal Association on Drupal.org. This means that automated testing services for Drupal 7 will be shut down, and there will be no more updates provided by the Drupal Security Team.

When this occurs, Drupal 7 will be marked end-of-life in the update manager, which appears in the Drupal administrative interface. Updates, security fixes, and enhancements will no longer be provided by the community, but may be available on a limited basis from select commercial vendors.

If you have a site that is running on Drupal 7, now is the time to start planning the upgrade. Note that the transition from Drupal 8 to Drupal 9 will not be the significant effort that the transition from 7 to 8 was. In fact, the first release of Drupal 9 will be identical to the last release of Drupal 8, except with deprecated code removed and dependencies updated to newer versions. (See Plan for Drupal 9 for more information on Drupal 9.)

What this means for your Drupal 7 sites is, as of November 2021:

  • Drupal 7 will no longer be supported by the community at large. The community at large will no longer create new projects, fix bugs in existing projects, write documentation, etc. around Drupal 7.
  • There will be no more core commits to Drupal 7.
  • The Drupal Security Team will no longer provide support or Security Advisories for Drupal 7 core or contributed modules, themes, or other projects. Reports about Drupal 7 vulnerabilities might become public creating 0 day exploits.
  • All Drupal 7 releases on all project pages will be flagged as not supported. Maintainers can change that flag if they desire to.
  • On Drupal 7 sites with the update status module, Drupal Core will show up as unsupported.
  • After November 2021, using Drupal 7 may be flagged as insecure in 3rd party scans as it no longer gets support.
  • Best practice is to not use unsupported software, it would not be advisable to continue to build new Drupal 7 sites.
  • Now is the time to start planning your migration to Drupal 8.

If, for any reason, you are unable to migrate to Drupal 8 or 9 by the time version 7 reaches end of life, there will be a select number of organizations that will provide Drupal 7 Vendor Extended Support (D7ES) for their paying clients. This program is the successor to the successful Drupal 6 LTS program. Like that program, it will be an additional paid service, fully operated by these organizations with some help from the Security Team.

The Drupal Association and Drupal Security Team will publish an announcement once we have selected the Drupal 7 Vendor Extended Support partners.

If you would like more information about the Drupal release cycle, consult the official documentation on Drupal.org. If you would like more information about the upcoming release of Drupal 9, join us at DrupalCon Seattle.

Information for organizations interested in providing commercial Drupal 7 Vendor Extended Support

Organizations interested in providing commercial Drupal 7 Vendor Extended Support to their customers and who have the technical knowledge to maintain Drupal 7 are invited to fill out the
application for the Drupal 7 Vendor Extended Support team. The application submission should explain why the vendor is a good fit for the program, and explain how they meet the requirements as outlined below.

Base requirements for this program include:

  • You must have experience in the public issue queue supporting Drupal 7 core or Drupal 7 Modules. You should be able to point to a history of such contribution. One way to measure this is issue credits, but there are other ways. You must continue this throughout your enrollment in the program. If you have other ways to show your experience, feel free to highlight them.
  • You must make a commitment to the Security Team, the Drupal Association, and your customers that you will remain active in this program for 3 years.
  • As a partner, you must contribute to at least 20% of all Drupal 7 Vendor Extended Support module patches and 80% of D7ES core patches in a given year. (Modules that have been moved into core in Drupal 8 count as part of core metrics in Drupal 7) .
  • Any organization involved in this program must have at least 1 member on the Drupal Security Team for at least 3 months prior to joining the program and while a member of the program. (See How to join the Drupal Security Team for information.) This person will need a positive evaluation of their contributions from the Security Working Group.
  • Payment of an Drupal 7 Vendor Extended Support annual fee for program participation is required (around $3000 a year). These fees will go to communication tools for the Drupal 7 Vendor Extended Support vendors and/or the greater community.
  • Payment of a $450 application fee is required.
  • Your company must provide paid support to Drupal 7 clients. This program is not for companies that don't provide services to external clients.
  • Application review process:

  1. We will confirm that each vendor meets the requirements outlined above and is a good fit for the program.
  2. If the Security Working Group does not think you are a good fit, we will explain why and decline your application. If you are rejected, you are able to reapply. Most rejections will be due to Organizations not having enough ongoing contribution to Drupal 7 and Organizations not having a Drupal Security Team member at their organization.
  3. The Drupal Association signs off on your participation in the program.
  4. If you are accepted, you will be added to the Drupal 7 Vendor Extended Support vendor mailing list.
  5. The Security Working Group will do a coordinated announcement with the vendors to promote the program.

If you have any questions you can email [email protected]

Feb 25 2019
Feb 25

We all have learned in our biology classes that genes are made up of DNA which gives instructions to the body to grow, develop and live. In other words, it is like a blueprint or like a recipe which guides an individual to do a particular task. 

Just like DNA is important to impact a human body, Drupal distributions are necessary to build and create a social impact platform for your projects and website. 

Image of a DNA with binary numbers on its string placed on a black background


Social impact is and must be the primary goal and measure for every social initiative. Measuring social impacts urges organizations not to only focus on the economic or financial factor, but to access their influence across the environmental and social dimensions. 

How Building a Social Impact Platform With Drupal Distributions Do Well to a Project?

A distribution packages a set of contributed and custom modules together with Drupal core to optimize Drupal for a specific use and industry. Drupal distribution has evolved from an expensive lead generation tool to something which offers a service at a large scale. Some of the Drupal distributions like:

OpenSocial 

OpenSocial is a free Drupal distribution for constructing private social networks and an out of the box solution for online communities. Open Social is a distribution that is built in Drupal 8 to construct social communities and intranets. It is built in Drupal 8, and it wraps in itself in an array of possibilities leveraging the features of Drupal 8.

In the Drupal community, Open Social is placed as a successor of Drupal Commons. Drupal Commons is a Drupal 7 distribution that is an out of the box community collaboration website.
 

Image of a flower where open social is written below it with purple, orange and red background

 

  • A case study on Pachamama  

Pachamama approves the inherent people of the Amazon rainforest to protect their lands, culture, educate and inspire people everywhere to bring forth a growing and sustainable world. Drupal was chosen for its flexibility and customizable features.

Drupal was appointed for its versatility and customizable features. For example, Pachamama grants an on- and offline ‘Awakening the Dreamer’ course. In the course module, the user can walk through a step-by-step course program and finish with video, text or an opportunity to keep track of the development and progress. To make this possible within the Pachamama Alliance platform integration of a course module into the Open Social platform was done.
 

Image of a laptop where the homepage of Pachamama Alliance is placed


Lightning 

A distribution developed and maintained by Acquia. This distribution provides a framework or starting point of Drupal 8 projects that require more advanced layouts.

The developers have been provided with hundreds of embedded automated tests that allow them to implement continual integrations pipelines. It controls major functionality, essentially granting a safe environment to innovate with their own custom code additions to Lightning.

Image of the Drupal logo. With a blue background in the drop where a lightning sign is inside

 

  • A case study on Higgidy 

Higgidy is a thriving business, offering incredible high-quality food that is sold in supermarkets. Drupal 8 was chosen for this project based on numerous factors.

The potential for future upgrades to make commerce into the platform was also an engaging benefit, enabling the user to assure that they don’t end up with a fragmented tech stack divided across many platforms. 

Being mobile-driven was a core concern of the platform selection, and Drupal 8 presented with a seamless content experience every time.

One of the primary and most important decisions was to make use of the Lightning. This gave a great head start for a site of this nature, right out of the box, presenting some very important components and assuring that they are able to get going. The site was essentially powered by Views coupled with some custom serialization.
 

Image of the homepage of Higgidy website


Opigno 

This is an Open source e-learning platform based on Drupal that enables the user to accomplish online training, skills of students and employees. Opigno is an open source e-learning platform that is based on Drupal. It allows the user to control online training, and efficiently guarantee that student, employee and partner skills remain up to date.

Opigno LMS is intended for Companies, Corporations, and Universities, looking for an e-learning solution that is flexible and is easily scalable.
 

Drupal Commerce 

Drupal Commerce is an open-source eCommerce software that augments the content management system Drupal. It helps in managing eCommerce websites and applications of all the sizes. 
This distribution also helps in enforcing strict development standards and leveraging the greatest features of Drupal 7 and major modules like Views and Rules for maximum flexibility.
 

Image of a cart with a blue Drupal logo in it. Where the text is beside it written as Drupal Commerce

 

  • Drupal Commerce helping the community: A case study on Sosense

Sosense supports entrepreneurs who address some of the most challenging social and environmental problems. Drupal was selected for this project because it was one of the most relevant frameworks that build a scalable platform. 

Sosense demand was to rebrand and redevelop their first, custom-developed, platform to develop technical scalability, usability, and interaction design. The project work was simple yet appreciatively challenging. One side it drew from our expertise in creating community- and fundraising solutions. On the other side, Sosense was one of the first complex sites to apply Drupal 7 where many important modules were still in dev status. 

Testing and debugging modules like Organic Groups, Drupal Commerce and i18n, required many unexpected hours of work. The agile project management approach allowed us to tackle some of the unexpected issues with frequent releases and constant client interaction. The project was delivered on time and to the full satisfaction of our client.
 

Screenshot of the homepage of Sosense website


OpenChurch 

This is the distribution which is for churches and ministries.  A flexible platform with common features of the church helping them streamline development of the website. Some of the features of this distribution are:

  • Blog - It includes a list page and archive page, the blog content type is very easy and this does not use the core blog module.
  • Bulletin - Includes block for downloading latest bulletin, also a list page and content type.
  • Events - Includes an event content, filtered by the ministry and responsive calendar.
  • Gallery - Integrates with ministry content and is an easy way to manage galleries.
  • Giving - Includes list display for featured charities
  • Homepage Rotator - a very nice way to feature content on the homepage in a slideshow which is a very common feature on sites today.
  • Ministry - this represents a church's core ministries (Missions, Youth, etc.) and integrates with other content on the site.
  • Podcast - An out of the box sermon podcast page. Also includes a block for showing the most recent podcast. It is called labeled 'Sermons' but can be used for any kind of podcast.
  • Social - Social integration with Twitter, Facebook, Google+ and more! Enable visitors to share content with their social networks.
  • Staff - Includes staff page and integration with well with ministries.
  • Video - Add 3rd party video from Youtube and Vimeo

Presto

Presto includes pre-configured and ready to use right out-of-the-box functionalities. It consists of an article content type with some pre-configured fields and a basic page content type with a paragraphs-based body field. Some pre-configured paragraph types in this distribution are:

  • Textbox
  • Image
  • Promo bar
  • Divider
  • Carousel

Not only this but it also consists of a block which allows the embedding of Drupal blocks. This distribution has an article listing page which displays a paginated listing of articles, sorted by publish date.

Conclusion  

The advantages of working with a Drupal distribution continue well till date. Maintenance is also a breeze. When you create a website born out of distribution, all modules and features are integrated and tested together. When updates are required, it is a single update, as opposed to hundreds. Thus Drupal distribution for your social impact platform is what you need.

At opensense Labs we purely follow all the functionalities that come with the Drupal distribution. Contact us on [email protected] for more information on the same. Our services would guide you with all the instruction and information, that you require for the same. 

Feb 25 2019
Feb 25

Click, tap, like, hit, post, tweet, retweet, repost, share, tag, comment - I am sure that you are known to all these terms, use them daily and even promote your business with it. 

We live in a world where the boundaries of work and office space are changing. A new era of transformation has opened up where collaboration and communication within the company is modified into a “Digital System” 

The heart of all this meaningful connection and real-time communication (Which is integral for modern business) is the social intranet.

 Image of an iPad where a hand touches the screen and different dialogue clouds are coming out of it with different pictures


And nothing beats the performance of OpenSocial, a Drupal distribution that is used for building social communities and intranet. 

OpenSocial is bringing power and the essence of pervasive social capabilities to the web.

You ask how?

Well, Let’s find out

Understand OpenSocial 

OpenSocial is an out-of-the-box solution for the online community. It is used for creating social communities, intranets, portals and any other social project. It appears with a collection of features and functionalities that are useful in constructing a social framework. 

In the Drupal community, Open Social is placed as an heir of Drupal Commons (Drupal Commons is a Drupal 7 distribution that is an out of the box community collaboration website) 

 Image of a white flower in an orange and purple background where open social is written at the bottom of it


OpenSocial and its out-of-the-box feature 

  • Content types and structure

The user is offered with two content types: events and topics. The architecture lets OpenSocial be lightweight software that can easily be installed and can be used seamlessly by users. Blogs, News etc. are all identical content type as a topic but have separate taxonomy.

  • Media Management 

With the help of Media management, the user can efficiently arrange, resize and add images wherever they want to on a particular website. File System, Images Styles and all other media configurations that are needed to add, resize and adjust images are inbuilt.

  • Responsive and Multi-Lingual Support 

Open Social follows with Drupal 8 “mobile-first” theory and it is responsive “by default”. Not only this but it also consists of “Translation Module” that is used for Multilingual support.

  • SEO Optimization

The SEO strategy is based on a consultative approach. Adverbs, SEO, Social media and conversion optimization is used to generate the traffic. The out-of-the-box feature in OpenSocial helps the user to optimize their website in a way that more people visit it. 

OpenSocial Foundation and W3C Social Web Activity

“Social standards are part of the application foundations for the Open Web Platform” 
-Jeff Jaffe 

In other words, they will be used everywhere, in diverse applications that operate on phones, cars, televisions, and e-readers. In terms of OpenSocial, the W3C standard is defined as:

The social web working group which determines the technical standards and API facilitates access to social functionalities as part of Open web platform.
The social interest group coordinates messaging around social at the W3C strategy that enables social business. 

Open source project at Apache Foundation

The Apache Software Foundation hosts two active and ongoing projects in addition to the many commercial enterprise platforms that practice on OpenSocial, it serves as reference implementations for OpenSocial technology:

Apache Shindig: It is the reference implementation of OpenSocial API specifications, versions 1.0.x and 2.0.x. It is the standard set of Social Network APIs that constitutes profiles, relationships, activities etc

Apache Rave: It is a lightweight and open-standards-based extensible platform for managing, combining and hosting OpenSocial and W3C Widget related features, technologies, and services. 

How is OpenSocial contributing to society?

The Developers 

Social platforms are interactive and exercise notifications that are provided with the alerts. Making numerous social software to control social experience takes a lot of time and effort. Building a distribution is the answer to all of it. It allows the developers to build the best things, re-use it, expand and even improve on that. 

Site Owner and Business 

If you are using Opensource Saas offerings, you have the ability to use site codes and data anytime. Social media changed modern society and communications, especially in our private lives. The decentralized nature of social software is a huge opportunity for organizations to reinvent the way they communicate and collaborate

End Users 

End users obsess over user-centered design. Without engaged end users, no projects wouldn’t go anywhere. Thus providing the users with tools that are appealing and easy to use are a must for great user experience.

Why choose OpenSocial over any other software?

Freedom for the clients. If they need to download their SaaS platform and run or extend it as they want, then they can easily do it. 

Getting to this point from scratch takes longer and the core modules give you the functionality you need from the ground up.

The above points clearly say it is better software. With the Drupal community putting extra eyes on the code, making suggestions for design and development improvements, hopefully adding new features word-of-mouth marketing, and possibly some clients.
It provides easy customization options.

OpenSocial giving tough competition to other community software in the market

The pace of digitization is steadily increasing, leaving a lot of old processes behind in the dust. The same applies to traditional methods of innovation. The internet has not just become a hub to share knowledge, but also to create knowledge together through crowd innovation.

Some of the other community software in the market like lithium is being beaten hard by OpenSocial.

How?

Let’s find out 

  Lithium OpenSocial Who uses it? Businesses of all sizes looking to attract new visitors A better way of connecting with your members, volunteers, employees, and customers Free Trail Not Provided  Provided  Free Version Not Provided  Provided  Starting Price $235.00/month It is free Entry Level set up Not Provided  Provided 

What does Drupal Community Gain From Open Social?

Without Drupal distributions, we won't be able to successfully compete with commercial vendors. Drupal distributions have great potential.
-Dries Buytaert

With the help of Open Social distribution, the Drupal community has been provided with a platform for their social projects. A more sustainable and adopted way of development. OpenSocial is better with Drupal because:

Users can use Open Social for their own projects and clients.
They can give back to the open-source community.
If the user is a Drupal freelancer or professional then they can improve the Drupal.org standing.

Case Study on Youth4Peace 

The UN Security Council acknowledges the positive role played by all young women and men in preserving international security. The task force for Youth, Peace, and Security proposed an updated and expanded Youth4Peace platform. This was done in order to give inspired parties and partners a path to enable consistent and timely information.

The UNDP was already familiar with the features and functionalities of Drupal as the previous site was built on the same. The organization supports open-source mainly because of the reusability feature of modules. 

Moreover, the Drupal 8 community distribution, Open Social equals several goals of the project. Goals like: innovation and the use of technology. The distribution already included most of the needed features for the project, including blogs, events, profiles, information streams, a discussion engine, and moderation tools for community managers.

Therefore, The Youth4Peace portal was developed. It was constructed using an Agile method and mainly focused on:

  • A curated Knowledge Resource Library
  • Moderated e-Discussions & e-Consultations
  • Experts’ Profiles
  • News & Events and their overviews with filters

By being able to produce content for non-community members, the community was able to reach the global platform even at a bigger pace.
 

Image of 7 colorful hands in circle position where the text is written “progress study on youth, peace, and security


In The End 

Now we know that OpenSocial has the right blend of features that are needed to build a social community. The distribution proves to be an appropriate platform to start building a community or intranet with immense features.

Opensense Labs understands how important it is for every organization to stay connected with the world. Therefore, we are here to leverage all those facilities and services. Ping us at [email protected] now.

Feb 25 2019
Feb 25

Earlier we wrote about stress testing, featuring Blazemeter where you could learn how to do crash your site without worrying about the infrastructure. So why did I even bother to write this post about the do-it-yourself approach? We have a complex frontend app, where it would be nearly impossible to simulate all the network activities faithfully during a long period of time. We wanted to use a browser-based testing framework, namely WebdriverI/O with some custom Node.js packages on Blazemeter, and it proved to be quicker to start to manage the infrastructure and have full control of the environment. What happened in the end? Using a public cloud provider (in our case, Linode), we programmatically launched the needed number of machines temporarily, provisioned them to have the proper stack, and the WebdriverI/O test was executed. With Ansible, Linode CLI and WebdriverIO, the whole process is repeatable and scalable, let’s see how!

Infrastructure phase

Any decent cloud provider has an interface to provision and manage cloud machines from code. Given this, if you need an arbitrary number of computers to launch the test, you can have it for 1-2 hours (100 endpoints for a price of a coffee, how does this sound?).

There are many options to dynamically and programmatically create virtual machines for the sake of stress testing. Ansible offers dynamic inventory, however the cloud provider of our choice wasn’t included in the latest stable version of Ansible (2.7) by the the time of this post. Also the solution below makes the infrastructure phase independent, any kind of provisioning (pure shell scripts for instance) is possible with minimal adaptation.

Let’s follow the steps at the guide on the installation of Linode CLI. The key is to have the configuration file at ~/.linode-cli with the credentials and the machine defaults. Afterwards you can create a machine with a one-liner:

linode-cli linodes create --image "linode/ubuntu18.04" --region eu-central --authorized_keys "$(cat ~/.ssh/id_rsa.pub)"  --root_pass "$(date +%s | sha256sum | base64 | head -c 32 ; echo)" --group "stress-test"

Given the specified public key, password-less login will be possible. However this is far from enough before the provisioning. Booting takes time, SSH server is not available immediately, also our special situation is that after the stress test, we would like to drop the instances immediately, together with the test execution to minimize costs.

Waiting for machine booting is a slightly longer snippet, the CSV output is robustly parsable:

## Wait for boot, to be able to SSH in.
while linode-cli linodes list --group=stress-test --text --delimiter ";" --format 'status' --no-headers | grep -v running
do
  sleep 2
done

However the SSH connection is likely not yet possible, let’s wait for the port to be open:

for IP in $(linode-cli linodes list --group=stress-test --text --delimiter ";" --format 'ipv4' --no-headers);
do
  while ! nc -z $IP 22 < /dev/null > /dev/null 2>&1; do
    sleep 1
  done
done

You may realize that this is overlapping with the machine booting wait. The only benefit is that separating the two allows more sophisticated error handling and reporting.

Afterwards, deleting all machines in our group is trivial:

for ID in $(linode-cli linodes list --group=stress-test --text --delimiter ";" --format 'id' --no-headers);
do
  linode-cli linodes delete "$ID"
done

So after packing everything in one script, also to put an Ansible invocation in the middle, we end up with stress-test.sh:

#!/bin/bash

LINODE_GROUP="stress-test"
NUMBER_OF_VISITORS="$1"

NUM_RE='^[0-9]+$'
if ! [[ $NUMBER_OF_VISITORS =~ $NUM_RE ]] ; then
  echo "error: Not a number: $NUMBER_OF_VISITORS" >&2; exit 1
fi

if (( $NUMBER_OF_VISITORS > 100 )); then
  echo "warning: Are you sure that you want to create $NUMBER_OF_VISITORS linodes?" >&2; exit 1
fi

echo "Reset the inventory file."
cat /dev/null > hosts

echo "Create the needed linodes, populate the inventory file."
for i in $(seq $NUMBER_OF_VISITORS);
do
  linode-cli linodes create --image "linode/ubuntu18.04" --region eu-central --authorized_keys "$(cat ~/.ssh/id_rsa.pub)" --root_pass "$(date +%s | sha256sum | base64 | head -c 32 ; echo)" --group "$LINODE_GROUP" --text --delimiter ";"
done

## Wait for boot.
while linode-cli linodes list --group="$LINODE_GROUP" --text --delimiter ";" --format 'status' --no-headers | grep -v running
do
  sleep 2
done

## Wait for the SSH port.
for IP in $(linode-cli linodes list --group="$LINODE_GROUP" --text --delimiter ";" --format 'ipv4' --no-headers);
do
  while ! nc -z $IP 22 < /dev/null > /dev/null 2>&1; do
    sleep 1
  done
  ### Collect the IP for the Ansible hosts file.
  echo "$IP" >> hosts
done
echo "The SSH servers became available"

echo "Execute the playbook"
ansible-playbook -e 'ansible_python_interpreter=/usr/bin/python3' -T 300 -i hosts main.yml

echo "Cleanup the created linodes."
for ID in $(linode-cli linodes list --group="$LINODE_GROUP" --text --delimiter ";" --format 'id' --no-headers);
do
  linode-cli linodes delete "$ID"
done

Provisioning phase

As written earlier, Ansible is just an option, however a popular option to provision machines. For such a test, even a bunch of shell command would be sufficient to setup the stack for the test. However, after someone tastes working with infrastructure in a declarative way, this becomes the first choice.

If this is your first experience with Ansible, check out the official documentation. In a nutshell, we just declare in YAML how the machine(s) should look, and what packages it should have.

In my opinion, a simple playbook like this below, is readable and understandable as-is, without any prior knowledge. So our main.yml is the following:

- name: WDIO-based stress test
  hosts: all
  remote_user: root

  tasks:
    - name: Update and upgrade apt packages
      become: true
      apt:
        upgrade: yes
        update_cache: yes
        cache_valid_time: 86400

    - name: WDIO and Chrome dependencies
      package:
        name: ""
        state: present
      with_items:
         - unzip
         - nodejs
         - npm
         - libxss1
         - libappindicator1
         - libindicator7
         - openjdk-8-jre

    - name: Download Chrome
      get_url:
        url: "https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb"
        dest: "/tmp/chrome.deb"

    - name: Install Chrome
      shell: "apt install -y /tmp/chrome.deb"

    - name: Get Chromedriver
      get_url:
        url: "https://chromedriver.storage.googleapis.com/73.0.3683.20/chromedriver_linux64.zip"
        dest: "/tmp/chromedriver.zip"

    - name: Extract Chromedriver
      unarchive:
        remote_src: yes
        src: "/tmp/chromedriver.zip"
        dest: "/tmp"

    - name: Start Chromedriver
      shell: "nohup /tmp/chromedriver &"

    - name: Sync the source code of the WDIO test
      copy:
        src: "wdio"
        dest: "/root/"

    - name: Install WDIO
      shell: "cd /root/wdio && npm install"

    - name: Start date
      debug:
        var=ansible_date_time.iso8601

    - name: Execute
      shell: 'cd /root/wdio && ./node_modules/.bin/wdio wdio.conf.js --spec specs/stream.js'

    - name: End date
      debug:
        var=ansible_date_time.iso8601

We install the dependencies for Chrome, Chrome itself, WDIO, and then we can execute the test. For this simple case, that’s enough. As I referred to earlier:

ansible-playbook -e 'ansible_python_interpreter=/usr/bin/python3' -T 300 -i hosts main.yml

What’s the benefit over the shell scripting? For this particular use-case, mostly that Ansible makes sure that everything can happen in parallel and we have sufficient error-handling and reporting.

Test phase

We love tests. Our starter kit has WebdriverIO tests (among many other type of tests), so we picked it to stress test the full stack. If you are familiar with JavaScript or Node.js the test code will be easy to grasp:

const assert = require('assert');

describe('podcasts', () => {
    it('should be streamable', () => {
        browser.url('/');
        $('.contact .btn').click();

        browser.url('/team');
        const menu = $('.header.menu .fa-bars');
        menu.waitForDisplayed();
        menu.click();
        $('a=Jobs').click();
        menu.waitForDisplayed();
        menu.click();
        $('a=Podcast').click();
        $('#mep_0 .mejs__controls').waitForDisplayed();
        $('#mep_0 .mejs__play button').click();
        $('span=00:05').waitForDisplayed();
    });
});


This is our spec file, which is the essence, alongside with the configuration.

Could we do it with a bunch of requests in jMeter or Gatling? Almost. The icing on the cake is where we stress test the streaming of the podcast. We simulate a user who listens the podcast for 10 seconds. For for any frontend-heavy app, realistic stress testing requires a real browser, WDIO provides us exactly this.

The WebdriverIO test execution - headless mode deactivated

Test execution phase

After making the shell script executable (chmod 750 stress-test.sh), we are able to execute the test either:

  • with one visitor from one virtual machine: ./stress-test.sh 1
  • with 100 visitors from 100 virtual machines for each: ./stress-test.sh 100

with the same simplicity. However, for very large scale tests, you should think about some bottlenecks, such as the capacity of the datacenter on the testing side. It might make sense to randomly pick a datacenter for each testing machine.

The test execution consists of two main parts: bootstrapping the environment and executing the test itself. If bootstrapping the environment takes too high of a percentage, one strategy is to prepare a Docker image, and instead of creating the environment again and again, just use the image. In that case, it’s a great idea to check for a container-specific hosting solution instead of standalone virtual machine.

Would you like to try it out now? Just do a git clone https://github.com/Gizra/diy-stress-test.git!

Result analysis

For such a distributed DIY test, analyzing the results could be challenging. For instance, how would you measure requests/second for a specific browser-based test, like WebdriverI/O?

For our case, the analysis happens on the other side. Almost all hosting solutions we encounter support New Relic, which could help a lot in such an analysis. Our test was DIY, but the result handling was outsourced. The icing on the cake is that it helps to track down the bottlenecks too, so a similar solution for your hosting platform can be applied as well.

However what if you’d like to somehow gather results together after such a distributed test execution?

Without going into detail, you may study the fetch module of Ansible, so you can gather a result log from all the test servers and have it locally in a central place.

Conclusion

It was a great experience that after we faced some difficulty with a hosted stress test platform; in the end, we were able to recreate a solution from scratch without much more development time. If your application also needs special, unusual tools for stress-testing, you might consider this approach. All the chosen components, such as Linode, WebdriverIO or Ansible are easily replaceable with your favorite solution. Geographically distributed stress testing, fully realistic website visitors with heavy frontend logic, low-cost stress testing – it seems now you’re covered!

Feb 24 2019
Feb 24

Education is just like planting a seed. A seed that has different stages with significant and important roles. If any stage is missed in the entire scenario, it destroys the life cycle of the plant. 

It is not what is poured into a student that counts, but what is planted
-Linda Conway  

There is no secret to the fact that education has the power to change lives. For a successful career, every student needs to go through the learning stage of knowledge, confidence, academics and technical skills so that they can grow efficiently. A college education is one such element that contributes highly to these steps of learning. 

Therefore, to achieve these steps of knowledge, campus management software has been introduced. 

picture of a laptop which hjas book shelf in it. A degree in lying at the corner of the keyboard. A globe is lying beside the laptop.


OpenSource Campus Management solution is one such management software which has made lives easy for students, teachers, authorities and other people that follow down the chain. Such a system has brought standardization and steadiness within the organization. 

But what exactly is OpenCampus?

OpenCampus is a technical model that contributes highly to the outlook and the network of the universities. It was developed with the first open adoption solution of campus management in Drupal. 

OpenCampus is designed to cover the life cycle of students.

In Germany and Austria, more than 30 universities are using this software and it is highly contributing to their needs and requirements. 

Image that says OpenCampus in blue text with a circle logo that is on the left side of the text

With the help of OpenCampus software, you can manage everything. Starting from all the courses till recording achievements, the application does everything and is considered among the most versatile applications. It allows mapping of complex procedure which includes the allocation of the student into smaller classes in medical or dentistry programs. 

The Framework

The framework of OpenSource is based on the open source technology, Drupal, and it lets their customers create their own applications with a smooth integration of third-party products such as a moodle. 

Image of a blue platform where the text is OpenCampus framework on which threads are placed with social media logos

Features provided by OpenCampus
 

Features Benefits Application and Admissions
  • Transparent and multi-staging application process.
  • Dynamic list of view.
  • Automatic e-mail notification
  • Smart forms
Course and Event Management
  • Parallel small groups.
  • Automation of complex course sequence 
  • Uploading of documents, term papers, and personal calendar
Exam Management and validation
  • Exam questions 
  • Written tests and online evaluation
  • Seating plans
Records of Achievements
  • Easy modifications following revision of the examination.
  • Automatic generation of course certificate
  • List of synopsis 
Evaluation
  • Evaluation via app
  • Flexible configuration
  • Automatic evaluation report 
Mobile apps and Platforms
  • Integration of students and faculty 
  • Forums and online discussions
  • Attendance


Application and Admission

The process involving applications as well as admissions have been made really simple with the help of OpenCampus. The software presents the applicants with a simple tool that uploads and manages all the necessary information in one single place. 

Course and Event Management 

OpenCampus software is one of the most powerful and flexible of its kind. The module handles simple seminars with the location, automates complex courses, appointment, and lecturer. It also supports multilevel architectures with multi-language pages and directs the budget control. 

Exam management

The software is an innovative web-based solution that grants users with extensive functionalities for creating a multi-level architecture of any exam. All the aspects of exam preparation are managed seamlessly with OpenCampus ( starting from an online mock test to the seating arrangement)

Records of Achievements 

OpenCampus performance management tells the whole study achievements of the students in a clear view. The data of the other modules such as "OpenCampus Exams" and "OpenCampus event management" are also stored in this location. Easy modifications in the revision of the examination, automatically generating course certificate and the listing of synopsis are some of the features that are offered under OpenCampus

Evaluation

Continuous and seamless evaluation is the key to ensure the quality of teaching and offers that are present by a university. The user can evaluate standardized courses and receive qualitative as well as quantitative feedback on different areas of teaching. The user can benefit from simple creation option of questionaries or reports as full integration of course management is done in the system. 

Mobile Application 

The OpenCampus software has special support which is "Room Management". The users can manage their booking of event and laboratory rooms and their equipment. As the software is mobile responsive, it makes it even more efficient and handy. 

OpenCampus logo which has a two concentric circle that has a blue background and there are 14 corresponding pictures around it


The reason why customers choose OpenCampus 

Higher education institutes are bound with various responsibilities and data information that has to be managed accurately and in complete synchronization. OpenCampus here bags all the trophies by providing them with the administration of the students and faculty. There are also many reasons why OpenCampus is chosen by universities. Some of the reasons are:

  • It presents with unique processing mapping: OpenCampus is the only software that manages complex processes of the universities. 
  • It comes with comprehensive feature sets: OpenCampus software offers extensive functionalities and features to its customers. 
  • Open Adaptive System: OpenCampus is an adaptive system that has additional modules that can easily be added anytime on the openSource platform.
  • Established and Experienced: More than 25 universities are using OpenCampus that have at least 3,000 students. 

OpenCampus for Research Data Management System For Clinical Studies

Research institutes need to manage multiple studies with individual data sets, processing rules and different type of permissions. But there are no “official” or “standard” technology that presents an easy to use the environment to construct database and user interface for the clinical trials or the research studies. Thus, many software solutions were being used which were explicitly made for a specific study, to cost-intensive commercial Clinical Trial Management Systems (CTMS)

With OpenCampus Research, Open adoption software (OAS) solution provided the users with a standard environment for state-of-the-art research database management at a very low cost.

The architecture of the open adoption software (OAS) allows the user after a brief instruction to develop their own web-based data management system.

The implementation provided with the following features:

  • Basic Architecture

OpenCampus is basically three types: forms, trees, and containers. Any type of research project or clinical trial can always be mapped with this model and are fully configurable through the graphical user interface 
 

Image of a flow chart stating user, processes, trees, containers, and nodesSource: National Center for Biotechnology Information

 

  • Interoperability

There are many taxonomies that allow the user to classify content with terms gathered within the vocabularies. With the help of taxonomies, the field contents are able to store not just as text, but also as a reference that is linked to the predefined value.

  • Multicenter 

The approach of OpenCampus software works really well under this section. There is one single study administrator that assigns permissions to center coordinators. Center coordinators then independently distribute access and permissions to the data managers that are responsible for entering the data.

The multicenter concept can be extended with various additional features such as node state levels or individual data processing guidelines that ensure that certain quality management actions are executed during data processing

  • Meta-analysis

One core element of this data storage approach in the OpenCampus OAS concept is that it allows the nodes to get connected to each other. The link between these nodes is called entity reference. With the help of entity references, the data from many studies can be combined (merged), enabling meta-analysis to be executed just by creating a new output view.

  • Data Security  

The two major solution in terms of security is that the customer can fill online form or the information can be submitted on premises along with the confidentiality of doctor-patient.

Thus, with the help of OpenCampus system, a steady environment was provided to the research center and the people working in it alongside with database design and pattern design. 

Conclusion

OpenCampus is not only that software which is used for the small clerical task, but it is also beyond that as it offers three-way interactive platform for students, teachers, and parents. It not only saves the time of the administrative staff and their pupils, but it also allows them to pay fees online and makes them attentive about important information around the university. 

Opensense Labs believes that the contemporary system of education will spread a new level of superiority in the education sector. Ping us at [email protected] to know more about OpenSource campus management. The services provided by our organization would help you solve all your queries.

Feb 23 2019
Feb 23

The story of a historical character acquires a plethora of accretions over the centuries. So, we have numerous incidents and episodes in his or her life but not the complete picture. So, representing historical characters on stage could lead to a fractured narrative. There has to be synchronisation with the pre-recorded dialogue and it should not distract the actors from emoting.

blacka and white image of people acting in a play


The synchronisation is also of great significance in the digital scene. Content publishing in Drupal is of utmost importance with the expanding possibilities of content creation itself. It is even more crucial when it has to be synchronised between Drupal sites. Why is content synchronisation needed in Drupal?

Need for Content Synchronisation

Drupal 8 offers numerous tools for streamlining content creation and moderation. For instance, the Content Moderation module lets you expand on Drupal’s ‘unpublished’ and ‘published’ states for content and enables you to have a published version that is live and have a different working copy that is undergoing assessment before it is published.

In case, you need to share content or media across multiple sites, different solutions are available in Drupal that comes with content synchronisation capabilities to assist you to keep the development, staging and production in superb sync by automating the safe provisioning of content, code, templates and digital assets between them. Multiple synchronisation tasks can be created and scheduled for automating their occurrence in the future targeting different destinations servers and sites. Or, synchronisation tasks can be manually performed via the user interface.

Following are some of the major modules that are worth considering for content synchronisation necessities:

Deploy - Content Staging

Logo od Drupal deploy module with icon resembling cloud


The Deploy module enables users to easily stage content from one Drupal site to another and automatically governs dependencies between entities like node references. Its rich API is extensible that helps in different content staging situations. It is great for performing cross-site content staging. Using Deploy with RELAXed Web Services helps in staging content between different Drupal websites. It, also, works with Workspace module for offering workspace preview system for single site content staging. And the API offered by RELAXed Web Services is spectacular for building fully decoupled site. With Multiversion, all content entities can be revisioned. Workbench Moderation ensures that when you moderate a workspace, content is replicated automatically when approved.

Entity Share

Logo of entity share drupal module with three drop shaped logos of drupal


Entity Sync module enables you to share entities like node, field collection, taxonomy, media etc. between Drupal instances. It lets you share entities with the help of JSON API and offers a user interface for leveraging endpoints provided by JSON API module.

CMS Content Sync

logo of content sync drupal module with the word content sync written in blue and black


CMS Content Sync module offers content synchronisation functionalities between Drupal sites with the help of a NodeJS based Sync Core. The synchronisation of an enormous amount of data consisting of content and media assets is possible with the help of this module that can’t be managed by Drupal itself. It is wonderful for content staging as it enables you to test code updates with your content and publish code and content concurrently. It manages the publishing so that you can have a complete focus on the creation of content. It also offers content syndication functionalities and your entire content can be updated and deleted centrally. Moreover, it allows you to connect any of your sites to a Content Pool that lets you push your content and media items and the remote sites can be allowed to import that content easily.

Content Synchronisation

Exporting single content item or a number of content items from an environment to another efficaciously is possible with the help of Content Synchronisation module. That means you can export and import full site content. Or, you can export and import a single content item. The difference between site content and the one in YAML files can be viewed. Also, entities can be imported with a parent/child relationship.

Acquia Content Hub

drop shaped logo of acquia content hub drupal module


The distribution and discovery of content from any source in order to a fantastic multi-channel digital experiences can be done using Acquia Content Hub module. It enables you to connect Drupal sites to the Acquia Content Hub service. Acquia Content Hub, which is cloud-based, centralised content dissemination and syndication solution, lets you share and enrich content throughout a network of content sources using extensible, open APIs.

Conclusion

These are some of the most significant solutions available in the enormous list of Drupal modules that are specifically built for enhancing synchronisation of content between Drupal sites.

We have been constantly working towards the provisioning of marvellous digital experiences with our expertise in Drupal development.

Let us know at [email protected] how you want us to help you build innovative solutions using Drupal.

Feb 23 2019
Feb 23

A scientist can be rewarded a Nobel prize for some amazing scientific breakthrough made in his or her research. Most often than not, the leading scientist is backstopped in the research by a team of hugely talented assistants.

Lemur sitting near a wood


Talking about backstopping, you do need something as a backstop even in the digital landscape. When you make an alteration to your website, you have to be sure if it is devoid of unintended side effects. This is where BackstopJS comes into light. As an intuitive tool, it enables swift configuration and helps you get up and rolling quickly.  Before we look at how it can be leveraged with Drupal, let’s dive deeper into visual regression testing and BackstopJS.

Traversing visual regression testing and BackstopJS

Visual regression testing emphasises on identifying visual alterations between iterations or version of a site. In this, reference images are created for every component as they are created which enables a comparison over time for monitoring alterations. Developers can run this test on his or her local development environment after the alterations are performed to make sure that no regressions issues transpire because of the changes.

Visual regression testing emphasises on identifying visual alterations between iterations or version of a site.

Visual regression testing is hugely beneficial in enabling the developers to get a test coverage that is more than what they could do manually thereby ensuring that alterations do not cause regression impact on other components. It has the provision for enhanced detail comparison than what you would get while reviewing the site manually. Even without a full understanding of the project, developers know what the website should look like before the alterations.

Logo of BackstopJS with graphical image of lemur and backstopjs written below


BackstopJS, an open source project, is a great tool for performing visual regression testing. It is leveraged for running visual tests with the help of headless browsers to capture screenshots. It was created by Garris Shipon and was originally ran with the help of PhantomJS or SlimerJS headless browser libraries. It supports screen rendering with Chrome-headless and you can add your own interactions using Puppeteer and ChromyJS scripting.

BackstopJS is leveraged for running visual tests with the help of headless browsers to capture screenshots

It offers an excellent comparison tool for identifying and highlighting differences and lets you set up several breakpoints for testing responsive sites. Moreover, it utilises simple CSS selectors for identifying what to capture. It provides in-browser reporting user interface like a customisable layout, scenario display filtering, etc.  Furthermore, it comes with integrated Docker rendering and works superbly with continuous integration and source control. Also, it gives you CLI and JUnit reports.

Workflow and installation

It’s pretty easy to install and configure BackstopJS which involves:

Installation (global): npm install -g backstopjs
Installation (local): npm install --save-dev backstopjs
Configuration: backstop init (creates backstop.json template)

Following is the workflow of BackstopJS:

  • backstop init: A new BackstopJS instance is set up by specifying URLs, cookies, screen sizes, DOM selectors, interactions among others
  • backstop test: By creating a set of test screengrabs and comparing them with your reference screengrabs, you can check the alterations that show up in a visual report.
  • backstop approve: If all looks fine after the test, you can approve it

BackstopJS with Drupal

A Drupal Community event in 2018 talked about a Drupal 8 module called Backstop Generator for creating backstop.json configuration files on the basis of site’s unique content.

[embedded content]


This Drupal 8 module exposes an administrative configuration form to create a BackstopJS visual testing profile on the basis of Drupal website’s content. It assists you in creating backstop scenarios from Drupal pages and defining random pages for including as scenarios. You can toggle on and off viewport sizes. It results in a backstop.json file that requires you to place that into a backstop directory and the existing backstop.json file is replaced. It requires the services of Serialisation, HAL and REST modules. It is worth noting that this module is not covered by Drupal’s security advisory policy.

Backstop Generator can do a lot more. You can contribute towards building more interesting features and join the issue queue on Drupal.org to submit a patch or report a bug.

Conclusion

Deploying frontend alterations can be troublesome. Visual regression testing software like BackstopJS ensures that our changes are accurate and contained and can be great for Drupal sites.

We have been offering a suite of services to help digital firms fulfil their dreams of digital transformation.

Contact us at [email protected] and let us know how you want us to be part of your digital transformation plans.
 

Feb 23 2019
Feb 23

R programming can be an astronomical solution for foretelling re-booking rates by leveraging previous guest ratings and, thereby, automating guest/host matching. That’s exactly what analysts at Airbnb, an online marketplace and hospitality service provider, has done with it. It has leveraged R for driving numerous company initiatives with the help of an internal R package called Rbnb.

Letter R written on a wall


As a growing organisation, Airbnb’s inclination towards R for enhancing its business operations speaks volumes of this programming language. It offers powerful analytics, statistics and visualisations. In combination with Drupal, R programming language can be a spectacular option for an innovative solution. Let’s take a look at the origins of R and its significance before heading over to the integration of Drupal and R programming.

A sip of history

Microsoft delineates that strolling along history of R would take us back to ‘90s when it was implemented by Robert Gentleman and Ross Ihaka (faculty members at the University of Auckland). It was commenced as an open source project in 1995. R Core Group handled this project from 1997. The first version was released in the year 2000. It was closely modelled on the S language for Statistical Computing.

R programming: An exploration

Logo of R programming language with the letter R and a grey circle overlapping it.


R, as an open source programming language and software environment, is magnificent for statistical computing and graphics. Prominently, it is used by data scientists and statisticians alike. It has the support of R Foundation for Statistical Computing and a huge community of open community developers. Those who are accustomed to using GUI focussed programs like Statistical Package for Social Sciences (SPSS) and Statistical Analysis System (SAS) can find it difficult in the nascent stages of using R as it leverages a command line interface. In this case, RStudio can be meritorious.

R is a language and environment for statistical computing and graphics. - r-project.org

R offers a wide array of statistical techniques like linear and non-linear modelling, classical statistical tests, time-series analysis, classification, clustering among others. In addition to this, it also provides graphical techniques and is very extensible.

R has had a remarkable growth which can be seen through the following figure.

graphical representation with several dots to explain growth of R programmingSource: Stack Overflow

Even the Redmonk programming language rankings, that compared languages’ appearances on Github (usage) and StackOverflow (support), R maintains its position near the top.

A straight line graph with several dots to explain popularity of R programming


Academics, healthcare and government segments maintain the preeminence when it comes to the industries that visit the R questions the most in U.S. and U.K.

Bar graph with horizontal bars in orange colour to explain different industries using R programmingSource: Stack Overflow

The Burtch Works survey shows that R is a second choice for data scientists and its flexibility makes it great for predictive analytics. Research practitioners, who want to do both sorts of analysis and concurrently implement machine learning to their marketing research processes in future, will find R as a great option.

Pie charts with green and blue regions and Bar graph with green and blue bars to explain R programmingSource: Nebu

Why choose R?

Following are some of the pivotal reasons that state the importance of R:

Documentation

Online resources for R like message boards are superbly supported and well documented.

Analysis

R helps in performing data acquisition, cleaning and analysis all in one place.

Data Visualisation

R has fantastic tools for data visualisation, analysis and representation.

graphical representation showing greyish region formed by uneven line to explain R programmingFigure: A ggplot2 result | Source: IBM

This is a good example of data visualisation provided by R package ggplot2. In this, it displays the intricate relationship between yards per game and touchdowns as well as first downs.

Easy to learn

You can quickly get to speed as R has an easy learning curve. For instance, with the simplicity of the language, you can easily create three samples and create a bar graph of that data.

Bar graph with blue, green , orange, red and yellow bars to explain R programmingFigure: Three random samples | Source: IBM

Machine learning

It makes machine learning a lot more easy and approachable with a superabundance of R packages for machine learning like MICE (for taking care of missing values), CARET (for classification and regression training) among others.

Drupal in the picture

A digital agency integrated Drupal with R for an insurance company that envisions itself offering a highly personalised matching service to assist people for selecting the right insurance program. The company leveraged R for building an algorithm that can calculate the compatibility score. The main notion was to be able to efficaciously match prospective customers to the insurance carrier based on customer’s needs and preferences.

The company leveraged R for building an algorithm that can calculate the compatibility score to be able to efficaciously match prospective customers to the insurance carrier based on customer’s needs.

The intricacies involved during the calculation of the compatibility score that is a function of elements like user persona, price sensitivity among others. So, numerous unique customer personas were found in the process like demographic factors (gender, age, education etc.) and car ownership details (car type, mileage etc.). Once a prospect is found to be a particular persona, it is, then, mapped to each of the insurance providers on the basis of a score and customer satisfaction survey ratings.
 
Moreover, different scores are calculated for tracking sensitivity of the customer to price, customer service levels, etc. This is done through a web service that links to the insurance carriers’ information and offers a quote for the customer on the basis of numerous information points provided by him. Finally, consolidation of all these scores into two different parameters give a final score that helps the prospect to select the right insurance carrier.

An insurance portal was built in Drupal which was the customer persona questionnaire that prospects can fill to be able to search the best carrier for their spending styles and other preferences. Once the information is entered by the prospect, it is passed from Drupal to R in the JSON format. R ensures that a plethora of customer data already exist and are also leveraged by the algorithm developed. Quotes pulled from multiple carriers are processed via web service. On the basis of the algorithm, R processes all the data and sends back the best carrier match options for the prospect who can, then, select the carrier on the basis of his preferences.

Conclusion

R, along with Drupal, is a marvellous option for the data experts to do everything ranging from the mapping of social and marketing trends online to building financial and climate models for giving that extra push to our economies and communities.
 
We have been offering a suite of services for fulfilling the digital transformation endeavours of our partners.
 
Let us know at [email protected] how you want us to be a part of your digital innovation plans.
 

Feb 23 2019
Feb 23

iRobot, a leading global consumer robot company, had a spectacular debut on Amazon Prime Day as it sold thousands of its Roomba robotic vacuums. Eventually, with the greater focus on its central value proposition vis-à-vis offering leading-edge robots to relieve customers from humdrum chores and the increasing number of connected customers, iRobot decided to move its mission-critical platform to the Amazon Web Services (AWS) Cloud. By leveraging the powerful tools and integrations of AWS, they were able to use a serverless architecture that offered an efficacious combination of scalability, global availability and breadth of services.

a humanoid standing on a rocky surface while the sun sets in the distance


Why is such a big organisation like iRobot opting for serverless computing? Using a serverless architecture powered by AWS IoT and AWS Lambda was beneficial to keep the cost of the cloud platform low, negate the need for subscription services and handle the solution with fewer people. Also, the need to maintain physical infrastructure and systems software vanishes. Drupal, as a leading player in the open source content management framework market, has been a humongous option for building innovative solutions. Drupal can be a great solution for implementing a serverless architecture. Before diving right into it, let’s traverse the road that delineates how the concept of serverless came into existence.

The emergence of serverless computing

To look at how serverless came into existence, freeCodeCamp has compiled an interesting story that takes us to the 1950s. This was the time period when the computing paradigm called mainframes burst onto the scene. Eventually, the mid-2000s witnessed the advent of a new paradigm called cloud computing.

Flowchart with arrows pointing different words to explain history of serverless computingSource: ResearchGate

It was in the 2010s when the concept of serverless architecture popped up. Serverless is more: From PaaS to present cloud computing report has an interesting compilation that distinguishes six main dimensions, as seen in the figure above, of critical breakthroughs that paved the way for the emergence of serverless.

Serverless: In-depth

Flowchart showing icons representing servers, cloud, Microsoft Office 365 logo, AWS Lambda to explain serverless computingSource: freeCodeCamp

What is Serverless Computing? It enables you to write and deploy code without having any sort of obstruction while handling the underlying infrastructure. Here, the cloud-based servers, infrastructure and operating systems are relied upon by the developers. Although it is referred to as serverless, there is still an involvement of servers. Being a fully managed service, the setup, capacity planning and server management are managed by the cloud provider.

Serverless architectures are application designs that incorporate third-party “Backend as a Service” (BaaS) services, and/or that include custom code run in managed, ephemeral containers on a “Functions as a Service” (FaaS) platform. By using these ideas, and related ones like single-page applications, such architectures remove much of the need for a traditional always-on server component.- Martin Fowler

Martin Fowler states that there is no clear view of defining Serverless. On one side, it was first used to describe applications that involve the incorporation of third-party, cloud-hosted applications and services for handling server-side logic and state called (Mobile) Backend as a Service (BaaS). On the other side, it can also mean applications in which the server side logic is written by the developer and run in stateless compute containers that are fully managed by a third party called Functions as a Service (FaaS).

Benefits of Serverless

Bar graph with blue-coloured horizontal bars to explain benefits of serverless computing


Following are some of the merits of Serverless stated by AWS:

  • Governance: There is no need for governing any servers. To add to this, no software or runtime to install, maintain or administer is needed.
  • Scalability: Automatically, your application can be scaled. It can also be flexibly scaled by adjusting its capacity which is done by toggling the units of consumption (such as throughput or memory) in lieu of units of separate servers.
  • Cost: You can pay for consistent throughput or execution duration instead of making payment by server unit.
  • Availability: It offers built-in availability and fault tolerance

Implementation

Flowchart showing logos of Amazon Aurora, CloudFront, S3, EFS to explain implementation of serverless computing with Drupal


A combination of Amazon CloudFront, a web service that quickens the process of distribution of static and dynamic web content; [email protected], that extends serverless compute feature to the CloudFront network; and Drupal as a powerful headless CMS can be fantastic for building a serverless architecture. The seamless integration between CloudFront, [email protected] and headless Drupal delivers the lowest latency and personalised experience to the users.

The seamless integration between CloudFront, [email protected] and headless Drupal delivers the lowest latency and personalised experience to the users.

AWS has delineated how to accelerate the Drupal content with Amazon CloudFront. It showed how to deploy CloudFront to cache and accelerate your Drupal content with the help of a globally distributed set of CloudFront nodes. In this, every CloudFront distribution consisted of one or more origin locations. An origin is where Drupal content resides. By running the supplied Amazon CloudFormation stacks, Drupal 8 was deployed. Amazon Elastic Compute Cloud (EC2), Amazon Elastic File System (EFS), Amazon Relational Database Service (RDS) and Amazon Aurora came in handy as well. It was all wrapped up in a highly available design using several Availability Zones and its configuration was done in such a way that it was able to auto scale using Amazon EC2 Auto Scaling groups.
 
Path module, available in Drupal 8, helped in creating URL aliases for the content. ‘Aggregate CSS Files’ and ‘Aggregate JavaScript Files’ were enabled by default within the Drupal 8 administration. This led to the reduction of bandwidth requirements between the Origin AWS infrastructure and CloudFront Edge nodes. Internal Drupal caching was disabled by default that controlled the maximum amount of time a page could be cached by browsers and proxies. Drupal’s CDN module was also suggested to be enabled that could alter file URLs so that CSS, JavaScript, images, audio and videos could be easily cached within CloudFront. This was followed by the creation of CloudFront distribution by using CloudFront console which involved configurations on Origin, default cache behaviour settings, and distribution settings.

What lies ahead?

A report by Markets and Markets estimates that the market size of serverless architecture was estimated at USD 4.25 billion in 2018. This is expected to grow at a Compound Annual Growth Rate (CAGR) of 28.6% to reach USD 14.93 billion by 2023. Elimination of the need to administer servers for minimising infrastructure costs and streamlining deployment, governance and execution, shift from DevOps to serverless computing, and the burgeoning state of microservices architecture have been contributing towards its growth.

Bar graph showing blue, green, indigo, maroon coloured vertical bars to explain serverless computing market shareSource: Grand View Research

A report by Grand View Research states that automation and integration services segment play a crucial role in establishing a serverless architecture. While the monitoring services segment is expected to see the highest CAGR of 28.8%, the small and medium enterprises (SME) segment will grow with CAGR of 28.6% between the forecast period of 2015 and 2025. The banking, financial services and insurance (BFSI) segment to anticipated to retain its dominance in this market share. Moreover, North America, with the powerful presence of U.S., is growing at a great pace. Asia-pacific is also expected to witness a growth of CAGR of 26% in the forecast period.
 
Research and Markets, in a report, states that serverless computing does come with certain challenges during its deployment. As Cloud Service Providers (CSP) control the underlying infrastructure, users are unable to customise or optimise the infrastructure. Further, the organisation has no authority over the infrastructure that raises the risk factor involved with adding several customers on the same platform. Also, consumers do not have control over penetrations tests and vulnerability scanning on infrastructure that increases the requirement for compliance concerns adopters and acts as restraints to the market growth.

Conclusion

The Chief Technology Officer (CTO) of Amazon, Werner Vogels, said in his 2016 keynote, “Before, your servers were like pets. If they became ill you had to nurture them back to health. Then, with cloud, they were cattle, you put them out to pasture and got yourself a new one. In serverless, there is no cattle, only your application. You don't even have to think about nurturing back to health or getting new ones, all the execution is taken care of.”
 
Serverless Computing can be a great solution in combination with Drupal. The serverless platform enables you to globally distribute your web application for running dozens of data centres across the globe with the customers being served from the one that is nearest to them.
 
Be mindful of the fact that serverless is not the right approach for all the problems and consider your business requirements before taking a plunge.
 
We have been on a constant pursuit of offering a magnificent digital presence to our partners using our expertise in Drupal development.
 
To decide if serverless is the right fit for your business, let us know at [email protected] so that we can help you scale your digital business.

Feb 23 2019
Feb 23
Date: 2019-February-23Vulnerability: SA-CORE-2019-003 Notice of increased risk and Additional exploit pathDescription: 

This Public Service Announcement is a follow-up to SA-CORE-2019-003. This is not an announcement of a new vulnerability. If you have not updated your site as described in SA-CORE-2019-003 you should do that now.

There are public exploits now available for this SA.

Update, February 25: Mass exploits are now being reported in the wild.

In the original SA we indicated this could be mitigated by blocking POST, PATCH and PUT requests to web services resources, there is now a new way to exploit this using GET requests.

The best mitigation is:

This only applies to your site if:

  • The site has the Drupal 8 core RESTful Web Services (rest) module enabled.
  • OR

  • The site has another web services module enabled, like JSON:API in Drupal 8, or Services or RESTful Web Services in Drupal 7, or custom code that allows entity updates via non-form sources.

What to do if your site may be compromised

Take a look at our existing documentation, ”Your Drupal site got hacked, now what”.
We’ll continue to update the SA if novel types of exploit appear.

Feb 22 2019
Feb 22

 custom theme pictured near a pre-built theme

When implementing Drupal or any CMS for that matter, you can jump-start development and save time with pre-built themes. However, these themes can also hamstring your efforts to deliver content in the way you envisioned. 

An advantage of a pre-built theme is that they are usually already tested and will work as designed right out of the gate. On the other side, building a custom theme ties your web presence tightly with your brand identity and feels more like a singular experience. In this post we will try to determine if you should build a theme to match a custom design, or use a pre-built theme and try to fit your needs into the pre-packaged solution.

What is a CMS theme?

First, let's talk about what a theme even is and what its role in the website is. Themes exist for the most part in the presentation layer of your site. A theme will generally consist of templates for markup, CSS, and JavaScript. A theme is the rendered interpretation of the content data being returned by the backend CMS system. The theme determines the HTML structure through its templates and the look and feel of the HTML through its CSS and javascript files.

Brainstorm your next development project with an Ashday Drupal expert! Request your free session today. 

A pre-built Drupal theme may also include small pieces of custom code or custom views depending on what is needed. Often times they will also come with some predefined content types and usage examples. Similarly, pre-built Wordpress themes can also have some custom code or dependencies on additional plugins. A pre-built theme can’t accommodate a lot of custom needs or integrations and in many cases need to be hacked, ( significantly altering the code in a way that will likely eliminate any hope of support ), to get them to do anything beyond their design.

In the world of themes there are free themes and premium themes. Premium themes have a cost, usually a one-time fee or subscription, and a license and terms of use. There is a cottage industry around the creation of premium themes, individuals build attractive looking layouts and load them up with the entire kitchen sink of design patterns from sliders to parallax video backgrounds, then provide varying levels of support for them. The marketing behind these themes make them out to be the one-size fits all web solution. In reality, they often fall short of what you actually need and cause unnecessary bloat from the included features that you don’t need or use. That’s not to say that they are a bad solution, sometimes it is exactly what is needed for a project with very loose requirements.

About Drupal Themes

For Drupal, themes are broken up into a few different categories. There are base themes that don’t provide any visual design but do provide structure for HTML markup and some basic functionality. Base themes require a custom child theme that extends them to finish out the look and feel of the site. Related to custom themes in Drupal are distributions, these include a theme but also a pre-configured suite of contributed and custom modules. Distributions are generally built around specific project types like CRMs, e-commerce, education, or government to name a few. 

If you are considering using a pre-built theme, whether it’s a premium or free theme with Drupal, you should consider if you even need to use Drupal at all. Pre-built themes bypass a lot of the flexibility that Drupal can provide, and you may be able to achieve your goals with a more streamlined SaaS website builder solution. Pre-built themes make a little more sense for a CMS like Wordpress that is designed for a specific task like blogging. If you are looking to do more than a basic brochure style site or blog with Wordpress, you should think about a custom theme. 

Pre-built themes can save time up front during your initial build phase. They can also end up consuming more time than a custom theme would if anything needs to be modified or added to in any way. If you have plans to add an integration or custom feature in a future phase of your site, the pre-built theme could add more time in development and end up costing more in the long run. In short, if your project is a square peg that fits in the pre-built theme’s square hole, then the pre-built theme is the way to go, but this scenario is rare. In most cases you want your site to be tailored to your business and have a user experience that feels intentional and purpose-built. This is something that can only be achieved with a lot of planning and a custom theme.

Offer for a free consultation with an Ashday expert

Feb 22 2019
Feb 22

Whether you’ve just recently built a new site, or you are in charge of maintaining an existing one, it’s critical to leverage the most you can out that site on an ongoing basis. As the internet grows and audience needs change, your site needs to be maintained and adapted over time. Sites can also be expensive to upgrade if not properly cared for (think of not performing regular maintenance on your car for several years until finally it breaks in an expensive way).

And yet, most organizations don’t have the money to redo a site more than once every three or four years. Sometimes they often don’t have the money to hire someone in-house to maintain the site beyond content updates. Who takes care of your security updates, or changes to modules or plugins so that your site doesn’t break?

That’s where quality website support and maintenance comes in. A good support agency can make your site last a long time past its creation date and keep it fresh until it’s time for the next rebuild and redesign.

Here’s are the top five things to look for when hiring for outside website support:

  1. Make sure they have a dedicated support team or department. Don’t go with an agency that simply pulls people off of regular design or development build projects to do support tickets on the side. Your site won’t get the same attention or care, since they consider support more of a side gig rather than an important part of their business model. Make sure the agency has a dedicated team that is committed to and organized around supporting sites.
  2. Look for transparency in billing. Make sure you understand the billing options. Most companies will offer different levels of packages, each with a set number of hours. If you have a site with a lot of traffic and ecommerce for selling items to customers, you’re going to want immediate service if something goes wrong vs. a site that’s more informational and can wait a few hours before a fix is implemented. Understand the levels of service you’re getting and the differences in costs for the timeliness of the response. Also ask what happens with any unused hours paid for in advance: do they rollover to the next month, or are they “use it or lose it?”
  3. Ask if you can talk to a human if needed. All agencies use (or should use) a ticketing system in order to track support requests. Ticketing systems allow for transparency, accountability, and clarity on what is being addressed and when. While these systems are tremendous for tracking the progress of an issue as it gets fixed, using them exclusively can be frustrating if something is hard to explain via text. Ask the agency if you’re allowed to hop on a call with one of their support staff, or the Project Manager, for advice and guidance. Often you can save time and increase clarity to simply have a conversation with a human. Plus it’s nice to establish a relationship with the person in charge of keeping your site running smoothly.
  4. Check that there’s a diverse range of talent within the team. Most developers can do module, plug in and security updates. But can they do any front-end work? What if the theme breaks, or you need a new page design? You might need more than code updates. Go for a more diverse and creative team that has experience with feature development as well as creative enhancements to cover all the range of items you might need.
  5. Determine how important it is if they work in your time zone. Talented designers and developers are all over the globe, but it can be tough to get fast responses from people in time zones very far off from yours. What happens if you need something right away, but it’s the middle of the night for them? If you’re in Hawaii, for example, you may not want to have an east coast agency handle your support. Ask the agency what their hours are, and try to get serviced in as close to your time zone as possible.

Following these tips will help give you confidence that you are asking the right questions and finding the right support services to fit your organization.

If you’re interested in learning more about Kanopi’s support offerings, contact us. We have dedicated support teams for both Drupal and WordPress, with a diverse staff who can cover anything you need. We also do it very well. Our hours are 9:00 am to 5:00 pm your local time in North America . . . and that counts for Hawaii!

Feb 22 2019
Feb 22

Joris Snoek - Business Dev

+31 (0)20 - 261 14 99

It is good to keep abreast of available open source 'contrib' Drupal modules. 'There's a module for that', applies to many use cases within Drupal; it's a sin to build something that already exists. We keep track of the latest module releases every month, this is what we noticed about module updates in the last month:

1. Rabbit Hole

An ingenious module, somewhat superlative, but I was surprised that I never saw it before. This Drupal module solves an issue that has required attention for Drupal implementations for years: you want to use certain content, files or Taxonomy terms combined for building specific other content pages - you do not want to use them as individually accessible pages.

For example: you have a page where a slideshow with 10 images is shown. Those 10 images are 10 manageable Drupal nodes in the backend. If you do not do anything about it, then your slideshow probably works fine, but those 10 nodes are also individually accessible anonymously - and indexable by Google (╯°□°)╯︵ ┻━┻

That is what you want to avoid: your SEO is broken and people can end up on pages that are not part of your website at all - and probably not styled -yikes!

We always solved this in custom code: access to relevant entities give a 404 page not found. But this module made it generically configurable, very nice!

Bonus
Oh yeah, if you use the XML sitemap module: do not forget to exclude items.

https://www.drupal.org/project/rabbit_hole

2. Quick Link

Following this blog on the track: this module provides an implementation of the Quicklink library from Google Chrome Lab for Drupal. Quicklink is a lightweight JavaScript library (less than 1 kb compressed) that enables faster consecutive page loads by following in-viewport links.

How Quicklink works
Quicklink attempts to speed up navigation to subsequent pages. It:

  • Detects links within the viewport (using Intersection Observer)
  • Waits until the browser is inactive (with requestIdleCallback)
  • Checks whether the user has a slow connection.
  • Prefetches from URLs to the links (using
    of XHR).

Under construction
The module just recently arrived and is still under heavy construction at present, but absolutely one to watch.

https://www.drupal.org/project/quicklink

3. Image Effects

A popular evergreen module, which was named in Drupal 7 era ImageCache Actions: contains a bundle of actions that you can apply to an image.

https://www.drupal.org/project/image_effects

4. Weight

If you want to set the order of content in a list (eg Drupal nodes), you will need a field to facilitate this. Drupal does not offer this by default, but this module helps you: after installation you can give content items (for example Drupal nodes) a weight, making them appear higher or lower in a (non-chronological) list.

Interesting in this context is Comparison of Node Ordering Modules.

https://www.drupal.org/project/weight

5. Automatic User Names

This Drupal module can automatically generate a username from other User fields (eg first name and last name). Because this username is automatically generated, it is no longer necessary to have it filled in manually. That is why this module also deactivates the username field. The Real Name module can be a good addition to this.

https://www.drupal.org/project/auto_username

6. Role Expire

This is a simple module that allows administrators to manage expiration dates on user roles. A common application of this module is the implementation of subscriptions in the form of a magazine, where someone has access to protected content for a certain period of time.

https://www.drupal.org/project/role_expire

7. Twig Field Value (Drupal theming)

A popular Drupal module, with which Drupal themers can get partial data from render arrays. So that there is more control over exactly which data ends up on the screen.

https://www.drupal.org/project/twig_field_value

8. Entity Browser

Developer module to provide a browser / selector / selector for a Drupal entity. It can be used in any context in which a content manager has to select one or more entities and something has to be done with it (content, image, video, audio, etc)

Possible applications:

  • to produce an entity reference widget;
  • use in a wysiwyg editor.

An example is the File Entity Browser module, a kind of media browser, that uses this Entity Browser module.

https://www.drupal.org/project/entity_browser

9. Paragraphs Previewer

An extension for the popular Drupal Paragraphs module. By default, there is no possibility in the backend to view a preview of an input piece of content in Paragraphs: you first have to save the entire content article (the node), then refresh the frontend and see what it looks like. This module solves this by giving the possibility of a preview in Paragraphs, in the backend.

You will have to make sure that the html/css styling of the Paragraph in question is also fully included there and is not dependent on a global context (eg page, section, div, etc classes).

https://www.drupal.org/project/paragraphs_previewer

10. Views Parity Row

In Drupal you can work with View Modes because content from the same content type in different places in a Drupal website can look different, some standard View Modes are:

  • Full Content
  • RSS
  • Search index
  • Teaser

You can add unlimited View Modes yourself. A list of all active View Modes can be found under DRUPAL_SITE_URL/admin/structure/display modes/view.

Now Drupal core also contains the Views module: for making lists in the broadest sense of the word. In such a Drupal View you can specify which View Mode of relevant content you want to show (teaser, full content, rss, etc). But you can only choose one in the View, so you can not alternate them. And you might want that in some cases; that is the issue that this module solves: different View Modes in one View, phew :)

https://www.drupal.org/project/views_parity_row

11. Leaflet

Leaflet is an open source JavaScript library for mobile-friendly interactive maps. This module integrates Leaflet into Drupal. An alternative to Google Maps or MapBox for example.

https://www.drupal.org/project/leaflet

12. Background Images Formatter

This module offers an image formatter that allows you to set an image in the background of a css tag. The images come from a Drupal entity field and not from a configuration page or a custom Drupal entity or anything else, so it's very easy to set up and manage - as the project page describes.

The module also contains a sub-module to process responsive images.

https://www.drupal.org/project/bg_image_formatter

13. OtherView Filter

Within the Drupal core you can easily create lists, for example of content or users, using Views. Sometimes within the many unwise standard options you just fail to build in that one exception: for example one or more specific content items that you do not want to have in the list.

If you install this module, you can have the results of one View excluded in the other. This sounds like a Rube Goldberg machine and perhaps a custom query is better, but that depends on the use case, the budget, system scale, future wishes and your development knowledge.

https://www.drupal.org/project/other_view_filter

14. Taxonomy Formatter

Drupal Taxonomy is a powerful, flexible system for categorizing content. The standard formatters merely build a lot of divs around the terms in the frontend. This module adds a new formatter that gives you more influence on these layout options.

https://www.drupal.org/project/taxonomy_formatter

15. Copy Prevention

This module applies a number of techniques making it more difficult to copy content from your Drupal website:

  • Switch off selecting text.
  • Disable copying to clipboard.
  • Disable right mouse button for all site content.
  • Disable right mouse button for images only.
  • Protect/hide images for search engines so that your images are not shown in search results.

If you really do not want information copied, then you should not put it on the internet, technically savvy people can always copy content from a public Drupal website, but this module makes it more difficult for non-technical people.

https://www.drupal.org/project/copyprevention

16. Consumers

A developer api module used in Contenta CMS, a Headless Drupal distribution. This module itself does not contain functions for end users, but facilitates an API for other modules to build on.

In this case Consumers can be registered (similar to https://developers.facebook.com), for decoupled Drupal installations to offer variations based on who makes the request. All these options are managed under a joint umbrella so that other modules can use them.

https://www.drupal.org/project/consumers

17. Nagios

This module integrates Drupal into the Nagios monitoring system and provides instant central insight into Nagios of:

  • Is the Drupal core up-to-date?
  • Are the Drupal contrib modules up-to-date?
  • Are the Drupal site settings correct?
  • Many other safety aspects in Drupal

https://www.drupal.org/project/nagios

18. Autoban

Drupal security module, which analyzes visitor behavior: when suspicious actions are detected, the relevant IP address is added to a blacklist.

There are various settings possible, so you can adjust how strictly the module occurs.

https://www.drupal.org/project/autoban

Wrap up

Ok, that's it for this now, next month I expect a new modules updates, so stay tuned!

Feb 22 2019
Feb 22

SEO is an integral part of any website. The same holds true for Drupal as well. Fortunately, due to Drupal’s prolific community and consequently module-rich nature, getting started on SEO with it is somewhat easier as there are loads of SEO modules available for it. 

However, sifting through loads of modules can get overwhelming, and this is why I’m going to help you out by highlighting the ones our team has found the most useful. 

Drupal SEO Checklist

First up on our list, we’ve got the Drupal SEO Checklist module; an all-in-one SEO dashboard that checks if your site is optimized for search engines and gives you an overview of various SEO functions for the site. 

Apart from that, it also breaks down the SEO tasks for you, recommends various SEO modules that further improve functionality and even keeps track of what has already been done with a date and time stamp. If you like to keep things organized, then the SEO Checklist is a must-have for you!

SEO Checklist

Pathauto

Pathauto is an immensely useful module that most Drupal developers swear by. Where SEO is concerned, having proper URLs to a site’s pages is essential if you want your content to rank high on SERPs. 

By using Pathauto, the need to manually create proper URLs for each new node is eliminated. Instead, the module automatically generates URLs based on specific set patterns that can be customized by the user. The URLs thus generated are concise and structured in the way that the user wants. As such, the Pathauto module is a really simple solution that can do wonders for your SEO.

Pathauto module

Redirect

Imagine a situation where you make a change to an article, which also means changing the context of the URL. The thing is, that URL is already ranked on search engines - this means that if any user were to click on the ranked URL, they would be directed to a link that is no longer available. 

Very likely, this would cause the user to leave your site and look elsewhere for the desired information, resulting in a detriment to your site's SEO ranking. With the Redirect module, you can easily redirect the users to the new URL. This contributes greatly towards your SEO efforts as it eliminates dead links from your site.

Redirect module

XML Sitemap & Simple XML Sitemap

A website’s XML sitemap is like the directory of that website. In it is defined the website’s structure, such as its URLs and the relationships between them. This makes it easier for Google’s search engine bots to crawl through such directories and hence rank them. It is highly recommended that you create a sitemap for your site in order to boost its SEO ranking.

The XML sitemap and the Simple XML sitemap modules for Drupal create such a sitemap for your Drupal site. The major difference between these two modules is that the latter was made specifically for Drupal 8; see this post for more differences between the two.

XML sitemap module

Simple XML sitemap module

Metatag

The Metatag module gives you the ability to provide more metadata to your website. This includes tags, page titles, descriptions etc. Google’s search engine uses this metadata to rank the website in search engine results.

While Drupal natively doesn’t allow editable meta tags fields, all of that can be done with the Metatag module. Using this module, the user can set meta tags for users, taxonomy, nodes, views etc. 

A new release of the module was made just a few days ago, resolving the issues described in SA-CORE-2019-003.

Metatag module

Google Analytics

While not a SEO module in and of itself, Google Analytics is a powerful tool for all aspects of a website. It helps you with monitoring traffic and keeping tabs on an extensive analysis of your site. 

Whenever you perform SEO-related changes to your site, you might be curious to see what results it yields. Using the Google Analytics module for Drupal, you can integrate your Drupal site with Google Analytics and find out what results your practices bring about. 

Google Analytics

Conclusion

There you have it - some of the most useful modules for Drupal 8 which encompass a wide variety of SEO actions. These are also the most accessible modules for anyone wanting to get started on optimizing their Drupal site for search engines. We hope you make good use of them and succeed in upping your website’s SEO game!

Are you struggling with your Drupal site’s SEO? Need a helping hand? Contact us at Agiledrop - our extensive experience of working with Drupal enables us to provide top-notch Drupal services to our clients.
 

Feb 21 2019
Feb 21

“But I don’t want to think about Drupal 9 yet!”

Adulting sometimes means doing things we don’t want to do, like thinking about CMS version upgrades. 

We can help.

For now, here’s what you need to know about Drupal 9.

1. Drupal 9 is targeted for a June 2020 release.  

Eighteen months following the release of Drupal 9, in November of 2021, Drupal 7 and 8 will both hit EOL status. This means that you will have a little more than a year to move your site from Drupal 7 or 8, following the June 2020 release date of Drupal 9. 

The normal schedule would dictate that Drupal 7 hit EOL shortly after the Drupal 9 launch. However, because so many sites are still on D7, and at this point, many may just skip D8 completely, the decision has been made to extend D7 support for an additional year. As a result, D7 and D8 will both lose support at the same time.

The November 2021 date is significant because that is also when Symfony 3, which is a major dependency for Drupal 8, will lose support.

What this means for you as a D7 or D8 site owner is that you should have a plan to be on Drupal 9 by the summer of 2021.

2. The transition from Drupal 8 will not be painful.

Unlike Drupal 8, which for practical purposes was a new CMS sharing the Drupal name, Drupal 9 is being developed on Drupal 8. That means if you are up to date with Drupal 8.9 the move to Drupal 9 should be relatively easy. It probably won’t be Drupal 8.4 to 8.5 easy, but it should be nothing close to the level of effort of moving from Drupal 7 to Drupal 8. In a lot of ways, Drupal 9 is just the next six-month update that comes after Drupal 8.9, with the added complication of being the version that we use move depreciated code and APIs out of the codebase.

My recommendation if you are still on Drupal 7: migrate to Drupal 8 when it's convenient. As stated above Drupal 9 is really just Drupal 8.10.1. So you kind of are migrating to Drupal 8, even it’s called Drupal 9 at that point. You won’t save any effort by waiting until Drupal 9 is out, you’ll just be on the outdated D7 codebase longer.

Concerning modules: as long as modules aren’t depending on any APIs that are being depreciated in D9, contributed modules should continue to work with both D8 and D9. 

3. Promet Source can help with a readiness audit

The good news if you are on an updated version of D8 the transition to D9 should be smooth. If you are on D7 you are essentially doing the same thing whether you migrate to Drupal 8 or Drupal 9.

We’re here to help! If you want to talk in more depth about what Drupal 9 has in store, and start to make a plan for the transition, contact us for information on our Drupal 9 readiness audit.
 

Feb 21 2019
Feb 21

You're probably thinking, “But I don’t want to think about Drupal 9 yet!”

Well, adulting sometimes means doing things we don’t want to do, like thinking about CMS version upgrades. 

We can make this easy.

For now, here’s what you need to know about Drupal 9.

1. Drupal 9 is targeted for a June 2020 release.  

Eighteen months following the release of Drupal 9, in November of 2021, Drupal 7 and 8 will both hit EOL status. This means that you will have a little more than a year to move your site from Drupal 7 or 8, following the June 2020 release date of Drupal 9. 

The normal schedule would dictate that Drupal 7 hit EOL shortly after the Drupal 9 launch. However, because so many sites are still on D7, and at this point, many may just skip D8 completely, the decision has been made to extend D7 support for an additional year. As a result, D7 and D8 will both lose support at the same time.

The November 2021 date is significant because that is also when Symfony 3, which is a major dependency for Drupal 8, will lose support.

What this means for you as a D7 or D8 site owner is that you should have a plan to be on Drupal 9 by the summer of 2021.

2. The transition from Drupal 8 will not be painful.

Unlike Drupal 8, which for practical purposes was a new CMS sharing the Drupal name, Drupal 9 is being developed on Drupal 8. That means if you are up to date with Drupal 8.9 the move to Drupal 9 should be relatively easy. It probably won’t be Drupal 8.4 to 8.5 easy, but it should be nothing close to the level of effort of moving from Drupal 7 to Drupal 8. In a lot of ways, Drupal 9 is just the next six-month update that comes after Drupal 8.9, with the added complication of being the version that we use move depreciated code and APIs out of the codebase.

My recommendation if you are still on Drupal 7: migrate to Drupal 8 when it's convenient. As stated above Drupal 9 is really just Drupal 8.10.1. So you kind of are migrating to Drupal 8, even it’s called Drupal 9 at that point. You won’t save any effort by waiting until Drupal 9 is out, you’ll just be on the outdated D7 codebase longer.

Concerning modules: as long as modules aren’t depending on any APIs that are being depreciated in D9, contributed modules should continue to work with both D8 and D9. 

The good news if you are on an updated version of D8 the transition to D9 should be smooth. If you are on D7 you are essentially doing the same thing whether you migrate to Drupal 8 or Drupal 9.

We’re here to help! If you want to talk in more depth about what Drupal 9 has in store, and start to make a plan for the transition, contact us for information.

Feb 21 2019
Feb 21

This is Part 3 of a three part series about choices you can make with the news of Drupal 9’s release. Part 1 is an overview. Part 2 is what to do if you choose to stay on Drupal 7. Part 3 is what to do it you choose to upgrade to Drupal 8. 

If you’re following along in our series about the release of Drupal 9, you understand that there are options for upgrading, each with its pros and cons. At Kanopi, we know it’s not a one size fits all decision. We want to provide you with as much information as possible to help you decide what’s right for your site.

To recap, we shared an overview of all the options in part one, and a deep dive for our clients who plan to stick with Drupal 7 in part two. Here in part three, we share a bit of wisdom for those who are considering moving to Drupal 8.

At Kanopi we support more than 100 Drupal 7 sites. Many of them are well optimized and built to last, which can make it difficult to pull the trigger on a rebuild.  

When we talk to our Drupal 7 clients about migrating to Drupal 8, we typically hear one of three things:

  1. We don’t have the budget.
  2. We don’t have the capacity.
  3. The site looks and works perfectly fine.

Below, I’ll dig a bit deeper into each of these objections.

Budget

An average website lasts 3-5 years. However, many stakeholders aren’t aware that they need to budget for a new site that often, so they are caught off guard when the time comes. There are a few ways to bridge this gap:

Build the business case. A business case compares the challenges of sticking with your current site with the new opportunity and ROI that could be gained by making a change.

To get started, we recommend a site audit and creative strategy session to help identify what’s not working and what might be needed to get back on target. You should also take a look at your organic search performance (SEO), accessibility, speed, and overall usability. All of these factors can reveal where your site may be missing the mark and help to justify an upgrade.

When building your case, make sure that you think through the total cost of ownership for your site so that you can reserve enough budget to get the work done right. For example, if you spent $25,000 on your website in 2013, then made incremental updates over the last five years at $10,000 per year, the cost of your site is $75,000. If you want to preserve all features in your rebuild, you should ask for at least $75,000. While you’re at it, it’s a good idea to ask for 25 percent more than the amount it would take to preserve existing features. The redesign process will inevitably generate new ideas and site improvements that will require additional budget to implement. In this example, we would recommend asking for $100,000 and justifying the cost with a breakdown that takes your total cost of ownership into account.

Here’s another example: if you built your Drupal 7 site in house and worked on it for 24 months using a resource who makes $75,000 per year, the site cost your organization $75,000. Knowing this can help you build a rationale that hiring an agency to build your Drupal 8 site at $75-100,000 within six months is a great deal since the work will have similar costs and take far less time to complete.

Demonstrating where and how a new website could show direct ROI can make all the difference when convincing stakeholders to approve the budget for an updated site.

Consider the costs of doing nothing. It’s also helpful to think bigger than the cost of an upgrade and consider the costs of not improving your website. Lost customers, damaged reputation and missed opportunities can be hard to quantify, but should be considered.

For example, if your website’s contact form currently gets completed an average of 10 times a month and 10 percent of those who complete the form convert to a sale, that means each deal is worth $10,000. What if, through a smart redesign and upgrade, you were able to increase form completions to 15 per month and add content and features that support the sales team, resulting in 20 percent sales conversions?

As you can see, there are many ways to frame your case to support budget requests. Use the approach that will work best to help your stakeholders understand the value of your website project and it’s potential to make a meaningful impact on your organization’s bottom line. Once they see the value, the budget will come much more easily.

Capacity

Today’s working world moves at lightning speed. Most of us end up doing far more than what’s included in our job descriptions, and those full plates can make a website rebuild feel impossible to tackle.

If your stakeholders are concerned about your team’s capacity to handle a rebuild, talk to them about approaching the work in smaller phases. Many of our clients tackle rebuilds one phase at a time, often signing on for smaller, more digestible bites that make up a larger endeavor. This can help make the process feel more approachable and easier for stakeholders to wrap their heads around. Try getting started with a bit of user research. Then tackle design. You can continue from there in small steps until the work is complete.  

Alternatively, this is where an agency like Kanopi Studios comes in. Rebuilding your site on Drupal 8 or WordPress is a lot of work, but an experienced agency can take much of that work off your plate by making the process as smooth and straightforward as possible and keeping the project’s momentum at full swing. That keeps your team concentrating on their day to day work while the rebuild happens simultaneously.

The site looks and works fine

The most common objection we hear from our clients is that their stakeholders don’t see a need to change or understand the point of doing things differently through a rebuild.  

Maybe you already have a beautiful website that is driving strong results. If so, that’s wonderful! However, as time goes on, you’ll find you need to mix things up a bit to keep up with the pace of change and stay competitive. Trends shift, customer behavior changes, and Google likes to keep us guessing with their algorithm updates. Change is constant in all things, and even more so online.

Most websites have room for improvement, even if they are doing well. To ensure your site stays current, keeping your CMS up to date should be part of your roadmap. If you’re planning to make any updates this year, consider upgrading to Drupal 8 as part of your solution.

Remember, the safety zone may feel warm and comforting, but it will never give you the insight and growth that exploring the unknown can provide. Who knows what wonderful things could be in your future?

Need help?

We can help you strategize and build your case for an upgrade to Drupal 8, 9, or even WordPress. When in doubt, get in touch! We can work out the best approach together.

Feb 21 2019
Feb 21

A PHP code execution bug affecting the Drupal core CMS has been discovered. The Drupal team has released a highly critical new security patch as a fix, urging all system administrators to update affected modules and configurations immediately.

drupal security patch

Not all Drupal sites are affected by the bug. According to the security advisory issued on drupal.org, sites that meet one of the following conditions are affected: 

  • The site has the Drupal 8 core RESTful Web Services (rest) module enabled and allows PATCH or POST requests, or
  • the site has another web services module enabled, like JSON:API in Drupal 8, or Services or RESTful Web Services in Drupal 7.

This bug arose because some field types are not properly sanitizing data from non-form sources. This enables arbitrary PHP code executions in certain cases. If not dealt with, a malicious user could exploit the bug to invade a Drupal site and take control of an affected server. Given that Drupal is such a popular website publishing CMS worldwide, with over 285,000 sites running on Drupal 8 and more than 800,000 sites running Drupal 7, a security issue of this magnitude warranted a swift response.

Per the advisory released on Wednesday, Drupal recommends users who may be affected to take the following steps:

If a site update cannot be updated immediately, the Drupal Security Team recommended another set of steps. Site admins can an disable all web services modules or configure their web servers to not allow PUT/PATCH/POST requests to web services resources.

The security patch was widely released on Wednesday, but Drupal considered the issue serious enough to let admins know about the release a day in advance. Duo was one of the agencies informed and our team got to work as soon as the patch was released. The advance warning gave our developers enough time to block out time on Wednesday afternoon to ensure that every site we’re responsible for was updated immediately. By the end of the day, we had patched all affected Duo client sites. 

While working with an agency helps guarantee a proactive solution to security fixes, Drupal’s in-house security team also plays a major role in maintaining diligence. A team of over 30 volunteers, the Drupal Security Team only recruits from experienced members of the Drupal community. As such, these active and committed watchdogs are typically able to identify problems before they become widespread. The current bug, for instance, was identified by the Drupal Security Team.

A strong security apparatus means nothing, however, if users don’t put in the necessary effort to keep their sites safe. In 2018, the Drupalgeddon 2 bug affected over a million Drupal sites. While many users subsequently patched their servers, an analysis conducted a couple months after that bug was discovered revealed that over one-hundred thousand sites remained vulnerable. All of the foresight in the world can’t protect a site if security concerns aren’t addressed as soon as they’re brought to light.

While major security issues like the one this week and Drupalgeddon 2 are rare, all code has bugs. The best thing a site can do to defend against potential vulnerabilities is to be proactive, actively searching for and fixing issues. Duo offers these protections on a continuous basis and, as part of the Drupal community, are able to discover and implement solutions as soon as they become available. In an increasingly digital world, it’s critical that companies find a partner who can handle your cybersecurity needs promptly and without complication.

Explore Duo

Feb 21 2019
Feb 21

Learn how to implement lazy loading images to improve the performance of your website.

Recently, I've been spending some time making performance improvements to my site. In my previous blog post on this topic, I described my progress optimizing the JavaScript and CSS usage on my site, and concluded that image optimization was the next step.

Last summer I published a blog post about my vacation in Acadia National Park. Included in that post are 13 photos with a combined size of about 4 MB.

When I benchmarked that post with https://webpagetest.org, it showed that it took 7.275 seconds (blue vertical line) to render the page.

The graph shows that the browser downloaded all 13 images to render the page. Why would a browser download all images if most of them are below the fold and not shown until a user starts scrolling? It makes very little sense.

As you can see from the graph, downloading all 13 images take a very long time (purple horizontal bars). No matter how much you optimize your CSS and JavaScript, this particular blog post would have remained slow until you optimize how images are loaded.

Webpagetest images february before

"Lazy loading" images is one solution to this problem. Lazy loading means that the images aren't loaded until the user scrolls and the images come into the browser's viewport.

You might have seen lazy loading in action on websites like Facebook, Pinterest or Medium. It usually goes like this:

  • You visit a page as you normally would, scrolling through the content.
  • Instead of the actual image, you see a blurry placeholder image.
  • Then, the placeholder image gets swapped out with the final image as quickly as possible.
An animated GIF of a user scrolling a webpage and a placeholder images being replaced by the final image

To support lazy loading images on my blog I do three things:

  1. Automatically generate lightweight yet useful placeholder images.
  2. Embed the placeholder images directly in the HTML to speed up performance.
  3. Replace the placeholder images with the real images when they become visible.

Generating lightweight placeholder images

To generate lightweight placeholder images, I implemented a technique used by Facebook: create a tiny image that is a downscaled version of the original image, strip out the image's metadata to optimize its size, and let the browser scale the image back up.

To create lightweight placeholder images, I resized the original images to be 5 pixels wide. Because I have about 10,000 images on my blog, my Drupal-based site automates this for me, but here is how you create one from the command line using ImageMagick's convert tool:

$ convert -resize 5x -strip original.jpg placeholder.jpg
  • -resize 5x resizes the image to be 5 pixels wide while maintaining its aspect ratio.
  • -strip removes all comments and redundant headers in the image. This helps make the image's file size as small as possible.

The resulting placeholder images are tiny — often shy of 400 bytes.

Large metal pots with wooden lids in which lobsters are boiledThe original image that we need to generate a placeholder for. An example placeholder image shows brown and black tonesThe generated placeholder, scaled up by a browser from a tiny image that is 5 pixels wide. The size of this placeholder image is only 395 bytes.

Here is another example to illustrate how the colors in the placeholders nicely match the original image:

A sunrise with beautiful reds and black silhouettesAn example placeholder image that shows red and black tones

Even though the placeholder image should only be shown for a fraction of a second, making them relevant is a nice touch as they suggest what is coming. It's also an important touch, as users are very impatient with load times on the web.

Embedding placeholder images directly in HTML

One not-so-well-known feature of the <img> element is that you can embed an image directly into the HTML document using the data URL scheme:

<img src="https://dri.es/optimizing-site-performance-by-lazy-loading-images/data:image/jpg;base64,/9j/4AAQSkZJRgABAQEA8ADwAAD/2wB
  DAAEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQICAQECAQEB
  AgICAgICAgICAQICAgICAgICAgL/2wBDAQEBAQEBAQEBAQECAQEBAgICAgICA
  gICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgL/wA
  ARCAADAAUDAREAAhEBAxEB/8QAFAABAAAAAAAAAAAAAAAAAAAACf/EAB8QAAM
  AAQMFAAAAAAAAAAAAAAECAwcEBQYACAkTMf/EABUBAQEAAAAAAAAAAAAAAAAA
  AAIG/8QAJBEAAQIFAgcAAAAAAAAAAAAAAQURAAIDBDEGFBIVQUVRcYH/2gAMA
  wEAAhEDEQA/AAeyb5HO8o8lSLZd01Jz2nbKoK4yxDVvZqYl7uaV4CWdmZQSSS
  ST86FJBsafEJK15KD05ioNk4G6Yeg0V9bVCmZpXt08sB2hJ8DJ2Tn7H/2Q==" />

Data URLs are composed of four parts: the data: prefix, a media type indicating the type of data (image/jpg), an optional base64 token to indicate that the data is base64 encoded, and the base64 encoded image data itself.

data:[<media type>][;base64],<data>

To base64 encode an image from the command line, use:

$ base64 placeholder.jpg

To base64 encode an image in PHP, use:

$data =  base64_encode(file_get_contents('placeholder.jpg'));

What is the advantage of embedding a base64 encoded image using a data URL? It eliminates HTTP requests as the browser doesn't have to set up new HTTP connections to download the images. Fewer HTTP requests usually means faster page load times.

Replacing placeholder images with real images

Next, I used JavaScript's IntersectionObserver to replace the placeholder image with the actual image when it comes into the browser's viewport. I followed Jeremy Wagner's approach shared on Google Web Fundamentals Guide on lazy loading images — with some adjustments.

It starts with the following HTML markup:

<img class="lazy" src="https://dri.es/optimizing-site-performance-by-lazy-loading-images/placeholder.jpg" data-src="https://dri.es/optimizing-site-performance-by-lazy-loading-images/original.jpg" />

The three relevant pieces are:

  1. The class="lazy" attribute, which is what you'll select the element with in JavaScript.
  2. The src attribute, which references the placeholder image that will appear when the page first loads. Instead of linking to placeholder.jpg I embed the image data using the data URL technique explained above.
  3. The data-src attribute, which contains the URL to the original image that will replace the placeholder when it comes in focus.

Next, we use JavaScript's IntersectionObserver to replace the placeholder images with the actual images:

document.addEventListener('DOMContentLoaded', function() {
  var lazyImages = [].slice.call(document.querySelectorAll('img.lazy'));

  if ('IntersectionObserver' in window) {
    let lazyImageObserver = new IntersectionObserver(
      function(entries, observer) {
        entries.forEach(function(entry) {
          if (entry.isIntersecting) {
            let lazyImage = entry.target;
            lazyImage.src = lazyImage.dataset.src;
            lazyImageObserver.unobserve(lazyImage);
          }
        });
    });

    lazyImages.forEach(function(lazyImage) {
      lazyImageObserver.observe(lazyImage);
    });
  }
  else {
    // For browsers that don't support IntersectionObserver yet,
    // load all the images now:
    lazyImages.forEach(function(lazyImage) {
      lazyImage.src = lazyImage.dataset.src;
    });
  }
});

This JavaScript code queries the DOM for all <img> elements with the lazy class. The IntersectionObserver is used to replace the placeholder image with the original image when the img.lazy elements enter the viewport. When IntersectionObserver is not supported, the images are replaced on the DOMContentLoaded event.

By default, the IntersectionObserver's callback is triggered the moment a single pixel of the image enters the browser's viewport. However, using the rootMargin property, you can trigger the image swap before the image enters the viewport. This reduces or eliminates the visual or perceived lag time when swapping a placeholder image for the actual image.

I implemented that on my site as follows:

const config = {
    // If the image gets within 250px of the browser's viewport, 
    // start the download:
    rootMargin: '250px 0px',
  };

let lazyImageObserver = new IntersectionObserver(..., config);

Lazy loading images drastically improves performance

After making these changes to my site, I did a new https://webpagetest.org benchmark run:

A diagram that shows page load times for dri.es before making performance improvements

You can clearly see that the page became a lot faster to render:

  • The document is complete after 0.35 seconds (blue vertical line) instead of the original 7.275 seconds.
  • No images are loaded before the document is complete, compared to 13 images being loaded before.
  • After the document is complete, one image (purple horizontal bar) is downloaded. This is triggered by the JavaScript code as the result of one image being above the fold.

Lazy loading images improves web page performance by reducing the number of HTTP requests, and consequently reduces the amount of data that needs to be downloaded to render the initial page.

Is base64 encoding images bad for SEO?

Faster sites have a SEO advantage as page speed is a ranking factor for search engines. But, lazy loading might also be bad for SEO, as search engines have to be able to discover the original images.

To find out, I headed to Google Search Console. Google Search Console has a "URL inspection" feature that allows you to look at a webpage through the eyes of Googlebot.

I tested it out with my Acadia National Park blog post. As you can see in the screenshot, the first photo in the blog post was not loaded. Googlebot doesn't seem to support data URLs for images.

A screenshot that shows Googlebot doesn't render placeholder images that are embedded using data URLs

Is IntersectionObserver bad for SEO?

The fact that Googlebot doesn't appear to support data URLs does not have to be a problem. The real question is whether Googlebot will scroll the page, execute the JavaScript, replace the placeholders with the actual images, and index those. If it does, it doesn't matter that Googlebot doesn't understand data URLs.

To find out, I decided to conduct an experiment. Yesterday, I published a blog post about Matt Mullenweg and me visiting a museum together. The images in that blog post are lazy loaded and can only be discovered by Google if its crawler executes the JavaScript and scrolls the page. If those images show up in Google's index, we know there is no SEO impact.

I'm not sure how long it takes for Google to make new posts and images available in its index, but I'll keep an eye out for it.

If the images don't show up in Google's index, lazy loading might impact your SEO. My solution would be to selectively disable lazy loading for the most important images only. (Note: even if Google finds the images, there is no guarantee that it will decide to index them — short blog posts and images are often excluded from Google's index.)

Conclusions

Lazy loading images improves web page performance by reducing the number of HTTP requests and data needed to render the initial page.

Ideally, over time, browsers will support lazy loading images natively, and some of the SEO challenges will no longer be an issue. Until then, consider adding support for lazy loading yourself. For my own site, it took about 40 lines of JavaScript code and 20 lines of additional PHP/Drupal code.

I hope that by sharing my experience, more people are encouraged to run their own sites and to optimize their sites' performance.

February 21, 2019

6 min read time

Feb 20 2019
Feb 20

Hi everyone,

we are excited to share a few program updates on Drupal Mountain Camp as the team behind the scenes is working hard preparing the last bits before the conference in just 2 weeks.

We are extremely grateful for all the quality session submissions people have submitted. The full schedule over 4 days includes 9 workshops, 2 keynotes, 4 featured sessions and 42 regular sessions in 3 different tracks. 

Besides the already promoted keynotes, we would like to highlight the following featured sessions:

Thanks to the collaboration with the Drupal Recording Initiative by Kevin Thull, we'll be able to provide video recordings for you after the conference.

Contribution is a key topic for Drupal Mountain Camp. Make sure to sign-up for one of the 7 different initiatives or propose your own using our contribution sign-up sheet.

We also updated our social events page so you can start preparing for some fun in the snowy Swiss mountains.

So far, more than 95 tickets have been sold. Regular tickets are available for CHF 120 until 1st of March, afterwards we sell tickets for CHF 140.

We are looking forward seeing you at Drupal Mountain Camp in Davos, 7-10 of March 2019.

Josef / dasjo on behalf of the Drupal Mountain Camp team.

Feb 20 2019
Feb 20

by David Snopek on February 20, 2019 - 2:21pm

As you may know, Drupal 6 has reached End-of-Life (EOL) which means the Drupal Security Team is no longer doing Security Advisories or working on security patches for Drupal 6 core or contrib modules - but the Drupal 6 LTS vendors are and we're one of them!

Today, there is a Critical security release for the Link module to fix a Remote Code Execution (RCE) vulnerability.

The Link module provides a field for storing links.

The module didn't properly validate the field data.

This is mitigated by the fact that the issue is only known to be exploitable via the Services module.

See the core security advisory for Drupal 7 & 8 for more information. Drupal 6 core is not affected, only the Link module.

Here you can download the Drupal 6 patch or the full release.

If you have a Drupal 6 site using the Link module, we recommend you update immediately! We have already deployed the patch for all of our Drupal 6 Long-Term Support clients. :-)

If you'd like all your Drupal 6 modules to receive security updates and have the fixes deployed the same day they're released, please check out our D6LTS plans.

Note: if you use the myDropWizard module (totally free!), you'll be alerted to these and any future security updates, and will be able to use drush to install them (even though they won't necessarily have a release on Drupal.org).

Feb 20 2019
Feb 20

Over the years, as Drupal has evolved, the upgrade process has become a bit more involved; as with most web applications, Drupal's increasing complexity extends to deployment, and whether you end up running Drupal on a VPS, a bare metal server, in Docker containers, or in a Kubernetes cluster, you should formalize an update process to make sure upgrades are as close to non-events as possible.

Gone are the days (at least for most sites) where you could just download a 'tarball' (.tar.gz) from Drupal.org, expand it, then upload it via SFTP to a server and run Drupal's update.php. That workflow (and even a workflow like drush up of old) might still work for some sites, but it is fragile and prone to cause issues whether you notice them or not. Plus if you're using Drush to do this, it's no longer supported in modern versions of Drush!

So without further ado, here is the process I've settled on for all the Drupal 8 sites I currently manage (note that I've converted all my non-Composer Drupal codebases to Composer at this point):

  1. Make sure you local codebase is up to date with what's currently in production (e.g. git pull master).
  2. Reinstall your local site in your local environment so it is completely reset (e.g. blt setup or drush site-install --existing-config). I usually use a local environment like Drupal VM or a Docker Compose environment, so I can usually just log in and run one command to reinstall Drupal from scratch.
  3. Make sure the local site is running well. Consider running behat and/or phpunit tests to confirm they're working (if you have any).
  4. Run composer update (or composer update [specific packages]).
  5. On your local site, run database updates (e.g. drush updb -y or go to /update.php). _This is important because the next step—exporting config—can cause problems if you're dealing with an outdated schema.
  6. Make sure the local site is still running well after updates complete. Run behat and/or phpunit tests again (if you have any).
  7. If everything passed muster, export your configuration (e.g. drush cex -y if using core configuration management, drush csex -y if using Config Split).
  8. (Optional but recommended for completeness) Reinstall the local site again, and run any tests again, to confirm the fresh install with the new config works perfectly.
  9. If everything looks good, it's time to commit all the changes to composer.lock and any other changed config files, and push it up to master!
  10. Run your normal deployment process to deploy the code to production.

All done!

That last step ("Run your normal deployment process") might be a little painful too, and I conveniently don't discuss it in this post. Don't worry, I'm working on a few future blog posts on that very topic!

For now, I'd encourage you to look into how Acquia BLT builds shippable 'build artifacts', as that's by far the most reliable way to ship your code to production if you care about stability! Note that for a few of my sites, I use a more simplistic "pull from master, run composer install, and run drush updb -y workflow for deployments. But that's for my smaller sites where I don't need any extra process and a few minutes' outage won't hurt!

Feb 20 2019
Feb 20
Project: Drupal coreDate: 2019-February-20Security risk: Highly critical 23∕25 AC:None/A:None/CI:All/II:All/E:Exploit/TD:UncommonVulnerability: Remote Code ExecutionCVE IDs: CVE-2019-6340Description: 

Some field types do not properly sanitize data from non-form sources. This can lead to arbitrary PHP code execution in some cases.

A site is only affected by this if one of the following conditions is met:

  • The site has the Drupal 8 core RESTful Web Services (rest) module enabled and allows GET, PATCH or POST requests, or
  • the site has another web services module enabled, like JSON:API in Drupal 8, or Services or RESTful Web Services in Drupal 7.

(Note: The Drupal 7 Services module itself does not require an update at this time, but you should still apply other contributed updates associated with this advisory if Services is in use.)

Updates

  • 2019-02-22: Updated risk score given new information; see PSA-2019-02-22. The security risk score has been updated to 23/25 as there are now known exploits in the wild. In addition, any enabled REST resource end-point, even if it only accepts GET requests, is also vulnerable. Note this does not include REST exports from Views module.
Solution: 

Versions of Drupal 8 prior to 8.5.x are end-of-life and do not receive security coverage.

To immediately mitigate the vulnerability, you can disable all web services modules, or configure your web server(s) to not allow GET/PUT/PATCH/POST requests to web services resources. Note that web services resources may be available on multiple paths depending on the configuration of your server(s). For Drupal 7, resources are for example typically available via paths (clean URLs) and via arguments to the "q" query argument. For Drupal 8, paths may still function when prefixed with index.php/.

Reported By: Fixed By: 
Feb 20 2019
Feb 20

Agiledrop is highlighting active Drupal community members through a series of interviews. Now you get a chance to learn more about the people behind Drupal projects.

This week we talked with Amber Matz, Production Manager and Trainer at Drupalize.Me. In addition to these two important roles, Amber is actively involved in a number of projects in the Drupalverse, the current most notable one likely being the Builder Track at DrupalCon Seattle. Have a read if you’d like to find out more about her journey with Drupal and her insights on its future.

1. Please tell us a little about yourself. How do you participate in the Drupal community and what do you do professionally?

My username on drupal.org is Amber Himes Matz. I participate in the Drupal Community in a number of ways. The bulk of my volunteer time lately has been consumed by the program team for the Builder Track at DrupalCon Seattle, where we review, select, schedule, and support speakers and sessions for the upcoming ‘con. I’m working on (as I am able) moving two issues forward, Add experimental module for Help Topics and new Draft "Getting Started" Outline & Guide. I’m also part of the Community Cultivation Grants Committee and like to keep tabs on what’s happening amongst Drupal camp organizers in Slack. (In February 2019, I was the lead organizer for the Pacific NW Drupal Summit in Portland, OR.) Professionally, I work on Drupalize.Me as Production Manager and Trainer for the platform, which features Drupal tutorials in both written and video format.

2. When did you first come across Drupal? What convinced you to stay, the software or the community, and why?

I was a web developer for an organization for many years working mostly with PHP and MySQL on the backend and HTML/CSS on the frontend. I coded a LOT of forms and form processing scripts. I discovered Drupal as an escape from that tedium. I stuck with it because I needed work and wanted a better job, which I eventually got. I stay with the Drupal community because of a rewarding and satisfying job, great people (local, global, and online), and the opportunity to travel.

3. What impact has Drupal made on you? Is there a particular moment you remember?

My career benefited greatly and singularly from showing up to a local Drupal user group meeting. From that first meeting, I made a connection which lead only weeks later to a job interview and my first job as a dedicated Drupal developer (which ended up being Developer + Client Manager + Project Manager). After this job experience, I was hired at Lullabot as a trainer for Drupalize.Me. (Drupalize.Me is now part of a sister company to Lullabot.)

4. How do you explain what Drupal is to other, non-Drupal people?

Drupal is a platform to structure and present loads of content in a scalable way.

5. How did you see Drupal evolving over the years? What do you think the future will bring?

I think the challenge for the Drupal community is to provide straightforward and accessible means for anyone to install, use, and customize Drupal. The great “tout” of Drupal is its scalability. And it certainly has and does scale. This presents a great challenge. How do we provide functionality, tools, documentation, and training for a platform that can be used for such a wide range of use cases? How do we make it easier to use the kinds of tools that are necessary for such a complex platform? I know that a lot of people are hard at work on these kinds of problems. I think the future of Drupal will mean gaining a better understanding of our user base and not assuming that everyone falls into an “enterprise” category or whatever.

6. What are some of the contributions to open source code or to the community that you are most proud of?

At the moment, I’m most proud of the line-up of speakers for the Builder Track for DrupalCon Seattle. The program team worked really hard choosing speakers and in the midst of a lot of changes to the ‘Con.

7. Is there an initiative or a project in Drupal space that you would like to promote or highlight?

Register for DrupalCon Seattle!

8. Is there anything else that excites you beyond Drupal? Either a new technology or a personal endeavorment.

I’ve been a web developer since about 2001 or so. That has added up to a lot of raging against the screen. I have discovered open source hardware and the “maker” community and have discovered the joy and pleasure of coding on a variety of microcontrollers and single-processor boards in a variety of applications from breadboarding to learn concepts in electronics to sewing with conductive thread to making a variety of fun and whimsical projects. Working with physical computing objects has brought back a level of sanity to the otherwise (come on, admit it) insane world of web development.

Feb 20 2019
Feb 20

With growth comes changes and today we're introducing changes to our legal terms and pricing. The basic susbcription remains the same at $78 USD per year and the Professional subscription was bumped from $249 to $360 USD per year. The Enterprise subscription now starts at $3000 with a $1000 USD set-up fee which is needed for the one-time job of collecting brand logos and brand names and setting up our scripts to produce the white-labeled/re-branded products automatically. The Enterprise subscription will now be charged per month rather than per year.

New terms of service: https://www.sooperthemes.com/legal/terms
Services catalog:  https://www.sooperthemes.com/legal/services-catalog

Drupal is becoming more valuable but more expensive

3 years after Drupal 8's release the results are in: Drupal is still relevant but Drupal 8 is more expensive to implement and Drupal's adoption curve has tapered off. I don't think this is necessarily bad. Drupal's leadership made a decision to make Drupal the best Enterprise-grade CMS, not the best everyman's CMS. The result is that Drupal's steep learning curve became steeper yet, and costs of training and hiring Drupal developers increased accordingly. Our price bump is not merely a reaction to a decrease in the volume of Drupal websites in need of our solutions, it is also part of our learning process. 

Sooperthemes is growing but it is not growing enough

Since our Drupal 8 launch last year business at Sooperthemes is better than ever. But with our growing popularity comes a big increase in workload from the customer support forum, customer success tasks, and managing simple tasks like account administration, taxes, sales questions. It adds up to a lot of work. Currently our prices are too low for the increase in customers to pay for new staff to take on the additional workload. We have been investing a lot of effort in training interns but the time has come to move to a more sustainable solution.

Without changes Sooperthemes is not ready for the future. This price increase in the Professional subscription is one part of our strategy for sustainable growth.

Another change is getting better at charging big clients more than small clients. We want to keep our products accessible to the entire Drupal community. While we love our enterprise clients. we don't want to develop an amazing product just for the Drupal elite who can afford hundreds or thousands of dollars per month per site. Therefore we're introducing new licensing terms to charge users based on the scale of their usage of our flagship product Glazed Builder.

We updated our terms for us to be able to charge websites fairly not just by the number of domain (site) licenses, but also by the number of users who are using our Glazed Builder product. Some examples to illustrate why I think this is fair.

  1. Freelance Music teacher's website with 1 domain license: $78 USD per year including updates and support.
  2. An Drupal agency with currently 10 clients on our products: $360 USD per year.
  3. A fast-moving consumer goods enterprise with 40 enterprise domain licenses: ~3000 USD per month.
  4. If Tesla.com would use our products for their marketing content, job portal, community forum, online stores, online tools, in 37 languages: $78 USD per year, or 6 dollars and 50 cents per month.

I think the last example illustrates why it makes sense to introduce this new lever to help Sooperthemes grow sustainably.  To learn how exactly our new licensing term works make sure to read our services catalog.  

Provide More Value To Enterprise Clients

In order for Sooperthemes to be successful in the future we will need to work on signing on more Enterprise clients. We're going to work on adding more features that are important to enterprise clients. Starting today we offer better support options and dedicated support developers to cases in the Enterprise tier. If you want to share ideas on what you think should differentiate the Enterprise subscription tier from the other tiers don't hesitate to send me an email here: https://www.sooperthemes.com/contact

I would be especially interested in hearing what it would take for your business to purchase an enterprise subscription.

Feb 20 2019
Feb 20

A month ago, Matt Mullenweg, co-founder of WordPress and founder of Automattic, visited me in Antwerp. While I currently live in Boston, I was born and raised in Antwerp, and also started Drupal there.

We spent the morning together walking around Antwerp and visited the Plantin Moretus Museum.

The museum is the old house of Christophe Plantin, where he lived and worked around 1575. At the time, Plantin had the largest printing shop in the world, with 56 employees and 16 printing presses. These presses printed 1,250 sheets per day.

Today, the museum hosts the two oldest printing presses in the world. In addition, the museum has original lead types of fonts such as Garamond and hundreds of ancient manuscripts that tell the story of how writing evolved into the art of printing.

The old house, printing business, presses and lead types are the earliest witnesses of a landmark moment in history: the invention of printing, and by extension, the democratization of publishing, long before our digital age. It was nice to visit that together with Matt as a break from our day-to-day focus on web publishing.

An old printing press at the Plantin Moretus Museum

Dries and Matt in front of the oldest printing presses in the world

An old globe at the Plantin Moretus Museum

February 20, 2019

47 sec read time

db db
Feb 20 2019
Feb 20

You don’t want people to treat your website as an outcast. You don’t want to be the ugly duckling in this sharp, serious and extremely competitive world. 

Correct?

Thus, owning a professional looking website becomes an important platform for all sorts of business. It doesn’t really matter whether you are planning to make money or not, treating your website just like your employees is a must.

Why?

Well, because it creates an impression of your business, a place where people come to see who you are and what you want. Whether it is a big e-commerce site or a one-pager - a good website would always bring values to you and your company.  

An image of a hand which is holding a tray on which there is an image of a laptop, PC, touchpad and phone


As important as the website is for you, the themes contributes highly to the user experience and functionality of a particular site.

Your theme is the overall look, feel and styling of your website and the very first thing which the audience witnesses. And nothing can beat Drupal in this area.  

Beginning with the Zurb Foundation 

So here is Zurb foundation for you.

Zurb Foundation is the prototype theme that is used to prototype in the browser. It allows you to rapidly create a website or application while leveraging mobile and responsive technology that is well tested. 

The front end framework is the collection of HTML, CSS, and Javascript containing design patterns. These design patterns help the user in persevering time by helping them to dodge boring, boilerplate codes. The sites that are built on this foundation works great when there are multiple screens which include laptops, mobile phone, PC, iPad. The reason is that Zurb foundation has a responsive framework that uses CSS media queries and a mobile-first approach. 

Image of the Zurb foundation logo which has a white yeti holding a phone and a laptop


Different versions of Zurb 
 

 Infographic of different versions of Zurb in a linear form


Zurb Foundation 3

If the primary goal for your website is rapid prototyping then Zurb foundation 3 is for you.  This theme is developed in Sass, which is a powerful CSS pre-processor that helps you to write, clean and organize CSS which can be maintained easily over time. 

One of the biggest advantages of using Zurb Foundation 3 was the shift of development of the framework to Sass and compass rather than pure CSS. 

Sass grants the user with variables, functions and powerful mixin that speeds up the development of framework as well as made the code more coincide. 

Image of number 3 which is drawn in ice shape with horns. Around the digit is image of a phone and an Ipad on left and PC on right.


Zurb Foundation 4 

This version of Zurb foundation brought many new functionalities and changes in its framework. Started from being a mobile-friendly theme, Zurb foundation 4 supported some complex layouts, grids, and media queries. 

This version brought about flexible and powerful built-in tools with the advantage of it being accessible to different screen size (new, simpler syntax that better reflects how the grid works)

Apart from this, Zurb foundation 4 is all about semantics.  Users were granted with the power of removing and writing all the presentational classes from the markup with built-in sass mixins and extensions. 

Zurb Foundation 4 also presented the users with some splendid plugins that worked well with AJAX. 

Image of a white yeti with specks holding a mobile phone that is of the same height with blue background


Zurb Foundation 5

Fast, strong and a better foundation, Zurb foundation 5 is great for designers and developers that are using it in their workflow. The foundation specifically focused on smart coding that introduced the users with better coding practices. And if it is a team then this would give them an idea to start from a common point. The advantage: It helped them to put together all interactions and workflow in a shorter period of time. 

 Image of the white yeti wearing an astronaut suit with a mobile phone

Zurb Foundation 6

Foundation of site 6 has been designed to give the users with production efficiency in their project. It includes a wide range of modular and flexible components that are not only lightweight but easy to maintain as well. 

Foundation 6 is also half the size of foundation 5, in other words, there was a reduction of 50 % code. All these codes have come with ARIA attributes and roles alongside instructions. 

The base style in foundation 6 act as a coded framework, which makes the work of the user even more easy and flexible. Simpler CSS styles allow the user to easily modify them and fit it according to their needs. 

Image of a white yeti wearing an astronaut suit holding a laptop in a sleeping position. It has a background image of stars and a rocket


Zurb Foundation or Bootstrap?

If talking about Zurb and its different version, bootstrap tends to make its way between all this. 

How?

Well, because Bootstrap and Zurb are the major participants when it comes to web designing methods.  Often designers and developers seem to get lost in the mist while battling and choosing between one. 

  Zurb Foundation Bootstrap Community The community here is smaller as compared to bootstrap but it is growing with decent technical support.  The community here is smaller as compared to bootstrap but it is growing with decent technical support.  CSS Preprocessor Foundation supports SASS Bootstrap also supports SASS but it is a less camp. Customization Minimalist approach to its pre-built UI components giving room to the designers to create new things. Bootstrap consist of basic GUI customizer which most of the time doesn’t allow the users to create something new.  Browser Support Supports Chrome, Mozilla Firefox, Safari, opera, android  Supports Chrome, Mozilla Firefox, Safari, opera, android, and IE8 Grid System Foundation was the first one to go mobile friendly and adapt to the grid system. Bootstrap has ample of time to bridge the gap in terms of features and functionalities.  Advanced Feature Elements Zurb foundation   of X-y grid, responsive tab, off-canvas etc It is customizable with a variety of designs. Releases Zurb has more releases w.r.t the development requirement. It has 6 releases and the 7th one is yet to come. Bootstrap has 4 release. The  4th  release was on August 19, 2011

Why choose Drupal for Zurb foundation?

When building a website what is the very first thing which the user sees?

Content? Functionalities? Information? Or Layout?

Selecting the theme and functionalities of a website is one of the most primary decisions that a website owner has to make and Drupal is the CMS that can help you achieve this task. It has about 1,316 themes where each theme has a variety of strengths

Out of which Zurb Foundation theme is grid-based, mobile first CSS. When used with Drupal, Zurb provides efficiency to the front end developer. It is a platform-specific method that helps you achieve various functionalities. 

  • The foundation XY grid (in timeless condition) allows the user to easily create and layout Drupal content.
  • The top bar of the foundation is combined with Drupal’s menu navigation and works well with submenus.
  • The off-campus region in the Zurb foundation are available as Drupal blocks and the menu bar that is placed in this region are automatically themed. 

Creating Sub Themes 

It is imperative that the user creates a sub-themes that allow Zurb foundation to apply on any website. 

There are two ways of creating a sub-theme:

Drush: Drush is basically a command line access to common Drupal administrative tasks and configuration. The user can change the directory with the help of Drush.  

Manually: The user can also create sub-themes manually. They can complete the task by expanding the theme folder and then replacing it with starter files. 

Contributed Drupal modules that can be used with Zurb

The Zurb foundation aims to do theming without the help of any dependencies but there are many modules that help in theming better. 

Modules like panels, block class, display suite, special menu items. 

Panels: Panels module allow a site administrator to create customized layouts for one or more users. It provides an API that allows the configuration and placement of
blocks into regions. 

Block Class: Block class allows the user to add classes to any block through the block’s configuration interface. 

Display Suite: Display suite allows the users to take control over content fully using drag and drop interface. 

Menu Items: This module allows the user to create dropdown divider and all sort of headers in the navigation menu. 

Case Study on MIT Press CogNet 

MIT Press CogNet is an essential online resource for all the students and scholars that are into cognitive sciences. 

The objective of this project was to provide a completely responsive design in all devices for the users. An organization worked closely with the CogNet team. With the CogNet team, they developed a basic wireframe. custom Drupal theme based on the zurb foundation was built.

Zurb foundation theme and sass for CSS preprocessing were used to rework the already existing theme. To guarantee a seamless experience on any type of screen the developers used jQuery to construct slick navigation, scrolling, and content exploration. 

The results were eye-catching. From being a desktop-only website, MIT Press CogNet was modified into an accessible one where the users were able to view it in any device. The biggest achievement of the whole procedure was that it was done under the budget provided by the organization.  

Future of Zurb Foundation

Zurb is yet to launch another version of the architecture (ITCSS+SMACSS). 

Zurb Foundation 7 separates the view layer from the logic layer to make it easy and reliable. 

It would dramatically improve your freedom to shift between JavaScript frameworks with a super-powerful pluggable JavaScript architecture. In short, there are two major changes that would take place in Zurb Foundation 7 

The first one as mentioned would dramatically let user shift between javascript frameworks with javascript architecture. UI framework today tends to either go in one framework or have different and independent ports for different JS framework.

And yes, the second major change in the foundation is the ITCSS based architecture with the usage of SMACSS. This would make it easier to build and maintain scalable sites and themes. 

Conclusion 

Remember that themes are connected to the protagonist's internal journey. It is not just the visuals but it is also the journey of the user experience that they would have while going through your website.

At OpenSense labs, we understand how important it is to create a website that matches your goals and objectives. Ping us at [email protected] so that we can arrange services to make your website what you have always hoped for. 

Feb 19 2019
Feb 19
Performance and scale for all levels of digital commerce


Drupal Commerce is a fantastic open source ecommerce platform, but there is a common misconception that it is lacking when it comes to performance and scalability. This is not true! Drupal Commerce is extremely fast and is more than capable of scaling from small business all the way to enterprise level ecommerce. We have proof and it’s right here for you to view.

Download the Drupal 8 Commerce Performance Benchmarks Report (PDF)

About the report

Shawn McCabe, Acro Media’s CTO, put Drupal Commerce to the test to see how it performed on a number of different AWS configurations, ranging from single server setups all the way up to multi-server configurations.

He ran simulated traffic through Drupal Commerce, mimicking actual traffic as close as possible, testing concurrent users, site speed, transactions per second, and a number of other useful technical metrics.

The smallest server configuration tested was capable of handling 130 concurrent users flawlessly, with a throughput of 13.59 transactions per second. On the other hand, the largest configuration could handle 52,000 concurrent users with a throughput of 1,305.85 transactions per second.

The report goes further and includes how the tests were set up, their limitations and methodology, all of the server configurations details and, of course, the test results. This testing puts the performance and scalability question to rest, backed by hard data that anyone can reproduce. Drupal Commerce is a viable option for ecommerce that businesses of any size can use and grow with in the future.

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web