Aug 02 2019
Aug 02

A few days ago, on Wednesday, July 31st, Acquia held a webinar on digital experience titled “Think Bigger: Your Digital Experience is More Than Your Website”. 

The two presenters were Justin Emond, CEO of Third & Grove, and Tom Wentworth, SVP of Product Marketing at Acquia. 

They talked more generally about the experience economy and the recent important changes in digital experiences, and more specifically about digital experience platforms (DXP); namely, why an open DXP is the best solution and how Acquia’s services can serve as the foundation for an open DXP.

As with all Acquia webinars, a recording will be publicly available soon for anyone who wasn’t able to attend it or who wants to revisit certain points. In the meantime, we hope this recap will fill in enough gaps to make the wait easier or maybe even compel you to start rethinking your digital strategy today in preparation for the future.

Experience is everywhere

As Tom states, we are now in the “experience economy”, with 1:1 personalization a necessity for brands that plan to win in this economy. 

Today, everything is essentially an experience; we’re surrounded and bombarded by them. Competition among brands, too, works mostly on the basis of customer experience, which means brands need to constantly focus on delivering the best possible experience if they want to stand out. 

The physical world is full of amazing, memorable experiences (Disney, for example, has decades of them under its belt and is hence able to focus on all their minor details). But - what about the digital? What are our most memorable experiences in the digital sphere?

For both, it holds true that it takes a lifetime of great experiences to create an iconic brand. In the digital, however, you can undo a lot of positive experiences and even destroy a brand with a single bad experience, from which it is extremely difficult to come back. 

Why is it so hard to create great digital experiences?

The recent explosion of channels has made user journeys hard to predict, as they interact with brands through various channels, some of which didn’t even exist a few years ago, while those that haven’t yet been invented will also become touchpoints with brands.

Current martech systems are siloed. They each focus on different parts of the customer journey and, by consequence, each have their own view of this journey. But, not only are the tools siloed - the very organization of the teams is siloed as well.

This kind of organization makes it impossible sometimes to deliver an integrated customer experience. And the problems becomes even worse at scale, with even greater technological and organizational limitations to delivering a great, 1:1 customer experience. 

So, how can you tackle this and win out in the experience economy?

Well, the most important thing is - breaking down the silos, both on the technological and organization level. In order to deliver an integrated digital experience, you need one common view of the customer which is consistent across all channels. 

This brings about obvious advantages: the ability to come to market and take advantage of new channels faster, more consistent user experiences, reusable content, automated decision making, more governance, etc.

In the “old” internet, every brand needed a website - this is also the reason why the CMS was created, as a better way to manage these websites. But, today, a website alone isn’t enough; today, every brand needs a digital experience platform - an open DXP.

Planning your optimal DXP

Well, but, isn’t a DXP essentially the same thing as a CMS? It’s true that a DXP is a product, a platform, a solution - but, at the bottom line, it’s a strategy of how you’re going to interact with your customers to achieve desired goals. 

So, a DXP is a strategic perspective on how to approach this problem, whereas a CMS is a tactical solution. 

Web content management

The web CMS is still the basis for any DXP (“content is king”). The focus, then, should be on specific use cases from which you can work. Some of the most common of these are:

  • Multichannel delivery: this use case rests on the perception of content as a service, content in the sense of enabling people and making their lives easier. An API-first strategy is vital for this, as you need to be open with distributing and sharing content with other platforms.
  • Cross-channel strategy: a bit more complex than the previous point, here the focus is more on mapping the customer journey and figuring out how the customer moves through multiple touchpoints of interaction and what the entire integrated story then is.
  • Campaign management: the most important thing here is to be aware of how the CMS, personalization and marketing tools all interact. They need to work really well together in order to get the most out of the campaign.
  • Commerce: the recent emergence of cloud commerce platforms, such as BigCommerce and Shopify Plus, has made it possible to invest less into the backend (since it’s in the cloud) and allocate a bigger part of your budget to other areas, such as marketing. 
  • Customer data: what you do with data is more important than how you collect it or store it. The question here is: how are you going to extrapolate the insights and how can you best leverage them?
  • Work backwards: the future is uncertain and unpredictable. If you acknowledge that, you can work backwards from it, starting with the realization that your DXP will have to be adaptive to change and new tools; we are in an era of unprecedentedly fast digital innovation, after all.

Trends

1. If you want agile marketing, you need high developer velocity.

In software development, agile has completely replaced the waterfall approach. Now we’re starting to see this as a marketing trend as well: small releases, continuous iteration, better insights on the performance of a campaign and consequently the ability to adapt faster. But the catch is - successful agile marketing demands high developer velocity.

2. If you need cutting-edge commerce, you need to be disruption-ready.

With e-commerce becoming the most popular form of shopping, innovations in this sphere will be particularly important for brands, hence they will have to be especially adaptive in this area. Commerce cloud applications mentioned earlier are an example of these very recent breakthrough technologies.

3. If you need a decoupled or headless approach, don’t go with a technology that wants to do several different things at the same time. 

Very likely, such a tool won’t do any of the things as well as you would need it too. Because of this, a microservices approach is becoming more and more popular, using for example a JavaScript framework on the front-end in combination with one (or more) CMS.

Open DXP is the only DXP that has it all

Because of all the considerations and trends just discussed, you need to embrace an open architecture for your DXP, one without the restrictions of a lock-in.

Unified content and data create a seamless 1:1 customer experience. Acquia is helping their clients with bringing together all the data obtained from their customers, connecting all that data together in order for a single, unified view of the customer, and getting the content to the customer through whichever channels they interact with a brand on. 

Acquia Open Experience Platform

The Acquia Open Experience Platform consists of two parts: the marketing hub and the experience factory. The latter is built on the Drupal CMS and then extended with preconfigured features that are ideal for mid-market organizations. 

So, with all the advanced integrations such as Mautic or Acquia Lift, how can you achieve better business outcomes? In what way do they empower you? The answer is: they enable you to connect the right person at the right time with the right content on the right channel.

The “open” refers to more than just open source; it’s about being an open platform. In this context, this means utilizing Acquia’s open DXP alongside competitive products; whatever technology their clients need, Acquia wants to make all these different technologies work better together. 

In this sense, Acquia’s DXP is positioned as an open alternative to proprietary platforms such as for example the Adobe Experience Manager or Salesforce’s Lightning Platform. 

Some additional resources

Q&A session

Q: Can an organization get started with only Acquia Lightning and then add on other services later?
A: Absolutely; there are some foundational investments you really need, such as Lightning. Then you can add on Lift to extend your Drupal site with personalization, then Mautic for marketing, etc. Think of your DXP as a journey, not just as a touchpoint on that journey.


Q: Can Acquia Lift be integrated with other CMS platforms or does it only work with Drupal?
A: Yes, it does work with other platforms; it was designed as CMS-neutral.


Q: What’s the biggest mistake you’ve encountered when helping your customers move to a DXP?
A: There were two crucial mistakes, actually. Firstly - not accepting that the future is unknowable and that things change; and, secondly - a lack of discovery (the discovery checklist linked above is an excellent starting point).


Q: What does a digital experience look like in 2025?
A (Justin): It’s going to be similar, in the sense that there will still be a website, but also different in terms of the way people will interact. There will be an even greater focus on mobile experience, but voice is more limited in its use cases, so it likely won’t be as important as the hype predicts.
A (Tom): Because the pace of technology has never advanced faster, it’s hard to predict what the digital experience will look like even next year. New platforms are emerging every day and we’ll likely continue to see this; the winners will be the organizations that are able to successfully reach their customers with personalized content across all channels. The most important thing will be constant innovation; it will need to happen on a monthly basis. This is true for both the platforms themselves as well as for the organizational aspect. 

Conclusion

We hope this recap has given you a better understanding of what an (open) DXP is and why a focus on the digital experience will continue to be more and more important thanks to technological advancements. 

A lot of brands already demand a multichannel and cross-channel experience for their customers, but the only integrated solutions are expensive and limited proprietary tools. 

Now, Acquia’s positioning itself as the only open provider of these services has the potential to completely change the name of the DXP game. We’re excited to see how their upcoming tools, e.g. Content Cloud, will act as further disruptors of the industry.

We conclude with the one major takeaway from all this: because the future is uncertain, you need to set a strategy that will allow you to adapt to any new technologies in order to stay in the game.
 

Aug 01 2019
Aug 01

Palantir defined personas for the various site audiences. The site needed to be able to surface relevant content for teens, parents, physicians, and people in crisis situations. We tested wireframes in-depth and performed chalkmark tests around the menu structure, both of which helped make sure pathways were simple and straightforward for all audiences.

For general site search, we implemented Solr-based Acquia Search, which provided more advanced capabilities than the standard Drupal search functionality. Palantir added recommended results, so if there is something our client wants to bump to the top of search results based on a specific keyword, they now have that ability. For example, if a user were to search for the term “cancer,” our client can now make sure that results for the oncology department get bumped to the top of the results list.

Aug 01 2019
Aug 01

Written by Sven Berg Ryen, Leader of the GDPR audit team at Ramsalt Lab

EU Cookie Compliance, one of the top 100 Drupal modules, is a Drupal module that offers a cookie consent banner with various features, making it more convenient for your site to become GDPR compliant. GDPR is the new data privacy regulation that came into effect on 25 May 2018 and it sets out to bolster the rights citizens of the EU have over their data which is held by companies. Ramsalt Lab is currently supporting the module development as part of our GDPR audit services.

According to GDPR, if you have any traffic from EU citizens on your site, you need to ask for consent before you, or third party scripts, process any of their personal data.

This is all very well, you can ask for consent first, and then only use the visitor’s private data if they consent, but under GDPR you’re required to do so only when the visitor is an EU or EEC citizen. That still leaves billions residents outside of the EU where privacy laws may not require consent (one could argue whether this is good or bad) for storing cookies that identify individuals. Wouldn’t it be nice if you can comply with EU regulations and at the same time not pester those outside of the area where GDPR is enforced?

Luckily, EU Cookie Compliance has a feature to the rescue. It can first check whether the user resides in the countries that GDPR affects, and then display the banner accordingly, only when applicable.

So the technical parts

To achieve this, you need an additional addon; either the Smart IP or geoIP modules - or the geoIP PHP library. It may be easiest to use the module route, since adding the PHP library may not be feasible on your hosted server or cloud solution.

We will here use Smart IP, since that’s the only module that the Drupal 8 version of EU Cookie Compliance supports. There is now also a beta version of GeoIP available for Drupal 8, so at some point, EU Cookie Compliance may support GeoIP also in the 8.x module version. You can follow this issue for the progress.

The option to show the banner only to EU countries can be found near the bottom of the module settings page. A notification can be seen when the Smart IP module is not enabled.

EU Countries configuration section when no geolocation utility has been selected.

Enabling and setting up the Smart IP module

Install and enable the Smart IP module, using your preferred technique (such as composer/drush or direct download from drupal.org). In Drupal 8, you also have to enable a Smart IP data source module.

The Drupal 8 module gives you to the following geolocation lookup options:

  • Free and licensed geolocation files from ip2location.com (signup required to get access to the free version). For the purposes of this module, you only need the DB1 database, with coverage of countries. Attribution is required when you use the free database.
  • Geolocation service from ipinfodb. A free API is available. You are however limited to 2 requests per second, and will be blacklisted if you exceed that limit. Also, the service limits you to lookup requests from one server IP only, which may not be ideal if you’re planning to test the service from your localhost. Note that the module utilizes the ip-city endpoint, and not the faster ip-country one. Sign up to get an API key.
  • MaxMind GeoIP2. A free database is available, updated on the first Tuesday of each month. No signup is required to use the free version, though attribution is required.
  • MaxMind GeoIP2 Precision API service offering lookup at the country level at $0.0001 per request. A free trial is available.

Some fallback options are available, and will be accessible if the headers exist in the web page query when you open the configuration page (which means they may not be available on your localhost, but could be available on your server).

  • Cloudflare headers
  • The mod_geoip module in Apache
  • Nginx headers

The Drupal 7 version of Smart IP offers all of the above and in addition some legacy lookup services.

We will be using the Smart IP MaxMind GeoIP2 binary database, because it has a free version of the database that will automatically be updated once a month on cron run. In other words, you need to enable the smart_ip_maxmind_geoip2_bin_db submodule (part of smart_ip).

Configuring Smart IP for GDPR

After having enabled the required modules, head over to /admin/config/people/smart_ip.

Select the “Use MaxMind GeoIP2 binary database” option to see the configuration for the service. Choose the Lite database version, the Country level edition and make sure that Yes is chosen under Automatic updates.

Further down, in the second pane, configure your settings to allow geolocation lookup for all desired user roles. Then, since I guess you care about privacy, either opt to not save the user’s geolocation on account creation, or enable the feature to prevent storing location details from GDPR countries.

Scroll all the way to the bottom and press “Save configuration”. If you get an error at this point, you need to set a private file path in settings.php.

Smart IP configuration section with recommended configuration highlighted.

After having configured Smart IP, you need to head over to MaxMind’s website and download the GeoLite2 Country file in DB format. Then, expand the archive, grab just the file labeled “GeoLite2-Country.mmdb” and drop it into “[PATH_TO_PRIVATE_FOLDER]/smart_ip”. After you add this file manually once, the Smart IP module will take care of the automatic monthly updates.

Note: In Drupal 7, the GeoLite 2 country database is downloaded automatically when configuring the module, so there’s no need for a manual download.

Configuring EU Cookie Compliance

Next, head back to the settings for EU Cookie Compliance at admin/config/system/eu-cookie-compliance and enable the “Only display banner in EU countries” option. If your site uses any caching at all, you’ll want to enable the Javascript based option.

EU Cookie Compliance configuration section for limiting the display of the banner to only show up in EU countries.

After enabling this feature, you will need to rebuild Drupal cache, in order for Drupal to pick up the new path that is used to determine if the user is in the EU.

Testing

Note: If you’re on an EU Cookie Compliance version prior to 8.x-1.7, you need the patch from this EU Cookie Compliance issue in order for the debug feature in Smart IP to work. The Drupal 7 version of EU Cookie Compliance doesn’t have this problem (though you should always make sure that your version is up-to-date to get the latest bug fixes and features).

This feature involves a few moving parts, so to ensure everything has been set up correctly, there’s a handy debug feature in Smart IP that can be used. This way, you can check that you are indeed displaying the banner only to European countries where GDPR legislation apply. The easiest way to check if the settings are correct is to temporarily set up debugging in Smart IP for the Anonymous user and open an Incognito window. This way you can ensure that no existing cookies are giving false assurance that the feature is working.

Try using a value such as 151.101.2.217 (which at the time of this article is one of the IPs for the drupal.org server, situated in the US). Notice that no banner is shown when you debug smart IP with this value.

Try 185.91.65.150 (the IP for the server where drupalnorge.no is hosted, which is in Norway) and the banner should appear.

Section of Smart IP configuration showing an IP number has been configured for debugging purposes.

After testing is completed, remember to disable debugging for the anonymous user by clearing the value on the Smart IP configuration page.

Conclusion

A little work is required to set up EU Cookie Compliance to display the GDPR cookie banner only to countries and territories where the law requires one. Resulting from this, you will hopefully have happier users.

If you need help setting up your GDPR cookie banner, or have questions about how your site can become GDPR compliant, you can always get in touch with us at Ramsalt Lab through our contact page.

Written by Sven Berg Ryen
Developer and Leader of the GDPR audit team at Ramsalt Lab

Sven Berg Ryen

Aug 01 2019
Aug 01

For the sixth year in a row, Acquia has been recognized as a leader in the Gartner Magic Quadrant for Web Content Management.

For the sixth year in a row, Acquia has been recognized as a leader in the Gartner Magic Quadrant for Web Content Management. Acquia first entered the Web Content Management Magic Quadrant back in 2012 as a Visionary, and since then we've moved further than any other vendor to cement our leadership position.

As I've written before, analyst reports like the Gartner Magic Quadrant are important because they introduce organizations to Acquia and Drupal. As I've put if before If you want to find a good coffee place, you use Yelp. If you want to find a nice hotel in New York, you use TripAdvisor. Similarly, if a CIO or CMO wants to spend $250,000 or more on enterprise software, they often consult an analyst firm like Gartner..

In 2012, Gartner didn't fully understand the benefits of Acquia being the only WCM company who embraced both Open Source and cloud. Just seven years later, our unique approach has forever changed web content management. This year, Acquia moved up again in both of the dimensions that Gartner uses to rank vendors: Completeness of Vision and Ability to Execute. You'll see in the Magic Quadrant graphic that Acquia has tied Sitecore for the first time:

The 2019 Gartner Magic Quadrant for Web Content ManagementAcquia recognized as a leader, next to Adobe, Sitecore and Episerver, in the 2019 Gartner Magic Quadrant for Web Content Management.

In mature markets like Web Content Management, there is almost always a single proprietary leader and a single Open Source leader. There is Oracle and MongoDB. Splunk and Elastic. VMWare and Docker. Gitlab and Github. That is why I believe that next year it will be Acquia and Adobe at the very top of the WCM Magic Quadrant. Sitecore and Episerver will continue to fight for third place among companies who prefer a Microsoft-centric approach. I was not surprised to see Sitecore move down this year as they work to overcome technical product debt and cloud transition, leading to strange decisions like acquiring a services company.

You can read the complete report on Acquia.com. Thank you to everyone who contributed to this result!

August 01, 2019

1 min read time

db db
Aug 01 2019
Aug 01

Time to Vote graphic

Voting is now open for the 2019 At-Large Board positions for the Drupal Association! If you haven't yet, check out the candidate’s profiles. Get to know your candidates, and then go vote.

Cast Your Vote!

Voting is open to all individuals who had a Drupal.org account by the time nominations opened and who have logged in at least once in the past year. While a Drupal Association membership is not currently required, it is strongly encouraged.

To vote, you will rank candidates in order of your preference (1st, 2nd, 3rd, etc.). The results will be calculated using an "instant runoff" method. For an accessible explanation of how instant runoff vote tabulation works, see videos linked in this discussion.

Election voting is from 1 August, 2019 through 16 August, 2019. During this period, you can continue to review and comment on the candidate profiles.

Have questions? Please contact me: Rachel Lawson.

Aug 01 2019
Aug 01

The Migrate API is a very flexible and powerful system that allows you to collect data from different locations and store them in Drupal. It is in fact a full-blown extract, transform, and load (ETL) framework. For instance, it could produce CSV files. Its primarily use, thought, is to create Drupal content entities: nodes, users, files, comments, etc. The API is thoroughly documented and their maintainers are very active in the #migration slack channel for those needing assistance. The use cases for the Migrate API are numerous and vary greatly. Today we are starting a blog post series that will cover different migrate concepts so that you can apply them to your particular project.

Understanding the ETL process

Extract, transform, and load (ETL) is a procedure where data is collected from multiple sources, processed according to business needs, and its result stored for later use. This paradigm is not specific to Drupal. Books and frameworks abound on the topic. Let’s try to understand the general idea by following a real life analogy: baking bread. To make some bread you need to obtain various ingredients: wheat flour, salt, yeast, etc. (extracting). Then, you need to combine them in a process that involves mixing and baking (transforming). Finally, when the bread is ready you put it into shelves for display in the bakery (loading). In Drupal, each step is performed by a Migrate plugin:

The extract step is provided by source plugins.
The transform step is provided by process plugins.
The load step is provided by destination plugins.

As it is the case with other systems, Drupal core offers some base functionality which can be extended by contributed modules or custom code. Out of the box, Drupal can connect to SQL databases including previous versions of Drupal. There are contributed modules to read from CSV files, XML documents, JSON and SOAP feeds, WordPress sites, LibreOffice Calc and Microsoft Office Excel files, Google Sheets, and much more.

The list of core process plugins is impressive. You can concatenate strings, explode or implode arrays, format dates, encode URLs, look up already migrated data, among other transform operations. Migrate Plus offers more process plugins for DOM manipulation, string replacement, transliteration, etc.

Drupal core provides destination plugins for content and configuration entities. Most of the time targets are content entities like nodes, users, taxonomy terms, comments, files, etc. It is also possible to import configuration entities like field and content type definitions. This is often used when upgrading sites from Drupal 6 or 7 to Drupal 8. Via a combination of source, process, and destination plugins it is possible to write Commerce Product Variations, Paragraphs, and more.

Technical note: The Migrate API defines another plugin type: `id_map`. They are used to map source IDs to destination IDs. This allows the system to keep track of records that have been imported and roll them back if needed.

Drupal migrations: a two step process

Performing a Drupal migration is a two step process: writing the migration definitions and executing them. Migration definitions are written in YAML format. These files contain information about the how to fetch data from the source, how to process the data, and how to store it in the destination. It is important to note that each migration file can only specify one source and one destination. That is, you cannot read form a CSV file and a JSON feed using the same migration definition file. Similarly, you cannot write to nodes and users from the same file. However, you can use as many process plugins as needed to convert your data from the format defined in the source to the format expected in the destination.

A typical migration project consists of several migration definition files. Although not required, it is recommended to write one migration file per entity bundle. If you are migrating nodes, that means writing one migration file per content type. The reason is that different content types will have different field configurations. It is easier to write and manage migrations when the destination is homogeneous. In this case, a single content type will have the same fields for all the elements to process in a particular migration.Once all the migration definitions have been written, you need to execute the migrations. The most common way to do this is using the Migrate Tools module which provide Drush commands and a user interface (UI) to run migrations. Note that the UI for running migrations only detect those that have been defined as configuration entities using the Migrate Plus module. This is a topic we will cover in the future. For now, we are going to stick to Drupal core’s mechanisms of defining migrations. Contributed modules like Migrate Scheduler, Migrate Manifest, and Migrate Run offer alternatives for executing migrations.

 

This blog post series is made possible thanks to these generous sponsors. Contact Understand DRupal if your organization would like to support this documentation project, whether the migration series or other topics.

Aug 01 2019
Aug 01

We’re thrilled to be attending Drupal Camp Pannonia from the 1st to 3rd August!

Ratomir and Dejan from the CTI Serbia office will be attending Drupal Camp to discover the biggest breakthroughs of the Open Source platform in 2019.

Grand-terrace-palic

The Grand Terrace in Palić, where Drupal Camp Pannonia is held.

 

Held in Palić, Serbia, near the Hungarian border, Drupal Camp Pannonia brings together some of Europe's top Drupal experts for a weekend of knowledge sharing and Open Source collaboration.

CTI Digital has long worked with a global roster of clients serving international customers. In early 2019, we made a permanent investment in Europe and opened a Serbia office. 

We selected Serbia as the Drupal community is growing quickly and filled with world-class talent. Supporting Drupal Camp Pannonia is a crucial part of our investment in the European Drupal community. If you’d like to chat with Dejan or Ratomir about CTI Digital or Drupal, please email [email protected] to set up a meeting.

Ratomir-and-Dejan-1-1

Dejan (Left) and Ratomir (Right)

We're pleased to announce Dejan (@dekisha007) is also conducting a session on Day 1, Friday 1st August at 15:45. He'll be delving into a new front-end practise we have developed at CTI Digital using Pattern Lab.

Here's what to expect:

  • Pattern Lab and Atomic Design in Drupal
  • Tailwind CSS
  • Advantages of Vue.js over React in Drupal
  • Wrapping all of the above up with CTI’s custom base theme

Be sure to catch Dejan’s talk on day 1, along with a host of brilliant sessions and workshops by checking out the online schedule.

Or follow @dcpannonia and @CTIDigitalUK for all the action on twitter.

Aug 01 2019
Aug 01

What is a land acknowledgement?

Here’s the gist: in America (North and South), as well as Australia and other colonized nations, if you’re not an indigenous person, you live on stolen land. We start our meetings by, among other things, asking where people are from, and asking them to acknowledge the indigenous history of the land they live on. 

Why do we do them?

In order to remember, and recognize that the land we live on has been colonized. However, simply stating this isn’t enough - let’s do more!

What are we doing?

A blog post, where members of our community will each share a couple paragraphs of research that they have done on the land where they live. (and please share your sources).

Alanna Burke, Lenni Lenape Land | Pottstown, Pennsylvania, USA

The Lenni Lenape people lived in an area that covered Delaware, Eastern Pennsylvania, Southeastern New York, and New Jersey when colonizers came in the 1600s. While most Lenape now reside in Oklahoma [1], there is still a tribe in New Jersey - the Nanticoke Lenni-Lenapes. The Nanticokes are ancestors of the Lenni-Lenapes, and the tribes are interconnected. “Our compound tribal name, a practice not uncommon among modern tribes, honors our ancestors from the two dominant ancient tribes which comprise our tribal nation [2].”

Despite living in their homeland, they are not federally-recognized. Federal recognition gives them tribal sovereignty - the ability to govern themselves. (In my research, there appeared to be some dispute over whether they were even state-recognized or eligible for any federal monies [3]).

There are still Lenape in Pennsylvania, too, though I struggled to find an organized presence online. One local woman, Uhma Ruth Py, speaks regularly at events and works with local museums to help preserve her heritage [4]. The Lenape Nation of Pennsylvania is a non-profit dedicated to increasing awareness of Lenape culture and history, and it recently helped to move the Lenape artifacts from The University of Pennsylvania Museum of Archaeology and Anthropology to the Lenape Nation’s Cultural Center in Easton, PA [5].

[1] https://en.wikipedia.org/wiki/Lenape#Oklahoma

[2] https://nanticoke-lenape.info

[3] https://www.gao.gov/products/GAO-12-348

[4] https://www.readingeagle.com/life/article/lenni-lenape-woman-keeps-her-native-american-heritage-alive

[5] https://www.lenape-nation.org/2nd-project

Alex Laughnan, Ohlone Lands | San Francisco, CA, USA

As a resident of the Bay Area specifically San Francisco, I wanted to better understand the indigenous people that were disrupted, forced to give up the original lands that they had lived on during the colonization of America, and have been marginalized historically.

San Francisco and most of the Bay Area is on the land of the Ohlone people. “Today’s Ohlone/Costanoan people are the descendants of speakers of six related Costanoan languages that were spoken in west central California, from San Francisco Bay to Monterey Bay, when Spanish missionaries and settlers arrived in the 1770s.” (Milliken) While many thousands of Ohlone people were present upon the initial settlement from Spanish explorers, they were either converted via the missions or exiled and pushed into southern California.

In 1769, Spanish explorers entered the present-day San Francisco peninsula. “By 1801 all of the native San Francisco Peninsula people had joined Mission Dolores. Over the next few years, speakers of other languages — Bay Miwoks from east of San Francisco Bay and Coast Miwoks, Patwins and Wappos from north of the bay — joined Mission Dolores, swelling its population to over 1,200 people. They intermarried with its San Francisco Bay Costanoan speakers and with one another. Although most of the northerners returned home when missions San Rafael and San Francisco Solano were opened in the northern part of the San Francisco Bay Area, some remained at Mission Dolores. When the process of closing the missions began in 1834, the 190 members of the Mission Dolores Indian community included only 37 descendants of the original San Francisco Peninsula local groups.” (Milliken) Unfortunately, due to disease and colonization many members of the Ohlone/Costanoan tribes were wiped out. Historical policies and acts, as recent as the 1950s, from the United States federal government continued to disenfranchise the tribes through insufficient recognization and compensation for the land taken from them.

In present time, the term “Ohlone” is not unilaterally accepted in all indigenous communities on the San Francisco peninsula. Many, such as the Ramaytush, continue to push for better clarity around the different groups of people that existed in the area before Spanish settlement. “Today’s Ohlone/Costanoans are not a single community in either the social sense or the political sense. They do not gather as a united body for holidays or traditional ceremonies. They do not recognize a single Ohlone/Costanoan leadership or corporate organization.” (Milliken) Instead, the Ohlone people are often organized into groups each with a sense of community that their ancestors developed from shared experiences in the various Bay Area Spanish missions. “Most of the tribes continue to preserve and revitalize their cultural history through education, restoration of their native languages, and the practice of cultural storytelling.” (Cogswell)

References

Alex McCabe, Seminole lands | Orlando, FL, USA

The Seminole people are the result of unbowed resistance to hundreds of years of murder, war, and theft. Once part of the Mississippian culture that spanned much of the eastern portion of what is today known as the United States, invasions by Spain, England, and the United States killed or forcibly removed tribes from their ancestral homelands. The survivors fled south into Florida’s swamps, and mingled together with other tribes, eventually undergoing the process of ethnogenesis to become one tribe that the United States would come to know as the Seminoles [1][2]. It was not until the middle of the 19th century that the government of the United States ceased its efforts to remove the Seminole people from Florida [3].

At this point, Seminoles began to trade with white people. In 1907, the first Seminole reservation was created [3], but it wasn’t until 1957 that the Seminole tribe was officially recognized by the federal government, and even then it was at gunpoint - the Seminoles were forced to organize politically along the lines set forth in the 1934 Indian Reorganization Act [4].

Today, the Seminole tribe is very much alive, well organized, and financially successful. Tobacco and gaming bring in income [5], and in 2007, they closed on a deal that sold the international Hard Rock brand to the Seminole tribe [6]. They also operate museums, such as the Ah-Tah-Thi-Ki museum [7], in order to preserve Seminole culture.

[1] https://www.semtribe.com/STOF/history/indian-resistance-and-removal
[2] https://en.wikipedia.org/wiki/Seminole#History
[3] https://www.semtribe.com/STOF/history/timeline
[4] https://www.semtribe.com/STOF/history/survival-in-the-swamp
[5] https://www.semtribe.com/STOF/history/seminoles-today
[6] https://en.wikipedia.org/wiki/Hard_Rock_Cafe#Acquisition_by_the_Seminole...
[7] https://www.ahtahthiki.com/

Tara King, Pueblo Lands | Albuquerque, New Mexico

Albuquerque is on the land of several Pueblo peoples. There are 19 sovereign Pueblo nations in New Mexico, as well as three Apache tribes and the Navajo Nation.  

New Mexico was first colonized by the Spanish in 1540, when Coronado claimed the land on behalf of Spain.  Centuries of colonization, slavery, and genocide followed--everything from forced labor to "boarding" schools where children were separated from their families and cultural traditions destroyed.  Juan de Oñate led the Acoma Massacre of 1599 in response to rebellion at Acoma. During the Massacre, over 800 Acoma people died, many others were enslaved, and men over the age of 25 had one foot amputated. 

In 1680, the Pueblo Revolt succeeded against the Spanish, killing 400 Spanish and driving 2,000 settlers from the land. The revolt was coordinated by Popé (Ohkay Ohwingeh) from Taos Pueblo, though people from most Pueblos participated.  Pueblo control of New Mexico continued until at least 1692, and some Pueblos (such as Hopi Pueblo) never re-entered Spanish control. In 1706, Albuquerque was founded as a trading post between the Pueblo peoples and the Hispanos (descendants of the Spanish settlers).

In 1848, New Mexico became part of the United States after the Mexican-American War, though New Mexico was not granted statehood until 1912.  Between 1965-1975, the US Government built Cochiti Dam, an engineering project to control the Rio Grande river.  The dam not only altered the natural flow of the river, but it also destroyed sacred lands for the Cochiti people and flooded (then salinated) the historic fields that are central to Cochiti culture and survival. 

Thousands of Pueblo people live and thrive in New Mexico today.  To name just a few, Deb Haaland (Laguna) is one of the first two Native women elected to the US Congress.  Rebecca Roanhorse (Ohkay Ohwingeh) is the author of a very fun series of fantasy novels called The Sixth World.   If you visit Albuquerque, you can shop at Red Planet, possibly the world's only indigenous comic book store: https://redplanetbooksabq.com/ or visit the Indian Pueblo Cultural Center, a museum dedicated to allowing Pueblo people to tell their own story: https://www.indianpueblo.org. Albuquerque is also home to the Gathering of Nations, North America's largest pow-wow: https://www.gatheringofnations.com/.

I work toward dismantling colonialism in Albuquerque and New Mexico by learning about Native history and sharing it widely, by respecting, learning about, and advocating for Native land-use practices, and by supporting Native-owned businesses.

Lauren Maffeo | Bethesda, Maryland, USA

Paleo-Indians inhabited the Chesapeake Bay region as early as 9500 BC. In 1608, John Smith sailed up the Potomac River after co-founding the Jamestown settlement in Virginia. After sailing to the Little Falls of the Potomac (north of the present-day Chain Bridge), Smith and his peers met the first inhabitants of what's known today as Montgomery County, Maryland.

Members of the Piscataway and Nacotchtank tribes arrived in the region up to 10,000 years before Smith. Members of those tribes had crossed the Alleghenies before reaching the Potomac River Valley.

Due to an abundance of game, fruit, nuts, and other natural resources, members of both tribes traversed the area. And due to an abundance of fish, they were able to net thousands of shad at once.

Debris like broken arrowheads have been found on modern sites in Bethesda, including the National Institutes of Health. This suggests that the tribes built hunting camps throughout the modern Bethesda region.

2,000 years before Smith's arrival, the Piscataway and Nacotchtank tribes settled in small agricultural villages near the Potomac River. Choosing to live communally, they harvested the original succotash. (Native American for “broken corn kernels.”) They also buried their dead in the area.

I live in the area of modern-day Bethesda where Western Avenue and River Road meet. Native Americans and Europeans both used trails across this area in the 17th century. Although trade between the two groups was prosperous at first due to European demand for fox fur, the Piscataway and Nacotchtank tribes were eventually pushed back across the Alleghenies.

A few decades after John Smith arrived in Bethesda, Henry Fleet sailed up the Potomac River and stayed with the Piscataway tribe from 1623 to 1627. After returning to England and sharing the resources he had found, Fleet won funding for another expedition. At that time, members of tribes living on the land known as Bethesda were forced into reservations and died due to lack of immunity to communicable illness.

References

Want to contribute?

This post is intended as a living resource. If you would like to contribute information or have feedback, please let us know on the drupal.org issue so we can update and improve. 

If you’d like to learn more about our efforts to support the Drupal community by dismantling racism, please join our meetings on Drupal slack in #diversity-inclusion. 

Jul 31 2019
Jul 31

Pantheon is an excellent hosting service for both Drupal and WordPress sites. But to make their platform work and scale well they have set a number of limits built into the platform, these include process time limits and memory limits that are large enough for the vast majority of projects, but from time to time run you into trouble on large jobs.

For data loading and updates their official answer is typically to copy the database to another server, run your job there, and copy the database back onto their server. That’s fine if you can afford to freeze updates to your production site, setup a process to mirror changes into your temporary copy, or some other project overhead that can be limiting and challenging. But sometimes that’s not an option, or the data load takes too long for that to be practical on a regular basis.

I recently needed to do a very large import for records into a Drupal database and so started to play around with solutions that would allow me to ignore those time limits. We were looking at needing to do about 50 million data writes and the running time was initially over a week to complete the job.

Since Drupal’s batch system was created to solve this exact problem it seemed like a good place to start. For this solution you need a file you can load and parse in segments, like a CSV file, which you can read one line at a time. It does not have to represent the final state, you can use this to actually load data if the process is quick, or you can serialize each record into a table or a queue job to actually process later.

One quick note about the code samples, I wrote these based on the service-based approach outlined in my post about batch services and the batch service module I discussed there. It could be adapted to a more traditional batch job, but I like the clarity the wrapper provides for breaking this back down for discussion.

The general concept here is that we upload the file and then progressively process it from within a batch job. The code samples below provide two classes to achieve this, first is a form that provides a managed file field which create a file entity that can be reliably passed to the batch processor. From there the batch service takes over and using a bit of basic PHP file handling to load the file into a database table. If you need to do more than load the data into the database directly (say create complex entities or other tasks) you can set up a second phase to run through the values to do that heavier lifting. 

To get us started the form includes this managed file:

   $form['file'] = [
     '#type' => 'managed_file',
     '#name' => 'data_file',
     '#title' => $this->t('Data file'),
     '#description' => $this->t('CSV format for this example.'),
     '#upload_location' => 'private://example_pantheon_loader_data/',
     '#upload_validators' => [
       'file_validate_extensions' => ['csv'],
     ],
   ];

The managed file form element automagically gives you a file entity, and the value in the form state is the id of that entity. This file will be temporary and have no references once the process is complete and so depending on your site setup the file will eventually be purged. Which all means we can pass all the values straight through to our batch processor:

$batch = $this->dataLoaderBatchService->generateBatchJob($form_state->getValues());

When the data file is small enough, a few thousand rows at most, you can load them all right away without the need of a batch job. But that runs into both time and memory concerns and the whole point of this is to avoid those. With this approach we can ignore those and we’re only limited by Pantheon’s upload file size. If they file size is too large you can upload the file via sftp and read directly from there, so while this is an easy way to load the file you have other options.

As we setup the file for processing in the batch job, we really need the file path not the ID. The main reason to use the managed file is they can reliably get the file path on a Pantheon server without us really needing to know anything about where they have things stashed. Since we’re about to use generic PHP functions for file processing we need to know that path reliably:

$fid = array_pop($data['file']);
$fileEntity = File::load($fid);
$ops = [];

if (empty($fileEntity)) {
  $this->logger->error('Unable to load file data for processing.');
  return [];
}
$filePath = $this->fileSystem->realpath($fileEntity->getFileUri());
$ops = ['processData' => [$filePath]];

Now we have a file and since it’s a csv we can load a few rows at time, process them, and then start again.

Our batch processing function needs to track two things in addition to the file: the header values and the current file position. So in the first pass we initialize the position to zero and then load the first row as the header. For every pass after that we need to find point we left off. For this we use generic PHP files for loading and seeking the current location:

// Old-school file handling.
$path = array_pop($data);
$file = fopen($path, "r");
...
fseek($file, $filePos);

// Each pass we process 100 lines, if you have to do something complex
// you might want to reduce the run.
for ($i = 0; $i < 100; $i++) {
  $row = fgetcsv($file);
  if (!empty($row)) {
    $data = array_combine($header, $row);
    $member['timestamp'] = time();
    $rowData = [
             'col_one' => $data['field_name'],
             'data' => serialize($data),
             'timestamp' => time(),
    ];
    $row_id = $this->database->insert('example_pantheon_loader_tracker')
             ->fields($rowData)
             ->execute();

    // If you're setting up for a queue you include something like this.
    // $queue = $this->queueFactory->get(‘example_pantheon_loader_remap’);
    // $queue->createItem($row_id);
 }
 else {
   break;
 }
}
$filePos = (float) ftell($file);
$context['finished'] = $filePos / filesize($path);

The example code just dumps this all into a database table. This can be useful as a raw data loader if you need to add a large data set to an existing site that’s used for reference data or something similar.  It can also be used as the base to create more complex objects. The example code includes comments about generating a queue worker that could then run over time on cron or as another batch job; the Queue UI module provides a simple interface to run those on a batch job.

I’ve run this process for several hours at a stretch.  Pantheon does have issues with systems errors if left to run a batch job for extreme runs (I ran into problems on some runs after 6-8 hours of run time), so a prep into the database followed by running on queue or something else easier to restart has been more reliable.

View the code on Gist.

Jul 31 2019
Jul 31

It's that time of the month again! The time when we express our thanks to those Drupal teams who've generously (and altruistically) shared valuable free content with us, the rest of the community. Content with an impact on our own workflow and our Drupal development process. In this respect, as usual, we've handpicked 5 Drupal blog posts, from all those that we've bookmarked this month, that we've found most “enlightening”.

From:
 

  • Drupal 8 site-building best practices
  • to valuable built-in tools for image optimization in Drupal
  • to altruistically shared tips on how to enforce coding standards in Drupal
  • to best modules for implementing social login functionality into one's Drupal website
  • to... specific metrics that one should prioritize when evaluating his/her website's marketing efforts
     

… the July edition of our “OPTASY favorites” series is, again, packed with high quality, and (most of all) useful content.

But, let's get straight to our selection: here are the 5 blog posts that we've added to our list of resources this month.

Integrating social login functionality into Drupal websites must be one of the most common tasks that we deal with as a development team.

So, having a list of modules designed precisely for this is so reassuring:
 

  • we save precious time
  • we know that it's a Drupal solution (not a different, third-party software component) that we integrate into our projects
     

So, the InternetDevel have shared their list of Drupal 8 modules for social login just in time to... add to our bookmarks and resources & tools list.

Their blog post highlights 5 modules to evaluate first — to see if they fit a specific Drupal 8 web project's particularities — whenever we need to enable social login on the websites that we work on.

Our list of best practices for developing websites in Drupal 8 remains an open one. We keep on adding valuable suggestions on:
 

  • life-saving modules that we should be using, that we might have overlooked or underated
  • new approaches to Drupal development that help us save valuable time and avoid emberassing mistakes
     

That's why the ADCI Solution's team blog post on the best practices to adopt and to stick to when building a website in Drupal 8 couldn't have escaped our “radar”.

From:
 

  • module recommendations for SEO, security, maintaince and performance issues
  • to useful tips on how to address the editorial experience on the websites that we build
  • to what modules we should remove once we move a website from the dev environment to live environment
     

.. we've found a whole bunch of reasons why this post deserves its place in our top 5 Drupal blog posts from July.
 

Another one of those articles that we've “bumped into” precisely when we were looking for the solution that it presents.

In this case here we were looking to set up, document and enforce coding policy and procedures to ensure that our developers comply with the Drupal coding standards.

In their “revelatory” blog post, the Lullabot team puts the spotlight on a tool  — GrumPHP — that checks your Drupal code, before commiting it. One that detects any violation of the well-known Drupal coding standards.

Then, they go on anticipating and detailing each possible challenge that you might face when using this tool and they even share their solutions for overcoming them.

Overall, GrumPHP makes such an interesting discovery that we can't wait to leverage it in our next Drupal project. 
 

Disclaimer: this is not a Drupal blog post, yet the tips included there are highly applicable to websites running on Drupal, as well.

For us, it's been more of a recap of all those key metrics to use for evaluating our marketing strategies.

And it's convenient to have all 6 of them listed in one post that we can keep at hand and use whenever we need to check whether our marketing efforts do have an impact.

From:
 

  • time on site
  • to number of pages visited
  • to traffic sources
     

… plus 3 other crucial metrics for us to turn into the main criteria to use when assessing our marketing strategy's efficency, the article's such a welcomed “reminder”.

A handy resource that we've added to our list of 5 favorite Drupal blog posts from July.
 

Optimizing images is on top of our list of performance-boosting techniques. Therefore, the WishDesk's post on those Drupal 8 built-in tools designed precisely for this purpose came in more than handy.

And not only do they outline all the image optimization features that Drupal 8 provides us with, right out of the box, but they:
 

  • go into details of the whole process and its particularities in this version of Drupal
  • stress out the crucial importance of optimizing one's images for SEO and performance purposes
     

The END!

These are the 5 Drupal blog posts of the month which, in our opinion, shared the most useful and usable content.

Do you have your own list of favorites, too?

Image by Gerd Altmann from Pixabay

Jul 31 2019
Jul 31

Drupal 8 is built on PHP, but using new architecture paradigms that can be difficult to grasp for developers coming from a Drupal 7 background. The Typed Data API lies at the core of Drupal 8, and provides building blocks used throughout the Drupal 8 architecture. In this presentation, Jay Friendly, Morpht's Technical Director, dives into the Typed Data API, what it is, how it works, and why it is so awesome!

31 July 2019

Jul 30 2019
Jul 30

Testing integrations with external systems can sometimes prove tricky. Services like Acquia Lift & Content Hub need to make connections back to your server in order to pull content. Testing this requires that your environment be publicly accessible, which often precludes testing on your local development environment.

Enter ngrok

As mentioned in Acquia’s documentation, ngrok can be used to facilitate local development with Content Hub. Once you install ngrok on your development environment, you’ll be able to use the ngrok client to connect and create an instant, secure URL to your local development environment that will allow traffic to connect from the public internet. This can be used for integrations such as Content Hub for testing, or even for allowing someone remote to view in-progress work on your local environment from anywhere in the world without the need for a screen share. You can also send this URL to mobile devices like your phone or tablet and test your local development work easily on other devices.

After starting the client, you’ll be provided the public URL you can plug into your integration for testing. You’ll also see a console where you can observe incoming connections.

Resources

Jul 30 2019
Jul 30

I posted my analysis of top uses of deprecated code in Drupal contributed projects in May 2019 two months ago. There are some great news since then. First of all, Matt Glaman of Centarro implemented support for deprecation messages directly in reports, so we can analyse reports much better in terms of actionability. Second, Ryan Aslett at the Drupal Association implemented running all-contrib deprecation testing on drupalci.org (Drupal's official CI environment). While showing that data on projects themselves is in the works, I took the data to crunch some numbers again and the top deprecated API uses are largely the same as two months ago. However, we have full analysis of all deprecation messages and their fixability which gets us two powerful conclusions.

Stop using drupal_set_message() now!

If there is one thing you do for Drupal 9 compatibility, make it getting rid of using drupal_set_message(). As the API documentation for drupal_set_message() explains you should use the messenger service and invoke the addMessage() method on it.

Of the total of 23750 deprecated API use instances found in 7561 projects, 29% of them were instances of drupal_set_message(). So statistically speaking if you stop using this single function, you are already 29% on your way to Drupal 9 compatibility (likely more for most projects).

Dezső Biczó already built automated deprecation fixers covering drupal_set_message() and more, based on rector.

Figure visualising the data explained in the text.

76% of deprecated API use can already be resolved

On top of the 29% of drupal_set_message(), there is another 47% of various other API uses that can be fixed now. This means that those deprecated APIs were marked before Drupal 8.6.0 was released and are therefore in currently unsupported Drupal core versions. So stopping the use of them would still keep your code compatible with all currently supported Drupal core versions. In other words, as of today, drupal.org project maintainers can resolve three quarters of the outstanding code changes for Drupal 9 compatibility. Considering we are still ten months from before Drupal 9's planned release date, this is quite good news!

Upgrade Status is a nice visual tool to explicitly list all the errors in the projects you scan. It provides instructions on how to fix them and immediately fixable issues are highlighted.

Work with project maintainers

You are not a drupal.org project maintainer but want to help? Yay! Based on the above it may be tempting to run to the issue queue and submit issues for fixing all the things. Good plan! One thing to keep in mind though is to work with the maintainers of projects so your help benefits the project most effectively. Drupal.org project owners may specify Drupal 9 plan information that should help you engage with them the best way (use the right meta issue, know of their timeline plans, and so on). Check the project page of projects you are interested to be involved with to make sure you contribute the best way.

https://t.co/hf2ENvlZSo projects can now specify Drupal 9 porting information, so *you* can direct *your* contributors to provide the most valuable help on the way to Drupal 9, fund the process or just step back (for now). Edit your project to help your contributors help you! pic.twitter.com/l1OWwOllBK

— The Drop is Always Moving (@DropIsMoving) May 21, 2019

More to come

I think the data is super-interesting and I plan to do more with it, for example cross-referencing with project usage. Stay tuned for more information in the future. In the meantime all the source data is available for your mining as well.

Correction: An earlier version of this post said there were 43% additional fixable deprecations for a total of 72%. Thanks to Ryan Aslett for corrections, the correct numbers are 47% and 76% respectively. Images and text fixed.

Disclaimer: The data is based on the state of contributed projects on July 29, 2019 based on Drupal core's 8.8.x development branch on July 29, 2019. As contributed module maintainers commit fixes, the data will get better. As core introduces more deprecations (until Drupal 8.8.0-alpha1), the data could get worse. There may also be phpstan Drupal integration bugs or missing features, such as not finding uses of deprecated global constants yet. This is a snapshot as the tools and state of code on all sides evolve, the resulting data will be different.

Jul 30 2019
Jul 30

Today, User experience (UX) is not just about how a user feels when interacting with your website. In this world of rapidly growing interfaces and APIs, content plays a supreme role in offering your users with exceptional UX. To keep up the pace, you need to adopt hot-selling, fast-moving front-end technologies like Angular JS, React JS, etc. that can deliver your content in an application-like speed.  Headless Drupal (or decoupled Drupal) is one such approach that is gaining much popularity because of its innovative ability to deliver outstanding digital experiences. Bigwigs like Weather.com, The Tonight Show, Great Wolf Resorts, Warner Music Group and many more, have taken the headless Drupal route offering their customers with interactive and unique front-end designs and fast-loading websites.

What is Headless Drupal?

headless drupal

To go headless or not is a rather tricky decision to make in this digital world. So what’s the whole buzz about going Headless? Simply put, in a headless Drupal architecture, the front-end (consumers of content) of the CMS is detached from the back-end (provider of content). 

Conventionally, Drupal websites are meant to multi-task. Which means, Drupal manages both - the back-end content management as well as the front-end rendering of content. There is no doubt that Drupal CMS on its own can deliver a rich user experience to the end user but when it comes down to instantaneous responses for a request, delivering content seamlessly in different interfaces, it does fall short. In a decoupled Drupal architecture, instead of the Drupal’s theme layer, a client-side framework like AngularJS, React or Backbone.JS is used. A user request does not have to be processed by the server all the time, which can drastically improve the speed and UX of your Drupal website.

Technically speaking, a headless Drupal website sends out data in JSON format instead of HTML. A powerful front-end UI framework renders this data in JSON format and delivers the web page.

decoupled drupal                           Headless Drupal Architecture

Categorizing Decoupled Drupal

In a traditional Drupal CMS architecture, the browser invokes a request that is processed by PHP logic which then renders the HTML and sends it back to the browser. Of course, the developer can embed Javascript for some client-side improvements but this can result in a situation where different client-side frameworks are being used for different modules. Thus making it an extremely complex system.

Progressive Decoupling

If you are looking to preserve your Drupal Theme layer and yet be able to provide immediate responses to the browser, the Progressive Decoupling approach is your best move. Here you can have your cake and eat it too! The initial application state is rendered by Drupal which can be then manipulated by client-side coding. Modules can be written in PHP or Javascript while you can avail the powerful performance benefits of Drupal.

This version of decoupled Drupal allows for contextualized interfaces, content workflow, site preview, and other features to remain usable and integrated with Drupal as a whole. While content editors and site assemblers feel at home with this decoupled Drupal version, it also allows front-end developers to pursue their own velocity while keeping site assemblers unblocked, by dedicating a portion of the page to a JavaScript framework.

In short, a progressively decoupled Drupal offers an approach that does a great job in striking the perfect balance between editorial needs like layout management and developer desires to use more JavaScript.

decoupling drupal
A graph illustrating the progressive decoupling spectrum for these examples – Source- Acquia
 

Fully Decoupled Architecture

And then there’s the Full decoupling – where Drupal’s presentation layer is completely replaced with a client-side framework. This version of the decoupled CMS allows an uninterrupted workflow as the client-side framework also acts as a server-side pre-renderer. Drupal CMS is purely used as a content repository that takes care of all the back-stage jazz. When you completely ignore Drupal’s theme functionality you are also letting go of some effective performance benefits that Drupal provides. Also a lot of rebuilding would need to be done to fully decouple the administrative interface and front-end of a Drupal website. Using Javascript on the server-side also complicates the infrastructure requirements.

While a fully decoupled Drupal architecture has gained more attention in recent years with the growth of JavaScript showing no signs of slowing down, it involves separation of concerns between your content structure and its presentation. In a nut shell, creating a fully decoupled Drupal system is like treating your web experience as a separate application that needs to be served content.

Is it a good idea?

Traditionally, Drupal CMS is meant to do both – content management and rendering the front-end for the whole website. A lot of pressure, don’t you think? Drupal experts believe that Drupal’s strengths lies in the power and flexibility of its back-end and that it needs to be service oriented first instead of HTML oriented. Decoupling Drupal means letting some other system manage the front-end while Drupal takes care of the back-end system. Why is it a good idea to decouple Drupal, you ask?

If you want to adopt cutting-edge and modern front-end technologies that Drupal cannot provide you will need a powerful front-end framework like React JS or Angular JS. With a headless Drupal approach, you can have all of this and still maintain your robust backend Drupal CMS.

  • With the Headless Drupal architecture, you can “Write once and publish everywhere”. This system allows content editors, marketers and business owners to create content once and deliver it to multiple interfaces seamlessly.
  • With a decoupled CMS, detaching the front-end from the back-end content management system will allow for more flexibility and efficiency of the Drupal content model. Just like how delegating work decreases your workload and increases productivity.
  • A layered architecture promotes a more secure system. Site admins can restrict access to different areas of the infrastructure. 
  • Headless Drupal allows front-end developers to have full control over the presentation, UI design and UX of the website. The combination of a great client-side framework and a seasoned front-end developer can get you a website with a rich, faster, application-like user-experience, and seamless interactivity.
  • Integrating with third party applications is easier and more efficient with a headless Drupal model.
  • Both the front-end and back-end developers can work independently which can lead to efficient and speedy delivery of a project.
  • If you want to redesign your website, you won’t have to re-implement your Drupal CMS. Likewise, revamping your back-end system can be accomplished without having to alter your front- end.

Is headless Drupal for everybody?

Although decoupling Drupal can help you achieve your goals of an uninterrupted and application-like user- experience, it might not be a good fit for everyone. Here’s why –

  • Websites like News sites or Blogs, that don’t really need much user interactivity, will not benefit from decoupling their Drupal website.
  • When you opt for a fully decoupled Drupal architecture for your website, you are letting go of some of the top (and free) functionalities that come with the Drupal theme layer like the block placements, layout and display management, content previews, UI localization, security features like cross-site scripting (XSS), etc. Some of them cannot be replicated by a client-side framework.
  • If budget is an issue you need to keep in mind about the price you will have to shell out for experienced front-end developers. Also the cost for rebuilding a missing (otherwise freely available) Drupal feature from scratch. 

Who uses Headless Drupal?

Many top enterprises have taken the headless Drupal approach and successfully so! Their websites are fast to load and offer interactive experiences to their users in all devices and interfaces. Some examples are –

  • The Tonight Show with Jimmy Fallon – uses Backbone.js and Node.js for the front-end
  • Weather[dot]com – uses Angular.js for the front-end
  • Great Wolf Resorts – uses CoffeeScript and Spine framework
  • EC Red Bull Salzburg – uses Angular.js for the front-end
  • Warner Music Group – uses Angular.js for the front-end

…And many more on this list here.

Jul 30 2019
Jul 30

Ever since the cloud computing proliferated enterprise digital transformation, new cloud platform services have started thronging the scenes. Now, the cloud ride is burgeoning even faster in 2019 and the cloud vendor innovation pace is at sky-high. The revenue is soaring. Amazon Web Services (AWS), one of the giants in this space, has witnessed a 45% rise in revenue year over year and reached $7.43 Billion for the fourth quarter of 2018.

A humanoid made out of cardboard placed near a creek with Amazon written on it


AWS has been a force to reckon with when it comes to buying storage space for holding a colossal database, provision of bandwidth for hosting a website or processing power in order to run intricate software remotely. With AWS, the necessity of buying and running own hardware gets eliminated and organisations or individuals can pay for only what they actually use. Netflix, leading media services provider and a go-to option for streaming movies and web series, leverages AWS for almost all its backend infrastructure, storing and streaming its online content.

Netflix, from being a DVD-by-mail service to one of the most sought after media streaming services in the world, has come a long way. The number of Netflix subscribers has grown multifold (approximately 150 million in 2019). With 37% of the internet users around the globe binging movies and web series on Netflix, the power of AWS has massively helped them to keep up with the growing strength of customer base and scale at demand. If AWS can play such an influential role on a big enterprise like Netflix, you can hope for wondrous things to transpire when another magic pearl is added. Drupal can do miracles along with different AWS products and there are various ways to leverage products of Amazon Web Services with Drupal for your web development solution. But first, let’s take a quick look at AWS and a plentitude of products that it offers.

AWS in a nutshell

It is imperative to understand where AWS stands today in the market share before getting acquainted with its various provisions. In comparison to other big cloud service providers - Microsoft and Azure - there is a clear lead in the market maintained by AWS.

A bar graph in blue and green colours to show the statistics on Amazon Web Services (AWS) market shareSource: Canalys

Amazon Web Services is definitely one of the most sought after cloud solutions in the market. So what is it? It is a comprehensive cloud platform by e-commerce giant Amazon that provides software as a service (SaaS), platform as a service (PaaS), and infrastructure as a service (IaaS) offerings. AWS offers cloud services from multiple data centres and availability zones that are spread across different regions of the world.

A Well-Architected Framework can be put into use that is built to assist cloud architects in the development of a safe, high-performing, resilient, and efficacious infrastructure for their applications. This framework is based on five pillars namely operational excellence, security, reliability, performance efficiency and cost optimisation.

Five different icons stacked together resembling graph, shield, lightning, speedometer, dollar


AWS provides a huge set of cloud-based services comprising categories like analytics (Amazon CloudSearch, Amazon Athena etc.), application integration (Amazon Simple Notification Service, Amazon MQ etc.), robotics (AWS RoboMaker), compute (AWS Elastic Beanstalk, AWS Lambda etc.), database (Amazon Aurora, Amazon Redshift etc.), and satellite (AWS Ground Station) among others for helping organisations move rapidly, lessen IT costs, and be highly scalable.

Different icons resembling joystick, laptop, robot, globe, mobile phone, cloud, goggles stacked together to represent AWS services Source: AWS

Different ways to leverage AWS services with Drupal

Drupal, an open-source content management framework, is an astounding digital experience platform that helps you disseminate the right content to the right person at the right time on the right devices. Its great content authoring capabilities, provision of stupendous web performance, multilingual features, high scalability, easy integration with the best tools that are available outside of its periphery, mobile-first approach, multisite offering, and immense security make it one of the leaders in the content management system (CMS) market. No wonder its usage has continuously risen to new heights.

Graph showing red, blue and green lines to depict Drupal usage statisticsDrupal usage statistics | Source: BuiltWith

Whether you need to deploy a production-ready Drupal website or build innovative solutions with Drupal 8, many of the products from AWS can be of magnificent use in Drupal development. Let’s take a look:

Production grade Drupal configuration

You can deploy a highly available Drupal architecture on the AWS cloud using a quick start guide. This allows you to leverage AWS services and further improve the performance and extend the functionality of your CMS. AWS’ flexible compute, storage and database services make it a top-notch platform for running Drupal workloads.

The core AWS components that are used for this implementation involve the AWS services like EC2 (Elastic Compute Cloud), EFS (Elastic File System), RDS (Relational Database Service), VPC (Virtual Private Cloud), Auto Scaling, CloudFormation, Elastic Load Balancing, IAM (Identity and Access Management), ElastiCache, CloudFront and Route 53.

The AWS Regions assist in governing network latency and regulatory compliance. Regions are designed by taking availability into consideration and comprise at least two availability zones. Regional endpoints are supported by most AWS services thereby minimising data latency as they provide an entry point for service requests in that region. 

Illustration showing a big square containing smaller squares and circles in green, blue, orange and yellow colours to represent the mode of Drupal deployment on AWSSource: AWS

For the deployment of production-grade Drupal configuration, AWS CloudFormation gives you an automated, simple way for creating and handling a collection of related AWS resources. The main template takes care of building the network-related resources first and then the launch of different templates for Drupal and Amazon Aurora ensues. Modularising CloudFormation code involves other templates and the creation of an Amazon Machine Image (AMI) for Drupal requires an additional template that uses AWS Lambda. For installing Drupal on all the instances in the Auto Scaling group, AMI turns out to be effective. This negates the possibility of repeated downloads.

There are optional templates that can be leveraged like deploying an ElastiCache cluster, building CloudFront web distribution and creating DNS (Domain Name System) records in Route 53 public hosted zone. If you use ElastiCache or CloudFront, the configuration of Drupal with requisite default settings is done. Optimisation of Drupal’s caching and content delivery network settings can be done once Drupal stack is deployed. And when you delete the main template, it deletes the entire stack.

This quick start’s highly available reference architecture for Drupal deployment requires an HTTP(S) load balancer, two or more Drupal servers on Apache web server, shared file storage, shared ElastiCache for Memcache cluster, CloudFront distribution, and Route 53. Deployment of Drupal can be done into a new Virtual Private Cloud (VPC) which involves building a new AWS environment comprising VPC, subnets, NAT gateways, security groups, bastion host and a lot of other infrastructure components. Or, the deployment of Drupal can also be done into an existing VPC that enables Drupal in your existing AWS infrastructure.

Alternative to deploying and hosting production-ready Drupal

For deploying a high-availability Drupal website, this AWS documentation another way round. It exhibits the process of deploying and hosting Drupal. In this, the implementation of an architecture, in order to host Drupal for a production workload, requires minimal governance responsibilities from you.

AWS Elastic Beanstalk, Amazon RDS and Amazon EFS can be leveraged. Once the uploading of Drupal files is done, Elastic Beanstalk governs the deployment process automatically that involve application health monitoring, load balancing, capacity provisioning, auto-scaling among others. RDS offers a cost-effective and resizable capacity while managing time-consuming database administration tasks for you.

Serverless implementation using [email protected]

Flowchart containing different icons connected by arrows to represent Serverless implementation using Drupal and AWSSource: AWS

Drupal can also be a fantastic solution for implementing serverless architecture. The union of Amazon CloudFront, [email protected] and headless Drupal can offer the lowest latency and personalised experience to the users. Deployment of CloudFront allows you to cache and accelerate your Drupal content with the assistance from a globally distributed set of CloudFront nodes. In this, every CloudFront distribution constitutes one or more origin locations. An origin is where Drupal content resides. Deployment of Drupal 8 is done running by running the supplied Amazon CloudFormation stacks. In this, AWS services like EC2, EFS, RDS and Aurora are of great use as well. It is all wrapped in a highly available design with the help of multiple Availability Zones and the configuration is done in such a manner that auto-scaling can be successfully done using EC2 Auto Scaling groups.

Creation of URL aliases for the content is done using the path module that is available in Drupal 8. Within the Drupal 8 administration, ‘Aggregate CSS Files’ and ‘Aggregate JavaScript Files’ are enabled by default. Therefore, the need for bandwidth gets reduced between the Origin AWS infrastructure and CloudFront Edge nodes. Internal Drupal caching is disabled by default that has the authority over the maximum amount a time a page could be cached by browsers and proxies. For altering file URLs and easily caching CSS, JavaScript images, audio and videos within CloudFront, it is also suggested to enable Drupal’s CDN module. Subsequently, CloudFront distribution is created with the help of CloudFront console that involves configurations on Origin, default cache behaviour settings and distribution settings.

Interactive screens using AWS IOT

Drupal is an incredible option for building a scalable digital signage solution for a variety of organisations and can reduce costs, speed up time to market, and help in creating engaging experiences for the people. Metropolitan Transportation Authority (MTA), that plays a significant role as the largest public system in the United States of America, has benefitted by leveraging Drupal and AWS IoT services.

A digital signage powered board at a railway station with the name of railway station written inside itSource: Acquia

Drupal, which powers MTA’s website, also helped them to serve content and data to thousands of digital signs in hundreds of stations in New York City. The utilisation of digital signage’s benefit in station countdown clocks has allowed MTA to offer a great customer experience.

The content can be built inside Drupal and data is pulled from external feeds so that countdown clocks can be supplied with data. The data can be pulled from transit information weather and message provider because Drupal is equipped with provider APIs and once the data is given context via Drupal content model, it is pushed to the digital signs. This is done with the assistance of a data pipeline that’s implemented for utilising IoT service from AWS. 

Cross-channel experience with Amazon Alexa

Amalgamation of Amazon Alexa and Drupal can be great for allowing content to be accessed both via web and voice assistants. Alexa Drupal module helps in the integration. For this, Drupal website must be available online and using HTTPS. To begin with, Alexa module can be installed and enabled on the Drupal site followed by the creation of a new Alexa Skills Kit. Subsequently, application ID, that is provided by Amazon in ‘Skill Information’ is to be copied and submitted to the Drupal site’s configuration. Configuration of Alexa skill in Alexa Skills Kit can then be done and a customised handler module can be built for handling custom Alexa skills.

[embedded content]


A digital agency used this process to build a solution that leveraged both Alexa and Drupal. This they demonstrated through fictional grocery store called Freshland Market. In this, a user opts for a food recipe from Freshland Market’s Drupal site and gets all the ingredients required to cook the food. The food recipe that was asked by the user was for 8 persons but the site has that information for 4 persons. So, the Freshland Market Alexa skill adjusts the quantity of ingredients by itself for 8 persons. In the midst of an array of questions and the relevant ingredients and cooking procedures that the user involves himself with, the food preparation activity turns out to be very simple for the user and it doesn’t require the user to look at the laptop or mobile phone at any stage.

Open source photo gallery using Amazon Rekognition and Amazon S3

Amazon Rekognition’s powerful face and object recognition capabilities can be leveraged with Drupal to a great extent. Its deep learning feature assesses a plethora of images and then utilises all of that data to label objects and detect faces in separate photos. Amazon S3 can help in storing all the photos on a website in one S3 bucket.
 
In a bid to create an open and powerful solution for building galleries and sharing images, a digital agency integrated S3, Rekognition and AWS Lambda with Drupal 8. The main objectives behind this implementation of an open source photo gallery were that it should be ‘self-hosted’, be easily able to upload plentitude of photos, use Drupal as a content store, leverage S3 for file storage and utilise Rekognition for automatic face and object recognition. The expected outcome was to make Drupal even better for photo sharing.

Flowchart showing a person working on laptop, a droplet containing number 8, red boxes, gama symbol, and laptop iconsSource: Acquia

They succeeded by developing an automated image processing workflow. In this, a user uploads a single picture or a set of pictures to Drupal 8 with the help of Entity Browser Drupal module. With the help of S3 File System module, Drupal, then, stores each of the pictures in an Amazon S3 bucket. For every new picture that gets copied into S3 bucket, an AWS Lambda function is triggered and the Lambda function sends the image to Rekognition. The function, then, receives back facial and object recognition data and calls a REST API  resource on the Drupal 8 site for delivering the data through JSON. Rekognition API Drupal module helped in parsing the data and storing labels and recognised faces in Drupal taxonomies and then relating the labels and faces to the Media Image entity for each of the uploaded pictures.

Conclusion

Drupal 8 keeps setting the bar higher when it comes to ease of use, offering limitless new ways to tailor and deploy your content to the Web, easily customise data structures, listing and pages, reaping the benefits of new capabilities for exhibiting data on mobile devices, building APIs and adapting to multilingual needs. Digital innovation is the forte of Drupal 8. And when AWS services are used along with Drupal, there is no stoppage to building exciting solutions.
 
We believe in open source innovation and are committed to offering great digital experiences with our expertise in Drupal development. Talk to our Drupal experts at [email protected] and let us know how do you want us to be a part of your digital transformation endeavours.

Jul 30 2019
Jul 30

We are well aware of the fact that Drupal Cache API is a remarkable feature introduced in Drupal 8. Still, this topic remains unrevealed to many developers as they consider caching to be a critical aspect of a website. In one of our earlier posts, we have exemplified Cache tags https://www.innoraft.com/blogs/how-does-entity-cache-work-drupal-8. Here is a guide that helps you easily grab in some basic concepts of Cache Context. 

Cache Context is basically a service that helps in creating multiple cached versions of something depending upon the context/request; be it a view, block or any other section on the page. 

For instance, let us consider a block displaying a list of tutorial links on a D8 instance. Now authorised users will be given access to all the links while anonymous ones will be provided only with the free tutorials. This data completely depends upon the role of the user. Hence ‘user.roles’ can be used as a cache context in such a scenario. For simplicity let us assume that there exist only two roles authenticated and anonymous. When an authenticated user hits the page, the version of the block with access to all links will be displayed. Hereafter if another authenticated user visits the page; the cached version of the block will be served thereby enhancing the site performance. When an anonymous user comes to the same page the entire request is carried out and the display with limited access to links will be shown. In such a way we can explicitly decide as to when the cache of the element will be invalidated based on the context.

D8 core provides few predefined cache contexts that are available at https://www.drupal.org/docs/8/api/cache-api/cache-contexts .

Our main focus here would be how to define and use a custom cache_context according to our requirement.

Let us consider a simple example to invalidate the cache of a block that displays a personalised message based on the summary that the user has provided in the user edit page.

Prerequisite: Add a field for filling in summary in the user edit form([base_url]/user/[user_id]/edit) provided by default in drupal.

Cache context can be registered as any other service in the module.services.yml file:

services:
  cache_context.user_summary:
    class: Drupal\example_cache_context\CacheContext\UserSummaryCacheContext
    arguments: ['@current_user']
    tags:
      - { name: cache_context }

   example_cache_context.services.yml

The trick here is to understand the naming convention for a new cache context. The name of the service should be of the format cache_context.* i.e should start with ‘cache_context.’ followed by the appropriate name. Hence the name cache_context.user_summary. Similarly, we can define a further level of hierarchy as well. As per the above snippet, the code for cache context goes into src/CacheContext/UserSummaryCacheContext.php. The service takes in the current_user service as an argument. The detail on how to pass a service as an argument to another is available at https://www.drupal.org/docs/8/api/services-and-dependency-injection/structure-of-a-service-file. We need to tag this service to cache_context as well.

The summary field should be used to create this cache_context logic as per the following code snippet:


<?php

namespace Drupal\example_cache_context\CacheContext;

use Drupal\Core\Cache\Context\CacheContextInterface;
use Drupal\Core\Session\AccountProxy;
use Drupal\user\Entity\User;
use Drupal\Core\Cache\CacheableMetadata;

class UserSummaryCacheContext implements CacheContextInterface {
  /**
   * @var \Drupal\Core\Session\AccountProxyInterface
   */
        protected $user_current;

  /**
  * {@inheritdoc}
  */
        public function __construct(AccountProxy $user_current) {
                $this->user_current = $user_current;
        }

  /**
  * {@inheritdoc}
  */
        public static function getLabel() {
                return t('User Summary cache context');
        }

  /**
  * {@inheritdoc}
  */
        public function getContext() {
    $id = $this->user_current->id();
    $user_details = User::load($id);
    $summary = $user_details->get('field_summary')->getValue()[0]['value'];
    return $summary;
        }

  /**
  * {@inheritdoc}
  */
  public function getCacheableMetadata() {
    return new CacheableMetadata();
  }
}



UserSummaryCacheContext.php

Here the SummaryCacheContext class implements the interface CacheContextInterface. The variable $user_current which is an instance of AccountProxyInterface is declared protected and used as per the arguments mentioned in the services.yml file. The function getContext() contains necessary code for the cache invalidation based on the context. We can implement any other logic according to our requirement here.

The CacheContext code is now ready to use. In order to test the code, create a block within src/Plugin/Blocks and place it on any page:


<?php

namespace Drupal\example_cache_context\Plugin\Block;

use Drupal\Core\Block\BlockBase;
use Drupal\Core\Cache\Cache;
use Drupal\user\Entity\User;
use Symfony\Component\DependencyInjection\ContainerInterface;

/**
 * Provides a block for particular user's summary
 *
 * @Block(
 * id = "user_summary_block",
 * admin_label = @Translation("User Summary Block")
 * )
 */


class UserSummary extends BlockBase {

  /**
   * {@inheritdoc}
   */
        public function build() {
                $build = [];
                $UserId = \Drupal::currentUser()- >id();
                $user = User::load($UserId);
                $summary = $user->get('field_summary')->getValue()[0]['value'];
                $build['user_summary'] = [
                        '#markup' => $summary
                ]; 
                return $build;
        }

  /**
   * {@inheritdoc}
   */
  public function getCacheContexts() {
        return Cache::mergeContexts(
                parent::getCacheContexts(),
                ['user_summary']
        );
  }
}

 UserSummary.php

To verify if the context has been added properly, use the chrome dev tools:

dev_tools

If these headers are not visible you might need to configure settings to enable these as per https://www.drupal.org/docs/8/api/responses/cacheableresponseinterface#debugging

This eliminates the need for cache clearance of entire site after the update of any data which slows down the website. Hope this blog gave a better insight as to the usage of cache context in Drupal 8. You can now invalidate cache as per the context keeping in mind the performance of the site while updating the user of every new piece of information appearing on the website. 

Jul 30 2019
Jul 30

Agiledrop is highlighting active Drupal community members through a series of interviews. Now you get a chance to learn more about the people behind Drupal projects.

We had an amazing talk with the super friendly Maria Totova, a driving force behind the Bulgarian Drupal community, organizer of various educational events, avid speaker and co-founder of Drupal Girls. Have a read and learn more about her numerous interesting projects and her love for Drupal. 

1. Please tell us a little about yourself. How do you participate in the Drupal community and what do you do professionally?

My name is Maria Totova and I am a back-end developer from Bulgaria. I have been using Drupal for the last 4 years and I absolutely love it! I work as a Drupal developer at trio-group communication & marketing gmbh, a leading German brand and communication agency, where we create individualized marketing and business solutions.

I am also a board member at Drupal Bulgaria, a non-profit NGO and the official Drupal foundation in my country, as well as an education manager, community leader & instructor at Coding Girls, a non-profit NGO and an international movement. Last, but not least, I am very happy to be a co-founder of Drupal Girls, a subdivision of Coding Girls, devoted especially to raising the interest towards Drupal and growing a strong and diverse community.

Being part of all these amazing institutions, I have the great pleasure to organize and conduct different kinds of events: meetups, workshops, courses and camps. I do my best to spread some Drupal love in high-schools and universities as well by teaching and mentoring students there. I especially love being a speaker at Drupal conferences and I always try to contribute and share what I have learnt. 

2. When did you first come across Drupal? What convinced you to stay, the software or the community, and why?

When I came across Drupal, I was a freelancer using WordPress to build rather tiny websites for small companies. So, I can say that I discovered Drupal at a stage of my life when I was searching for a change, for something more. And I found it. I started working with big brands on larger, more complex projects for great companies.

What I particularly like about Drupal is that it brings many challenges and opportunities. It's never boring. I learn a lot and do different things every day, develop all kinds of various functionalities all the time.

But most of all, thanks to Drupal I have met and continue meeting so many exciting people! I've got amazing colleagues, so smart and really crazy! :) I have found great mentors who have been helping me grow as a developer and I have made friends for life.

Indeed, the Drupal community is full of awesome people, inspiring folks, so open-minded and always ready to help. I love that! I have found the place where I fit in and feel safe and comfortable.

3. What impact has Drupal made on you? Is there a particular moment you remember?

I remember my first encounter with Drupal. :) It was not really a love at first sight... When a friend of mine, who, funny enough, hates Drupal, mentioned it, I decided to take a look. I visited drupal.org, went briefly through the quite strange, full of unknown terminology D7 docs, thought the themes were not so appealing but still decided to install it and dive in a little deeper.

Then, I encountered the content types and modules, and I was like: “Gee, I want to use this!” :D Of course, I went on learning Drupal, built my portfolio website in the process and a few months later I applied for a Drupal job.

Guess what? They called me and hired me on the very next day! I was over the moon! Since then, I have been absolutely enjoying my work every day at every company! How has Drupal changed my life? Phew, it has turned it upside down and inside out but in a very, very good way. I love it and I am happy. Thank you, Drupal! :)

4. How do you explain what Drupal is to other, non-Drupal people?

I always enjoy explaining what Drupal is to my friends and students. I start by underlining the fact that Drupal is not only a CMS but also a powerful framework. On one hand, you have the full capacity to structure your data and become a great content modeler without even realizing it.

On the other hand, you can build various complex custom solutions via Drupal APIs. I tell them how easy it is to install it for less than 10 min. Then, you receive a solid base that you can build on with only the functionalities you need, depending on the type of your project and without any unnecessary stuff. I describe what an impressive technology Drupal is and focus on its main features: modularity, security, performance, reliability, flexibility, multilingual support, mobile-first approach and so on.

Of course, I don’t forget to highlight the significance of the Drupal community: all the contributions, support and the amazing events that it brings along. In the end, what persuades them best is simply seeing my enthusiasm and understanding that Drupal brings real fun. :)

5. How did you see Drupal evolving over the years? What do you think the future will bring?

When I started with Drupal, it was version 7. Previously, I had experience in writing object-oriented PHP using CodeIgniter (they have the best docs ever!) and I loved the MVC pattern. It took me some time to understand the drupalisms but soon I grew fond of the hook system and everything.

However, the changes in Drupal 8 brought pure delight. The OOP paradigm and Symphony have made a huge difference. I am eager to see what the future brings, especially in terms of decoupled Drupal and consumer applications. Having in mind our great community, I am pretty sure that Drupal will continue evolving and shining!

6. What are some of the contributions to open source code or to the community that you are most proud of?

The Drupal Girls project is one of the things I am quite proud of. The idea behind it is to promote Drupal among ladies and bring more diversity to the Drupal community. We do this by organizing workshops, events & courses and inviting girls to join us in a safe, supportive and inclusive environment. Our main target group is high-school & university students, but we are also happy to work with teachers, instructors and developers using other technologies.

Since our vision is based on integration, we are always happy to have men at our events as well. We all know that men and women think in a different way and this is actually a very good thing! We find out different aspects while working on projects and complement each other.

In fact, more and more companies are starting to realize how important diversity is and how beneficial it is to their organizations. I am very happy that Trio, the company I work for, supports our mission and provides us with the space and everything we need for our events. I hope that more people and organizations will consider joining our initiative by establishing a local community in their city.

Since we are part of the Coding Girls family, a non-profit & non-government organization, all our work is completely volunteer. Thus, we are constantly looking for more mentors and instructors willing to educate and encourage girls to get started with Drupal.
 
The Drupal 8 Companion Guide is another project that is part of Drupal Girls and which I presented at Drupal Europe in Darmstadt last year. It has still some work in progress, but I will do my best to publish it soon. It is a structured and portable reference manual to various Drupal materials which both learners and instructors can adopt anytime, anywhere.

It aims to help beginners focus on the important concepts without losing too much time in a prolonged research and before they give up. I have been using it for conducting our workshops and courses as well as for building a curriculum for our Trio internship programs for university students. We are planning to provide it to high schools this autumn, too. Of course, the guide can also be used in a self-paced & self-study manner by newcomers on their journey through the Drupal realm.
 
In the meantime, I enjoy being a speaker at Drupal conferences and sharing my knowledge, experience or lessons learnt with the folks there. I particularly like the lively discussions at the end of the sessions, and I am always looking forward to them. One of the local camps that is especially important for me is Drupal Bootcamp Plovdiv and I am very proud to be among its organizers.

It is a two-day conference for total beginners that consists of various presentations, discussions, quizzes and workshops. At the end of the conference, every participant has their own project and a good basic understanding of Drupal.

We have been doing it for a third year in a row and I absolutely love to see new eager-to-learn eyes every time! In addition, thanks to the latest changes on community projects on drupal.org, I am also happy to give credits to our great speakers and mentors!

7. Is there an initiative or a project in Drupal space that you would like to promote or highlight?

Ah, there are so many great Drupal initiatives and projects that I simply cannot list them all. Of course, the first three that come up to my mind are the Drupal 9, the Admin UI & JavaScript Modernisation and the Documentation strategic initiatives. These folks are doing a wonderful job and they deserve our respect.

As a developer, I am deeply interested in the D8DX: Improving the D8 developer experience community initiative. Since I come from a Drupal 7 world, and I remember the multi-language combinations and struggles there, I cannot forget to mention how impressed and grateful I am to the Multilingual initiative!

Finally, the Promote Drupal initiative is really important to us all and should definitely be highlighted!

As for projects, I am particularly fond of Thunder: we have been using it as a foundation for developing our own distribution and I enjoy being one of the devs working on it. I also like Drupal Commerce and I am always happy to see new e-shops built on it. Most definitely, every project on drupal.org deserves a recognition for all the efforts of their maintainers and contributors!

8. Is there anything else that excites you beyond Drupal? Either a new technology or a personal endeavor. 

When I am not busy with Drupal, I volunteer the rest of my time to Coding Girls as an education manager, instructor and mentor. Coding Girls is an international organization promoting an increased presence of girls and women in technology, leadership and entrepreneurship. We have communities in different cities around the world and are constantly growing.

I am the community leader of Coding Girls Plovdiv in my hometown, where we have been organizing free meetups, courses, workshops and all kinds of tech events for more than two years. Apart from the summer break, we are quite busy as we have an event almost every Thursday. This is how I have gained solid experience in organizing events and I enjoy it a lot!

Besides, now I have the chance to do the thing I love as much as programming – teaching. I know how important mentorship is and I am happy to do it for other people, to pay it forward. :)

Jul 30 2019
Jul 30

The Intense Drupal 8 module provides a nice whole screen zoom of the images on your site. Keep reading if you want to learn how to install and use this module with a practical example.

Step #1. Download and Install the Required Modules

  • Open the terminal application on your PC
  • Go to the root of your Drupal installation (the composer.json file is inside this directory)
  • Type the following command:

composer require drupal/intense

Enter the composer command

  • Click Extend
  • Scroll down, search and check the following modules:
    • Blazy
    • Blazy UI
    • Intense images
  • Click Install
  • The System will install the core Media module, which is required
  • Click Continue

Click Continue

After installing the modules, it is necessary to download, unzip and place the required libraries in place.

  • Create a libraries directory on the root of your Drupal installation (the core directory is there)
  • Download the Intense library from this GitHub page.

Download the Intense library

  • Place the zip file inside the libraries folder
  • Extract it
  • Rename the extracted directory to intense
  • Repeat the process with the Blazy library
  • Rename it to blazy
  • At the end you should have the following file structure:

At the end you should have the following file structure

Step #2. Add a Field to the Article Content Type

Our site is promoting fishing trips, so a price field is required.

  • Click Structure > Content types > Article
  • Click Manage fields

Click Manage fields

  • Click Add field
  • Select Number (decimal)
  • Give the field a proper label
  • Click Save and continue

Click Save and continue

Click Save field settings

Click Save field settings

  • Check this field as Required
  • Add the prefix ‘$ ‘ (don’t forget the quotation marks)
  • Click Save settings

Click Save settings

Step #3. The Intense Image Configuration

  • Click Structure > Content types > Article > Manage display
  • Look for the Image field Format and select Blazy from the dropdown
  • Click the cogwheel on the right
  • Select the Media Switcher option and choose Image to Intense
  • Click Update

Click Update

  • Place the Price field right over the Comments field
  • Click Save

Click Save

If you want to have this effect on the teasers of your articles, just edit the Teaser view mode with the same configuration options.

Edit the Teaser view mode with the same configuration options

Step #4. Create Content

  • Click Content > Add content > Article
  • Write proper content
  • Upload an image
  • Click Save

Click Save

You should see a cross over the image if you hover with your cursor over it. The cursor will turn itself into a cross too.

You should see a cross over the image

Click the image, it will zoom and cover the whole screen.

Click the image, it will zoom and cover the whole screen

You can pan over the image by moving your mouse. The image closes when clicking once again.

I hope you liked reading this tutorial and putting it in practice (of course)!

Thank you and stay tuned for more Drupal content.


About the author

Jorge lived in Ecuador and Germany. Now he is back to his homeland Colombia. He spends his time translating from English and German to Spanish. He enjoys playing with Drupal and other Open Source Content Management Systems and technologies.
Jul 30 2019
Jul 30

Sites with long pieces of content or with a long landing page often have a little arrow at the bottom, which helps you get back to the top of the site instead of scrolling the whole way back.

The Back To Top Drupal 8 module helps site-builders who are not yet ready to work with templates or JavaScript to place this kind of button on their sites.

Keep reading to find out how.

Step #1. Download and Install the Required Module

  • Open the terminal application on your PC
  • Go to the root of your Drupal installation (the composer.json file is inside this directory)
  • Type the following command:

composer require drupal/back_to_top

Type the composer command

  • Click Extend
  • Scroll down and search the Back to top module, check it
  • Click Install

Click Install

Step #2. The Module Configuration

  • Click Configuration > User Interface > Back to Top
  • Check Prevent on administration pages and node edit
  • Change the Button text
  • Leave the default PNG 24 image button
  • Click Save configuration

Click Save configuration

Step #3. Replace the Button Image

The image file is called backtotop.png and is located at /modules/contrib/backtotop.

  • Rename the file backtotop.png to backtotop1.png
  • Paste a new image file (70px wide, 70px high) called backtotop.png into this directory
  • Paste another image file with the same dimensions called backtotop3.png.

This file will be used to achieve a hover effect.

This file will be used to achieve a hover effect

The backtoto2x.png is for retina displays there. You can replace this file with the same method. Make sure the file is this time 140px wide and 140px high.

Step #4. Edit the CSS Code

To display the green arrow when hovering over the yellow arrow you have to edit the CSS code of the module.

The CSS file is located at /modules/contrib/back_to_top_css.

  • Open this file in the code editor of your liking
  • Edit the #backtotop:hover selector
  • Add the following code:
#backtotop:hover { background: url(../backtotop3.png) no-repeat center center; bottom: 20px; cursor: pointer; display: none; height: 70px; position: fixed; right: 20px; text-indent:-9999px; width: 70px; z-index: 300; }

Add this code

  • Click Configuration > Performance > Clear all caches
  • Important! Clear also the cache of your browser

Head over to a long article on your site and scroll down. The yellow arrow should appear.

Head over to a long article on your site and scroll down. The yellow arrow should appear

Hover over it to see how the other image gets pulled.

Hover over it to see how the other image gets pulled

If you click on this button the page will scroll back to the top.

The Text/CSS Button

I was not able to edit the colors of the button through the user interface. However, you can tweak the look and feel of it by editing the CSS file located at /modules/contrib/back_to_top/css/back_to_top_text.css.

You can tweak the look and feel of the button

You can tweak the look and feel of the button

I hope you liked reading this tutorial. Stay tuned for more Drupal content. Thanks!


About the author

Jorge lived in Ecuador and Germany. Now he is back to his homeland Colombia. He spends his time translating from English and German to Spanish. He enjoys playing with Drupal and other Open Source Content Management Systems and technologies.
Jul 29 2019
Jul 29

In the coming weeks, you can expect a series of changes going into the development pipeline to support the CiviCRM-Drupal 8 integration. Individually, these will seem unrelated and disjoint - they may not explicitly reference “D8”. I wanted to spend a moment to discuss the concept which ties them together: the clean install process, which will make Civi-D8 an equal member of the Civi CMS club and a good base for continued development and maintenance.

This work on D8 stabilization has been made possible by the generous funders of the Civi-D8 Official Release MIH. If you’d like to see more topics addressed, please consider contributing to the MIH.

What do you mean by "clean" install process?

A "clean" install process is a set of steps for building a site in which CiviCRM comes in a direct fashion from source-code with a bare minimum of intermediate steps.

To see this concept in action, we can compare CiviCRM's integrations with Drupal 7 and Joomla:

  • CiviCRM-D7 is amenable to a (comparatively) clean install process. There are three Github projects (“civicrm-core”, “civicrm-packages”, and “civicrm-drupal”) which correspond directly to folders in the web-site. If you copy these projects to the right locations, then you have an (almost) valid source-tree.

  • CiviCRM-Joomla is not amenable to a clean install process. You might think it's similar -- it uses a comparable list of three projects (“civicrm-core”, “civicrm-packages”, “civicrm-joomla”). The problem is that “civicrm-joomla” does not correspond to a singular folder -- the install process requires a diasporadic distribution of files. The install script which handles this is tuned to work from the “civicrm-X.Y.Z-joomla.zip” file, and producing that file requires a tool called ”distmaker”. “distmaker” is fairly heavy - it requires more upfront configuration, is brittle about your git status, runs slower, and produces 200+mb worth of zipfiles. In short, building a CiviCRM-Joomla site from clean source is more difficult.

Why does a "clean" process matter?

It's easier to develop and maintain software when the build is clean and intuitive. Specifically:

  • It's easier to climb the ladder of engagement from user/administrator to contributor/developer.

  • It's easier to lend a hand - when someone submits a proposed patch, it's easier to try it out and leave a friendly review.

  • It's easier to setup automated QA processes for evaluating proposals and upcoming releases.

  • It's easier to setup sites for RC testing, issue triage, pre-release demos, and so on.

  • It's easier to pre-deploy a bugfix that hasn't been officially released yet.

Anecdotally, more experts with stronger collaborations have grown-up and stayed around in the Civi-D7 and Civi-WP realms than the Civi-Joomla realm. And that does not feel like a coincidence: as a developer who watches the queue, I'm generally intimidated by a Civi-Joomla patch -- even if it looks simple -- purely on account of the difficult workflow. I believe that a reasonably clean/intuitive build process is prerequisite to a healthy application and ecosystem.

Moreover, a clean build of Civi-D8 is important for Civi's future. Civi-D7 cannot be the reference platform forever - if we expect Civi-D8 to take that mantle, then it needs to be on good footing.

What kind of changes should we expect?

From a civicrm.org infrastructure perspective: Expect automatic setup of D8 test/demo sites - in the same fashion as D7, WordPress, and Backdrop. This means PR testing for "civicrm-drupal-8". For bug triage, it means normalized and current test builds. For release-planning and QA, the test matrices will provide test-coverage for D8 (similar to the other CMS integration tests). These services are blocked on the need for a clean process.

From a site-builder perspective: Expect the recommended template for `composer.json` to be revised. This should improve support for backports and extended security releases. Early adopters may eventually want to update their `composer.json` after this settles down; however, the details are not set in stone yet.

From a developer perspective: Expect the process for setting up `git` repos to become simpler. Instead of using the bespoke `gitify`, you'll be able to use the more common `composer install --prefer-source`.

From a code perspective: Expect changes to code/use-cases which (directly or indirectly) require auto-generated files. For example, when initializing Civi's database, the current Civi-D8 installer relies on “.mysql” files (which have been pre-generated via ”distmaker”); we can replace this with newer function calls which don't require pre-generated files -- and therefore don't depend on ”distmaker” or “setup.sh”.

Jul 29 2019
Jul 29

I joined Liip for an internship in POing. To take the most out of this opportunity, I prepared myself for the Scrum Product Owner Certification, which I succeed. Here are my key take-aways.

Why a Scrum Product Owner certification ?

The first step, I would say, to get certified is to be ready to dedicate time for the preparation and training, and to commit to it. This wasn’t a problem for me as I was highly motivated in improving my skills and knowledge about Scrum. And here is why.

I worked more than 10 years in international corporations, using traditional models for product development. And I reached the point where I sensed that something wasn’t quite right.

As part of the product development team, we were asked to deliver a set of features based on specifications defined at the beginning of a project. All the specifications had to be delivered for a mid- to long term deadline. No need to say that it wasn’t possible to modify them. The sign off could only be possible if the defined specifications were delivered. I often experienced a big rush at the end of the project because nor the scope nor the deadline could be changed. The team members suffered, almost burning out sometimes.

During the latest projects I was working on, feedback from end-users was given during the testing phase. It was already too late though. End-users requested features that were not part of the specifications defined at first. As the project team, we were told that: “end-users can make an enhancement request later on, after the Go-live”. Like my former colleagues, I had this sad feeling. We worked really hard on delivering these features. But we weren’t proud of them as end-users were complaining.

There has to be something better. A way where users needs are at the core of the product. A way where inputs from the team members really count. That’s when I came across the Scrum Guide. It became clear to me that I wanted to be part of this. That I wanted to be part of a Scrum team.

Referring to my experience and skills, Product Owner was the role appealing to me. In order to do so, I set myself two objectives: gaining experience and getting certified.

Collaboration with Liipers

I had the chance to join Liip for a three-months training in POing. Witnessing practical POing and being immersed in the Scrum philosophy was part of the deal. I was on-boarded on different projects too, working with different teams and clients. This helped me integrate how the Scrum Guide should be applied in practice. I got a deeper understanding of the Scrum events, the Scrum artifacts as well as the roles of the Scrum Master, the Product Owner and the development team.

Yes, self-organized teams works ! I was strongly motivated by this environment where all team members are equally responsible for the success or failure of the product. It really makes everyone commit to the work to be done, and brings up new ideas and solutions.

What about the Product Owner ?

This role is the one which always keeps in mind the end-user, not only for the team but also for the client. In my opinion, one of its biggest challenges is to convince the team and the client that real end-users feedback is a must. The PO is the one prioritizing the features to be developed, expressing user stories in the most intuitive manner and identifying the needs of the client and end-users.

I believe that as a Product Owner you need to be empathic, synthetic, a good listener and a good communicator.

During my training, I was inspired and empowered by my fellow PO colleagues. I loved their reactivity and the way they reorganized work to seize each business opportunity. The user needs are evolving and as the Scrum team we have to adapt our developpements to them. The Scrum framework allows us to do so.

Sprint after sprint, I was amazed how the product was developed, refined and improved to perfectly meet the evolving user needs.

Becoming a Product Owner myself was definitely the right choice for me.

Training with Léo – a certified Scrum Master and Agile Coach

At Liip, I had the chance to meet Léo – a Scrum Master and Agile coach. He guided me through the different steps, advising me to several really interesting readings such as Scrum, A Smart Travel Companion by Gunther Verheyen. Thanks to his coaching, I gained a deeper understanding of Scrum’s essence. He challenged me and made me think about Scrum principles and why Scrum has to be fully implemented – not just bits and bites of this amazing framework.

Getting certified, what’s next ?

Beginning of July I felt ready for the Scrum Certification. And I nailed it!

Actually, I applied the Scrum principles to my own training. The vision I have for my product is “to become an efficient Product Owner.” My first iteration was “to understand the role of a Scrum Product Owner”. And the certification was the test. I gave myself three weeks to reach that goal (sprint goal set, sprint duration set, commitment of the team, self-organization). On the other hand, I was open to faillure. As we never fail, but only learn.

This first iteration was a team effort too. I even had a Scrum Master – you know Léo – on my team ;-). I improved my knowledge on my colleagues’ experience (empiricism). My minimum viable product evolved from “to understand the role of a Scrum Product Owner” to “being a certified Product Owner”.

I am proud to announce that my first product works ! And I’m already working on the next improved iteration. So that my (own) product fits the market needs.

I feel fully armed to embrasse a new work life as a Scrum Product Owner. I want to help people – used to work with traditional models – evolving to the use of the Scrum framework.

Last but not least, I will carry on learning and adapting fast. I will be Agile and help others to achieve this goal as well.

Jul 29 2019
Jul 29

Conversations concerning accessibility of digital assets tend to fall into one of two categories: the carrot or the stick. 

The carrot is the host of business benefits associated with ensuring a great and accessible experience for all users. All too often, though, it’s the stick -- the threat of a lawsuit or actual legal action filed in federal court under Title III of the ADA -- that drives organizations to begrudgingly take steps toward getting their digital assets into compliance with WCAG 2.1 and ADA Section 508. 
 

Accessibility Claims Climb

Let there be no doubt, that the stick is real, and gaining momentum at a rapid pace. Lawsuits based on claims that a disabled person could not access a website because it was not coded to work with screen readers, or other assistive technologies, continues on a sharp upward trajectory. In 2018, we witnessed a threefold increase in accessibility lawsuits over 2017 -- from 814 to 2,285. The year-over-year increase in accessibility lawsuits filed during the first quarter of 2019 is more than 30 percent higher than the first quarter of 2018.  

The Southern and Eastern District of New York are battling the majority of these claims, followed by Pennsylvania and Florida, but the current geographic concentration cannot be viewed as any indicator of what’s next. The fact is, any organization that has a consumer-facing website that is not accessibility compliant, risks legal action.

There are no shortage of statistics such as these pertaining the “stick”-- the need for urgent and in-depth action to ensure accessibility compliance. I find conversations concerning the carrot to be far more fruitful, though. 
 

Accessibility is Good for Business

It should come as no surprise that the recent surge of accessibility-related lawsuits parallels the radical shift toward ecommerce. People of all physical and cognitive abilities are now counting on websites to buy what they need online, and retailers who are proactive about the accessibility of their sites are at a significant advantage for more reasons than simply staying out of court.

Any time a client is unable to complete a transaction or has a sub-par online shopping experience, that’s a lost opportunity. Chances are slim that a frustrated shopping experience will lead to a lawsuit, but there’s a significant likelihood that the client will look elsewhere -- possibly never returning to the site that was perceived to be problematic. This is among the reasons why it is so essential that we take a holistic view of online accessibility -- looking beyond the legal mandates and sharpening the focus on the needs and expectations of the users of your site. 

Vast and Varied Market

Recent Census Bureau statistics reveal that 56.7 million Americans, close to 20 percent of the population, have some form of disability.

Specifically:  

  • 3.3 percent of the population are visually impaired, which can include color blindness or require that they use a screen magnifier or a screen reader.
  • 3.1 percent have a hearing impairment for which they need to rely on captions for audio and visual media. 
  • Another 6.3 percent of the population have some sort of cognitive, mental, or emotional impairment which could impede their ability to complete a transaction without clear and consistent navigation and prompts.
  • And 8.2 percent have difficulty lifting or grasping, which could challenge their ability to use a mouse or navigate a keyboard. 

Accommodations and Awareness

Remediating a site to ensure accessibility will enhance the experience and enable commerce for differently abled users, while also attracting more users to the site.  A huge and seldom discussed advantage of online accessibility is its impact on SEO. Consider the analogy of how difficult it can be to find what you need on a cluttered desk stacked up with unmarked files. A well-ordered site that is tagged appropriately, with images that are accurately described, is not only key to accessibility compliance, it’s essential for modern browser searches. And don’t overlook PDFs. Information that is locked in an inaccessible PDF won’t be found by search engines -- a critical competitive disadvantage in the current market. 
 
A holistic and heartfelt approach to accessibility results in a blurring of the lines between user experience and ADA mandates. Much of this is accomplished during a Promet Source Human Centered Design workshops that take a deep and consensus-building inquiry into user needs and opportunities for growth. A holistic approach to accessibility is about ensuring a great user experience. It’s an investment in your brand and a profound opportunity for growth.

Looking for help with defining your audience and ensuring an accessible user experience that exceeds expectations? Contact us today.

Jul 29 2019
Jul 29

This year, Drupal GovCon had an amazing turnout. So great that the registration was at full capacity and had to cap attendance at 1,500 people!

Our team had an amazing time exploring Bethesda, from the conference grounds to the thriving community around it. While there we enjoyed conversing with the community, and forging new relationships. We learned many things, and shared our experiences with many others as well. Overall, it was a great environment for exploration and growth.  

From Hook 42 we had a first-time ever Drupal event attendee, an experienced Drupal veteran, and a goldilocks somewhere in the middle community member join forces. With the variety of community involvement just from our team alone, there were many different expectations attending this event and each person had their own goals and takeaways from this experience. We’re excited that an event can bring so many different people together and provide an experience that resonates with many different community members. Here's what we took away from this GovCon this year.

Aimee's Thoughts

GovCon 2019 delivered another great experience for me. I am a huge advocate of GovCon. I have found great value in my time spent here and look forward to returning each year. This camp is such a well run event that consistently delivers an inclusive, supporting and rewarding experience. I very much enjoyed my time there as a speaker, member and participant. I appreciated the structured content as well as the well attended after events where I got a chance to catch up with old friends, and meet many new people.

The sessions line up at GovCon 2019 was packed full of talented and useful information. There was so much I couldn’t attend that I’m especially grateful for the YouTube recordings. A big shout out for Kevin Thull and the team for getting these up!

It was an honor to be able to deliver my keynote, CommYOUnity: Fostering Healthy Contribution, at GovCon. Community is such a vital, core value to Drupal and that community is built on you and me. When we can build healthy relationships and make time to take care of ourselves we’re actively working on taking care of the community.

Thanks again for the wonderful opportunity and great experience GovCon!

Ryan's Thoughts

First, I wanted to give a big thank you to the event staff and volunteers who made GovCon 2019 possible. I am a first time GovCon attendee and had a great time. I knew when and where the sessions I was interested in were and found my way around NIH easily. The great planning and coordination that went in to GovCon made my experience very enjoyable.

Janessa Worrell and Brianna McCullough, from Acquia, gave a great presentation on governance, "Winter is Coming: How to Use Good Governance to Prevent your Organization from Becoming a Game of Thrones". They brought awareness to some of the harder to reach topics that help an organization align to make better project decisions and did it in an entertaining and informative way.

It was a pleasure to meet David Minton and Stephen Pashby from DesignHammer and then later see their presentation on estimation, "Successful Project Estimation: How To Get Your Weekends Back". Their modeling on historical throughput with detailed task tracking is an engaging idea I think we could put to use. 

I really appreciate the community that came together to share their thoughts and experiences and I look forward to my next opportunity to attend.

Lindsey's Thoughts

I’m semi-local to DC, and very familiar with the area so much so that I would come here as a child on field trips and such. I wasn’t coming here with the excitement of exploring the local community and the city, but more so with the intention of bonding with the Drupal community itself. Making connections with people that are experiencing similar situations as I evolve within the Drupal world is very important to me.

I spent most of my time holding down the booth, talking with many people about the great things our team has to offer. It’s a great way for a new Drupal-member to get to know a lot of people. For the first time ever, I experienced the joy of seeing connections I made at another Drupal event, and seeing their faces in the community again. Those bonds are already strengthening and I couldn't be happier! I am getting to understand just how connected our community really is.

When I wasn’t at the booth I attended sessions about accessibility. One common theme in all of them is that the speaker always shared empowering experiences that would resonate with the audience. They weren't necessarily technical, or Drupal related, but they provided a very humanistic connection to an issue that is usually backed by threatening legal concerns. 

I enjoyed hearing Shawn Brackat from Forum One sharing a story of a blind couple receiving a 3D model of an ultrasound, and Catharine McNally from Phase2 sharing her stories about growing up deaf and receiving the first ever study on cochlear implants. These are aspects of accessibility that people really bond with. Being inclusive is more than just avoiding a lawsuit. It’s about making things easier for everyone, and providing ways to accommodate those that do things differently. Everyone experiences things differently, disability or not, and it's important that we keep that in mind as we build digital experiences for the masses.

I enjoyed participating in my first GovCon in a variety of ways. Whether that was sitting at the booth, attending sessions, or dinner with great friends - I have had so many wonderful experiences. I’m already looking forward to next year.

Wrapping Up

The community always goes above and beyond, providing experiences that everyone can resonate with. GovCon did not fall short of those high standards. We’re so thankful to all of the organizers, and happy to have been one of many great sponsors that can help support events like these. All of us had a wonderful time, and we’re happy to be so involved with a community that has a great ecosystem. We’re already planning our next event... we’ll see you in Denver!
 

Jul 29 2019
Jul 29

Good images can be your website’s best friends if you treat them well. But they can be your website’s enemies if you have never heard about image optimization. 

Happily, there are many useful optimization tools. The 8th version of Drupal has a number of great ones out-of-box. This is one of many Drupal 8’s benefits that make website owners want to upgrade to Drupal 8.

So let’s talk about the reasons to optimize images and the ways to use the image optimization in Drupal 8 via the core features. 

Why image optimization is vital

  • Optimized images do not overload your website and significantly increase its performance compared to that with “raw” ones. Optimization saves precious seconds of your user’s browser loading time.
  • Conversions are potentially increasing because users do not leave a slow website. That’s why great attention is paid to e-commerce product image optimization. However, the ability for users to reach their goals is vital for every type of site and business.
  • Image optimization with mobile devices in mind significantly boosts your audience.
  • It is beneficial for SEO because search engines consider website speed as a ranking factor. Properly formatted images also have more chances to show up in image search results.
  • Automatic optimization will save a lot of your editorial staff’s work on content moderation.
  • Finally, needless to say that your customer satisfaction and brand reputation wins from fast-loading and trimmed visual assets. 

Image optimization in Drupal 8 core at a glance

Drupal 8 has the image styling feature out-of-box that allows you to trim original images for various scenarios. All uploaded images will be automatically formatted in the specified ways.

For example, if all users upload profile pictures, there will be no “chaos” in their dimensions. You can “ask” Drupal to show their thumbnails on the list of users, the pictures of specified sizes on the account page, and so on. 

To trim the original pictures as part of optimization, Drupal 8 applies effects such as scale and crop, resize, rotate, and more. 

It is also possible in Drupal 8 to create responsive web design and show different image styles according to device dimensions. You can have high-resolution Retina images and small ones for mobile devices. All this is created by Drupal 8’s built-in modules Responsive Image and Breakpoint.

You can always entrust image optimization in Drupal 8 to our Drupal developers, so all your visual assets are optimized for all use cases and all devices.

Drupal 8 image optimization at a closer look

Creating image styles

Let’s imagine someone uploads a profile picture with large dimensions and plenty of extra space.

large image to be optimized in Drupal 8

In terms of optimization, the scaling and cropping could be perfect here. And we can apply it automatically to all our Drupal users’ pictures. 

In Configuration — Media — Image Styles, we see Drupal 8’s default ones (“thumbnail,” “medium”, and “large”) and can add any others. So we:

  • create an image style and call it “User picture”
  • select the “scale and crop” effect for it
  • click “add” and “save”

Creating an image style in Drupal 8

When we edit this style, we can choose the width and height, as well as the part of the picture to retain during the crop. A good default is the center.

Specifying dimensions in Drupal 8 image style

Assigning image styles to use cases

So let’s use the newly created “User picture” style as a standard for user accounts. The styles can be assigned to an image field wherever it appears — in content types or their particular view modes (e.g. article teaser), in Views (e.g, a collection of articles), and so on. 

In our case, this is the “Picture” field in the user account (Configuration — People — Account settings — Manage display). Next to the field, we click on the cogwheel and select the “User picture” style.

Assigning Drupal 8 image style to image field

And now our user Spiderman will have his picture scaled and cropped to 400х400, and the same automatically applies to all other user pictures, old or new. A nice optimization!

Image optimization for user pictures in Drupal 8

Providing responsive design

Trimming the pictures to 400х400 looks pretty nice, but all devices are different. To have different styles for different device dimensions, developers enable the Responsive Image module and also use the Breakpoint module in Drupal 8 core.

We specify the breakpoints (or “turning points” in device dimensions) at which Drupal 8 will start to show a different image style. We do it in the theme’s breakpoint.yml file. Thanks to this, all devices show the picture well.

When the breakpoints are ready, we create a responsive image style in Configuration — Media — Responsive Image Styles, select the name of the theme in “Breakpoint group,” and attach image styles to breakpoints. The final optimization accord is assigning the responsive image style to the image field. 

Let’s optimize your Drupal 8 images

Image optimization in Drupal 8 is organized perfectly. D8 is equipped with user-friendly interfaces, but at the same time the optimization process looks a bit tricky to non-developers, especially in the responsive design part.

So our Drupal team is ready to do all the necessary optimization on customers’ websites. To do this, we can use Drupal 8's core capabilities or other tools from our large developer’s arsenal. 

With us, you can also migrate to Drupal 8, which has image optimization and responsiveness as a priority. 

Enjoy the benefits of having your images optimized!

Jul 29 2019
Jul 29

Decoupled Days is a conference for anyone who works with decoupled Drupal technology: developers, architects, executives, and even marketers like me. It’s been around since 2017 and focuses on sharing knowledge about back-end CMS development as a content service as well as front-end development for applications that consume that content.

Amazee Labs has specialized in innovative decoupled development for years, and we were excited to sponsor and speak at this annual event. Amazees travelled from all over the world to present, learn, share, and enjoy the chance to hang out face-to-face with coworkers we usually see only on computer screens. As a non-developer attendee, I was there to help man the booth, share what was happening on our blog and social media, and make sure our usually-distributed team got plenty of chances to enjoy the city and each other’s company. 

DD Team Working

After getting settled at our charming hotel in the middle of Manhattan, I met up with Amazees from all corners of the world for drinks at a rooftop bar with beautiful views of the city. I sat at a table with colleagues from Cape Town, Zürich, the UK, Taiwan, Spain, and even Upper Manhattan. We talked shop, of course, but also got to have some much needed social time. Amazee has such a rich distributed culture that I often forget I haven’t met certain team members in person until I’m surprised by how tall they are. On Zoom, I suppose, we’re all same height. 

DD Table

On the first day of the conference, we set up our table, featuring some cool swag including some custom headless horseman stickers we made specifically for the event. The Drupal community is unique in its tight-knit community and at every event, I’m struck by people’s passion and excitement to share their knowledge across companies and specialities. What a difference an open-source perspective makes. 

DD Jamie

DD Fran

DD Bryan

There were some great technical sessions over the course of the conference including Stew West’s Intro to GraphQL and Twig, Jamie Hollern’s presentation on Storybook and Drupal, Fran Garcia-Linares talking about GraphQL V4 and John Albin Wilkins presentation about Gatsby and GraphQL Schema Stitching.

From a business perspective, Pascal Kappeler and Stephanie Lüpold discussed an interesting case study and Bryan Greenburg talked about how to Decouple Your Team.

Michael Schmid shared two sessions from the amazee.io perspective, one was a look at best practices from over three years of building and hosting decoupled sites, and one about caching decoupled websites to improve speed.  

DD Amazee Ladies

It was a lot to cover in a couple of days, but everyone at the conference seemed energetic and excited to be there. On breaks, we got coffee and found cute lunch places to meet up with people we knew, or people we’d just met. For me, the highlight of the week was our team dinner, where we sat down as a group to eat, drink, and laugh together as an Amazee family. 

Jul 29 2019
Jul 29

The Drupal Cache API is used to store data that takes a long time to compute. Caching can either be permanent or valid only for a certain time span, and the cache can contain any type of data. To make websites faster Drupal stores web pages in a cache.

Drupal Cache has three properties

  • Cache context creates context variations when render arrays are being generated. If we have user as a context, every user may not have the same user permissions or language.
  • Cache tags define what object the cache depends on. For dependencies on data managed by Drupal, like entities and configuration.
  • Cache max-age is the maximum time that the cache is stored.

Here is an example from a custom block in Drupal 8:

use Drupal\Core\Cache\Cache;

return [
      '#theme' => 'user_profile_template',
      '#user_data' => $user_data,
      '#cache' => [
        'tags' => ['languages', 'timezone'],
        'contexts' => ['node:5', 'user:3'],
        'max-age' => Cache::PERMANENT,
      ],
];

Use of cache tags in Entity Caching

The cached data in different bins becomes old and obsolete at some point of time and requires removal from these bins to accommodate the latest changes. Before Drupal 8, there was no way to identify individual pieces of expired data stored in different cache bins. 

Cache tags provides a way to track which cache items depend on some data managed by Drupal.
If a renderable output which is output of a Controller or a custom block depends on content provided by some entity, we use cache tags to invalidate the data.
For example, if a node is updated, which appear in two views and three blocks. Without cache tags we wouldn't know which cache item to invalidate

The syntax for setting cache tags is thing:identifier. It has to be unique string and cannot contain spaces.

Entities gets caches in the form of <entity type ID>:<entity ID>

'tags' => ['node_list'], //invalidate when any node updates
'tags' => ['node:1','term:2'], //invalidate when node id 1 or term id 2 is updated
 

We can also define our own cache tag:

  • Request a cache object through \Drupal::cache().
  • Define a Cache ID (cid) value for your data. A cid is a string, which must contain enough information to uniquely identify the data.
  • Call the get() method to attempt a cache read, to see if the cache already contains your data.
  • If your data is not already in the cache, compute it and add it to the cache using the set() method.
$nid = 9;
$cid = 'my_module:' . $nid;

// Check if the cache already contain data.
if ($item = \Drupal::cache()->get($cid)) {
  return $item->data;
}

// The data to be cached.
$node = Node::load($nid);
$data = [
  'title' => sprintf('## %s', $node->get('title')->getValue()),
  //...
];

// Set the cache for 'my_module:' . $nid cache tag until $node changes.
\Drupal::cache()->set($cid, $data, Cache::PERMANENT, $node->getCacheTags());

A cache item can have multiple cache tags (an array of cache tags), and each cache tag is a string. Drupal associated cache tags with entity and entity listings. It is important to invalidate listings-based caches when an entity no longer exists or when a new entity is created. This can be done using EntityTypeInterface::getListCacheTags(), it enables code listing entities of this type to ensure that newly created entities show up immediately or invalidate the ones that don’t exist.

Cache::invalidateTags is used to invalidate all cached data of a certain cache tag.

// Invalidate all cache items with certain tags.
\Drupal\Core\Cache\Cache::invalidateTags(array('node:1',  'user:7'));

Explaining with an Example

Create a file custom_plugin/custom_plugin.services.yml in your custom module.

services:
  custom_plugin.my_cache:
    class: Drupal\Core\Cache\CacheBackendInterface
    tags:
      - { name: cache.bin }
    factory: cache_factory:get
    arguments: [my_cache]

This is to declare our cache whose identifier will be my_cache.

Create a routing for your controller custom_plugin/custom_plugin.routing.yml

custom_plugin.cache:
  path: '/my-cache'
  defaults:
    _controller: 'Drupal\custom_plugin\Controller\CacheController::content'
    _title: 'Cache'
  requirements:
    _permission: 'access content'

Then in our Controller custom_plugin/src/Controller/CacheController.php we will create a custom cache tag to store a dynamic cache item with conditions to invalidate when the value is updated.

<?php

namespace Drupal\custom_plugin\Controller;

use Drupal\user\Entity\User;
use Drupal\Core\Cache\CacheBackendInterface;
use Drupal\Core\Controller\ControllerBase;
use Symfony\Component\DependencyInjection\ContainerInterface;
use Drupal\Core\Cache\Cache;

/**
 * Class CacheController.
 */
class CacheController extends ControllerBase {

  /**
   * The cache backend service.
   *
   * @var \Drupal\Core\Cache\CacheBackendInterface
   */
  protected $cacheBackend;

  /**
   * Constructs a new CacheController object.
   */
  public function __construct(CacheBackendInterface $cache_backend) {
    $this->cacheBackend = $cache_backend;
  }

  /**
   * {@inheritdoc}
   */
  public static function create(ContainerInterface $container) {
    return new static(
      $container->get('custom_plugin.my_cache')
    );
  }

  /**
   * Build the user dynamic data.
   *
   * @return array
   *   Return the render array of the user dynamic data.
   */
  public function content() {
    $user = User::load(\Drupal::currentUser()->id());

    // Create a custom cache tag.
    $cid = 'custom_plugin:' . $user->id();
    // Check if there is any cache item associated with this cache tag.
    $data_cached = $this->cacheBackend->get($cid);

    if (!$data_cached) {
      // Build the user dynamic data.
      $data = $user->getAccountName() . ' last accessed at ' . date('H:i', $user->getLastAccessedTime());

      // Merge the entity cache of an user entity with our custom tag.
      $tags = Cache::mergeTags(['user:' . $user->id()], [$cid]);

      // // Store the data into the cache.
      $this->cacheBackend->set($cid, $data, CacheBackendInterface::CACHE_PERMANENT, $tags);
    }
    else {
      $data = $data_cached->data;
      $tags = $data_cached->tags;
    }

    // Return a renderable output.
    $build = [
      '#theme' => 'user_data',
      '#user' => $user->id(),
      '#data' => $data,
      '#cache' => [
        'tags' => $tags,
        'context' => ['user'],
      ],
    ];

    return $build;
  }

}

Initialize the variables to be used in the template in custom_plugin/custom_plugin.module file.

/**
 * Implements hook_theme().
 */
function custom_plugin_theme() {
  return [
    'user_data' => [
      'variables' => [
        'user' => [],
        'data' => [],
      ],
    ],
  ];
}

Create a template custom_plugin/templates/user-data.html.twig to print the cache items

<div class="custom-plugin-block">
  <p>User ID: {{ user }}</p>
  <p>{{ data }}</p>
</div>

Most developers and development teams have one cache invalidation strategy i.e. clear all cache. And that is not a good idea for complex websites and applications which have a huge amount of content. This custom cache invalidation strategy will help you clear only the required cache and keep the rest intact. This can boost the performance a lot and goes without saying, you have a better control of your site's cache.

Jul 28 2019
Jul 28

PHP 7.3.0 was released in December 2018, and brings with it a number of improvements in both performance and the language. As always with Drupal you need to strike a balance between adopting these new improvements early and running into issues that are not yet fixed by the community.

Why upgrade PHP to 7.3 over 7.2?

What hosting providers support PHP 7.3?

All the major players have support, here is how you configure it for each.

Acquia

Somewhere around April 2019 the option to choose PHP 7.3 was released. You can opt into this version by changing a value in Acquia Cloud. This can be done on a per environment basis.

The PHP version configuration screen for Acquia Cloud 

Pantheon

Pantheon have had support since the April 2019 as well (see the announcement post). To change the version, you update your pantheon.yml file (see the docs on this).

# Put overrides to your pantheon.upstream.yml file here.
# For more information, see: https://pantheon.io/docs/pantheon-yml/
api_version: 1
php_version: 7.3
Example pantheon.yml file

On a side note, it is interesting that PHP 5.3 is still offered on Pantheon (end of life for nearly 5 years).

Platform.sh

Unsure when Platform.sh released PHP 7.3, but the process to enable it is very similar to Pantheon, you update your .platform.app.yaml file (see the docs on this).

# .platform.app.yaml
type: "php:7.3"
Example .platform.app.yaml file

Dreamhost

PHP 7.3 is also available on Dreamhost, and can be chosen in a dropdown in their UI (see the docs on this).

The 'Manage Domains' section of Dreamhost

Dreamhost also win some award for also allowing the oldest version of PHP that I have seen in a while (PHP 5.2).

When can you upgrade PHP 7.3

Drupal 8

As of Drupal 8.6.4 (6th December 2018), PHP 7.3 is fully supported in Drupal core (change record). I have been running PHP 7.3 with Drupal 8 for a while now and have seen no issues, and this includes running some complex installation profiles such as Thunder and Lightning.

Any Drupal 8 site that is reasonably up to date should be fine with Drupal 8 today.

Drupal 7

Slated for support in the next release of Drupal 7 - being Drupal 7.68 (see the drupal.org issue), however there are a number of related tasks that seem like deal breakers. There also is not PHP 7.3 and Drupal 7 tests running daily either.

It seems like for the mean time, it is probably best to hold off on the PHP 7.3 upgrade until 7.68 is out the door, and also contributed modules have had a chance to upgrade and release a new stable release.

A simple search on Drupal.org yields the following modules that look like they may need work (more are certainly possible):

  • composer_manager (issue)
  • scald (issue) [now fixed and released]
  • video (issue)
  • search_api (issue) [now fixed and released]

Most of the issues seem to be related to this deprecation - Deprecate and remove continue targeting switch. If you know of any other modules that have issues, please let me know in the comments.

Drupal 6

For all you die hard Drupal 6 fans out there (I know a few large websites still running this), you are going to be in for a rough ride. There is a PHP 7 branch of the d6lts Github repo, so this is promising, however the last commit was September 2018, so this does not bode well for PHP 7.3 support. I also doubt contributed modules are going to be up to scratch (drupal.org does not even list D6 versions of modules anymore).

To test this theory, I audited the current 6.x-2.x branch of views

$ phpcs -p ~/projects/views --standard=PHPCompatibility --runtime-set testVersion 7.3
................................................W.W.WW.W....  60 / 261 (23%)
................................E........................... 120 / 261 (46%)
...................................................EE....... 180 / 261 (69%)
............................................................ 240 / 261 (92%)
.....................                                        261 / 261 (100%)

3 errors in views alone. The errors are show stoppers too

Function split() is deprecated since PHP 5.3 and removed since PHP 7.0; Use preg_split() instead

If this is the state of one of the most popular modules for Drupal 7, then I doubt the lesser known modules will be any better.

If you are serious about supporting Drupal 6, it would pay to get in contact with My Drop Wizard, as they at least at providing support for people looking to adopt PHP 7.

Jul 26 2019
Jul 26

Rain logo updated

The Mediacurrent team is excited to formally introduce Rain, an enterprise-grade Drupal install profile. Two years in the making, our team has spent countless hours developing these tools internally and deploying them to our client projects.

Our goal with this post is to share with the broader Drupal community, explain why we created Rain, and share how it can benefit your organization. We welcome feedback.

What is Rain? 

The Rain installation packages the best solutions the Drupal community has to offer so that organizations can build sites faster. We have used Rain internally for the last two years, making improvements along the way prior to its release as an open-source project.

Made by Mediacurrent

We designed the Rain install profile with the goal to capture features that overlap all verticals and projects of every size and scope. As part of the development process, we examined features and content patterns from successful client projects over the past three years. Through our research, we found many content features and module configurations did, in fact, apply to projects across the spectrum. Our focus has been to build Rain with the features that create the best admin and authoring experience we can provide for our clients and the open-source community.

Mediacurrent believes strongly in open source solutions, so we have released all of our project tooling to the community. For more information on the tools we have publicly released please visit the Mediacurrent Development Tools page for more information and links to our projects.

Who is Rain for?

It’s a full-time job to simplify processes on the backend and deliver a consistent, high-impact customer experience. Mediacurrent created the Rain Install Profile to make these jobs easier. 

Clients using Rain today include large B2B Enterprise, Higher Education, and Non-profit organizations. The Rain distribution is flexible enough to serve large and small organizations alike. For large enterprise builds, the install profile will reduce overhead and take advantage of reusable features even if the project overall is a highly-customized implementation. Smaller, more budget-conscious organizations have the opportunity to reuse even more out-of-box functionality for further savings. The end result is a fully branded, enterprise-grade Drupal solution for your organization

What makes the Rain Install profile different?

At DrupalCon this year, co-worker Justin Rent and I presented on the idea of “Content Patterns.” This presentation gives a window into the challenges we were facing and our approach to addressing those challenges. In short, we believe to have long-term success as an agency we need to pool together best practices and take advantage of repeatable patterns. While every project we work on is unique in some way, we also see many shared problems that could benefit from common-sense solutions. As an organization, we made a commitment to build tools that would address those problems and share them with the community.

We borrowed from other distributions, Acquia Lightning and Thunder CMS which are both great projects in their own right. We did, however, add our own twist. 

Our approach

1. Package and configure common modules

We package popular contributed modules like Metatag and many others that focus primarily on authoring and admin experience. Our installation comes with “sensible defaults” that offer a great out of box experience while saving developer and administrative time.

2. Offer flexible content features to jump-start development

Rain offers a variety of content features, including content types and paragraphs, not found in most distributions. These features are all optional and offer the opportunity to jump-start development. 

We believe our approach to content architecture gives editors a consistent, flexible interface for managing content. At the same time, it helps developers get going faster while being fully configurable. 

3. Ship with an extendable base theme and style guide

Rain ships with an excellent starter base theme.  This offers several advantages but is not required (see our GatsbyJS + Rain webinar for a “decoupled" option).

The theme ships with a base style guide that includes the most common components you’ll find on a website. These components provide basic styling and markup that saves development time. The components are intended to be branded to match the design requirements of a given project. Additionally, all Paragraph features are pre-integrated with the base theme and style guide to once again save development time and reduce costs. The net result -- the development team gets a running start on a new project instead of constantly reinventing the wheel.

Next steps

We would love to hear feedback from you. What features are your pain points, what features would you like to see? Interested in working with Mediacurrent? Contact us for a live Rain demo or for more information about our services. 

Jul 26 2019
Jul 26

Our lead community developer, Alona Oneill, has been sitting in on the latest Drupal Core Initiative meetings and putting together meeting recaps outlining key talking points from each discussion. This article breaks down highlights from meetings this past May. You'll find that the meetings, while also providing updates of completed tasks, are also conversations looking for community member involvement. There are many moving pieces as things are getting ramped up for Drupal 9, so if you see something you think you can provide insights on, we encourage you to get involved.

Out of the box meeting  07/16/19

At this meeting we discussed the following issues:

Drupal 9 readiness

Meetings are for core and contributed project developers as well as people who have integrations and services related to core. Site developers who want to stay in the know to keep up-to-date for the easiest Drupal 9 upgrade of their sites are also welcome.

  • Usually happens every other Monday at 18:00 UTC.
  • Is done over chat.
  • Happens in threads, which you can follow to be notified of new replies even if you don’t comment in the thread. You may also join the meeting later and participate asynchronously!
  • Has a public Drupal 9 Readiness Agenda anyone can add to.
  • Transcript will be exported and posted to the agenda issue.

Meeting Highlights from 07/08/19

Multiple branch compatibility, semantic versioning, and composer.json proposal:

Schema version and database updates hooks in Drupal 9.

Split off SchemaInstaller from schema.inc, the question is how long should 8xxx hooks be kept. They will need to be updated to D9, but should we have people update to 8.last first, and fire all of those updates before allowing them to update to D9 and start the 9xxx series of hooks?                                         

We discussed that at least for now, we will leave the minimum at 8000 for the sake of running updates, or have some special handling for when one is running the D9 codebase with a 8xxx schema to force an update before one's site works, or something along those lines.

Deprecation cleanup status -  blockers to D9 branch opening

Examples module for D9 is looking for maintainer(s)

Module renames, action and menu blockers

Drupal core cross-initiative meeting 07/24/19 

Media

Status of Media

  • Focus is on WYSIWYG, working on three things.
  • Representing media in WYSIWYG is going quite well. Well tested, some CSS things outstanding, following Cristina and Lauri discussion.

CMI 2

Status of CMI 2

  • Big API patch landed, not blocker anymore.
  • The ball is in their court, as the module is not yet ready to review.

Workspaces

Status of Workspaces

  • On track for stable workspaces. Currently working on:
    • fixing the bugs,
    • sub-workspaces, and
    • merging (not written yet because it needs the hierarchy).

Blocked Items In Workspaces

  • Reviews for RTBC and Needs review issues in workspaces
    • Remove workspace association entity type (tracking entity revisions vs. workspace) and replace with custom storage
    • Revisionable entity types cannot be uninstalled if their code does not exist anymore. Issues blocked are:

Claro

Status of Claro

  • General components need to be unblocked for beta / Drupal 8.8.
    • The Media library is the most important.
      • Currently doing research there.
    • Status report page UI.
    • Some other small components.
  • There is a new designer joining full time.

Blocked Items in Claro

Auto Updates

Status of Auto Updates

  • Lucas can now move forward as he was unblocked by DA last week.
  • PSA feed is the first phase, once it is there we will Drupal set message on every page.
    • This will go to core once/if possible.
    • There will be a contrib release first to figure out bugs.
  • Checking for update eligibility would stay in contrib, there are various readiness checkers that are tagged services. (Is read only, do you have db update, local modifications.)
    • All readiness checks are in 8 and 7 and committed.
  • Next step will be an in-place update with a quasi-patch zip file, there is an issue with passing tests. (It has a manifest to know deleted files, other files drop-in replacements.)
    • Ftp will make this available.
    • Will not work for dev branches or hacked sites.
    • Using composer will not be supported in the first version, only for tgz.

Migrate

Status of Migrate

  • Ideally do an audit step, Mike proposed a framework for this, very high priority.

Blocked Items in Migrate

Drupal 9

Status of Drupal 9

  • Deprecation fixes going in core going very well, down to 29 individual issues, all have issues, once the Drupal::l() lands, we will be down to double digits of instances.
  • Symfony 5 and 4.4 is in heavy development, we don't know what to adapt to right now, so we are in a holding pattern waiting for what is going to happen there.
  • Symfony 4.3 has some fails.

Composer

Status of Composer

  • Moving along pretty well.
  • Wikimedia merge plugin (original solution to core directory and product scaffold files) is to be removed, so we can move closer to the unofficial composer on its way to being official.
    • We don't know if anyone is relying on it.
  • Core vendor cleanup plugin should be reusable.
  • Scaffolding introduced a random fail, looking at that.
  • Templates are the last part when we can change drupal.org packaging, when the tarballs would be created by composer eventually.
  • Spreading out to solving "all things wrong with composer."
    • Eg. subtree split to only get core modules you need not all of them.
    • Lightning should be able to be built on drupal.org.
Jul 26 2019
Jul 26

We’re excited to be hosting the August 2019 TC Drupal monthly meeting, the first in the new "Lunch and Learn" format, which will rotate venues and feature a myriad speakers. See Allie's post about the change on the Twin Cities Drupal.org page. We look forward to them each month!

August Talk

TEN7’s Project Lead Les Lim will be giving a talk titled "Latest Paradigms in Content Editing: Paragraphs, Layout Builder, and Gutenberg" where he'll review the different available paradigms and where they are useful. 

Lunch and Learn Details

There's no registration required, just show up! Space is limited to 50 people though, so show up early!

When: Friday August 16, 12-1 p.m.
Where: Bde Maka Ska Room, Walker Library, 2880 Hennepin Ave, Minneapolis
Lunch: Pizza! (provided by TEN7)
Parking: The Walker Library has a paid parking garage (for a minimal fee). We recommend using it as Walker Library is in the heart of Uptown where street parking is seriously challenging.

More Info

Jul 26 2019
Jul 26

We’ve recently wrapped up another great Drupal event, at Drupal Camp Asheville. The sponsorship our team provided, in conjunction with the support of many others, helped the camp achieve many goals that we are proud to have been a part of. 

The experiences our team had stood up to the expectations of all Drupal events,

From Hook 42, we had three people who were able to participate this year, and each member of our team had different expectations going in, and learned many valuable things. Here’s what they have to say.
 

Danita’s Experience

Although it was only my second year of attending Drupal Camp Asheville, it already feels like “home.” One of the highlights of any Drupal camp or con or meetup is getting to see old friends and meeting new ones, and this camp doesn’t disappoint. An added bonus was getting to hang out with two Hook 42 colleagues (including Jonathan, one of the camp organizers), something that doesn’t happen often enough with a remote team.

On Friday, I joined a fairly packed room for a full-day training on CSS Grid and Flexbox with Wes Ruvalcaba, Senior Front-end Developer at Lullabot. Along with how to use Grid and Flexbox, Wes shared some tips and tricks he uses when theming Drupal websites. Even though I’ve used both Grid and Flexbox before, I came away from that all-day training with a better understanding of when and how they are “supposed” to be used - and to quit using “floats” for layout!

Sessions I attended on Saturday included one on contributing back to Drupal presented by Amy June Hinelline, Kanopi’s Community Ambassador. It was a good reminder that anyone (not just super-experienced coders) can do something to contribute and that our contributions make the community stronger.

In addition to the trainings and sessions and “hallway track” conversations, the social events and just the location of the camp in Asheville make this a not-to miss camp. I know I can’t wait until next year. 

Jonathan Experience

Another great year at Drupal Camp Asheville. On Thursday we all gathered to play games and hangout at the Wedge. Friday I gave my full-day “Essential Drupal 8 Developer Training”, and it went really well. I had a full class with lots of great questions and comments, and like last year, my voice started to give out. I didn’t record the training this year, but it was basically the same as 2018. Videos, slides, and code examples for the training are all available online.

Saturday we had an amazing group of presenters show off some of their best work and discuss topics they are passionate about. As an organizer, we made a lot of effort ensuring that there were great sessions for almost any topic related to Drupal or Drupal adjacent. We had everything from improving client interactions to an overview of the JSON:API, from understanding Drupal 8’s caching mechanisms to a discussion of emotional labor in communities, and from working effectively as a remote team to data analysis and visualization.

All sessions were recorded and uploaded to YouTube. Full playlist Drupal Camp Asheville 2019 sessions

As an attendee, it was really great to see the old guard and meet some of the new. I had a great chat with a new Drupal developer who is implementing a chat bot for his university, enjoyed tacos and beverages with the creators of Backdrop CMS, and talked about video games with friends as we hiked up to Catawba Falls. It’d be difficult to list each great interaction I had because there were good people and fun conversations happening practically the whole time. 

Biased opinion: I can’t recommend Drupal Camp Asheville enough. Thanks to all the hard work by our Director April Sides, the camp is well managed and a wonderful, casual, and considerate event for everyone. Hope to see you there next year! 

Group photo of drupal asheville attendees waving
 

Jul 26 2019
Jul 26

The expanding data landscape is feeding the demand for higher operational agility. This calls for a more responsive, reliable IT infrastructure — that doesn’t rake up millions — minimizes delays and downtime while improving security and making infrastructure more agile.

Between capacity constraints and unpredictable pricing models, AWS offers suited workloads for growing infrastructure needs with a host of services - IaaS, PaaS, SaaS - for Drupal enterprises. 

Here’s how you can run your Drupal up to 40% cheaper.

Keeping the Business Innovative and Agile

Increasing demands for performance, scalability and agility have never been higher. Success and growth for businesses depend on it. At the same time, changing landscape is forcing businesses to opt for lower costs, greater use of cloud resources and better customer service

While these changes have implications on infrastructure, compute and networking resources, they also impact storage. 

Lack of enough database storage, for example, can adversely impact application performance. Fast-growing applications may need more storage than expected or immediate storage resources.

 

Capacity and storage issues can hinder business agility

 

The continuous need for speed and efficiency is driving businesses to opt for storage as a service (STaaS) model. But there is more to it when it comes to the the benefits. Businesses get:

  • Better insights at reasonable cost: Providing a highly scalable environment at a low cost capable of handling the massive volume and velocity of data, organizations can shift from the two available models (CapEx and OpEx) for more predictable costs.
  • Better collaboration: Cloud-based business solutions accelerate innovation, delivering business analytics at the point of impact and enabling collaboration by creating and linking business networks.
  • Innovation and variety of solutions: Forward-thinking enterprises adopt for STaaS to speed up business innovation, improve overall data-centre efficiency, achieve integrated and innovative business results.  
  • Proven results: Organizations achieve their desired business outcomes by improving the responsiveness of their IT infrastructure without increasing risk or cost.

Capacity and storage issues can hinder your business agility. 

In order to avoid such challenges in the future, Drupal-powered enterprises need to constantly understand and adapt to the changing landscapes.

Azim Premji Foundation, Georgia Technical Authority, Department of Homeland security, USA are powered by Drupal and supported by AWS

While Drupal helps balance the rapid data growth, the right cloud storage solution needs to offer security and robust scalability without constraining the budget and prepare IT and marketing for what comes next.

Run your Drupal 40% cheaper

Choosing the right technology is crucial to avoid equipment failures and the costs of upgrading hardware. Small to medium enterprises and non-profit especially need sustainable solutions for future needs to run its operations without overcommitting budgets today. 

Finding the perfect match, organizations such as Azim Premji Foundation, Georgia Technical Authority, UCAS, Department of Homeland security - USA,  are powered by Drupal and supported by AWS.

Enterprises need sustainable solutions without over committing budgets today

AWS offers cloud web hosting solutions that provide businesses, non-profits, and governmental organizations with low-cost ways to deliver their websites and web applications.

The pay-as-you-go approach lets you pay only for the individual services you need, for as long as you use, and without requiring long-term contracts or complex licensing. 

Similar to how you pay for utilities like water and electricity.  

You only pay for the services you consume, and once you stop using them, there are no additional costs or termination fees.

The pricing models give your enterprises the flexibility to grow your business unencumbered by  IT

  • Pay-as-you-go

With AWS you only pay for what use, helping your organization remain agile, responsive and always able to meet scale demands. Allowing you to easily adapt to changing business needs without overcommitting budgets and improving your responsiveness to changes, reducing the risk of over positioning or missing capacity.

Drupal-AWS-PriceBy paying for services on an as-needed basis, you can redirect your focus to innovation and invention, reducing procurement complexity and enabling your business to be fully elastic.

  • Save when you reserve

By using reserved capacity, organizations can minimize risks, more predictably manage budgets, and comply with policies that require longer-term commitments.

For certain services like Amazon EC2 and Amazon RDS, enterprises can invest in reserved capacity. With Reserved Instances, you can save up to 75% over equivalent on-demand capacity.

Drupal-AWS-Reserve-price

When you buy Reserved Instances, the larger the upfront payment, the greater the discount.

  • Pay less by using more

Providing volume-based discounts, organizations can save more by increasing usage. . For services such as S3 and data transfer OUT from EC2, pricing is tiered, meaning the more you use, the less you pay per GB.

In addition, data transfer IN is always free of charge.

As a result, as your AWS usage needs increase, you benefit from the economies of scale that allow you to increase adoption and keep costs under control.

As your organization evolves, AWS also gives you options to acquire services that help you address your business needs. For example, AWS’ storage services portfolio, offers options to help you lower pricing based on how frequently you access data and the performance you need to retrieve it.

Drupal-AWS-storage-price

To optimize the savings, choosing the right combinations of storage solutions can help reduce costs while preserving performance, security and durability.

The pricing models give your enterprises the flexibility to grow your business unencumbered by  IT.

Case Study: Reducing cost & improving operational efficiency for Drupal application with AWS

Our client which is a legal firm and helps provide jurisdiction and litigant simple, seamless, and secure access to the record of legal proceedings. They built a SaaS-based workflow management application on Drupal to manage and track digital recordings of legal proceedings, transcripts including appeals to the stakeholders.

The goal was to build a robust, cloud-based server to effectively handle the processing and access to a large volume of text, audio and video files.

Since the business model was dependent upon frictionless uploading and downloading of text and media files, AWS cloud-based server came out as a unanimous solution. 

Business benefits

  • Simplified integration of the client's Drupal application with AWS S3, to enable flexible, cloud-native storage
  • As a result of going all-in into the AWS Cloud, the client reduced costs by 40% and increased operational performance by 30-40%
  • Dynamic storage and pay-as-you-go pricing enabled the client to leverage a highly cost-effective cloud-storage solution

Read complete case study on Cloud-Native Storage for Drupal Application with AWS

Get no-cost expert guidance

Designed to help you solve common problems and build faster, Amazon Web Services provides a comprehensive suite of solutions to secure and run your sophisticated and scalable applications.

Srijan is an AWS Advanced Consulting Partner. Schedule a meeting with our experts at no cost or sales pitch and get started with your cloud journey.

Jul 26 2019
Jul 26

You know how important accessibility is, but now what? There are a lot of well intentioned sites on the internet that aren’t accessible.

Is your website accessible?

How do you find out?

Well, it’s not as hard as it seems—and we’re here to help! Here are a few quick ways to measure the accessibility of your website.

1. Automated accessibility tests

While automated tools will only catch about 30% of accessibility bugs, they will give you a general idea of your site’s accessibility and show you some ways to make improvements.

Lighthouse: Chrome’s Accessibility Reporting Tool

Lighthouse is a free tool available right in Chrome. You can use it by simply using chrome’s testing website, in your development tools when you inspect a page, or with a browser plugin. Keep in mind that manual testing is also required to get a full picture of accessibility—we’ll cover that in just a moment.

To use the tool by going to a URL: Visit https://web.dev/measure and paste the URL of the page you want tested into the form field, then click “Run Audit” to see results.

To use the tool through inspect

  1. Right click on the webpage you want to test, and select “Inspect” from the dropdown or from your keyboard press command + option + I. This will open the inspect tool and bring up the last tool you used, so if the last thing you did was run an audit, it will bring you back to the audits panel.

    Dropdown menu: Inspect

  2. In the inspection window at the top right, click on the button with a double arrow, or expand the window until you see “Audits.” Select “Audits.” Dropdown menu: Audits selected
  3. Select your device size (mobile or desktop), and select “Accessibility” from the Audit Type options.
  4. Click “Run Audits.”
  5. A report will pop up in the inspect window with your overall score with information about your score results. Scores are out of 100, and 100 does not mean that a site is completely accessible–it means that it passed all automated tests. Lighthouse score display
  6. Below the score are details about accessibility errors. Toggle open these errors to see what element is failing and how to make fixes. Lighthouse error details

WAVE: Firefox and Chrome Extension

WAVE is a browser extension that allows you to run an automated accessibility test on a page of your website. It’s very thorough and one of our favorites for testing and fixing accessibility bugs.

To use WAVE:

  1. Install the WAVE Extension
  2. Go to the webpage you want to test, and click on the WAVE icon in the tools portion of your browser window. A report will pop up and your page will be marked up with the results of the review.

    Website in brower, with WAVE icon in toolbar

  3. A Summary will show up by default listing the number of Errors, Warnings, and other details on the page.

    WAVE tool sumary screen

  4. Click on the Flag icon to see more details. This will include information about what errors are on the page

    Wave tool details tab

  5. Clicking on the Tab at the bottom of the page that says “< code >” will show you the code marked up with the errors found.
  6. With the “< code >” tab open, you can click on the errors and warnings in the panel on the left to jump to the errors in the code. In the image below, clicking on the yellow rectangle “Redundant Link” icon in the report panel makes the code jump to the offending code.

    WAVE tool with code drawer open below website

2. Manual accessibility tests

A manual test will catch things automated tests can’t quite figure out. Robots are good for some things, but they can’t figure out human intention, so things like tab order, visual theming and good alt tags should be manually tested.

A toy robot from the 1950's

Keyboard testing makes sure that the site works for folks who are blind, who have low vision, who have limited mobility, or the person whose trackpad is broken. Conduct the following tests to see if your site is accessible to those using a keyboard to navigate:

  1. Go to the page you’d like to test. Start with your cursor in the address bar, and hit the “tab” button to navigate through the page. Each time you press tab, you should be moved to the next button, link or form input.
  2. Ideally, the first link you get to on the page is a “skip to main content” link that allows users to skip repeated navigation items.

    Webpage showing 'Skip to Main Content' link

  3. As you continue to tab through the page, you should be able to see where the focus is as it lands on each button, link and form field. Pro-tip: If you lose track of where it is because there’s not a visual indication that’s an accessibility issue.
  4. Check the order: Does pressing the tab key follow the natural flow of the page, or does it jump around? A good tab order follows the natural flow.
  5. Can you operate all menus, pop-ups, buttons, and forms?
  6. Can you press shift tab and navigate backwards?
  7. Are there items that are clickable that don’t receive focus?

Important Note: Keyboard testing needs to be done on mobile as well as desktop. Why? Some users who are blind don’t use full-sized computers or laptops because they don’t actually need a large display. Other users have low vision and magnify their screens. Which leads us to testing with zoom…

3. Testing with zoom

If you zoom a desktop screen to 400% on a responsive site you get…the mobile site! This is why testing on mobile and desktop is important.

Now that you’ve increased the screen to 400%, browse the page. As you browse ask yourself:

  1. Does text content get cut off?
  2. Do buttons get pushed off of the page?
  3. Is the functionality intact?
  4. Is there key functionality on desktop that’s no longer available on the mobile version?

4. Testing with a Screen Reader

Using a screen reader is a more advanced testing approach, and very helpful in identifying accessibility bugs on a site. If you use a mac, VoiceOver is the built in screen reader. To turn VoiceOver on or off, press command f5. Here’s a quick video tutorial on how to test your page using VoiceOver. The video description includes the full text of the captions as a quick reference.

[embedded content]

You can also turn on VoiceOver and tab through the page again to see if icon buttons are labeled properly, if the form labels you’ve applied make sense, and if alt tags on images are useful. If you press “control option a” all at once, VoiceOver will start reading every element from where you are on the page. If you tab, it will read the buttons, links and form inputs.

To sum it up:

Learning about different testing methods can help inform and add clarity to the process of making your site accessible. This is one of the most critical steps in your journey to making a website that everyone can experience. If you want to know how to transform these errors into a site that reads and navigates smoothly for all users, ThinkShout is here to help! Contact us to learn more about how we can partner to make your website more accessible.

Get In Touch

Questions? Comments? We want to know! Drop us a line and let’s start talking.

Learn More Get In Touch
Jul 26 2019
Jul 26

Is it possible to enhance your Drupal experience?

Drupal is a favorite content management system among professionals. It has been proven time and time again that it is reliable, scalable and can turn any website into a magical digital experience that your customers are loving. For these reasons, Drupal has gathered a passionate community that wants to constantly see it improve. Here at Sooperthemes, we are also driven by our passion for Drupal. We take Drupal and improve its shortcomings through our products. In other words, we enhance your Drupal experience with our framework theme and easy-to-use Drupal 8 & 7 visual content builder.

What are some examples of real-life organizations using Sooperthemes products?

It’s time to show you the results of using our easy-to-use drag and drop builder, and our framework theme. Here is a list of websites that were entirely built over the Drupal architecture using Glazed Builder and Glazed Theme:

Senate enhance your Drupal experience

The U.S. Senate is a core part of the legislation process of the United States. Such an important part of the U.S. had a need for a really good website platoform. Drupal was chosen because it can handle large and complex websites. On top of that, the senate chose to build all websites for newly inaugurated senators in 2019 with our Glazed Builder and Glazed Theme products. This resulted in a modern-looking governmental senator websites that provide a great experience at low costs to the senate, because much of the page-building work can be done in-house thanks to our easy-to-use page builder. 

2. Swarco

Swarco enhance your Drupal experience

Swarco is a company that offers traffic technology for better and safer transportation. It is based in Innsbruck Austria and has an international network of production facilities that are sure to meet the needs of their clients. Swarco decided to improve its online presence by overhauling its website with Glazed Builder. This resulted in an unforgettable digital experience that leaves a long-lasting impression. Well done!

3. Body Worlds

Body Worlds enhance your Drupal experience

Body Worlds is the biggest traveling exposition of dissected human bodies. The exposition attracted more than 37 million visitors, which makes it one of the hottest tourist attractions to date. Such a successful exposition had to also have an online presence that reflected their success. That's why Body World built its website with Glazed Builder. This resulted in a gorgeous website that attracts clients from all over the world. 

4. Monterrey Institute of Technology and Higher Education

Monterrey enhance your Drupal experience

Monterrey Institue of Technology and Higher Education is one of the most prestigious universities in Latin America. With its headquarters established in Monterrey, Mexico, Tec offers the finest education to its student. Such a successful university required a beautiful website that can convince prospective students to join their ranks. That's why Tec decided to go for the combination of Drupal and Glazed Builder. This resulted in a beautiful website that can tackle the multi-lingual necessities of the university, while also attracting a large number of students.

5. Open Medical

Open medical enhance your Drupal experience

Open Medical is a company that wants to improve the delivery of healthcare services to the general public. In order to do this, they partner with various companies that help them reach their goal. On top of that, such an initiative needed a good website that can showcase their mission and values. That's were Glazed Builder came into play. The results were a practical website that showcases the trustworthiness of Open Medical to their potential customers. This resulted in an increased number of clients and leads generated. 

What Sooperthemes' products?

Sooperthemes bases its products on the Drupal architecture. This means that you get the best that Drupal has to offer without any of its drawbacks, making it possible to enhance your Drupal experience. The Sooperthemes portfolio includes a large number of turn-key demo websites that can be used to quickly set up a gorgeous website that converts leads to customers right out of the box. There is a wide selection of demos that you can choose from based on the industry that your company is conducting business.

A couple of examples of our demos that are completely built with our drag and drop Drupal content editor and our framework Drupal theme.

Marketing Drupal Theme Demo:

marketing enhance your Drupal experience

This theme is perfect for any marketing agency that wants to have a gorgeous website that looks professional and attracts high caliber clients. The theme is highly customizable, being able to be adapted to the needs of every marketing agency.

Business Drupal Theme Demo:

business enhance your Drupal experience

Sooperthemes also provides a business website theme, perfect for people that want to have a new and astonishing website for their clients. The business theme focuses on a more professional look that conveys trust to your prospects. The business theme is the perfect choice for any business owner that wants to provide a great online experience for their customer.

Agency Drupal Theme Demo:

agency enhance your Drupal experience

Our agency theme is the perfect choice for any agency that wants to create or improve their digital presence. It is designed to be able to fit the needs of any agency that wants to impress their audience. It has an intuitive design that can surely make a great website for your agency. Especially if you want to enhance your Drupal experience.

Logistics Drupal Theme Demo:

logistics enhance your Drupal experience

Sooperthemes has the perfect theme website for any logistics company that wants to have an impressive online presence. The layout and design are specially adapted to be able to convey the fluidity and speed with which logistics companies are driving business. Moreover, these themes can further be customized to be able to reflect your brand. 

Photography Drupal Theme Demo:

photography enhance your Drupal experience

Are you passionate about photography and don't how you to monetize your hobby? The Glazed Photography theme is the right answer for you. You can easily setup-up your website to be able to show your clients your finest material. Glazed Photography is the right answer for you if want to have an edge above your competition.

Construction Drupal Theme Demo:

construction enhance your drupal experience

Any construction company has to have a jaw-dropping online presence in order to be successful. This is what you get by having by building your website with Glazed construction theme. This theme is perfectly adapted to reflect the seriousness and commitment of the construction industry. Whether you want to showcase your team or your portfolio, this theme is the perfect choice to make a lasting impression to any potential client.

Powerful content capabilities with Sooperthemes' easy-to-use visual content builder

These themed demo sites are further customizable to suit your needs with our Glazed Builder module. This module makes it easy to turn your dream website into reality. Glazed Builder is a powerful Drupal-based drag and drop visual builder that can make any Drupal website shine. One of the struggles that Drupal users seem to have at first is the steep learning curve, which can require a large number of hours, essentially bottlenecking the workflow. In order to bypass this struggle, Sooperthemes designed Glazed Builder, effectively helping website designers and marketers save countless hours and money on working with Drupal. The hours saved can be used for other important tasks.  One of the great points about Glazed Builder is that it makes designing a Drupal website seem effortless.

Why enhance your Drupal experience with Sooperthemes?

This is a great question that everybody should be asking themselves this before making a purchase decision. Well, let me explain.

 

Sooperthemes is driven by its passion for Drupal. Our main goal is to enhance your Drupal experience. In order to do so, we address the most common pain point that Drupal has, such as long development time, steep learning curve and difficult user interface. Sooperthemes has developed its products to be able to accommodate these needs. With the Glazed theme, users can quickly have a template for their Drupal website that can be easily customizable and deployed. On top of that, Glazed Builder overcomes the native powerful but complex user interface of Drupal with its Drag and Drop capabilities and intuitive user interface. On top of that, Glazed Builder incorporates a large number of elements that can be used to be able to further customize your website. Examples are sections, panels, jumbotrons, wells, panels, collapsible, Drupal blocks, Drupal views and much more. 

 

As you can see, the imagination is the only limiting factor when it comes to the capabilities of web design with Glazed Builder and its capabilities to enhance your Drupal experience.

Conclusion

If you want to enhance your Drupal experience, then Sooperthemes is the right answer for you. Not only does it offer the best of what Drupal has to offer, but it also transforms Drupal's weak points into its strong points. If you’re not convinced yet, no problem, try Sooperthemes for free here

Jul 26 2019
Jul 26

Sometimes we just want to see if a thing works

Recently I ran into a situation while building out the Watson/Silverpop Webform Parser where I just wanted to test and see if a few things worked without having to reload and bootstrap Drupal every time I refreshed a page. I also wanted to utilize some of the classes and methods made available from Symfony and Drupal. Can you feel my dilemma?

Let's be honest, running drush cr and refreshing a page takes time, and I'm an impatient person, so I wanted to see if there was a way to use some of these things without going through the pain of waiting for a Drupal bootstrap. Turns out, the solution wasn't that difficult and it made development on many methods of my module more pleasant than it could have been.

Here's the scenario

I wanted to test a few things that didn't require the database connection. Specifically, Drupal\Core\Serialization\Yaml to parse an array that I was building into YAML. So, what I did was stubbed out what would become my class WatsonFormEntityForm in a file in my docroot that I creatively named test.php. Now I was able to navigate to local.docksal/test.php on my local machine and see things working.

Get to the good stuff, already!

I got to a point in my development where I was able to convert the submitted HTML, in this case from a call to file_get_contents('./test.html'); into an array that I could work with. Xdebug was going great, and so was the module, but I wanted to see if I could convert it into YAML using a native Drupal method. The solution came with one single line of code.

$autoloader = require_once 'autoload.php';

This tells PHP, "Hey, we got a file here that wants to use some of the methods and classes in the Autoloader. Let's go ahead and let it!" This variable doesn't need to be called anywhere in the file. It just needs to exist. Now I was able to update the file with a few friendly use statements from within the Drupal and Symfony ecosystem without having to wait for all the database connections to happen.

The end result was:

<?php

use Drupal\Core\Serialization\Yaml;

$autoloader = require_once 'autoload.php';

// Do all the things here, including:

$yaml = Yaml::encode($array);

var_dump($yaml);

It sped up development, and it made it so I didn't have to wonder if something wasn't working because I forgot to drush cr all the things or if it was just because I made some mistakes.

Gotchas

Be sure that any code you're running doesn't rely on database calls or the container. For instance, if you try to run $nodes = \Drupal::entityTypeManager()->getStorage('node')->loadMultiple(); is going to throw a painful error.

Also, this is mainly for rapid prototyping or proving concepts. I don't recommend writing an entire module in procedural code and then trying to refactor later. Maybe take it one function at a time just to make sure it's doing what you want it to do.

Let me know if this helped you out or if you have better suggestions for rapidly testing some Drupal stuff without having to rely on a full bootstrap. As always, feel free to reach out to me on the Drupal Slack, where I'm always Dorf and always willing to help if I can.

Jul 25 2019
Jul 25

Cognitive overload requires more time and effort to complete a task. Learn more about how to reduce mental effort for users and how you can expand your knowledge of Cognitive User Experience Design (Cognitive UXD).

Based on my overview blog post Cognitive User Experience Design - Where Psychology meets UX Design I give you a deeper insight into cognition and describe its role in design based on the Cognitive Load Theory. It's about how we consume information, how we think, how we learn or solve problems, and the strategies we use to make decisions.

In cognitive psychology, cognitive load is the total amount of mental effort that is used in the working memory. Although we have a huge brain capacity, the problem is that its capacity is limited. This approach is known as the Cognitive Load Theory (Sweller & Chandler).
Accordingly, a high cognitive load leads to a higher mental performance in the brain. If several high demanding processing things are going on at the same time, this becomes even worse. More time and effort is needed to complete a task. This leads us to the first question.

What happens if the cognitive load is too high?

As already mentioned, our brain has only a limited amount of mental power. If the cognitive load is too high, the user no longer reads the content of a website. He only scans it.
Compared to adults, this performance is much lower in children. For example, the normal attention span for 8-year-olds is about 20 minutes, for 13-year-olds about 30 minutes (Wöstmann et al., 2015). This should be taken into account when conducting an interview or a test with children.

How can a UX Designer support the user?

User support can be achieved by reviewing and optimizing each step. Here are some starting points that can help:

  • Show the user an overview of the entire setup and in which step he is at any time
  • Provide the information that the entire process or each individual step is beneficial to the user and worth the user's time
  • Give clear instructions on what to do next
  • Check whether certain information/steps are really necessary and delete everything that is not important
  • Provide the information in a simple and understandable way

In order to make these approaches more concrete and thus more tangible and to reduce the cognitive overload of the users, a UX designer can apply different strategies. I will give a short overview of the most important ones and add a short example or description:

KISS (Keep It Stupid Simple)

  • Avoid unnecessary elements, less is more
  • Reduce the number of complicated graphics

Use different techniques

  • Provide information in different ways; these can be verbal, visual and/ or auditory techniques

Provide “bite sized” information

  • Break the content into smaller pieces

Remove unnecessary content

  • Reduce repetitions by proofing if a text is really required or whether a picture fulfills the same task

Reduce choices

  • Too many choices increase the cognitive load, especially for forms, dropdowns and navigation

Place words close to the corresponding graphics

  • Long distances force the user to scan the screen

Build on existing mental models

  • Use labels and layouts that users already know from other websites

Taking these recommendations into account when creating designs reduces the amount of brain capacity. This has a direct impact on how easily the user finds content or performs tasks.
With this in mind: Happy designing!

Jul 25 2019
Jul 25

The ideas in this post were originally presented by Suzanne Dergacheva at Drupal North 2018.
 

If you've opted for Drupal, then you must be dealing with a large amount of content, right? Now, the question that arises is: how do you build out and structure a complex content architecture in Drupal 8?

For you definitely don't run out of options when it comes to organizing your content:
 

  • content types
  • paragraph types
  • (custom) block types
  • custom fields
     

And things get even more complex when you start to consider all the various relationships between these entities. 

Now, let me help you structure this huge “pile” of different options, approaches and best practices for setting up an effectively organized content structure.
 

What Makes Drupal Ideal for Creating a Flexible Content Architecture?

One of Drupal's key selling points is that it ships with tools and workflows designed specifically to support a flexible content architecture.

And I'm talking here about its:
 

  • WYSIWYG editor
  • all the tools that streamline the content creation and publishing process
  • access control system based on user roles and permission levels
  • ecosystem of Drupal 8 content types (blocks, nodes, paragraphs, terms)
     

All these tools combined empower you to:
 

  • create any type of content (survey, landing page, blog entry...) nice and easy
  • control where and how that piece of content should be displayed on your website
  • categorize and structure your large amount of content using different content entity types
     

In short: Drupal 8's built, from the ground up, to support a well-structured, yet highly flexible content architecture.
 

Step 1: Plan Out Your Content Architecture in Drupal 8: Identify the Needed Content Types 

A well-structured content architecture is, above all, a carefully planned out one. 

Start by analyzing your content wireframe to identify your content needs and to... fill in the blanks:
 

  • decide what content you need on your website, how/where it should be displayed and to whom it should be accessible
  • identify the various content entity types for each piece of content
  • set out all the fields that each content entity type requires
  • define your taxonomy term entities
     

 It's also that step where you gradually start to populate each category outlined in your content wireframe with the corresponding types of content.
 

Step 2: Set Out Your Well-Defined Content Types

I'm talking about those traditional, crystal-clear content types like articles or job postings, where the structure is pre-defined and it leaves no room for interpretation.

Those fixed content types that guarantee consistency across your website, that are easy to search and to migrate.

This is the step where you define each one of these content types' elements —  paragraphs, data, long text, images, etc. — and their order. 
 

Step 3: Set Out the Relationships Between Various Types of Content

Since you're dealing with a complex content structure, an intricate network of references between different pieces of content will be inevitable.

Now, it's time to set out all those explicit relationships between your node references and their referenced nodes, between term references and terms...

Note: needless to add that the implicit relationships will form by themselves, you have no control over those.
 

Step 4: Define the Multi-Purpose Content and the Reusable Pieces of Content

While at this phase, where you identify the content types that you'll need, remember to add the multi-purpose and the reusable content types, as well.

Speaking of multi-purpose content, it's that content type (e.g. the landing page) that you don't know yet what content it should include. And what order its content elements should be displayed in. 

Therefore, you need to keep it flexible enough for future additions and modifications. In this respect, the Paragraphs module is the flexible page builder that you can rely on.The “secret” is to build your paragraph types — call to action, webform, view —  along with the fields that they incorporate and to... leave it flexible for future updates.

Now, as for the reusable type of content, the best example is, again, that of a landing page with multiple reusable blocks that you can move around to your liking.

What you can do at this stage is to define your block types: image, view, call to action.
   

Step 5: Create Your Custom Entities and Custom Fields

While structuring a complex content architecture in Drupal 8 you'll inevitably need to create some custom entities and fields, as well.

With a large pile of content to deal with, there will be cases when the options that Drupal provides aren't suitable. For instance, you might need to define some special rules for a specific piece of content.

In this case, creating a custom entity is a must, but make sure you've carefully thought through all its potential use cases and specific workflows. That you've invested enough time in prototyping it.

Also, you might find yourself in a situation where one of the fields needs to be stored or validated in a particular way. For instance, you might need to create a multi-value field. Since these scenarios call for a custom field, again, take your time to prototype it thoroughly.
 

The END!

These are the main steps to properly structure your complex content architecture in Drupal 8. The golden rule should be: always leave some room for flexibility.

Photo by Alain Pham on Unsplash 

Jul 25 2019
Jul 25

Submitted by karthikkumardk on Thursday, 25 July 2019 - 15:29:44 IST

file_scan_directory is deprecated and has been moved to the file_system service.

Before:

$files = file_scan_directory($directory);

After

if (is_dir($directory)) {
  $files = \Drupal::service('file_system')->scanDirectory($directory);
}

When possible, you should inject the FileSystemInterface into your constructor.

Original source - https://www.drupal.org/node/3038437

Jul 24 2019
Jul 24

That's how we supported a team at PostLogistics in the tranformation from hierarchy to Holacracy® - an interview with Lucien Ecoffey.

At the end of the kick-off meeting in January 2019, Lucien Ecoffey – the then Head of French- and Italian-speaking Switzerland’s PostLogistics Sales Support body – signed the Holacracy® Constitution. He gave up his hierarchical authority and transferred it to the process that would guide his entire team.

Six months after this transformation began, the former manager and now Lead Link of the PostLogistics Sales Support circle takes a look back at the successful transition.

How did this transition project to Holacracy® happen?

Lucien Ecoffey: We had an internal reorganisation in 2017. At this point, we established a service unit providing back-office services to various divisions of Swiss Post. This change affected how we worked. We needed a more flexible structure to meet the various demands of our different stakeholders. In addition, the members of my team expected more autonomy, more responsibility and less monitoring.

Why did you choose Holacracy®?

For three reasons:

  1. I was convinced that Holacracy® represents democracy applied directly to a company.
  2. The members of my team wanted to play a bigger role in decision-making.
  3. Another team within Swiss Post had already successfully implemented Holacracy®.

The team voted, and Holacracy® was the unanimous choice.

The Constitution and the clear tools and processes enable us to oversee the quality of management. Holacracy® also enjoys a high level of credibility. In addition, there are various pieces of associated software available, such as Holaspirit.

Organising work by roles rather than sets of specifications was an attractive prospect for all team members. These roles develop in line with the changes we face, something that is perfect for our need for flexibility. The members also appreciate that the autority is given by a set of rules and not by a single person any more.

*Lucien – the team’s former manager – transferred his authority over to the Holacracy® Constitution.*

What would you say are the benefits of implementing Holacracy®?

Each member of the team is more deeply involved, in particular at tactical and governance sessions. Our meetings have become increasingly efficient. All problems are presented at the meeting and tackled immediately: decisions are taken, and everyone is involved. This new decision-making process also means that I no longer have to decide everything, and it has given me more respect for my colleagues’ opinions.

We are still at the learning stage. Everyone requires an adjustment period to fully incorporate the new responsibilities of a role into their work and learn how to manage the authority that comes from this. Clarifying these responsibilities and grouping these within specific roles has already been of great benefit. This enables everyone involved to examine their activities and optimise their tasks. Team members’ ability to identify with their roles is also a key benefit of this transition.

Holacracy® has another benefit, namely transparency. For example, we can integrate a new member of staff into the team easiler and faster. Processes, responsibilities and tasks are clear and transparent.

What did you think of Liip's coaching?

Your support – four days spread across six months – completely met our expectations. Laurent and Jonas did outstanding work.

The kick-off was intense but absolutely necessary, and it gave the team a new way of looking at hierarchy. I remember the rather formal moment when I transferred my authority over to the Constitution. I also enjoyed working on our circle's purpose and creating the first roles.

During this kick-off, you gave us the tools that would enable us to work with Holacracy®. The coaching showed us that we could view authority from a different angle. It took us out of our comfort zone and enabled us to optimise the roles within our team.

You have been working with Holacracy® since the beginning of the year. What are your plans for the future?

Our work with Holacracy® is only just beginning. It is great to see that it really works. The next step is to incorporate the idea of roles and people being separate from each other. Current roles will be developed (some will be wound up and others created) to bring them closer to our daily work. We still need to learn how to handle some tools to enable this structure to become a source of fulfilment, for each employees personal benefit.

The transition taking place within our team has piqued the company’s interest and been well viewed. Numerous other teams are beginning a similar process.

I think this is fantastic, and hope that Swiss Post as a whole will move in the same direction.
I would therefore be delighted if Liip comes back to see the progress we have made, coach us on the aspects we need to improve, and share our experiences.

Jul 24 2019
Jul 24

This article assumes that you've already heard the big news about Acquia acquiring Mautic, the largest open-source marketing automation platform. Chances are that you've already run demos on your Drupal 8 site.  If that is indeed the case and your Drupal 8 site uses the popular Webform module; you're probably wondering how you can send your existing Webform submissions to Mautic in order to convert them to contacts.

Well, the easier way is to simply create a Mautic Form and embed that form in your Drupal site. Mautic lets you create many different forms which can be easily embedded in your Drupal site using simple JavaScript and/or HTML embeds. However, you cannot simply let go of your current Webforms, and let's also be frank, there no form systems that can outperform Drupal 8's Webform module.

The good news is that Drupal 8's Webform module can easily send submissions to Mautic forms - thus connecting your Drupal site with Mautic in a seamless way.

This integration also takes into consideration that Mautic tracks anonymous visitors then converts them into contacts once they submit a form - yes through Webform in this case.

Prerequisites: 

  1. You have a Drupal 8 website with Webform module installed
  2. You have a Mautic instance installed

Step by step guide:

  1. First, you'll need to create a Mautic form similar to your Drupal webform. You do not need to embed this form, you just need to create it so it receives the submissions by remote post from your Drupal webform.
     
     
  2. Download and install Webform Mautic module. This module adds a Webform handler to map submissions to Mautic forms.
  3. In your Drupal 8 site, go to the webform that you want to send its submissions to Mautic. 
    Navigate to the webform's "Settings", then go to "Emails / Handlers" to add a new handler.
    Create a new handler by clicking "Add handler" and choose "Mautic".
     
     
  4. Now you'll only need to configure the handler. Choose a meaningful title. I like to call this handler "Send submission to Mautic form".
        
     
  5. Save the handler and you're done.

Things to pay attention to:

In Drupal side:

  • It's recommended that you have Mautic tracking code installed in your Drupal site.
    This will leverage the contact tracking capabilities of Mautic and link form submissions to the contact activities that were tracked when the contact used to be an anonymous visitor.
  • Test the handler by going to "Test" in the handler's action menu. Make sure Mautic is receiving your submissions.

In Mautic side:

  • Make sure your form is published.
  • Make sure your form is not set to "Kiosk mode" if you want to get the full contact tracking activities when the form is submitted.

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web