Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
Jul 28 2020
Jul 28

Decoupled and multilingual web development are our specialities, and we’ve created this blog series to illustrate the advantages, as well as applications of using the Translation Management (TMGMT) module as part of a comprehensive translation strategy, and how this can be applied to a decoupled architecture.

So far in the series, we’ve had an in-depth look into what can be achieved when using Drupal as a translation management system, with the use of core functionality and some great contributed modules, and what the translation workflow can look like. We’re now going to shift the focus to look at translations in a decoupled Drupal architecture. 

First, we’ll look at a solution in which we integrated a Gatsby app with a pre-existing custom Drupal 7 translation repository, and then how we would approach decoupled translations with either Drupal 8 or Drupal 9 using the existing translation management system. We’ll be focussing on interface translation strings in this blog post. 

The Challenge

The goal of the project was to build a frontend that integrated with the client's third-party and in-house APIs and applications. Part of this was integrating with an existing Drupal 7 instance that housed the client’s content and an interface translation repository built with custom entities. The application needed to be provided in 8 different languages and due to the size and complexity of the application, introduced nearly 1000 interface translation strings. We needed to make sure that the translation team could continue using this translation repository as normal, and we’d need to be able to re-use some existing translations in the new Gatsby app. 

The Solution 

Initial setup

We started by creating a couple of custom endpoints in the client’s Drupal 7 instance that would allow us to read and write translations. We created an endpoint for listing out all of the translations in JSON format, keyed by langcode, and another endpoint for the creation of new translation entities.

Once the API endpoints were in place, we started to work on the translation framework for the Gatsby application. We chose to use the tried-and-tested i18next package, specifically react-i18next, for use with our React components. We had a large distributed team on the project, so to get us up and running quickly, we made sure there was a language switcher in place, and that the “t” function that i18next provides was available for use before completing the entire synchronisation feature with Drupal 7. This ensured the other developers could identify strings that needed translating and that we wouldn’t have to retrospectively update any text to make it translatable. Documentation for developers explaining the usage in the project was created, demonstrating example usage and the translation system architecture. The usage documentation within the Gatsby app looked similar to the example below:

import { useTranslation } from 'react-i18next';

function MyComponent () {
  const { t } = useTranslation();
  const siteName = ‘Amazee Translations’;
  return <h1>{t('page.section.element-name’,’Welcome to {{siteName}}’, {siteName} )}</h1>

Translation diagram

  • Translation key: a unique identifier for the translatable string
  • Default value: the English version of the translatable string. In our context, these matched our screen designs
  • Replacement variables and other options: some translatable strings will include dynamic variables

Synchronising translations

With the foundations set, we implemented the functionality to synchronise translations between the Gatsby app and Drupal 7.

Synchronising translationsThis translation synchronisation happens when the Gatsby application is built, and also periodically using a scheduled cron job.

1) Scan the Gatsby files for translatable strings 

We included and configured the i18next-scanner package to parse all our Typescript files and extract any usages of the “t” function. The extracted translations were all written to a file called dev.json which is used to populate the custom translation entities in Drupal 7. As well as all the parameters that were passed to the “t” function, a link to a custom preview app that lists out all Storybook stories for the given translation key, which gives the translation team context around the string and how it’s used in the app.

// Translation key parameter from t function
“form.address.state-label": {
// Default value parameter from t function (on build this will be prepended with 
 "administrative_title": "State/Province/Territory",
// Default value parameter from t function 
  "en_translation": "State/Province/Territory",
// Link to a custom app that displays all components that use the given translation key
  "context": "http://web.app/i18n/form.address.state-label”z

Example output from running the translation string scanExample output from running the translation string scan.

2) Fetch existing translations

To get existing translations, we make a GET request to the custom endpoint we created in Drupal 7, and for each language, we create a JSON file that can be used when initialising i18next.

Example output from running the translation string fetching from Drupal 7Example output from running the translation string fetching from Drupal 7.

3) Add any missing translations

Now that we have what should be available in Drupal 7 (all translatable strings from our Typescript files) and what actually exists in Drupal 7, we can do a quick diff between the dev.json file and the retrieved en.json (as we always create an English translation in Drupal 7 when syncing). 
This process means that when a new translation key is added to the Gatsby app, it will be pushed to Drupal on the next build, ready for translation into other languages. And vice versa, if a new translation is added to Drupal, it will be pulled into the app on the next build. 

In Drupal, each key's title is automatically prefixed with the app's version e.g. [1.1.0] Search (search.title). The version number is retrieved from the package.json file, which is updated when there's a new release. This helps identify translation batches for the translation team to work on.

Out in the wild

After the translation synchronisation feature was delivered, the translation process worked in this way:

  • Developers would ensure all strings used within the Gatsby app were wrapped in the t function and that a suitable key was provided. There’s error logging built into the synchronisation process to alert of any missing mandatory data, and using Typescript helps a lot for this too.
  • When features made it through the QA process, a deployment would be made to a pre-production environment where the translation sync would run and upsert the translations into Drupal 7. The translations would be tagged with the appropriate version number, and this would be communicated to the translation team at feature milestones.
  • Using the relevant version number, the translation team would then get to work on all the translations tagged with that version number, and subsequent builds would pull those into the Gatsby app. They would be able to use the preview links to confirm the context of the translation string and add the translation in Drupal 7.

Greenfield approach with Drupal 8 or 9

As the needs of our clients differ, we can’t always take a greenfield approach to solutions, and the solution above worked really well in the context of that project. In this example, we were only concerned with interface translations, so if we were to start afresh, here’s how we’d approach handling content and interface translation with Gatsby and Drupal 8 or 9.

Gatsby setup

We wouldn’t change a lot here in terms of localisation, we’d still use react-i18next in combination with i18next-scanner. It worked really well and one of the maintainers was super responsive and helpful when we ran into any issues (yay for the open-source community!). 

To synchronise interface translations with Drupal, we’d create a Drupal i18next backend plugin. To retrieve multilingual content from Drupal, we’d opt for using gatsby-source-graphql over gatsby-source-drupal due to gatsby-source-drupal not being multilingual-ready, and we are generally bigger fans of GraphQL and the flexibility it gives. Our very own John Albin presented the foundations for this kind of approach in his Decoupled Days 2019 talk 

Drupal setup

We’d set up GraphQL and use this as the endpoint for the Gatsby app to source its content from. We’d love to try out Gatsby Live Preview and Incremental Builds with this approach - something we’re very excited about after using this functionality successfully with a Gatsby + Contentful project. Content translations would work in the normal way, being retrieved with GraphQL, but for interface translations, i.e. the strings in the Gatsby app that Drupal isn’t aware of, we’d create a contributed module that would be able to communicate with the i18next backend plugin and then the interface strings from the Gatsby app could be managed in Drupal, as normal, as if they were retrieved from the usage of the t function in a Twig template.


With the progression of Drupal’s API-First initiative, from 7 to 8 and beyond, content and interface translations are now able to be retrieved from a Drupal instance and used in an entirely new frontend. This means there are lots of opportunities to level-up the UX of traditional Drupal 8 websites, without having to change the content and translation management process. Decoupled Drupal is a thriving topic and there are still many challenges to overcome; the important thing is to leave Drupal to what it’s best at, and the translation system is a fantastic and fundamental example of this.

Do you have a project that would benefit from a comprehensive translation strategy? We can help. Check out our services or get in touch today! 

Jul 22 2020
Jul 22

Why Virtual?

Unfortunately due to the pandemic, the planned DrupalCon for North America had to be rearranged and revised. It was originally scheduled to take place in Minneapolis from Monday 18th May – Friday 22nd May. Instead, the Drupal Association acted quickly and resourcefully to transform the event into a virtual remote conference.

At first, myself and others in the community were skeptical about how a virtual conference would compare to our vibrant in-person events; would it feel like a welcoming, inspiring, and collaborative environment?

Come for the code, stay for the people

Overall, I was thoroughly impressed with everything, from the organisation of the event, the diverse range of speakers, the fun social activities, and the networking opportunities. It exceeded my expectations and I think there’s a unanimous sentiment: DrupalCon Global was a huge success! 


How was the experience?

Aside from a few hiccups in the beginning, the conference ran smoothly once the waves of connection issues flattened out.

There were some notable differences from an in-person conference: the line for coffee was nonexistent, and you would not be silently judged for entering or leaving a session half-way through. Instead, you could just click through to virtually jump between parallel sessions in other tracks, something that is more difficult to do in person. This allowed attendees to freely explore and find the right sessions throughout the conference.

Overall, DrupalCon Global was a fantastic alternative that really brought together the community from around the world. Since a virtual conference eliminates travel considerations, it was truly more global than the originally planned event. I was able to network with people ranging from Sydney, Australia all the way to California, United States and down to people in Johannesburg, South Africa.

I was excited to represent as a mentor to two recipients of the scholarship program. One of my mentees was based in Lagos, Nigeria and the other in Kerala, India. It was an honour to be given the opportunity to welcome new members to the Drupal community and guide them through the conference.

The Kickoff

We started off the conference with Vincenzo of amazee.io, cooking pasta live from his home in Catania, Italy, for over 1500 event attendees. I’m sure some of the people who tuned in may have wondered if they were attending a conference related to Drupal or to cooking, but either way, many people really enjoyed the chance to improve their “PastaOps” skills.

Vincenzo of amazee.io cooking pasta After we all worked up an appetite from admiring the culinary wonder of Pasta al amazee.io, we were ready to jump into the latest news and initiatives from the community with Dries Buytaert opening up with the Driesnote. I was awe-struck to see photos and videos of the CelebrateDrupal.org platform used as part of the keynote. I was part of the community team that designed, built, and launched the website with the sole purpose to bring people together to celebrate Drupal and we’ve been overwhelmed with the positive responses leading up to the release of Drupal 9.

Dries laid out the five initiatives to work on as part of Drupal 10’s release cycle:

  1. Drupal 10 readiness
  2. An easier out-of-the-box experience
  3. A new front-end theme (Olivero)
  4. Automated updates for security releases
  5. An official JS menu component for React and Vue

You can read more about each of those initiatives in the State of Drupal presentation (July 2020).


With all DrupalCons, there comes a time to take the official DrupalCon Photo. However, with everything being remote, we needed something different. Tim Lehnen of the Drupal Association asked everyone to take a selfie and upload it to the CelebrateDrupal.org platform, while I spent the next day figuring out how best to display the images.

I came up with arranging the images in the shape of the new evergreen Drupal logo using CSS Grid to represent that the people, the community, are what makes Drupal great. Check out the DrupalCon Global 2020 - Photo.

Exhibit Booths

As sponsors, both Amazee Labs and amazee.io had virtual exhibit booths. These booths allowed us to showcase our recent videos, featuring closer looks at our experience in maintenance as well as recent updates to Lagoon. We ran a few live broadcasts within our booths. During these events, a few Amazees and special guests used a virtual room to talk about life at Amazee, our day-to-day job, and technologies we’re excited about. A benefit of the online format was that we could invite attendees to join the video chat to ask or answer questions live as well as in the booth chat room. 

Exhibit Booths
It was quite interesting to be able to virtually walk into other sponsors' exhibit booths and connect with them. They were all really friendly and welcoming with some sponsors giving away prizes if you took part in their virtual quiz or competed in a game, just like at the in-person conference.

Global connection

No DrupalCon is complete without BoFs (Birds of a Feather), and this time was no exception. We had various BoFs such as the resurgence of the #DrupalNapping corner, as well as a BoF about the upcoming DrupalCon Europe; and of course, the #DrupalPets BoF in which everyone brought their pet to the video call to virtually meet each other.

#DrupalPets BoF

Social Events

There were quite a few planned social events such as meditation by Rahul Dewan and multiple trivia nights to quench our thirst for fun interactive comedic gold. I took part in one of the three trivia nights and it felt just as fun and competitive as the in-person event.

Sessions and Summits

Aside from the keynote sessions which took place on the main stage, we also had multiple session tracks covering a wide range of topics. Accompanying the session tracks, there were several different summits which took a deep-dive into specific fields such as the Government Summit or the Healthcare Summit.

I attended the Performance and Scaling Summit which had four amazing speakers:

  • Mike Herchel - Front-end performance audit
  • Janna Malikova - Load testing your Drupal website before it's too late
  • Shane Thomas - Decoupled performance
  • Michael Schmid - How to survive COVID-19 as a hosting provider

I found the summit very informative, notably learning that the GovCMS platform recorded over 100,000 page views per minute with 187,000 concurrent users. In order to reach that level of web scale, you really need a performant website that returns the time to first byte (TTFB) in less than 0.3 seconds.

Furthermore, I attended the “Volunteer-led strategies for helping the Drupal community” group session by Alanna Burke, Elli Ludwigson, Ruby Sinreich, and Tara King. They discussed why diversity, equity, & inclusion is everyone’s responsibility. They highlighted the importance of intersectional feminism, anti-racism, and how to go about taking action.

Mitchell Baker gave an inspirational keynote about what we, the open-source community, means by “open.” Mitchell shared insights from over two decades leading Mozilla and how the open-source community can contribute to a better Internet with diversity being essential to making this happen.

Jacqueline Gibson talked about Digital Inequity for the Black community. A key learning was that company culture is step one in building an inclusive and diverse team. A welcoming, safe space starts with the team members, who then hold their managers' feet to the fire to sustain and nurture that culture.

Another talk that resonated with me was Leslie Gibson’s talk on building a flexible digital brand. Leslie took us through the digital event strategy of Black Womxn For, and explained how to empower local community members.

Local communities trust the people they know and vice versa

The last session I attended was John Albin’s talk on “Progressively decouple Drupal 8 with GraphQL and Twig”. I found that it was very informative as it covered not only how to get started with the GraphQL Twig module but also touched on the history of Drupal with the difficulties of learning the render API with Twig for Drupal 8.

There were so many more sessions with amazing speakers, I highly advise checking out the on-demand content which will become free to the public by September 2020.

So if you're as passionate about Drupal web development and the open-source community as we are, get in touch!

Jul 09 2020
Jul 09

DrupalCon Global 2020 is just a few days away, and we’re excitedly prepping up for what’s sure to be a virtual event like nothing the community has seen before. To say the least, it’s been a strange few months for everyone on the planet, but we at Amazee Labs think nothing exemplifies the resilient and persistent spirit of the open-source community like the innovative skills and organizational determination it took to bring DrupalCon Global 2020 and all its global attendees together.

Event organizers and volunteers have been tirelessly working to bring the community a virtual version of the DrupalCon experience that we all know and love.

The always popular “hallway track” has been reshaped into a virtual booth experience with plenty of space and time dedicated to promoting organic conversations with fellow attendees. A Virtual Library will give attendees access to presentations and featured speaker content, curated lists of the sessions selected for Meet the Speaker, on-demand event video, and special attendee-only content by stakeholder organizations.

Attendees focused on professional developments will have access to newly formatted sessions and program features, expanding everyone’s opportunities for inspiration and advancement in a myriad of fields and subjects.

At DrupalCon Global 2020, attendees will still be able to share and learn about the latest in thought leadership around open source and ambitious digital experiences, to enhance their careers and organizations, and add their strength and momentum to current and future Drupal projects.

Here’s how you can connect with the Amazee Labs team during the event:

Live Events

Our CEO Stephanie Lupold will join the How to maintain company culture with distributed teams Panel on July 15th at 16:00 UTC on the main stage to discuss how business leaders maintain their company culture in a virtual environment.

On July 16th catch our lightning talk on Automating your Web Maintenance using Drutiny at 19:15 UTC, presented by Blaize Kaye and Fran Garcia-Linares.

John Albin Wilkins’ presentation on Progressively decouple Drupal 8 with GraphQL and Twig will be live on July 17th 0:00 UTC. 

Virtual Booth

Don’t forget to stop by our virtual booth during exhibition hours to chat with a live Amazee and see presentations about our services and technology: 

  • Tuesday, July 14 14:00 - 15:00 UTC
  • Wednesday, July 15 21:00 - 22:00 UTC
  • Thursday, July 16 16:00 - 17:00 UTC 

You can also check out our video library anytime during the conference.

Don’t forget to register for the event. We hope to see you there!

Jun 23 2020
Jun 23

With a simplified translation flow with a balanced use of both human and machine translation, you can reduce your costs and reach even more markets.

In this blog, we’ll further explore a few user stories (introduced in part 1 of this series) and how we resolved their translation needs with the help of the Translation Management (TMGMT) module.

We will see how to:

  • Simplify and accelerate the translation process to empower content editors

  • Delegate the translation task to a machine and human translation to feed other translation systems (like e.g. Trados or Memsource)

  • Prevent data loss or duplication by choosing the content model that fits your expectations, from the beginning

  • Identify possible translation processes in a publishing workflow

  • Set a deadline, word count, and allow non-Drupal users to receive translation files by mail

These user stories can be grouped into three topics: Content Moderation, Paragraphs translation, and UX experiments. The implementation described below is Drupal 9 ready.

Content moderation

The very basic requirement for a “translation flow” would be to mark the translation as outdated.
This is working perfectly fine for non-moderated content, and there is also work in progress in Drupal 9.1 to set this feature back for moderated content.


It could also be a valid use case to set a state for a translation while making use of a publication workflow (e.g. content moderation). So if the source is published, it might not necessarily be the case for the translation.

There is work in progress with TMGMT to support pending revisions and accept translation as a specific moderation state, so this option is available while doing a Job review.


Additionally, we are experimenting with the following features with the TMGMT Content Moderation module:

  • Display the current moderation state close to the published status in the content translate form

  • Enable translation operation only when the source content reaches a specific state (example: “Published” state)

  • Exclude states from the translated entity (example: “For translation” state)

  • Redirect on Job completion to the latest revision of the entity

Combined with the Moderation State Columns module, it can also produce this kind of view.

Content Dashboard

Paragraphs asymmetric translation

Paragraphs can be configured to have asymmetric translations. In this case, the structure from the source entity can differ from the translations. It also allows to not have a translation fallback to the source on the frontend.


  • English source

    • Paragraph text 1 EN

    • Paragraph text 2 EN

  • French translation

    • Paragraph text 1 FR

While working with this setup, we need to be aware of several possible issues.

  • Data “loss” and dangling references can occur

    With existing content, switching to and from symmetric to asymmetric can cause data to not appear in the backend or the frontend, and then produce dangling references as the Paragraph entity ID will or will not be the same depending on the chosen setup. So this needs to be taken carefully into account before giving access to the content editors.

  • While using TMGMT, the flow might not be the expected one while doing several translations

    Let’s continue with our minimal example.

  • For our English source, the first edit is
    • Paragraph text 1 EN

    • Paragraph text 2 EN

  • A first French translation Job via TMGMT produces the expected result
    • Paragraph text 1 FR

    • Paragraph text 2 FR

  • A second edit of the English source adds a 3rd paragraph, so we have
    • Paragraph text 1 EN

    • Paragraph text 2 EN

    • Paragraph text 3 EN

We translate the same content again with a TMGMT Job: the 3rd paragraph appears in the Job review, but it will never appear on the frontend nor while editing the translation. This is probably not expected. If it is still accepted in the flow, it consumes unnecessary translation resources.

If we manually edit the French translation, to say remove the 2 originally translated paragraphs then add 2 others, we end up with

  • Paragraph text 4 FR

  • Paragraph text 5 FR

If we translate again the source at this stage, the situation between the TMGMT Job and the frontend will become quite unclear.

  • Work in progress

    We are close to a solution but the integration of TMGMT with Paragraphs asymmetric translation (including content moderation) is still a work in progress.

Paragraphs asymmetric integration with TMGMT is feasible but if you plan to translate the content several times/update translations with TMGMT it will most likely not be the right fit for content editors.

UX experiments

A client requested to adapt the TMGMT translation flow for content editors. The assumption was that the Job review will be done right after the translation, so it allowed us to propose a simplification of the UI and skip some steps. A similar request came from another client then we’ve decided to start a proof of concept and gather some of these requirements to see how they could be generalised.

Here are a few examples of alternate flows and simplified UX that could be used by roles that do not need the whole TMGMT stack.

Original TMGMT flow: Machine translation with DeepL

[embedded content]

The alternate flows described below are combined with the TMGMT Content Moderation features that were previously mentioned.

Alternate flow 1: Machine translation with DeepL

Once a source reaches the translatable state (Published in this case), the translation occurs in 3 steps: Create the Job > Review the translation > View the result as the Latest revision.
The second step could even be skipped by accepting translations without review.

[embedded content]

Alternate flow 2: Send a file by mail to a translation service

Here we presume that we can send the XLF file attached.

[embedded content]



Combined flow


In some cases, we can also combine the 2 flows and first do a machine translation via e.g. DeepL, and then export the result in the XLF output to populate the initial translation in another translation solution (like Trados or Memsource).

List of other features provided by the Simple TMGMT module:

  • Translate with the usual translate operations

  • Disable operations links when the user selects multiple languages

  • Limit operation links to the supported translators

  • Optionally disable translation once already translated (edit only and do not translate again via a Job)

  • Per content translation flag to keep track of automatic translation

  • Optionally add a delivery date. A default delivery date is calculated based on a certain amount of open days

  • Integration with the Swift Mailer module (HTML mail templates with attachments)

  • Integration with the Disable Language module (filter languages with permissions)

  • If a Job is left as unprocessed allow to continue or delete it via the translation form

  • On Job delete or Job item delete, redirect to the entity translation form

  • Re-route machine translation error messages to a specific email address (e.g. [email protected]) and simplify the error messages for content editors

Next steps

  • With the deadline, word count, and non-Drupal users in the flow, it opens the door to integration with other information systems and stakeholders. For example, a CRM might contain your translator contact details and translation skills while an accounting platform will get the cost reporting once the translation Job is finished.

  • Introduce a generic way to deal with notifications, as suggested in this issue.

If you need help with any of the above including adding translation functionality to your site - get in touch with us today!

May 27 2020
May 27

In this series about building multilingual sites, we will use Drupal as our central content management system. In this blog, we will dive into how we can implement translation for different features based on defined user stories with Drupal core and contributed modules.

Getting started: elaborate on a process

Having a good translation strategy starts at the early beginning of a project. Here are a few topics that could help to elaborate a process.

The translation strategy could be approached in several ways depending on the project:

With a minimum viable product (MVP) or a new project build

  • Fully translated, with several languages, for the first release

  • A proof of concept in a single initial language

  • A subset of languages first, then add more languages at a later phase

  •  ...

With upcycling an existing project or doing a project migration

  • The source and destination content model might not be the same

  • Some translations could be partial or outdated

  • ...

For both content creation and update, we should be able to answer the following questions:

  • Who are the personas? They could be represented by Drupal roles: content editor, translator, reviewer, publisher, …

  • What are their relations with the translation system, which features can help their work?
  • How will the content and features access be handled?

  • How will the content flow between personas, is there a publication workflow? Example: when the source is ready for translation is a simple workflow enough (translation review then publication) or do we need a more elaborate process that allows back and forth between reviews?

  • Are revisions required? Would it be possible to roll back a previous revision and how does it affect translations? Also what happens when the source is updated?

  • What are the relations between languages? Are we able to translate from another language than the source?

  • For locale, how can I make use of a first contributed version then override it, handle change and provide flexibility when more context is needed?

  • What are the available resources? In house translators, one or several specific translation agencies that might be business-specific, automated translation services? How do we want to deal with deadlines and scheduled publications?

  • How does it integrate with third parties (if any)? Translators could be part of a CRM or translation jobs might need to be reported in an accounting system.

  • What is the translation scope? It could be frontend and/or backend, do we want URL aliases to be translated?

  • Do we have to deal with external entities (e.g. content being imported and updated from an external service)?

  • Do we need a different structure between the source and the translation?

  • How does the translation fallback behave? Do we want to display the source, display a message, redirect, …?

Drupal multilingual concepts

Let’s see what Drupal brings out of the box. Here is a quick recap of the Drupal translation concepts.

  • Interface translation: also known as “locale”, it can be handled via Gettext / .po files. It includes strings passed through the t() function or trans Twig tag, these strings are also contained in Yaml files (module name in the info file, schema label, title in routing and annotations. They are not synced between environments by default, but their import can be automated.
Example: the label of a button in a Twig template

  • Configuration entities: core, contributed and custom modules are providing them via their schema. They can be synced via the configuration export/import between environments.
Example: strings set for Views, Blocks, settings, …

  • Content entities: their blueprints are made from configuration. They are specific to each environment by default.
Examples: Nodes, Terms, Users, Media, ...


With the Drupal capabilities in mind, we could continue by refining some parts of our process as common user stories, by project stakeholders. Then see how it can be covered with the core or contributed ecosystem. Here are some examples of possible expectations.

As a Product Owner, I can

  • have an overview of all the translations and their status

  • access each node translation without having the admin UI translated

  • edit translations that are defined by code

As a Translator, I can

  • be notified of new translation requests without having to access a Drupal site

  • receive structured files like .xlf or .po that I can use with other translation solutions than Drupal

As a Translation Reviewer, I can

  • let know the Publisher that a translation is ready for publication

As a Site Builder, I can

  • limit the edit / view access to several languages to translate content before publishing it, as a whole for a language

  • limit the access to translation languages, by user

As a Developer, I can

  • add metadata to locale by specifying context related strings

  • have support for plural

  • update locales between environments

  • extract translations from custom code

  • make sure that every string is translatable

  • migrate content with translations and map them to a new content model

Translations in the wild

Content translation will be developed in the second part of this series, so let’s see first how some of these user stories can be covered by the contributed ecosystem and which tools are available for developers.

Drupal core (>= 8, 9)

Each core concept is basically covered by its own module with the Language module as the common requirement: Configuration Translation, Content Translation, Interface Translation.

Make the source language editable

Since 8.5.0, the Interface translation module provides an easy way to enable interface translation for the site default language. Given that English is the default language: edit the English language and check “Enable interface translation to English”. In a sense, it can be regarded as a successor of String Overrides.


Multilingual migrations are stable (8.9.0, 9.0.0). Migrate Drupal Multilingual module is no longer required.

Translation context

Context can be used to provide or identify variations of the same string.
So it can also be used to group translations for business-specific translations (e.g. custom code) that might not fall in the scope of contributed translations by adding metadata.

This issue aims to provide context filtering in core (> 9.x) so it will replace modules like Locale Translation Context.

Examples of context definition

  • PHP $this->t('May', [], ['context' => 'Long month name']);

  • Twig <span>{% trans 'May' with {'context': 'Long month name'} %}</span>

  • Javascript Drupal.t('May', {}, {context: 'Long month name'});

  • PHP annotation label = @Translation("Month", context="Date"),

POEdit with context

The core API

The core provides great services to deal with translations, like 

Contributed solutions

Translation Management Tool (TMGMT)

TMGMT facilitates the translation process and really shines if you need to export the data structure with the content while outsourcing a translation Job.
It supports a large variety of translation service providers (Lionbridge, DeepL, Google, ...) and provides support for content, configuration and interface translation. These 3 sources are displayed as an overview, so it replaces solutions like Translation Overview.

The beauty of this unified solution is that you can outsource translations with any of the provider (like DeepL), and e.g. create a Job that groups each locale string as Job items.

Translation Extractor (POTX)

The Drupal core interface translation export (admin/config/regional/translate/export) will export basically everything that is available in the interface translation UI. This module provides

  • string extraction from code, instead of strings that have previously been registered

  • filtered extractions as .pot (template) or .po files by language with optional inclusion of the translations.

The Drupal 8 version is still a working in progress, make sure to review these issues:
String context not taken into account when retrieving a translation and Plural translation values are not exported.

TMGMT Source

Disable Language
Filters out disabled languages from the language switcher and sitemap redirects users that don’t have the permission to view disabled languages and much more.

Allowed Languages
Set restrictions on which content a user can edit by language.

Administration Language Negotiation
Its main use case is to allow displaying the frontend of the site in one language and still keep most of the backend in English (or another language of your choice).

Since 8.6.4, this behaviour can still be partially achieved for the backend, with core configuration, by using the Account administration pages as a detection method for interface text.

Paragraphs Asymmetric Translation Widgets
While using Paragraphs, it could be required to have a different structure between the source and the translations. Make sure to carefully test it if you switch from Paragraphs default widgets as data loss might occur if you already have translations. Also, there might be some limitations while using TMGMT.

Drush supports locale check, update and import (see https://localize.drupal.org/translate/drupal8)

Quick tips

Having an overview of the content model (entities and relations) can help to answer accurately some site-building questions:

  • which entities are the subject of a content moderation

  • which ones are translatable (entity and field level)

  • which tools might answer the requirements (Paragraphs, Blocks, …)

  • how does it integrate with site search (e.g. while using Search API)

Troubleshooting common errors:

  • Interface: Check if the string is wrapped in a t() function or trans tag. Reduce duplicates early (interface translation strings are case sensitive).

  • Content: check the translatability of your entities and fields, the Content language UI is a great tool for this (/admin/config/regional/content-language)

  • Configuration: the schema or the .config_translation.yml files might be missing.

Making use of context early for business-specific translations might help later to include translation metadata to export or filter.


More to come on this topic

In the next blog, we'll adapt our content translation flow and delegate this to the Translation Management Tool (TMGMT), where we will tackle different approaches to solving complex issues such as using content moderation for symmetric and asymmetric translation.
Finally, we'll finish off the series with a third instalment to look into how we've approached and solved a complex Gatsby multilingual integration with Drupal 7, and how this can be applied to a decoupled Drupal 8 site. 

Questions? Comments? Want to learn more? Get in touch with us today!

May 20 2020
May 20

More likely, you have a content management system (CMS), which is specifically configured and customized to manage your content. In some cases, this same software will deliver your content to your audience. In more complex cases, the CMS exposes your content over an API to a front-end application and other channels, which present this content to your audience.


echo("It's software all the way down.");

When it comes to the software supporting your website or application, there will always be a tension between building something new and innovative and getting the most out of what you have already built. Twenty years ago, Joel Spolsky wrote about Netscape making the "worst strategic mistake... they decided to rewrite the code from scratch."


While I love this quote (I probably reference this thinking at least a few times a month) perhaps the more important point is made at the end of Joel's article:

"It’s important to remember that when you start from scratch there is absolutely no reason to believe that you are going to do a better job than you did the first time."

Now, this isn't to say that I am anti-innovation or that I am a technology conservative. In fact, I believe that iterations are the best way to make pieces of software more effective, elegant, or efficient. But, far more than this, I hate to see the waste created by under-investment in existing production software.

Talking about something new is way easier than executing maintenance plans on what you already have. And, as it turns out, executing a new development project is often far more difficult and way riskier than you can possibly imagine while you're dreaming up the new idea.

I believe that adopting a maintenance mindset is one of the best ways to delay the risky undertaking of a full web site or web application rebuild. I come across a lot of engineers, communications professionals, and digital marketers who think that web maintenance is something that starts after a development cycle and ends just before a rebuild.

Web Maintenance Graph

When talking to customers and teams, I try to encourage us to give the software being created the upper hand by employing what I refer to as "Engineering for Maintenance". You can think of this as "building it right". Once the software has been built and launched in a way that it can be maintained, you can execute a systematic maintenance plan, which I refer to as "Maintenance Engineering".

Engineering for maintenance

ALGM Web Maintenance

In the best case, "building it right" should be the first step taken towards adopting a maintenance mindset. Building and launching web software in a way that it can be maintained covers multiple interrelated disciplines. Starting from the user experience (UX) and design process, where we uncover the best way to communicate with your audience, through to the content modelling, where we plan a logical structure for your content, to picking the technology stack and developing the custom features required to present the specifics of your brand, organization, or mission.

By building something in a maintainable way, we lay the groundwork for getting the return on our investment for the longest possible time.

We frequently come across maintenance cases where "Engineering for Maintenance" was not considered. There are multiple reasons why this could be the case, running the range from inexperience to unexpected success, to pivot pressure, through to simply negligent practices. In these cases, the "Engineering for Maintenance" often takes the shape of an audit and change implementation plan to lay the foundation of what we need to take the project into an appropriate maintenance plan.

Maintenance engineering

ALGM Web Maintenance Engineering
In the web development context, "Maintenance Engineering" covers all the disciplines of keeping the website delivering the specific value it was created to deliver. This could be from simply communicating a concise brand message, through to an enterprise brand presentation, collecting donations, or selling products.

Maintenance Engineering is not just keeping the software secure and up to date - it is also about keeping features working well, iterating on features that can be changed, and building out new features which can launch on the same software. So Maintenance Engineering also covers the creation of new features to deliver new value as you iterate and grow to further understand your audience.

Consequently, this phase doesn't only involve engineers. It equally involves UX specialists, designers, content specialists, analytics consultants, and project managers.

As you may notice, there is a significant overlap in the activities that are undertaken during a new development stage and those undertaken in a maintenance stage. However, I've found that "Maintenance Engineering" has a somewhat different flavour to that of the engineering undertaken when building new software. This is due to the fact that maintenance engineering looks after a production site - it necessarily takes a more conservative approach, and moves at a different pace.

I don't mean to suggest that things move more slowly in a maintenance phase - in fact, this is often quite the opposite. I rather mean to underline the point that there are often more things to consider with a site that is in production. This is the reason that, at Amazee Labs, we split our team into a new build team and a maintenance & extension team.

Get more from what you have already built

Adopting a maintenance mindset can be a challenge. It sometimes requires a shift in thinking and taking a longer view than what you may have intended. But I believe it can be valuable, and that you can often get more out of what you have already built.

Find a partner that gets it

If you have an internal team looking after your web software, find a product owner, manager, or member of the engineering team that also believes there is merit in maintenance as a mindset, rather than a milestone.

If you outsource your web development, have a chat with your project manager or customer liaison and check that they share your view on maintenance as a mindset. At Amazee Labs, this happens by default in the regular alignment calls between the Project Manager and client Product Owner.

Take inventory

Spend some time with your chosen partner taking an inventory of where you are at with your current site or application. How aligned is your website look and feel with your brand? How well is it telling your story? Is it still effectively selling your products, or collecting your donations? Essentially ask yourself, from a functionality and communication perspective, how far off are you?

Next, take inventory of the software and infrastructure. Is the software out of date, and can it be brought up to date easily? How performant is the site? Does it feel snappy or sluggish?

Take action

Remember, the goal is to get more out of what you already have. There is no getting without doing, so the most important part of trying it out is to take action! The sooner you try to adopt the maintenance mindset, the sooner you'll really know how much more you can get out of what you have already built. In some cases, you may find that the relative cost of maintenance is so high that new software should be considered. In this case, you have learned something, so you're already better off than you were.

But in my experience at Amazee Labs, 9 times out of 10, with some planning and some creativity, we have been able to extend the lifetime of our customers' sites significantly.

Reach out

The team and I at Amazee Labs are passionate about helping you get more out of what you have already built. If you are interested to find out what we do, and how we do it, get in touch. Looking forward to hearing from you.

May 11 2020
May 11

Join us on May 28th for an Amazee Labs Webinar about Test-Driven Development with Storybook and Cypress

Test-driven development is central to any agile development process, and navigating to a browser manually for testing is time that can be spent on more important things. Mastering the testing tools that accompany different languages and frameworks might seem daunting, but it’s one of the most rewarding and efficient things a developer can learn, and we’re here to help.

In our last webinar, we talked about applying Test-Driven Development with Cypress.io to Drupal modules and Gatsby Websites. 

Now we’ll take the next step on our journey to total confidence with testing by teaching you how to mix Storybook into your workflow in order to: 

  • Improve functional and visual test coverage for user interface components 

  • Speed up development 

  • Test without ever touching a browser

Join us on May 28th, 2020 at 04:00 PM -- Register online now! 

Want to review before you join? Check out the resources and full video recording of our last webinar on TDD with Gatsby.

Watch our previous Webinars:

Feb 15 2019
Feb 15

The recent post on Dries’ blog about REST, JSON:API and GraphQL caused a bigger shockwave in the community than we anticipated. A lot of community members asked for our opinion, so we decided to join the conversation.

Apples and Oranges

Comparing GraphQL and JSON:API is very similar to the never-ending stream of blog posts that compare Drupal and Wordpress. They simply don’t aim to do the same thing.

While REST and JSON:API are built around the HTTP architecture, GraphQL is not concerned with its transportation layer. Sending a GraphQL query over HTTP is one way to use it, and unfortunately, one that got stuck in everybody’s minds, but by far not the only one. This is what we are trying to prove with the GraphQL Twig module. It allows you to separate your Twig templates from Drupal’s internal structures and therefore make them easier to maintain and reuse. No HTTP requests involved. If this sparks your interest, watch our two webinars and the Drupal Europe talk on that topic.

So GraphQL is a way to provide typed, implementation agnostic contracts between systems, and therefore achieve decoupling. REST and JSON:API are about decoupling too, are they not?

What does “decoupling” mean?

The term “decoupling” has been re-purposed for content management systems that don’t necessarily generate the user-facing output themselves (in a “coupled” way) but allow to get the stored information using an API exposed over HTTP.

So when building a website using Drupal with its REST, JSON:API or GraphQL 3.x extension and smash a React frontend on top, you would achieve decoupling in terms of technologies. You swap Drupal’s rendering layer with React. This might bring performance improvements - our friends at Lullabot showed that decoupling is not the only way to achieve that - and allows you to implement more interactive and engaging user interfaces. But it also comes at a cost.

What you don’t achieve, is decoupling, or loose-coupling in the sense of software architecture. Information in Drupal might be accessible to arbitrary clients, but they still have to maintain a deep knowledge about Drupal data structures and conventions (entities, bundles, fields, relations…). You might be able to attach multiple frontends, but you will never be able to replace the Drupal backend. So you reached the identical state of coupling as Drupal had for years by being able to run different themes at the same time.

The real purpose of GraphQL

Back when we finished the automatically generated GraphQL schema for Drupal and this huge relation graph would just pop up after you installed the module, we were very proud of ourselves. After all, anybody was able to query for any kind of entity, field, block, menu item or relation between them, and all that with autocompletion!

The harsh reality is that 99.5% of the world doesn’t care what entities, fields or blocks are. Or even worse, they have a completely different understanding of it. A content management system is just one puzzle piece in our client's business case - technology should not be the focus, it’s just there to help achieve the goal.

The real strength of GraphQL is that it allows us to adapt Drupal to the world around it, instead of having to teach everybody how it thinks of it.

Some of you already noticed that there is a 4.x branch of the GraphQL module lingering, and there have been a lot of questions what this is about. This new version has been developed in parallel over the last year (mainly sponsored by our friendly neighbourhood car manufacturer Daimler) with an emphasis on GraphQL schema definitions.

Instead of just exposing everything Drupal has to offer, it allows us to craft a tailored schema that becomes the single source of truth for all information, operations, and interactions that happen within the system. This contract is not imposed by Drupal, but by the business needs that have to be met.

A bright future

So, GraphQL is not a recommendation for Drupal Core. What does that mean? Not a lot, since there is not even an issue on drupal.org to pursue that. GraphQL is an advanced tool that requires a certain amount of professionalism (and budget) to reap its benefits. Drupal aims to be used by everyone, and Drupal Core should not burden itself with complexity, that is not to the benefit of everyone. That's what contrib space is there for.

The GraphQL module is not going anywhere. Usage statistics are still climbing up and the 3.x branch will remain maintained until we can provide the same out-of-the-box experience and an upgrade path for version 4. If you have questions or opinions you would like to share, please reach out in the #graphql channel on drupal.slack.com or contact us on Twitter.

Feb 12 2019
Feb 12

Amazee Labs is proud to sponsor Drupal Mountain Camp in Davos, Switzerland 7-10 March 2019.

Come by and see us in the exhibit area or at one of the social events, and be sure to check out these Amazee sessions: 

On Friday, from 14:40 till 15:00, join Maria Comas for GraphQL 101: What, Why, How. This session is aimed at anyone that might have heard or read about “GraphQL” and is curious to know more about it. The session will give a basic overview and try to answer questions like:

  • What is GraphQL?

  • Is GraphQL only for decoupled projects?

  • Advantages to using GraphQL with Drupal

  • Getting started with GraphQL

Follow this up on Friday from 15:00 till 16:00, with Daniel Lemon who will present Mob Programming: An interactive session. The basic concept of mob programming is simple: the entire team works as a team together on one task at the time. That is one team – one (active) keyboard – one screen (projector of course). It’s just like doing full-team pair programming. In this session you’ll learn:

  • What are the benefits to a team?

  • How could this be potentially integrated into your current workflow

  • The disadvantages to Mob Programming and why it might not work for certain types of companies (such as a web agency).

Additionally, don’t forget to check out this talk from Michael Schmid of amazee.io Best Practices: How We Run Decoupled Websites with 110 Million Hits per Month. This session will lift the curtain on the biggest Decoupled Websites run by amazee.io and will cover:

  • How the project is set up in terms of Infrastructure, Code, Platform and People

  • How it is hosted on AWS with Kubernetes, and what we specifically learned from hosting Decoupled within Docker & Kubernetes

  • Other things we learned running such a big website

Hope to see you in Davos soon! 

Jan 23 2019
Jan 23

What is “Open Source”? Is it really free?

Publishing software under an open source license means that you grant people the right to use, study, modify and distribute it freely. It does not imply that this process is free of charge. The legal framework just ensures that the operator - at least in theory - has full control over what the software is doing.

That being said, charging for open source isn’t common. The simple reason is that it's hard or in some cases even impossible to track where the software is used. Even if the maintainer added some kind of license check, the open source license grants the user the right to remove it, so any effort in that direction is futile.

Most open source developers generate revenue either by relying on donations or charging for support and maintenance. Since they don’t have to provide a warranty for each installation of their code, these strategies can often at least cover their expenses. In some cases, it’s even enough to make a living.

Should my code be open source?

Writing a piece of code that does something useful can lead down three different paths. These three options could be called lazy, crazy and safe. And that makes the decision a lot easier.

1. Lazy: Just keep that piece of code within the project

In the best case scenario, you will remember it if you stumble upon a similar problem four months down the road and copy it over to project B. You will probably find some bugs and do some improvements, but stakes are not high that they will make it back to project A.

In the worst case is that the lines of code are just left and forgotten and the problem will be solved once again, at the cost of the next project B, while keeping the full maintenance costs in project A.

2. Crazy: The solution is super-useful and so fleshed out that you decide to sell it under a propriety license model

Going down this road means serious marketing to achieve a critical mass, providing guarantees and warranty to customers, and paying a host of lawyers to make sure nobody steals the intellectual property or uses it in unintended ways.

This all boils down to starting a*high risk business endeavour*, and in most cases, it doesn’t make sense.

3. Safe: The solution is moved into a designated package

In the worst case, the code just stays in this package and is never re-used. More commonly, it can be picked up for project B, and all improvements immediately are available for project A. The maintenance costs for this part are shared from now on.

And in the best case, this package is made publicly available and somebody else picks it up and improves it in some way that directly feeds back into project A and B.

Advantages of Open Source in an agency

Client Value 

From our perspective as an agency, there is hardly ever a case where open source is not the best option. Our business model is to get the best possible value out of our clients' investment. We achieve that by contributing as much as we can since every line of code gets cheaper if it can be reused somewhere else. Some clients actively encourage us to share projects even in their name and some don’t care as long as we get the job done.

External Collaboration

Our core business value is our knowledge and experience in providing software-based solutions, not the software itself. And as long as our client agrees, we use our position to spark collaboration were it wouldn’t be without us. If we see requirements that pop up across different projects, we can align these and share the effort, which ultimately helps our customers saving money.

Internal Collaboration

Another reason for us investing into open source is our own setup. As a heavily distributed team, information flow and structure is even more important than for co-located companies.
I often see code not being published openly due to tightly coupled design, missing tests, or insufficient documentation.

The investment to increase quality is often billed against “contribution costs” and therefore the first thing to fall off the edge. But it actually is part of “doing your job properly”, since software should also work reliably and stay maintainable if its only used once.

Since proper architecture and documentation become vital as soon as different timezones need to cooperate on a single codebase, contributing has to become the standard process instead of the exception.

Apart from that, threatening developers with publishing their creations has proven to be a terrific instrument for improving code quality.

Open source products

If the produced software, or - more general - produced knowledge, itself is the product or would expose business critical information then it might not make sense to go open source. But even in such cases, interesting exceptions have happened.

Tesla’s heavily discussed move to release all its patents for electric cars to the public back in 2014 is not exactly the latest news. Some praised Elon Musks goodwill, while others called it a marketing stunt. The fact is, Toyota cancelled the partnership with Tesla around the same time and released its first hydrogen fuel cell car. A behemoth like Toyota focusing on hydrogen cells could have become a serious threat to the electric car industry in total. Releasing the patents was a way to strengthen the technology enough to overcome this obstacle. I wouldn’t dare to judge if the undertaking was successful, or if we would be better off with hydrogen cell cars. But this case illustrates how sharing knowledge can be as powerful as keeping it for oneself.

Another example is our sister company, amazee.io, who decided to open source their hosting platform “Lagoon” some time ago. Full transparency on how applications are hosted is a huge deal for technical decision makers, and it becomes a lot easier to gain their trust if they can see what’s going on. Sure, you *could* just grab the code, try to get your hands on some amazee.io-grade engineers, and strap them in front of their computers 24/7 to get the same level of reliability and support. But I doubt there is a legal way to do this with less money than just hiring the creators themselves.

Should everything be open source?

This might ignite some discussions, but I don’t think so. The open source community has suffered a lot from being associated with pure goodwill and altruism. And this has led to serious problems like developer burnout and subsequent oversights that shook platforms as a whole.

The “no license fee” bait did a lot more damage than it helped. There might be no fee, but that doesn’t mean work is for free. Compensation just works through other channels. And if this is not possible, it’s sometimes better to pay for a license than relying on an unsustainable open source solution. 

I personally see open source as a business model that embraces the fact that distribution of information is free. Instead of wasting resources on artificially locking down intellectual property, it focuses on creating actual value. And since I'm making a living of creating this value, I consider this a good thing.

Open Source as a model is one tool that gives us the ability to create innovative and ambitious projects for our clients. Get in touch with us today!

Jan 18 2019
Jan 18

March 2019 sees the return of Drupal Mountain Camp, in the picturesque town of Davos in Switzerland. The call for sessions closes at midnight CET on Monday, 21 January, so be sure to submit your talk today.

We’re proud to be part of the organising team as well as a Gold sponsor for this awesome community run event. We’ve submitted several talks and hope you do the same.

About Mountain Camp

The camp is designed to combine the beauty of the snow-covered Swiss Alps, with the warmth of the Drupal community. It's a perfect combination of fresh tracks for those who ski or snowboard, with inspirational talks by amazing people. This will, of course, be accompanied by some world famous Swiss cheese and chocolate.
The camp takes place from 7 - 10 March 2019, at the Davos Congress Centre.

Davos Congress Centre

Final call for sessions

Call for sessions close on Monday, 21 January, so don’t delay, be sure to submit yours today! 

Along with the great sessions, there will be 2 confirmed keynotes. The first on Friday, entitled "The Future Of Drupal Communities", by Drupal community leaders Nick Veenhofand and Imre Gmelig Meijling, and the second on Saturday by Matthew Grill, about the "Drupal Admin UI & Javascript Modernisation Initiative."

If that sounds interesting and you want to know more about the topic submission process, read on:

How can I submit a session?

  1. So you've got something you'd like to talk about, awesome, here's how you can submit a session:
  2. Head on over to the submit a session link.

  3. Think of a catchy title and fill in the Session Title.

  4. What is your talk about? Try to write 4-5 lines about what you'd like to talk about in the Description textbox.
    Note: you can add images if it helps to portray your talk.

  5. Select what kind of Session Type (how much time) you'd like.

  6. Input the appropriate Tracks - you may select multiple if your talk covers various topics.

  7. Select the Level of Expertise - is it more of a beginner talk or does it become quite advanced with technical terms?

  8. Don't forget to add your Speaker Name and Contact Email.

Session Talk

Why should I submit a session?

Preparing and then presenting helps to entrench your knowledge on the topic. You'll also learn from your peers who attend your talk, through feedback and questions.

Be sure to take note of the following, when considering your topic and submission:

  • Giving a talk will require a lot of work and preparation, but don't let that put you off. It will pay off in the end.

  • People who attend your talk are generally looking for help in your specific topic, so this will be a great time for networking.

  • You'll be noticed and people will tell you that you're cool.
  • Ok, maybe you don't want to be noticed, and maybe you're fine with not being called cool, but you'll definitely have fun talking.

  • You'll feel way more confident afterwards, which might be a good enough boost for you to jump on a snowboard and hit the slopes on the weekend.

Check out some of these great proposed sessions for inspiration:

I hope this has inspired you! Now go ahead and submit your talk and we'll see you in March in Davos, Switzerland. Till then, follow the Camp on Twitter.

Swiss Alps

Nov 28 2018
Nov 28

Once I got a task to fix a bug. While bug itself was easy to fix, I had to find the commit where it was introduced. To describe why I had to do it, I have to explain a bit our development process.

Our branching model

The exact branching and deployment workflow may differ from project to project, but we have two mainstream versions. One is for legacy amazee.io hosting and one is for Lagoon.

Here is the common part. The production instance always uses the latest prod branch. When we start to work on a new task, we create a new branch from prod. When the task is tested and demoed, we deploy it. Separately from other tasks.

We do this to speed up the delivery process, and to make our clients happy.

If a project lives on the legacy hosting system, it usually has PROD and DEV environments. For a task to be tested and demoed we have to deploy it to DEV first.

With Lagoon, we have a separate environment for each task, and this is awesome!

The bug I had to fix was on a project hosted on the legacy system. Also the bug was found on the DEV environment, and it was not present on PROD. So one of the active tasks introduced it (and at that time we had lots of active tasks). I had to find which one.

The bug

An element was appearing on a page, that it should not have appeared on.

The project

The backend is built with Drupal. The frontend is also Drupal, but we used progressive decoupling to embed dynamic Vue.js elements. In between - our beloved GraphQL. No test coverage (nooooooooooooooo.com) yet, but we have a plan to add it with some end-to-end testing framework. Most probably it will be Cypress.


It's a modern e2e testing framework. It has lots of cool features, and some of them, like time traveling, help you not only to write tests but to develop in general. Just watch the 1-minute video on the Cypress website and you'll love it.

Git bisect

This is a very easy and very powerful Git tool. To make it work, you just need to give it three things:

  • a commit where things are good
  • a commit where things are bad
  • a command to test if things are good or bad

The result would be the first bad commit.

Docs: https://git-scm.com/docs/git-bisect

The search

Finally, I can share my experience in combining these two tools.

Since we don't yet use Cypress on the project, I installed it globally on my machine with npm i -g cypress and created cypress.json in project root with {} contents. That's all Cypress needed.

To run Git bisect, I used the following commands:

The my_test.sh was looking like this:

(I actually was lucky that for Drupal I only had to run cache clear after each Git jump. If, for example, there would be Drupal core updates in between bad and good commits, then running drush cr would not work. But in this case I could install Drupal every time from an existing configuration. It would have been a bit slower.)

And here is the Cypress test which I put into the path/to/vue/cypress/integration/test.js file:

It took a little time to set this all up. The result was good - I was able to identify the commit in which the bug was introduced.

Sum up

Modern e2e testing frameworks are easy to set up and use. They can do more than just automated testing. All it takes is some your imagination.

For example, once a colleague of mine had a task to do a content update on a project using an Excel file as a source. One way to do it was to do everything by hand, copy-pasting the data. The other way would be to write a one time importer. But instead, he turned the Excel file into JSON data and used TestCafe to do the click-and-paste job. This was faster than the first two options. And it was quite cool to see the visualization of the automated task - it's so nice when you can see the result of your work.

Nov 27 2018
Nov 27

Over 300 people attended this year, many of them backenders but also frontenders, designers, business strategists, and other stakeholders all coming together to share learnings, experience, and excellent local beers in the city of Ghent.

DrupalCamp Ghent was organised by the Drupal community, and we want to say  thanks to all the organisers for making all of this possible, with a special mention to Peter Decuyper who enlightened us with his sketch notes of the sessions.

It is the essence of camps to make the (difficult) choice between the sessions you will attend, so here are the highlights of the ones that we attended.

The organisers paid extra attention to the relationship between sessions, so many talks nicely complemented each other.

Decoupling and the future of Drupal: about UX, code, design and humans

The position of Drupal is constantly being re-evaluated. One of the values of the Drupal is paying attention to the people. The work of these last months brought one more time the proof of this value by covering a large variety of persona.

Authors and site builders

UX was covered in many ways, Clément Génin has been debunking the myths about user-centric design, and he explained the what by talking about a mindset and not a magic formula that can be applied on an existing project. I perceived his session as a way to build a love story between the designer and the end user.

Cristina Chumillas demonstrated the how by showing us the path that was followed for the Drupal Admin UI since 2017 and what we might expect for Drupal 8.7. If you want to help or just know more about this work, head to the Admin UI & JavaScript Modernisation strategic initiative.


Preston So gave us even more perspective, he started his keynote with the history of the Drupal frontend to continue with the emergence of wearables, digital signage, augmented reality, and conversational UI. Then, he introduced the concept of contexteless / universal editing with a multipolar Drupal that can reduce the custom work needed for decoupling. A good example of this trend is GraphQL. Content is like water: when the shape changes, it should adapt to its context rather than being context specific.

When it is about content, the editor is one of the most important stakeholders. Ruben Teijeiro provided a few answers to problems like page refresh, too much site building, or keeping the link between content editing and decoupling. Among other solutions, he mentioned modules like Elementor, Content Planner, Glazed Builder or Editable.


Dries Van Giel gave us an introduction to Sketch, a fully vector-based tool suited for web design, that leverages features like components (symbols), shared styles among documents and element export in multiple formats. This meets the current approach of component-based design (like Pattern Lab or Fractal does) and reusability.


GraphQL is all the rage nowadays, Peter Keppert talked about

  • When to use decoupling: multiple frontends for one CMS, Single Page Apps, …
  • The benefits of using GraphQL for that purpose: a self-documented schema, that is strongly typed and that allows to cache queries in the database.
  • The points that need attention compared to other solutions: possible information disclosure and the complexity that induces a change on the team.
  • The integration in the Drupal contrib ecosystem with Paragraphs and Box


Fabian Bircher explained how the Configuration Management (CMI) has evolved since Drupal 8.0. At the time, it was designed to cover the basic flow of deploying without modifications. Contributed modules have implemented several other use cases like configuration split or ignore, Drupal 8.6 added the installation of a site from a given configuration and Drupal 8.7 will introduce the new ConfigTransform service. Using Drupal as a product can also be implemented with the Config Distro module.

With his typical sense of humour, Branislav Bujisic gave us an introduction to Functional Programming. The foundation of his session was a comparison between Alan Turing states and Alonzo Church functions. He introduced concepts like immutability, static typing, and side effects elimination to improve testing and caching (memoization), with a control over complexity and more performant code. Even if PHP is not a functional language, a few of these principles can still be applied. Truly inspiring!

Testing and code quality

If you are looking for a way to contribute back to the Drupal, a lot of core and contributed projects needs manual testing. Just have a look at the 'Needs review' status on the Drupal issue queue. Automated testing is also welcomed, Brent Gees gave us all the keys to get started seamlessly with Unit, Kernel or Functional tests in his presentation How to get started with writing tests for contrib.

When it is about client work, the time that can be spent on tests may be more limited, and the approach is more about testing the assembly of components, so a pragmatic solution is to use fast Functional Testing with solutions like Behat. Tom Rogie showed how to configure Behat for several environments and browsers in a Continuous Integration workflow, but more importantly, what to test.

Improve easily the quality control tomorrow in your projects. Yauhen Zenko provided a nice way to run tools like PHP Linter, coding standards compliance and mess detection, wrapped in a Composer based solution.


Joris Vercammen covered the best practices for Search API configuration, demonstrating in the meantime that most common use cases can be covered by a plain database server.

For a live demo, head to http://drupalsear.ch, that exposes most Search API features with the new Drupal Umami profile.

Advanced topics like machine learning and AI were illustrated by the maintainer of the Search API Solr Search module and the Solarium library, Markus Kalkbrenner with streaming expressions, graph queries and the inner workings of the Solr, sweet!


Serverless is a buzzword that can lead to confusion. Robert Slootjes explained it with Functions as a Service (FaaS) and the action of removing the hassle of server provisioning and scaling.

Thijs Feryn, the author of a Varnish book, adopted the perspective of caching by diving deep into the http protocol. It was nice to get detailed explanations about the foundations of the web and the Symfony framework. The session was also demonstrating that Drupal already implements most of the best practices regarding caching.

It was awesome to see how many things can be learned in such a small amount of time, and we are already looking forward to the next edition!

Dinner closing

Nov 21 2018
Nov 21

For the past ten years, the Drupal community organises a yearly DrupalCamp held in various cities of Belgium. This time, it will take place in the lovely city of Ghent.

As usual, the organisers are broadening the audience of this event with content aimed at developers, designers, site builders, and business strategists. They also contribute to this goal by maintaining low ticket prices.

The sessions are raising the bar too, with hot topics such as search, accessibility, functional programming, chatbot, testing, GraphQL, and serverless.

I’m excited to take this opportunity to enjoy the community, expand upon my knowledge of the Drupal ecosystem, and prove once and for all to my fellow Amazees, Dan and Vijay, that there is no comparison between Belgian and Swiss chocolate.

View the full programme here.

Oct 31 2018
Oct 31

Join us on November 5th for the Zurich Drupal Meetup at the Amazee Labs Zürich office.


  • The File Management Module for Drupal 8 - Lightning talk + Q&A by David Pacassi Torrico
  • Outlook Drupal Switzerland Activities 2019 - Discussion by Josef Dabernig (Amazee Labs)
  • Propose your topic in the comments!

General Information 

The Zurich Drupal Meetup is dedicated to people interested in the Content Management System & Framework Drupal.

We welcome everybody from beginners to Drupal ninjas and would be happy to see you present a recent project of yours or talk about any other Drupal-related topic.

Talk Formats

  • Lightning talk (max. 10 minutes)
  • Short talk (max. 25 minutes)
  • Full talk (max. 45 minutes)

If you would like to join us, sign-up here: https://www.meetup.com/Zurich-Drupal-Meetup/ 

Oct 23 2018
Oct 23

A full rebuild of a website can be a time consuming and expensive process. Upcycling is an incremental approach to relaunching existing websites. This blog will explain more about what upcycling is and why it might be the right choice for your website

Why upcycle?

Most websites will be rebuilt every three to six years to keep up with online trends, because of technical debt, or simply to refresh their appearance. At Amazee Labs, we have helped many clients transition from their legacy web systems onto Drupal 8 but not everyone is ready to do the move all at once. This is where upcycling can come into play.

As upcycling is intended to be an incremental approach it might not be suitable for every use case or every client. Upcycling de-prioritizes the “one-big-bang-launch-wow-effect” and allows us to partner with our clients to meet one primary goal: reduce time to market for big website improvements and maximising the value of time spent.

When to upcycle?

If you have a well-established web system that has been operational for several years, and you aren’t ready to spend the time and money to do a full rebuild, upcycling might be the answer.

Upcycling Process

As you can see, upcycling can be performed at any stage of an existing web project. Depending on the size of the upcycling project, we might transition from the maintenance and extension mode back to implementation. Alternatively you might do a smaller upcycling project within the maintenance & extension cycle. Large upcycling projects will often mean moving all the way back into a conceptual consulting & discovery mode before we start implementing new features or functionality.

What to upcycle?

We’ve designed an upcycling questionnaire to guide the conversation with the customer with regards to different aspects of the website. Although these are common areas for upcycling, we use this questionnaire as a starting point to discuss what will be the best fit for each project.  

Upcycling Areas

For each of these upcycling areas, we have a set of questions to validate the potential and need for upcycling. For example, when we talk about design we would ask if the look and feel of the website is perceived as outdated or if there are any inconsistencies within the current design implementation.

If we identify an area that could benefit from upcycling, we will provide a set of recommended steps for improvement. In this case that might be a design refresh, establishing a design system, or rebuilding the frontend.

We also provide upcycling case studies to show our clients what is possible with upcycling, and help build on their ideas to improve their website without starting from scratch.

How to upcycle?

Upcycling demands that we are in a position to split things up.

An example is Sonova.com. The main website has been running on Drupal 7 since 2014. Last year, we started relaunching individual country pages using Drupal 8. These new pages allow the content managers on the client’s side to benefit from the better editorial features of Drupal 8 early on without needing to wait for a relaunch of the entire website. Gradually we keep relaunching country page by country page on Drupal 8.

Upcycling Sonova Drupal 7

Sonova Country Page Version in Drupal 7

Upcycling Sonova Drupal 8

Sonova Country Page Version in Drupal 8

The next step in upcycling this site will be a relaunch of the main website on Drupal 8. When we are ready for that step we can build upon the incremental steps we started for the country pages.

As well as the additional editorial features, we also worked with the client to choose a different Drupal theme. . This means sites running on Drupal 7 feature a different design than the sites running on Drupal 8. So instead of merely optimizing for consistency across all country pages, together with the client, we chose to allow to innovate and bring newer design versions to the local markets without waiting for the relaunch of the whole site.

How does upcycling relate to decoupling?

If your site has some complex backend logic that you don’t want to rebuild but you are eager to relaunch the frontend, upcycling could be the solution. Usually, we would relaunch the frontend within Drupal’s theme layer. But in certain cases, it makes sense to relaunch the frontend as a decoupled site and then integrate the existing backend. We recently did this for a customer that wanted to get started with Drupal 8 but had some complex Drupal 7 Backend logic that needed to be maintained.

On the other hand, if the backend really needs an overhaul and you want to keep the existing frontend without rebuilding it, upcycling could work for that too, after decoupling the backend.

Decoupling your architecture will enable you to upcycle individual parts and bring value to the end user faster but it also comes at a price of added complexity. In the end, it’s important to compare the advantages and disadvantages

Pros of upcycling Cons of upcycling

Get the most out of your existing website infrastructure

Benefit from user experience, design or frontend performance improvements without the need to wait for a big relaunch

See your investments as quickly as possible

Potentially added complexity when maintaining two systems at once.

Potentially inconsistencies in the appearance if sections are upgraded separately.

Partly you need to invest into a legacy platform rather than spending everything on the new one

More details on upcycling can be found in this presentation.

What’s your experience & challenges when it comes to upcycling? Do you have an existing project that you would like to improve? Let us know in the comments or reach out via the contact form.

Oct 11 2018
Oct 11

The second Amazee Labs webinar took place last Friday, 28th September 2018. Philipp Melab gave a stunning presentation on “Atomic Design in Drupal with GraphQL & Twig”. Here's a short recap of what we learned together.

We kick-started the webinar with a summary of what we learned in the first webinar, in case you missed that you can read up on it here. This time our focus was to build a real-world example website for a fictional web agency called Amazing Apps.

Amazing Apps design

Philipp wanted to pack as much information as possible into the webinar, so he set up a Github repository with everything you need to get started. We were shown a brief design of the end goal then jumped straight into the meat of the presentation by dissecting the git history of each commit in the repository together.

Clean, concise, & a well-structured frontend.

Fractal is a tool to help you build and document web component libraries and then integrate them into your projects. We were led through the basics of what Fractal provides as a starting point. Then we jumped through the repository to a point where we had a couple of components built, along with colours defined using CSS variables along with demo text content.

As part of Atomic Design, we explored and learned the use of atoms, molecules, and organisms. Atoms demonstrate all your base styles at a glance, such as a logo or a button. Molecules are UI elements containing two or more atoms functioning together as a unit, such as a menu. Organisms are relatively complex UI components containing multiple molecules, atoms, or other organisms, such as the header or footer.

fragment Menu on Menu {
  links {
    url {

Once we got to the menu component, we were treated with the first GraphQL fragment, from here we could navigate up the templates from molecule to the header organism, and then to the page layout template which called the twig block named header. We can then override these blocks with the use of the twig tag extends to inject our Fractal based templates as necessary along with our GraphQL fragment.

GraphQL Twig should be used to decouple things where it makes sense, building a fully decoupled solution still costs a lot regarding development; therefore GraphQL Twig is the right solution to enhance and modernise a site in a feature based manner.

Learnings as a webinar host

It was our second webinar, so we had a few learnings from our first edition which we incorporated into the new session. We made sure to start earlier with the marketing campaign to ensure a good turn out, and ideally a larger audience; we ended up with over 40% increase in the total audience!

Check out the Github repository and accompanying videos:

Amazee Labs would like to thank everyone who attended the live session, we enjoyed being able to share this with you, and we look forward to hosting another Amazee Labs webinar in the future.

You can watch the entire webinar here:

[embedded content]

Sep 19 2018
Sep 19

So here we are, post-Drupal Europe 2018. Talks have been given, BOFs attended, way too much coffee and cake have been consumed, and now I’m tasked with summarizing the whole thing.

The problem faced by anyone attempting to wrap up the whole of an event as momentous as Drupal Europe is that you have two options. On the one hand, you can give a fairly anemic bullet-point summary of what happened and when. The advantage of approaching a summary like this is that everyone who was at Drupal Europe 2018 can look at the list and agree that, “yes, this is indeed what happened”.
Fair enough. Maybe that would be a better blog?

But that’s not quite what I’m going to be doing since (as you’ll find in the links below) my colleagues have done a stellar job of actually covering each day of Drupal Europe in their own blogs. What I’m going to do, rather, is tell you about my Drupal Europe. And my Drupal Europe was far less about talks and BOFs (and coffee and cake) than it was about the people in the Amazee Group and the Drupal community in general.

Reasons to get off the Island

For background, I live in a smallish town (we have a mall and everything) down here on the South of the North Island in New Zealand. Getting myself to Darmstadt involved nearly 30 hours in those metal torture tubes we commonly call “airplanes”. Under most circumstances I’d avoid this kind of travel, but Drupal Europe was an exception because it presented me with the one opportunity I had this year to spend time with and around my teammates in Amazee Labs Global Maintenance specifically, and the rest of the Amazees at the conference in general.

I came to Drupal Europe in order to have the kind of high-bandwidth conversations that (very) remote work almost never allows. It allowed me to meet some of my colleagues in person for the first time, in some cases people who I’ve been speaking and interacting with online for more than a year. Outside of the hours of strategic meetings we all had, it was a joy spending time sharing screens IRL and looking at code, eating kebab (so much kebab), and (wherever we could) doing a bit of real work in-between.

And while my reason to get off my island was really my colleagues at Amazee -- being present, alongside, and with them -- the importance of the wider Drupal community is not lost on me and attending Drupal Europe highlighted to me, once again, just how special that community is.

Beer hall at dusk

We’re hiring, by the way.

In her deeply moving talk about her journey from being a freelancer to being the Head of Operations for ALGM, Inky mentioned the principle of Ubuntu. This ethical and metaphysical principle is often rendered in English as “I am because we are”. In one interpretation, at least, it suggests that our existence as individuals is inextricably intertwined with the existence of others. I think that something like Ubuntu is true of both Amazee and the wider Drupal community.

What makes Amazee special is the remarkable individuals that comprise it, indeed, I doubt I would’ve been as enthusiastic as I was to travel so far if they weren’t remarkable individuals. But I have to wonder whether those individuals would shine quite as brightly in any other company? Amazee gives us the space to be the best we can be and whatever shine we have as individuals makes Amazee glow that much brighter.
Zooming out a little, Amazee, as an organization, would not exist as it does without the wider Drupal community. And the Drupal community would be poorer, at least in my opinion, without the work that Amazee does.

It’s circles within circles within circles, each strengthening the other.

Showing your work.

This was a theme in the Amazee talks at Drupal Europe. Stew and Fran, in their discussion of Handy modules for building and maintaining sites ended things off with a note encouraging everyone who manages to solve a Drupal problem to consider how they might contribute it to the wider community. Indeed, Basti made this the theme of his entire talk, discussing the benefits of open sourcing your work and the material advantages the IO team has experienced by open sourcing their platform, Lagoon. And in terms of open sourcing code, Stew’s talk on Paragraphs has already lead to the creation of a brand new Drupal.org module from an internal Amazee project. Is this an example of upcycling, hmm, Josef?

Stew and Inky, showing their work.Stew and Inky, showing their work.

We’re off the Island now, time to go farther.

Speaking of circles, in some respects the move in the Drupal community in the past few years has been to expand our circles even further into the wider programming communities. Drupal 8 adopted much “external” code from the supporting PHP communities. But to some extent, we’re moving even further away from the Drupal island than simply playing-nicely with the PHP community. Decoupling Drupal, a major research topic right now, is at least in part about getting Drupal to be less monolithic, for it to serve content to systems and in contexts that aren’t necessarily Drupal specific. It’s no exaggeration to say that Amazee is ahead on the curve on this, as was evidenced by Michael and Philipps' talks. Michael discussed the “implications, risks, and changes” that come from adopting a decoupled approach, while Philipp simply dazzled a packed room with his demonstration of staged decoupling with GraphQL integration into Twig.

Drupal Europe art installation

This was Drupal Europe.

This was Drupal Europe. Not just talks, or coffee, or BOFs, or the (delicious) lunches. Rather, it was the opportunity to really dive in, experience, and behold the interlocking circles of individuals, friends, companies, and community that holds this sprawling structure we call the Drupal ecosystem in place. To get a sense where we are and where we’re going.

Previous Drupal Europe Blogs

Sep 14 2018
Sep 14

Vijay tells us about the fourth day's highlights in Darmstadt, Germany.


The 4th day of Drupal Europe began with a discussion by a panel made up of Dries Buytaert, Barb Palser, Heather Burns, Hurley Mautic, and Timothy Lehnen, about the future of the open web and open source. Some interesting points were made, especially how we have the responsibility of making open source better, and how we can better protect the four software freedoms principles.

First session

Decoupled Drupal: Implications, risks and changes from a business perspective

Next up was our very own Michael, who gave a presentation on Decoupled Drupal. Some interesting points were made in this presentation. As a developer I love the fact we can experiment with technology, however, I never really gave a second thought about how this can have an impact, both for the company and potential clients. Decoupling for sure has success and failures that we all are going to experience. For example, time to train the team to be up to date with the latest technology and with this come cost. In the end, however, it is an investment. One clear message from this presentation that I took was we should expect failure, and we should not get discouraged by it, but rather learn from it. We should also celebrate the success.

JavaScript Modernisation Initiative

The third presentation I went to was the JavaScript Modernisation Initiative, presented by Lauri Eskola, Matthew Grill, Cristina Chumillas, Daniel Wehner, and Sally Young. As a contributor to this initiative, it was great to hear how this idea came about as this was something I didn't really know. I came to learn that it all began at DrupalCon Vienna, where the idea of how to create a decoupled backend, with a redesigned, and modern administration experience in Drupal came up. As of now, the product is clearly in the prototype stage, with plans to remove the current implementation of Material UI and update using the design created by Christina, which is in the early stages of concept. If you would like to get involved in this initiative, you can find out more on the Drupal website.

Improving the Editor Experience: Paragraphs FTW

After lunch, it was time for Stew to give his second presentation of the week, this time on his own. His presentation was all about paragraphs, a beginners overview of using paragraphs to make the editors experience more fun. Stew went on to explain how to give more control over content layout, and the pros and cons of some of the contrib modules that support paragraphs. Even though this presentation was about Paragraphs, Stew did mention that there were other alternatives to this great module. Way to go Stew, two presentations in one week.

Stew's session

Decoupling Drupal with GraphQL & Twig

The final presentation I attended was by Philipp. He explained what GraphQL is and what it is not, and how much more it can do, such as Search API indexing, and feed Twig templates. One exciting part of this session was the reuse of fragments, meaning you can write one query and reuse it across many templates. It is clear to see why GraphQL is very popular, however, one interesting point that was brought up was that it isn't the same as injecting SQL into Twig. Phillip responded by saying a GraphQL query is not something that is executed, it is a definition of requirements, which you request from the implemented backend. Phillip also thanked Sebastian Siemssen, who happens to be both a core maintainer of the GraphQL module and an ex amazee.

Phillipp's session


After the conference, we headed back to the hostel to refresh and then headed out to eat for our final night in Darmstadt. After that we headed back to the venue for trivia night, this was my first time at trivia night, and it was full of fun, great people, atmosphere, food and drink, and great questions. After six rounds of questions, lots of laughter, and a small hiccup with their Google doc, the scores were tallied, and team 16 had won first prize, of which included Stew and Mostfa.


You could also say that Day 4 was pretty “Amazee-ing” with lots happening with our team. Congratulations to all from everyone at Amazee, both at the conference and those left behind.

I would also personally like to thank the Drupal Association for giving me a diversity ticket without which I would not have been able to attend this great conference and have a week of both excellent presentations and being able to continue to contribute to great initiatives.

Sep 13 2018
Sep 13

Mustapha tells us about the third day's highlights in Darmstadt, Germany, and some exciting announcements!

Drupal Europe 2018 - Wednesday 

The third day of Drupal Europe was a big day, we had the prenote and the Driesnote with some exciting announcements, the group photo, and a lot of interesting sessions.

The Prenote:

Our big day started at 8:15 with the prenote, which is very important because it shows you how awesome this community is. We were singing together and laughing very loudly about some "geek" jokes which would seem strange to others but not to us because we are living those jokes each day. The prenote is important because it makes you feel that you're not lonely, but you have all this family from around the world.



At every Drupal conference, Dries Buytaert, the leader of the Drupal project, shares updates on the current state of Drupal and makes some announcements on the way forward.


He firstly spoke about the Drupal 8.6 release which has some great content management improvements, which can be discovered here. Then the announcement party started and here are some of the highlights: 

- The adoption of React and JSON API to build a new decoupled administration UI.

- Drupal 9 will be released in 2020.

- Drupal 7 will have an end of life by 2021.

- Drupal 8 will too have an end of life by 2021, but it will be an easy upgrade to Drupal 9.

- Drupal.org <3 Gitlab: drupal.org code will be moved to Gitlab.

- There will be a Drupalcon next year organized by the Drupal Association and it will be held in Amsterdam.


After those exciting announcements, everybody went outside the Darmstadtium for the Group Photo which was taken by our own, Josef Dabernig.

Talking about Josef, he had a great a session entitled "Upgrading vs. Upcycling - How to stay ahead of the curve". It covered the life cycle of a Drupal project, how to audit your Drupal website, and which improvements you can propose to clients.


Last but not least, after such an exciting day, we went to do our Amazeeng "Team Dinner" and finished off our big day with lots of fun.

Team dinner

Thursday's Program: 

Thursday's speakers:

Sep 12 2018
Sep 12

Maita tells us about the second day's highlights in Darmstadt, Germany!

Tuesday, Day 2: Drupal Europe 2018.

From day one I was blown away by Darmstadt and seeing so many of the Drupal community all in one place. This is my first big Drupal event and I'm so glad to have this opportunity. 

One of the things I struggled with, was choosing which talks to attend. How do you know which one is most beneficial? I did my best to choose sessions I found interesting, and listening to Benjamin talking about Dynamic Virtual Reality apps with decoupled Drupal did not disappoint.

Drupal Europe

After the first session, we had a long lunch break and I can happily say that the food was amazing. We shared some Drupal stories over lunch and found that everybody has a different story. That's what makes the whole event and the Drupal community, so special.

Drupal Europe

After lunch, Tim and I attended Willy Wonka and The Secure Container Factor where Dave Hall gave us great tips on how to make our containers more secure and how to shorten our build steps. We also got to indulge in a small chocolate treat. We have recently started using Docker, so I felt attending this session would shed more light on it, especially when there's a lot of chocolate involved.

Drupal Europe

The next session, after lunch, was Fran and Stew's on choosing the right modules to install when building and maintaining Drupal websites. They gave a list of modules which were a must to install, as well as a detailed explanation of their benefits.

Handy modules when building and maintaining your site 

Drupal Europe

Another session that happened around the same time was Basti's. He talked about the benefits of open sourcing code. He mentioned how last year they (amazee.io) open-sourced all the code that runs in their production environment for everyone to see and contribute, and how it made a huge difference for their team and managed to speed up development. His main point was to encourage people to start getting open to the idea of open sourcing their code.


The last session of the day was from Inky. She gave a very intimate talk on her journey to becoming the Head of Operations for the Amazee Labs Global Maintenance team. She talked about the good, the bad and the ugly and how all that made her the most suitable person for the job.

Drupal Europe

There were a lot of other happenings throughout the day, that I, unfortunately, didn't manage to attend. Next time I am going to get myself a Time Machine so I can be in two places at the same time and I don't have to miss out on anything. Until then, I'm grateful that each session is recorded. See you tomorrow, Drupal Europe!

Wednesday's Program: 

Wednesday's speakers

Sep 11 2018
Sep 11

Stew gives an account of the first day's events in Darmstadt, Germany: team bonding, furniture assembling, and community.

Monday, Day 1: Drupal Europe 2018.

Picturesque buildings from history surrounded me, as I looked out the window of our hotel room at Hessenpark Open Air Museum, just north of Frankfurt. The Global Maintenance team workshop weekend was about to draw to a close and a week of Drupal Europe was about to begin. This would be my first international Drupal event, ever!

Hessenpark Museum

Our morning started with a delicious continental breakfast and strong coffee. The team went for a productive 'walk and talk', discussing ideas about our international and growing team, multiple projects, and how we will manage maintenance and new features.

Hessenpark Natural amphitheatre

From there, we all took off for Darmstadt, saying goodbye to Kathryn and Kristy (who were headed back to Austin, Texas) and Jason and Ltisch (heading to Zurich).

Walking into the enormous conference venue was a treat, seeing all the sponsors and companies putting up their stands and displays. I even got to see Dries walk by. I didn't think I'd be such a fanboy, but my smile was brimming. While Tuesday is the official starting day, there were several workshops already occurring when we arrived.

Darmstadt stadium

We constructed our Amazee lounge and it almost became a team building exercise. We got to use the trusty tools we brought all the way from South Africa to open boxes but found that well, everything was included. Of course, reading the IKEA manual helps...

The Amazee Global Maintenance team setting up the Lounge

After that, we got served a delicious lunch at the venue and then caught up with some work and conference preparations.

Amazee LoungeThe end result: The Amazee lounge couch.

We found our DJH Youth Hostel and got checked in, only to quickly go out again to share a Thai dinner which was very, very good.

As I complete this blog I am sitting next to Amazees from around the world: Mustapha (Tunisia), Michi (USA), and Basti (Switzerland). It's a great feeling when the team gets together from around the globe and I can't wait to see what tomorrow will bring. Check out some Amazee sessions and stop by our lounge to kick back and relax.

Tuesday's Program: 

Tuesday's Speakers:

Aug 31 2018
Aug 31

Agile Lean Europe is an unconference event that visits a different country each year and brings together people from across the continent. Agile practitioners and thought leaders come together for a 3-day event to share ideas around the topic of power transformation.

Being a regular DrupalCamp attendee, I was excited to visit a more Agile-focused conference. Agile Lean Europe uses Open Space Technology to facilitate an environment where ideas are created and shared in real-time and that allows everyone to contribute to the program. Steve Holyer, who is a familiar face at Amazee facilitated or “held” the open space.

The Kraftwerk Innovation Space that hosted the event served as an excellent environment for up to a dozen parallel sessions. In contrast to my usual conference experience, the unconference format was much more interactive in the way the sessions were presented. Let me walk you through a few of my highlights.

Agile Lean Europe Crowd

Manuel Küblböck from Gini did a session about “Decision-making when there are no bosses”. His view is that efficiency can be gained by focusing on consent over consensus. Rather than trying to have all parties agree on a proposed change, you emphasise working out strong objections and then test the idea. Decisions are always made by those who carry them out and people who are impacted can give input. Check Manuel’s tweet for further information.

Momchil Brashnayanov from FFV presented a session on agile for service agencies, which was a great opportunity to discuss common challenges such as, how best to integrate customers into our workflows, how to focus on value instead of features and what contract frameworks best support agile collaboration. Like a couple of other sessions, Momchil applied the lean coffee format where we all gathered topics, clustered them and then iteratively discussed them and decided when to move onto the next topic.

Agile Lean Europe 2018

Peter Stevens shared his insights and tools around personal agility. It was refreshing to see agile principles being applied to one’s personal life. Similar as in a project, life goals would get mapped to tasks, prioritized and moved along a kanban board. More information can be found on the related website.

Nicola-Marie O’Donovan discussed her experience working in a scaled agile environment and enabling teams to do a social plan roadmap together. Each team defines their vision and team projects upfront and then all gather to discuss the dependencies they have across teams. The teams then use the space to resolve all dependencies and get ready for implementation. She also has a medium article regarding that.

Agile Lean Europe 2018

I particularly enjoyed the Clean Language session by Olaf Lewitz. In a very interactive way, we experienced how different and difficult it can be to apply such a tool. Clean Language enables you to have a conversation while trying to inject as little of your own views into the question and focus fully on your conversation partner. In other words, instead of reflecting things, the focus is on getting a better understanding of the topic from the perspective of your interview partner. More information available.

In addition to the talks mentioned above, ALE18 had a lot more to offer. The keynote “Going where no one has gone before” on Yle’s agile transition by Mirette Kangas and Antti Kirjavainen from Finland, was full of inspiration and good advice, and so too was the [email protected] session by Paolo Sammicheli.

Agile Lean Europe Participant

I particularly appreciated the side program that was put together. Conference attendees partners and kids would organize a city program. Random attendees were able to connect in the evenings by signing up for “Dinner with a Stranger”.

The 3-day conference left me with tons of insights to the process. A few notables being tools like Powerful questions, or Minimum Viable Bureaucracy, models such as the Satir Change Curve or books such as The Preservation of the Agile Heart.

Agile Lean Europe 2018

Interested in more visuals from the conference? The ALE18 Official Album contains a good number of impressions. Also, check out the #agilekaleidoscope by Gaël Mareau and my album on Flickr.

Agile Lean Europe 2018 was all about power transformation. One key insight for me was that Agile is less about following a strict process, and more about you being able to adapt a variety of tools to your current challenges and requirements.

Thank you to everyone who participated and contributed and for making me feel transformed. Special thanks to the organizing team and sponsors who supported the event!

Curious about the next one? Follow Agile Lean Europe on Twitter.

Aug 30 2018
Aug 30

Drupal Europe is coming up in Darmstadt September 10-14. Let me walk you through my favorite aspects of the conference.

This large-scale conference is entirely community-organized by a very dedicated team of volunteers. Drupal Europe will feature 187 sessions within a wide variety of topics. Diversity was also a goal that the program team worked for and 30% of the sessions have a speaker that identifies being part of an underrepresented group.

The organizers decided to give an interesting twist to the program. In contrast to traditional categories like “Development”, “Design” or “Project Management”, industry tracks are in focus. Drupal Europe, therefore, should be much more appealing to the decision makers who want to evaluate or share their experience using Drupal.

Publishing + Media focuses all on the media industry and what Drupal can bring to the table there. Expect case studies around Burda’s Thunder distribution or Content editing related sessions. I’m looking forward to seeing Building high-performance Thunder sites by former colleague Wolfgang Ziegler.

From Amazee our colleagues Stew West and Fran Garcia will talk about Handy modules when building and maintaining your site as well as Improving the Editor Experience: Paragraphs FTW.

Drupal Europe 2018

Digital Transformation + Enterprise is all about large-scale clients & projects leveraging Drupal. Thursday’s keynote brings together Dries Buytaert, the original creator and project lead for Drupal, together with Barb Palers, Global Product Partnerships at Google and Leslie Hawthorn, Developer strategy at Red Hat talk about the Future of the open web and open source. Along with many other sessions in this track, I will share my approach on Upgrading vs. Upcycling - How Existing Web Sites Stay Ahead the Curve for those interested in Agile, Architecture and Web strategy.

Drupal + Technology is kind of the classic DrupalCon track focusing all on Drupal-technology related topics that don’t fit into one of the industry tracks. Amongst many others, I’m am looking forward to see Out of the Box is Out of the Box! to get an initiative update from Marc Conroy, Keith Jay and Elliot Ward. Join Amazee’s Philipp Melab for his session on Decoupling Drupal with GraphQL & Twig where he will dive into APIs, GraphQL, Theming and Atomic design.

Drupal Community discussed everything that keep our ecosystem running. I am particularly excited to join the 2 hours workshop by Shyamala Rajaram, Parth Gohil and Donna Benjamin on Building Local Communities - foster Drupal adoption. From Amazee make sure to see Ingrid Talbot in Learning to lead, after a life of going solo for some inspiring learnings on here transition in Being human & Leadership. As well as Bastian Widmer who will explain how to Opensource your daily work Docker Drupal Open Source.

Agency + Business touches on all the advice we need and want to share related to running a Drupal business. How to boost your team members performance by Marina Paych sounds like a very promising session exploring talent development and team culture. I’m humbled to host a panel discussion with Nick Veenhof, Ela Meier, Suzanne Dergacheva, Andre Baumeier, Steve Parks, Ashraf Abed. Together we’ll discuss Hiring Drupal Talent - A Recruiter's Dilemma Panel.

Drupal Europe 2018

Make sure to check out all the other industry tracks on the program page. The Open Web Lounge also serves as a great opportunity to connect with like-minded open source communities thanks to partners such as CMS Garden, Open Source Initiative, Wordpress, Joomla, Contenido, and Typo3. Got something to share?

Remember to schedule an Informal gathering (BoF) or sign up for Contribution, mostly focused on Monday and Friday where you can experience the real power of open source - being part of it.

We are looking forward to seeing you in Darmstadt. Enjoy the conference!

Aug 07 2018
Aug 07

With GDPR in full effect, sanitization of user data is a fairly hot topic. Here at Amazee we take our clients and our clients’ clients privacy seriously, so we have been investigating several possible approaches to anonymizing data.

In the Drupal world, and the PHP world more generally, there are several options available. Here, though, I’d like to discuss one we think is particularly cool.

At Amazee Labs’ Global Maintenance, we work with several different projects per day. We move data from our production to staging and dev servers, and from our servers to our local development environments. Especially on legacy systems, site-specific configuration details often exist only in the databases, and even if that weren’t the case, the issues we’re investigating routinely require that we dig into the database as it (more or less) is on the production servers. Anonymization is crucial for our day to day work.

So our considerations here are, how do we balance productivity while keeping things anonymous?

One way of achieving this is to make Anonymization transparent to the developer. Essentially, we want our developers to be able to pull down the live database as it exists at the moment that they pull it down, and have it be anonymized.

How can we achieve this?

Well, one way is to analyse the daily workflow to see if there are any points at which the data has to flow through before it reaches the developer?

It turns out that, if you’re working with mysql, this “final common path” that the data flows through is the mysqldump utility.

If you’re running backups, chances are you’re using mysqldump.

If you’re doing a drush sql-sync there’s a call to mysqldump right at the heart of that process.

Mysqldump is everywhere.

The question is, though, how do we anonymize data using myqldump?

The standard mysqldump binary doesn’t support anonymization of data, and short of writing some kind of plugin, this is a non-starter.

Fortunately for us, Axel Rutz came up with an elegant solution, namely, a drop in replacement for the mysqldump binary, which he called gdpr-dump. A few of us here at Amazee loved what he was doing, and started chipping in.

The central idea is to replace the standard mysqldump with gdpr-dump so that any time the former is called, the latter is called instead.

Once the mysqldump call has been hijacked, so to speak, the first order of business is to make sure that we are actually able to dump the database as expected.

This is where mysqldump-php comes in. It’s the library on which the entire gdpr-dump project is based. It provides a pure PHP implementation of mysqldump as a set of classes. On its own, it simply dumps the database, just as the native mysqldump cli tool does.

A great starting point, but it only gets us part of the way.

What we’ve added is the ability to describe which tables and columns in the database being dumped you would like to anonymize. If, for instance, you have a table describing user data with their names, email, telephone numbers, etc. You can describe the structure of this table to gdpr-dump and it will generate fake, but realistic looking, data using the Faker library.

This requires some upfront work, mapping the tables and columns, but once it is done you’re able to call mysqldump in virtually any context, and it will produce an anonymized version of your database.

There is still a lot of thinking and work to be done, but we think it’s worth investing time in this approach. The fact that it can be used transparently is its most compelling aspect - being able to simply swap out mysqldump with gdpr-dump and have the anonymization work without having to change any of the dependent processes.

If any of this piques your interest and you’re looking for more details about how you might be able to use gdpr-dump in your own workflow, feel free to check out the project (and submit PRs): https://github.com/machbarmacher/gdpr-dump.

Jul 24 2018
Jul 24

We're very excited to share some details with you about our next Amazee Webinar where we'll discuss atomic design in Drupal with GraphQL & Twig.

The notion of atomic design systems has made its way to Drupal quite some time ago. Tools like Patternlab and Fractal are part of our everyday workflow, but their integration with Drupal's theme system still offers challenges.

Our speaker, Philipp Melab, will take the lead and make sure that we take an in-depth look at this, and in this webinar, we will build upon the first chapter and leverage the power of GraphQL to build a clearly structured and really decoupled component library. 

Date: Friday, 24 August 2018

Time: 4-5pm CEST

Below is a screencast of our previous webinar to give you a good understanding of what to expect in our second edition end of August.

[embedded content]

Jul 24 2018
Jul 24

We're very excited to share some details with you about our next Amazee Webinar where we'll discuss Atomic Design in Drupal with GraphQL & Twig.

The notion of atomic design systems has made its way to Drupal quite some time ago. Tools like Patternlab and Fractal are part of our everyday workflow, but their integration with Drupal's theme system still offers challenges.

Our speaker, Philipp Melab, will take the lead and make sure that we take an in-depth look at this, and in this webinar, we will build upon the first chapter and leverage the power of GraphQL to build a clearly structured and really decoupled component library. 

Date: Friday, 28 September 2018

Time: 4-5pm CEST

Sign-up: You can register here and join us for the discussion.

Below is a screencast of our previous webinar to give you a good understanding of what to expect in our second edition at the end of September.

[embedded content]

Jul 19 2018
Jul 19

Day three

Today, my friends, we’re going to Change the World...

Rachel Lawson presented day three’s keynote. It was a really good session as it showed how everyone who attended, has contributed in some way to Drupal, as well as how “Drupal changes the world”. It started by “Meeting Sami”, a 10-year-old boy from Mosul, Iraq, who was captured (along with his brother) by ISIS. He was held captive for three and a half years, after which he was sent to a refugee camp. While in the camp, it was the Warchild charity that provided support, activities, education, and most importantly, ended up reuniting Sami and his brother with his family.

Now, you’re probably wondering what any of this has to do with Drupal? I know, I also did, but it became apparent that Warchild recently switched to using Drupal, making use of several modules. Rachel asked the audience to stand up, if they had made a contribution to modules used by Warchild, including paragraph and media. Almost half the room did, but I didn’t. She then went on to ask about other contributions that people in the audience had made. This time, it related to anything from documentation, to hosting meetups, and even attending camps.

By the end of the session, everyone in the room was standing, including me. It felt good to know that I had contributed in some way. During the question and answer session, the issue of becoming a member of the Drupal Association was raised, as well as the importance of doing so. Membership empowers the Drupal community to be able to do more things that are requested by users, which in turn makes a transformational difference.

Rachel Lawson presenting her session

“If you don’t push yourself and just go with things, then you’ll never get the amazing things.” - Rachel Lawson

Watch session

Drupal 9: Decoupled by design?

Both Preston So and Lauri Eskola gave a session on decoupling Drupal, as well as the direction in which it is going. Anyone who has been working with Drupal should know that the idea of decoupling Drupal has been around for some time. Among the reasons for doing this, is that developers are free to choose any technology they want for the frontend. It’s clear that Drupal 9 will continue to use Twig, but with support client-side rendering with an API first approach. Another point was that editors prefer the non-decoupled approach, which raises the questions, “Who is requesting this? Is it the clients or developers?”

Watch session

The future of Drupal in numbers

One of the most interesting and debatable sessions I attended was presented by Nemanja Drobnjak. Similar to the first keynote session, this session was about comparing Drupal from 18 months ago, with its current state. This presentation could have been perceived as very pessimistic, especially when seeing the numbers compared to other major CMS’s like WordPress. He also referred to the compare PHP frameworks blog.

All the data in the presentation had clearly been researched, so it was rather shocking to hear Nemanja predict that Drupal could go out of use within 15 years if the current trends continue. A few suggestions to prevent this were made. From improving documentation to Drupal directly targeting the education sector. This session drew a lot of questions. Firstly, “Why compare Drupal to Wordpress?”. I agree completely. It's about who is using it and benefiting from it. It reminded me of the blog post I read in which Vue.js passed React.js in the number of people who have 'starred' it on Github. Basically, it doesn’t mean that React is dying and Vue is now the norm. Both have different purposes and uses, just like, for example, Drupal and Wordpress.

Another question raised was, with Decoupled sites becoming more popular, “Can a crawler detect the backend?”. Maybe the data wasn't 100% correct.

Day four

An update on Drupal 8.6

The day four keynote session was presented by Gábor Hojtsy, who gave a short speech about the upcoming Drupal update. He then moved onto how we could help with several initiatives, both at Drupal Dev Days and in general, including helping with Admin UI and documentation.

Watch session

Contribute, contribute, contribute! Yes!!!

Having put my Windows issues on the back burner, it was time to get the admin UI demo to work. I went over to the Admin UI innovation table where I met Lauri Eskola, Daniel Wehner, and Volker Killesreiter, all of whom helped me try to get the site working. Turns out it was because of an outdated module, so I updated the module, created a pull request and boom, my first ever contribution to Drupal was made. I then spent the rest of the day looking at the code and getting to grips with how it worked.

I was then assigned my first issue, which took some time to complete as I was still getting used to the code base. But nonetheless, I was able to fix the issue and contribute some more to the initiative. I really like how everything is broken into small issues, meaning that a single person isn't completing a large issue by themselves. It is clear that Drupal can only be maintained if people contribute back to the project and/or community.

It is never too late to contribute! Even though Drupal has been around for almost 20 years, it still relies heavily on people to contribute and come up with innovative ideas. If you are looking to contribute, but don’t know how I can suggest you take a look at the Drupal development and strategic initiatives.

Having heard the word “contribute” several times, it would have been great to hear someone repeatedly say the word, as Steve Balmer did - "developers".

Day five

Quo Vadis, Free Software?

The final keynote session, by Rui Seabra, was about free software. He shared thoughts on how we should have the freedom to run software as we wish, make changes to the software to make it fit for your purpose, and distribute both the original and modified version. It was clear that as users of so-called “free software”, we have a misconception about what we think is free. Rui also went on to talk about how we can help protect the internet, especially from the EU’s copyright directive. I did find the joke about the “[fill in] sucks” reference to Windows, very amusing.

Rui Seabra presenting his session

Free software is everywhere, and people are forgetting that the freedom of sharing is a quintessential part of the evolution and moving forward together. “If we didn't share we wouldn't have knowledge, technology, and hardware we use today.” - Rui Seabra

Watch session

Progressive decoupling - the why and the how

The final session I attended was my colleague Blazej Owczarczyk’s talk, where he explained everything about progressive decoupling. One of his key points was that you should only decouple where it makes sense. Blazej showed some cool and interesting new features available in EcmaScript 6/7. We also learnt about the new await/async function in EcmaScript 8, which I found to very cool and cannot wait to start using. It was then time to move on and discuss how we could use these new features in our current Drupal sites.

By installing dependencies, defining a dynamic library and running a web server, you are able to create a decoupled environment for any technology of your choice. Two things I really liked about the session was 1) Blazej asking the audience to tweet a thanks to our very own Philipp Melab for the GraphQL module, and 2) the bonus question, which resulted in more questions from the audience. Way to go Blazej, we’re very proud of you here at Amazee Labs.

Blazej Owczarczyk presenting his session

Watch session

The rest of the day I spent contributing more to the Admin UI initiative.

Many thanks

I would like to take this opportunity to thank:

Ruben Teijeiro for being so helpful throughout the week and introducing me to several people.

Christophe Jossart for not only helping me with my installation issue but for being great company and showing me around Lisbon.

Lauri Eskola, Daniel Wehner, and Volker Killesreiter for the introduction to Admin UI, which helped me find the issue as to why I couldn’t set up the site on my machine and finally allowing me to help contribute to the great initiative.

Finally, to all the sponsors, speakers, organiser, and volunteers, a huge thank you for a spectacular week, great evening social events, and for making my first ever Dev Days an amazing one. I hope to see you all at the next one.


Jul 13 2018
Jul 13

This year’s Drupal Dev Days took place in sunny (well, for the most part) Lisbon, Portugal. Over 400 people attended this year’s Dev Days, and I was one of them. I am fairly new to Drupal, and this was my first conference dedicated to Drupal. This was a week-long event, something that was unusual to me as I am used to attending one or two-day events.

Day one

Day one was all about Contribution at ISCTE. Let’s just say my first day didn’t get off to a great start. It took me over 30 minutes to find the entrance to the University, where the event was taking place, and I wasn’t the only one.

It also didn’t help that it was raining, but luckily I had my umbrella (being British I never leave the house without it). Once I found the place, I realised I’d left my adapter at home, so I headed out to get one. Luckily, there was a shop nearby. Along with my purchase, I headed back to help with contributing to some of the Drupal innanatives.

There, I met Ruben Teijeiro, who introduced me to several people and how to get started. I really wanted to contribute to the Frontend space, especially to the new admin UI. Setting up wasn’t the easiest (which I expected), as I use Windows for all my development. There I also met Christophe Jossart, a long time contributor to Drupal, who tried to help me with setting up the site on my machine, but to no avail.

By the end of the day with the help of installing WSL on my machine, I had managed to install the site, get it up and running, only then to be served with several errors, by which time I had called it a day. It might sound like all doom and gloom, but I made a lot of progress made throughout the day.

Day two


The first keynote session of the week was given by Bojan Zivanovic. He spoke about the evolution of Drupal (version 7 to 8) and how Drupal has made several changes to Core using modern practices, making it a much better framework.

One interesting part was the adaption of  Drupal 8, which was a plus (40%), and the loss of Drupal 7 sites (11%), however, by putting them together Drupal actually lost over 35,000 sites in the last year. There were several reasons as to why this could be the case, including time spent installing Drupal and setting up the development environment, especially for those on Windows (like me :sigh:). It all relies on the community to help and improve what is already there, to make it easier for both new and current users.

A boat sinking in the sea

“Sometimes our Drupal websites end up looking like this.” - Bojan Zivanovic (in reference to the above image).

Watch session

Overview of GDPR modules for Drupal

Another session I attended was a comparison of the top GDPR modules and how you can make your Drupal site GDPR compliant.

The talk covered many aspects - from rights of the user to form checking and security. It was apparent that there are several modules that help do this, so having a short list of the best ones made it easier. It also became apparent that to become compliant, you require more than one module.

One thing that came out of this session, which I totally agree with, was that site security was often neglected in the past but is now at the forefront of all sites.

Watch session

How to delight content editors with UX when building Drupal 8 websites

This session, given by Chandeep Khosa, was the last one I attended and was the highlight of the day for me. Adding features to the Drupal like Admin Toolbar, like a theme module to make it look nice, or even add help text, may sound rudimentary, but how many actually do so and make use of it? (No, really?) Most content editors are not used to the Drupal admin like developers are, so it was nice to hear what we can do to make it easier for them.

One specific module I found very interesting was the tour module, which provides guided tours of the site interface via tooltips, something available in core today. I didn’t even know this existed. One thing I took away from this was that if you don’t need it, hide it, why show something to users that don’t get used?

Watch session

Part 2 of my Drupal Dev Days Lisbon 2018 Recap will follow soon. Thanks for reading.

Jul 05 2018
Jul 05

We will host the next Durpal Meetup at our Amazee Labs offices in Zurich on 11 July.

We'll focus our discussions on progressive decoupling, GraphQL, and Drupal.

So, if these topics interest you make sure to join us for an evening of great talks and collaboration.

We hope to see you there!

Date: Wednesday, 11 July 2018

Time: 6:30 PM - 9:00 PM

Venue: Amazee Labs, Förrlibuckstrasse 30, Zürich

Jun 27 2018
Jun 27

Join us for Drupal Dev Days in Lisbon! 

If Drupal development is your thing, then the upcoming Drupal Developer Days in Lisbon is the place to be.

The programme promises to keep the conversations going with code sprints, workshops, sessions and BoFs.

Amazee Labs is proud to be a Gold Sponsor and we look forward to catching up with you during, in-between and after the event.

See you there!

Dates: 2-6 July 2018

Venue: ISCTE-IUL University

If you want to know more about what's happening when, you can view the full programme here.

Jun 21 2018
Jun 21

Drupal Europe will be taking place soon – different organization, different structure, same great community. There are a few things you need to know about this year’s edition of Europe’s largest Drupal conference, so let me tell you about them. But before I start, your case studies, as always, are more than welcome!

DrupalCon Europe or Drupal Europe?

The first thing you might have noticed is that the event is called Drupal Europe and not DrupalCon Europe as in previous events. In order to find a sustainable approach to the conference in Europe, DrupalCon decided to take a year off in 2018 in order to come back stronger than ever, so the community decided to weigh in and organize this year’s event - Drupal Europe. This is the best possible example of one of the Drupal community mottos, “Come for the code, stay for the community”.


Main changes

Did I just mention that the event is fully organised by the community? Ah, yes, so here are some other changes:

  • Industry verticals - The program and session selection process will focus on real-life scenarios and industries and how Drupal is linked to those. These might be familiar as they’re likely to be the projects you work on on a daily basis, so you’ll be able to share insights into how you or your company approached one of these projects. Here is a list of the verticals:
  • Expertise topics - Not familiar with the verticals? Not to worry, we still have an easy way for you to find out what your interests are (back-end, front-end, site building…). All sessions are tagged with both industry verticals and expertise topics.

Additional time for contribution - this will be a constant topic throughout the event. There will be mentors and self-organised contribution teams, so make sure you come prepared if you want to contribute and if you’d like to volunteer as a mentor then register here as well!

Date and place

The event will take place from 10 to 14 September in Darmstadt, Germany, which is just 20 minutes from the Frankfurt airport. It’ll be a fun week, so clear your calendar! There will be plenty of opportunities to attend talks, mentoring sessions, sprints, etc.

The week’s schedule is as follows:

  • Monday - Contribution with some mentors
  • Tuesday to Thursday - Sessions, panels, workshops, BoFs and contribution lounge
  • Friday - Traditional mentored contribution and self-organized contribution spaces

Get involved

Just attending the conference is a great experience, but there are more ways to get involved:

  • Volunteering - Help with the organization of the event, find out more here.
  • Submit a session - By submitting and preparing a session gives you a great way to give back to the community. Sharing your knowledge and experience with others is what makes these conferences so great. So, make sure to submit your sessions here.

Josef, our Agile Consultant, is part of the conference program committee, so if you see him during that week, make sure you give him kudos for putting his energy, as well as that of all the other volunteers, into the event.

A number of Amazees are already working hard on submitting sessions, ranging from client case studies to more technical talks. We love these gatherings and want to contribute as much as possible, so you’ll be sure to see a lot of us there in that week.


So here is your to-do list:

  • Sign-up for Drupal Europe (if you haven’t already).
  • Submit your session/s, you have until the end of this month to do so.
  • Enjoy!
Jun 19 2018
Jun 19

Only a month has passed since DrupalCamp Transylvania, and already another Drupal Camp has come and gone in Romania. This time it was Drupal HackCamp, organised in the Romanian capital, Bucharest. It was a Drupal Camp with a very specific theme: Security.


Throughout the sessions presented at the Camp, one was able to find out what security issues Drupal had experienced in the past, how the Drupal Security team, as well as the Community in general, had dealt with them, what Drupal did to improve the security of the platforms that were developed using the CMS and what can (and should) be done to have a more secure application.

Since I first heard of it, a Camp focused on Drupal security sounded really interesting to me. This is the type of camp every Drupal developer should attend at least once in their career. Actually any web developer for that matter. As we know, security is a very important topic with regards to the web. Even for experienced developers, some things can be very tricky, as an application's security does not only depend on the code. It also depends on how the web server is configured or what kind of third-party libraries your code depends on. Additionally, it also depends on the libraries you are using in development, if they are used to pack or bundle your code, or if they end up touching your code in any other way.

One of the sessions which focused on how Drupal improved its security with each new version, was Peter Wolanin's - 10 Ways Drupal 8 Is More Secure.

10 Ways Drupal is More Secure

In this session, Peter Wolanin first gave a brief introduction to the OWASP Top 10, a list with the top 10 critical security risks that affect a web application. This is not only Drupal related, it applies to any kind of application that is accessible via the web. Next, he pointed out 10 things Drupal 8 implemented that help the developer to avoid those security risks. Among the points he mentioned were, the autoescaping feature implemented in twig (so now everything which gets outputted by twig, is by default, escaped), the automatic CSRF tokens in the route definitions (making it easier for the developer to create links which are valid only for the current user session), the removal of the PHP input filter (which was very dangerous if misused), and the enforcement of trusted host patterns for requests (so that your application will respond only if requested via a host which you actually trust).

As previously mentioned, having a secure app doesn't guarantee that your Drupal is secure. Nowadays, there is a growing interest in having decoupled apps. This means you have a backend which is usually used for content management only (that can be a Drupal site) and a frontend, which is a modern js application, that can be implemented optionally, using a framework like React, Vue.js, and so on. But then you also need to use npm for installing the additional js libraries you need, webpack for creating the javascript bundles for your app, and babel for transpiling your javascript code. So suddenly you start to introduce a ton of other dependencies, which each depend on a lot of other packages. Alexandru Badiu did a presentation called, “JS and Security”, which covered some of those aspects.

JS and security

So, you do the best you can to write secure code, try to evaluate the dependencies of your project, and make sure that they don't introduce critical security issues, but is that enough? There could still be several security issues which you’re unaware of, which will only be discovered while you are using the application. It would be awesome if we're able to do something to proactively protect us against common security risks.

Bastian Widmer (@dasrecht) presented a talk on this subject, entitled “How Open Source will help you to survive the next Drupalgeddon”, where he showed us a few tips that we can use in advance, in order to respond to potential security issues in future. Besides ensuring you do regular updates for all your app’s dependencies, you could also take some measures at the web server level. For example, only allow index.php to be executed, use a web application firewall or make sure that your operating system is configured properly.

How open source will help you to survive the next Drupalgeddon

Of course, there had to be a session about the last Drupalgeddon(s), at a Camp focusing on Security. The event’s keynote was by Jasper Mattsson, who actually discovered Drupalgeddon 2. He shared some tips with us on how to find security breaches. He said that there is no secret 'recipe' for that, but a good starting point, is to look for functions which output data, which can do multiple things, perhaps depending on how they are invoked (in which context or with which parameters) or which can trigger code execution.

Finding Drupalgeddon

There is one very important thing to keep in mind if you discover a security breach: do not post it on the regular Drupal issue queue. Instead, follow the instructions on how to report a security issue when you found one. The implications of reporting a security issue inside the regular Drupal issue queue can be very dangerous, as the attackers will then have plenty of time to create an attack until the issue is fixed.

Being in a city with such a rich history, we could certainly not miss the walking tour that the organisers had prepared for us on the Saturday afternoon. During the tour, we saw Bucharest’s most iconic buildings, which have survived all the great historical periods over the last 200 years - the monarchy, two world wars, communism and now democracy.

Atheneul roman

Old Church

Old Monastery

Drupal HackCamp Bucharest was a really great event, and I hope it takes place next year. It is of great value to all web developers, especially those at the beginning of their careers, as it prepares them for the dangers of the wild world wide web and equips them with the required knowledge to guard against any that may pop up along the way.

Jun 18 2018
Jun 18

I’ve been running events since college, for work and for fun, and for groups of 3 to 3,000. You’d think there’d be a difference, but the amount of energy it takes to run an event, surprisingly, is the same. It’s crazy how well these things scale.

Regardless of size, an event planner goes through a very predictable flow from event conception to event end.

We started planning Texas Camp in September of 2017. Knowing we were going to organize the event again for 2018, we scrambled to finalize the venue and update the sticker. By the time BADCamp rolled around, we had shiny new Texas Camp stickers to distribute at the nation’s largest gathering of Drupal people - all potential camp attendees.  

Because we knew when companies do their budget planning, we were ready with a brand new sponsor prospectus by December. By the second week, a cheerful call to sponsor was in many Drupal company inboxes.  

We worked to get the website launched in January, so attendees could plan ahead and to get everyone excited. Let me tell you this - when building a spankin’ new React + Drupal site, plan for extra time.

By the time we did launch in February, we had missed a few big camps, but still had plenty of time to get the word out on the call for sessions.

From February to April, we worked hard to get the word out about all the different ways people could get involved with camp. Sponsorship, speaking, volunteering, or simply just attending. Early-bird tickets were on sale and the sessions submissions were trickling in.

Texas Camp organizers attended DrupalCon Nashville and spread the good word of Texas Camp to anyone who would spare a few minutes. Those who promised to submit sessions were gifted a Texas Camp sticker, along with lavish promises of fame and glory.

Because we want Texas Camp to be known as an inclusive camp, we reached out to different groups, including the Drupal Diversity and Inclusion group, to help get the word out to a broader, and more diverse, audience. I’d like to think our efforts here helped us pick up more diverse speakers than we might have gotten through our usual channels.

At the end of April, the craziness began. Although I am a seasoned session selection overseer, this was my first time actively participating in the selection as a team. It’s not an easy task, not only considering the length of time it takes to read sessions!

We had a few mandates: no repeat speakers, diverse topics, variety in experience levels, and oh yeah, the selection was done fully blind to the presenter. All personally identifiable information (pronouns, speaker names, company names, etc) was all painstakingly struck from the submission pile.

At the end of the two-week selection process, the team gathered and made the final selection. Some speakers with multiple sessions had been ranked high enough to make the session cut, so the better of the two, or the session with most topical conflict with other highly ranked sessions, were made into backups.

After session selection, things started moving really fast. We had one week to confirm speakers and another week to make a schedule. Once that newsletter went out announcing the final schedule, the official countdown to Texas Camp had begun.

Week 4: Guess what you’ll need and order everything. This gives you enough time to re-order if anything goes wrong. It’s too early for real attendance numbers, so any amount you order is the best guess.

Week 3: Things will start to arrive. Your office will be filled with an insane number of soda flats and bizarre equipment. We had a silver 4-foot metal trough we had to explain on a few client calls. Speakers will begin canceling. New sponsors will appear out of the woodwork - which is a GREAT thing. Last minute sponsors allowed us to blow the budget on breakfast tacos!

Week 2: You’ve printed everything you can think to print and pray the sizes match and the colors turn out right. The final “Texas Camp is next week!” notice has gone out to attendees. Speakers are thoroughly annoyed at the number of reminders to RSVP we’ve sent.

Week 1: The blessed “eye of the storm”. The week before the event. It’s too late to do anything meaningful. All you can do is hope you’ve done enough ahead of time and remembered everything. Especially if the week of ends in a 3-day weekend for Memorial Day. An unexplained spike in registrations. It looks like we’ll hit 150!

The week of: It’s time for final inventory audits, calling and confirming with all the venues and updating catering counts with vendors. Always add more vegan meals than you have data for! Rally the organizing team and caravan the soda flats and registration supplies to the venue.

Make eye contact and remind each other that you can do it and that there will be coffee in the morning. Charge the iPads. Remember to print the special diet food tents for the morning.  

During camp: Have a stupid amount of fun. See people you haven’t seen in a year. Celebrate the CMS that drew us all together. So many people, at Texas Camp we nearly hit 200! Eat an inordinate amount of food. Watch some amazing talks. Sing karaoke.

After camp: Go home. Swear to never do it again. Take a vacation. Get a sunburn. Reconsider.

The week after camp: Begin researching venues for the next year.  

Jun 12 2018
Jun 12

Join us on Wednesday, at Gridonic, for the upcoming Zurich Drupal user group meetup.

The gathering is dedicated to all those interested in Drupal. Everyone, from beginners to experts, are more than welcome.

Hope to see you there!

Date and time: Wednesday, June 13, 2018, from 6:30 PM to 9:00 PM

Venue: Gridonic - Ernastrasse 22, Zürich

May 29 2018
May 29

Retrospectives are an essential part of our team’s workflow. After each iteration, we get together to collect insights and feedback. By doing so, our teams ensure they have time to celebrate achievements, learn from mistakes and steer their efforts along a process of continuous improvement.

What are the steps of a retrospective?

Retrospectives will often be made up of 3 simple steps: a) What went well? b) What could we have done better? c) Action items for further improvements. More in-depth retrospectives can use the following model for deeper analysis:

1) Set the stage

A brief check-in allows everyone to get ready for the retrospective, i.e. we gauge how everybody is feeling about the past iteration.

2) Gather data - What?

The data gathering stage is all about collecting different viewpoints based on the metrics of how the sprint went, external feedback the team has received or things they have observed during the iteration. For retrospectives of longer time periods, we use a timeline to collect major milestones from participants and discuss them in a group.

3) Generate insights - So What?

Here we go into problem solving mode. Using brainstorming activities we are able to determine the reasons why things went well or not. For example, the 5 Whys can be used to identify root causes or by imagining The Worst We Could Do, our teams find out what they need to improve on.

4) Decide what to do - Now What?

Now it’s time for the team to create actions that will help them to become even better in the next iteration. Practices like Circle of Influence helps to focus them on what they can accomplish as a team. We find Divide the Dollar to be useful as well as other dot-voting activities when determining what we want to focus on.

5) The closing perspective

Finally, in the closing, we want to make sure that everyone gives their final input on how the retrospective went.

Things to keep in mind when running retrospectives

Retrospectives done right are a powerful tool to help your team open up and have meaningful conversations. As with any meeting, it’s important to ensure everybody is on board with the working arrangements, such as being on time and a willingness to contribute. As the facilitator of the meeting, you can do a great job at providing a space where participants feel encouraged to share what’s really on their mind.

Looking for ways to make your retrospectives more engaging? Retromat is a tool that helps you think of different ways to facilitate a retrospective. In terms of online collaboration, we found meeting on zoom.us with Realtime Board and collaborating on our retrospective notes in a shared Google Slides presentation to be most effective.

Thanks for reading our take on retrospectives. If you'd like to learn more about running retrospectives effectively, don’t hesitate to reach out in the comments section or get in touch using our contact form.

May 24 2018
May 24

Drupal is all about security  

The Drupal community is unique in many ways, and the Drupal Security Team is an example of this. They provide documentation about writing secure code and keeping your site secure. They work with the drupal.org infrastructure team and the maintainers of contributed modules, to look into and resolve security issues that have been reported.

When a security issue is reported, the Drupal Security Team mobilizes to investigate, understand, and resolve it as soon as possible. They use a Coordinated Disclosure policy, which means that all issues are kept private until a patch can be created and released. Public announcements are only made when the issue has a solution and a secure version is available to everyone. This communication is sent out through all of the channels possible so that everyone is made aware of what they need to do to keep their sites safe and secure.

This means that everyone finds out about the patches, and therefore the vulnerabilities, at the same time. This includes people who want to keep their sites secure, as well as those who want to exploit vulnerabilities. Security updates become a matter of speed, and the development teams at Amazee Labs, along with our hosting partner amazee.io, are always ready to make sure patches are implemented as quickly as possible.

Recent Drupal Security Releases

On March 28th 2018, the Drupal Security Team released SA-CORE-2018-002. This patch was a critical security vulnerability that needed to be implemented on every Drupal site in the world as quickly as possible. At the time of the patch release there were no publically known exploits or attacks using the vulnerability, which was present on Drupal versions 6.x, 7.x & 8.x and was caused by inadequate input sanitization on Form API (FAPI) AJAX requests.

On April 25th, 2018 SA-CORE-2018-004 was released as a follow up patch. This release fixed a remote code execution (RCE) bug that would affect any site with Drupal versions 7.x or 8.x. The vulnerability was critical, and both issues resulted from problems with how Drupal handles a “#” character in URLs.

What are the dangers?

There are a number of different kinds of attacks that could take advantage of vulnerabilities fixed in the recent security updates. One kind of attack that is becoming more common is the installation of cryptocurrency mining software. These attacks are both subtle and resilient and use the CPU of the site server to generate cryptocurrency for the attacker.

Amazee Labs is keeping your sites safe

The Amazee Labs team takes these security releases seriously and works quickly to prepare for these updates. We inform our clients as soon as possible about the upcoming release and organize the maintenance and development teams to be ready to run the updates at the time of the release. During these “patch parties” our global teams work together to solve problems and secure all sites by leveraging everyone’s expertise all at once.

Implementing these measures takes development time not alloted in our usual maintenance budgets. We will always let you know when additional work is needed, and keep the communication channels open to address any concerns.

An additional layer of security is provided to our clients who host with our partner amazee.io. As soon as the security patch is released, the amazee.io team work to put an infrastructure level mitigation in place. This means that all Drupal sites that they host are immediately secured against initial attacks. You can read a detailed breakdown of how they accomplished this here.

May 14 2018
May 14

In this series we take a closer look at progressive decoupling in Drupal 8. We go through the project setup, check the tools and libraries and discuss potential applications. The first post of the series showed some new features that made it into JavaScript in the last few years. This time let’s see how to use it in a project.

JavaScript transpilation has been added to Drupal core in 8.4.0. The Core JS contribution workflow has been described in the change record Drupal core now using ES6 for JavaScript development. Unfortunately, the scripts mentioned there cannot be used to transpile contrib code yet. There’s an ongoing discussion about that in Use ES6 for contrib and custom JavaScript development. So we need to wait for that to be solved, right?

Not really. It turns out that it is enough to place the package.json file from core/ two levels up in the directory tree (in case of a composer project) and adjust paths to scripts to enjoy modern JS in contrib and custom code. With this file in the repository root we can run

to install dependencies, and we’re good to go with ES6.

will start the watcher, which monitors all .es6.js files and transpiles them to their .js counterparts whenever a change is detected.

The scripts can be invoked in one of 4 ways

To commit or not to commit?

Is it fine to commit the output (.js) files? That depends on the nature of the code. If it’s a contrib module / theme it’s best to do so. Target users shouldn’t be forced to transpile themselves and the build process of Drupal modules is not adjustable at the time off writing this post.

Source maps

Contrib modules would most likely provide just the optimized code (without source maps). The committed source .es6.js files can be used to overwrite the output files with dev features enabled for individual environments if needed.

Custom code

The choice here depends on the hosting platform. If it supports adjusting the build process based on the environment, then the .js files don’t have to be committed at all. The source files are enough and the compilation can be done before each deployment. Source maps can be used for dev and prod should get the optimized build. This is how it looks like in an .amazee.io.yml file for instance:

As with every artifact, ruling out the compiled versions of js files from the repository makes the development process smoother, mainly by reducing conflicts. On the other hand, it doesn’t have to be a big problem if the team is small enough.


Here’s a recipe for adding an example ES6 library to a theme.

  1. Add this package.json file the root of your project
  2. Install dependencies
  3. Start the file watcher
  4. Add a library definition to package_name.libraries.yml in your module or theme.
  5. Create the index file (js/mylib/index.es6.js)
  6. Save the file and check the terminal window with the file watcher, js/mylib/index.es6.js should be mentioned there and the compiled version - index.js - should be created next to the source file. The library is now ready to be used in a module or theme.

That’s it for setting up ES6 in a project. In the next post we’ll take a look at polyfills, talk about the fetch API and a see how to use async functions - the undeniable game changer from ES8.

If you want to learn more about Drupal libraries check these out

May 14 2018
May 14

“Absolutely incredible!” - just one quote from our first Amazeenar in which we explore the power of GraphQL Twig. Decoupling Drupal is the future, however, it may be a big leap to learn a whole new development stack. With GraphQL Twig, we can take baby steps with a soft-decoupled approach by writing GraphQL inside our Twig templates.


On Friday 11th May, Amazee Labs hosted its first Amazeenar - a live video training session presented by Philipp Melab who demonstrated some of the capabilities of GraphQL with the Drupal module GraphQL Twig.

We started the webinar while a crowd joined live from over 13 countries around the world, including Belgium, Brazil, Canada, South Africa, and as far east as Thailand.

It felt exciting to have a community of enthusiastic people connecting from so many different locations across the globe. This once again reinforced that Drupal is really about coming for the code and staying for the community.

Philipp dove into the talk by giving us a quick introduction to GraphQL, with an example query for us to better understand the concept:

query {
  node:nodeById(id: "1") {
    related:relatedNodes {

Running this example GraphQL query would give us the following JSON response:

  “node”: {
    “title”: “Article A”,
    “related” {
      { “title”: “Article B” },
      { “title”: “Article C” }


Inversion of control

Philipp then explained the need for decoupling, providing us with a good overview of the fundamental differences between standard Drupal and Decoupled Drupal, in which the control moves from a push approach to a pull approach.

React is great, but the inversion of control is crucial.

Enable the template to define its data requirements, allow's us to achieve a clear data flow with significantly increased readability and maintainability. The GraphQL Twig module allows us to add GraphQL queries to any Twig template, which is then processed during rendering and used to populate the template with data.

Philipp entertained the audience with a live working demo in which, together, we learnt how to enhance the default “powered by Drupal” block to pull in the username of user 1. He then blew our minds with an additional surprise - pulling in the current number of open bug issues for Drupal Core via the GraphQL XML submodule.


Did you miss the webinar? Don’t fret; we recorded everything!

Amazee Labs would like to thank everyone who attended the live session, we enjoyed being able to share this with you, and we look forward to hosting another Amazeenar shortly.

May 11 2018
May 11

If you were in the city of Cluj-Napoca between 4 and 6 May 2018 and walked around The Office, you probably saw over a 100 people from all over the world, wearing the same t-shirts, talking about Drupal. That's because DrupalCamp Transylvania was in town.

If you know a bit of Romanian history and have heard about Transylvania, you probably know about Vlad the Impaler. If not, then you've probably heard about Dracula. Either way, they're the same person. You may be asking yourself, "What has Dracula got to do with Drupal?". Well, the answer is in the picture below:

Immortal Drupal

We all want Drupal to be immortal. Because we love developing awesome websites with it.  That said, we must remember one thing, it's not all about work and making money, it's also about having fun using Drupal. That was one of the key points of Robert Douglass' keynote - "My Drupal Mid-Life Crisis".

One of the most interesting sessions was Larry Garfield's - "The container is a lie!". On reading the title, you'll probably want to check that out, since you most probably use containers (not necessarily Docker containers, although Docker is probably the most used these days) in your everyday work. He spoke of how software runs on modern Linux systems, that we should not think of boats, whales or shipping or even Docker when we hear the word container, and why it is actually useful that modern software is built (runs) on these "lies". These "lies" form part of our everyday work, and more importantly, the deployment to different environments makes it so much easier.

Larry Garfield Docker

Another very important topic, not only in the Drupal community but in technology in general, is GDPR (General Data Protection Regulation). Balu Ertl had a great session entitles, Overview of GDPR modules for Drupal, in which he provided an overview of all the modules in Drupal that can help your Drupal site achieve GDPR compliance.

Is Drupal ready for GDPR?

The conclusion was that we have quite a few modules (9) in this category, some of them available on both Drupal 7 and 8. Some of them implement a small part of the regulations (like the consent for using personal data, the possibility to delete or download all the personal data of a user, the possibility to anonymize user information when dumping a production database, etc.) and many of them implement overlapping features.

Drupal GDPR modules list

But there seems to be one module, General Data Protection Regulation, which tries to bring all these modules together under one umbrella so that we can have a unified and clear solution for making a site GDPR compliant.

Another thing that came up during the discussions about this subject, was that this is a really complicated subject for both technical and legal minds, and as such, you'll most probably not be fined immediatly if you're not 100% GDPR compliant on the 26th of May 2018. The most likely scenario is that the authorities will be there to help at first, and only fine you as a last resort. That said, this cannot be confirmed and everything should still be done to be GDPR compliant by the deadline.

Wait, there's more! While attending Lenard Palko's presentation, we saw this:

Auditing PHP Apps

No, we did not watch an episode from Doc McStuffins. This was about Auditing PHP Applications, a session in which Lenard Palko showed us how his team is dealing with auditing PHP applications and what things should we look for when having to do such an audit. He also shared some helpful tools that you should use and how should you structure the report.

As you can see it was a great DrupalCamp. Nice location, great presenters, lots of people and a dedicated sprint room. So, did we have any time for doing other stuff than coding and talking about Drupal? Yes, we did! We had some great parties each evening and a brave few of us even went for a morning run on Saturday.

Morning Run

I'm already looking forward to the next DrupalCamp Transylvania in 2019. See you there!


About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web