Nov 08 2018
Nov 08

If you are still using PHP 5, now is the time to upgrade to a newer version of PHP.

PHP, the Open Source scripting language, is used by nearly 80 percent of the world's websites.

According to W3Techs, around 61 percent of all websites on the internet still use PHP 5, a version of PHP that was first released fourteen years ago.

Now is the time to give PHP 5 some attention. In less than two months, on December 31st, security support for PHP 5 will officially cease. (Note: Some Linux distributions, such as Debian Long Term Support distributions, will still try to backport security fixes.)

If you haven't already, now is the time to make sure your site is running an updated and supported version of PHP.

Beyond security considerations, sites that are running on older versions of PHP are missing out on the significant performance improvements that come with the newer versions.

Drupal and PHP 5

Drupal 8

Drupal 8 will drop support for PHP 5 on March 6, 2019. We recommend updating to at least PHP 7.1 if possible, and ideally PHP 7.2, which is supported as of Drupal 8.5 (which was released March, 2018). Drupal 8.7 (to be released in May, 2019) will support PHP 7.3, and we may backport PHP 7.3 support to Drupal 8.6 in the coming months as well.

Drupal 7

Drupal 7 will drop support for older versions of PHP 5 on December 31st, but will continue to support PHP 5.6 as long there are one or more third-party organizations providing reliable, extended security support for PHP 5.

Earlier today, we released Drupal 7.61 which now supports PHP 7.2. This should make upgrades from PHP 5 easier. Drupal 7's support for PHP 7.3 is being worked on but we don't know yet when it will be available.

Thank you!

It's a credit to the PHP community that they have maintained PHP 5 for fourteen years. But that can't go on forever. It's time to move on from PHP 5 and upgrade to a newer version so that we can all innovate faster.

I'd also like to thank the Drupal community — both those contributing to Drupal 7 and Drupal 8 — for keeping Drupal compatible with the newest versions of PHP. That certainly helps make PHP upgrades easier.

November 08, 2018

1 min read time

db db
Nov 08 2018
Nov 08

Now that I’ve settled back down in Alaska after a fun trip to Berkeley for BADCamp, I’m finally digesting all of the info I gathered throughout the week. As always, it was cool to look over the schedule and see what topics were getting a lot of attention; and, without a doubt, it seemed like GatsbyJS was the hot-ticket item this year. So here’s a primer on what GatsbyJS is and why the Drupal community seems so head-over-heels for this up and coming site generator.

What is GatsbyJS?

For the uninitiated, GatsbyJS is a static site generator for React which allows you to compose a website using React components and JSX. Then, with a “Gatsby build” command, the entire React app gets compiled into a set of static HTML, JavaScript, and CSS files.

However, this is nothing new. Static site generators like Jekyll or Hugo have been doing this for years. So, what makes Gatsby so popular, especially within the Drupal community? Judging by the sentiments I heard from the four sessions on Gatsby I attended at BADCamp, I noticed that folks tended to gravitate toward a few essential features that made Gatsby appealing, not only as an integration with a Drupal backend but also more broadly.

Gatsby’s Drupal Source Plugin

The feature most commonly cited in BADCamp talks about Gatsby was the source plugin ecosystem, which allows a developer to define a backend “source” from which Gatsby will build pages. There is a robust and well-developed plugin for Drupal 8 (gatsby-source-drupal), which allows Gatsby to programmatically create content from Drupal’s content types, taxonomies, as well as blocks and menus.

The plugin works like this: first, you must enable the JSON API module, which exposes routes and schemas that help Gatsby import your content into your Gatsby application. Then, on the Gatsby side, you are able to query the data from your Drupal site using GraphQL and render the query results to various React components, such as page templates.

Build/Deploy Workflow

Secondly, the architecture around deploying/building a Gatsby site lends itself well to working with Drupal - using something like the Webhooks module, you can have your Gatsby site rebuild (and thus, pull new content) on every node create, update, or delete operation, and so forth.

This is very important, considering that most static sites are perceived as being labor intensive (in the sense that every time content is updated, someone needs to recompile the site); but, to hear Gatsby co-founder Sam Bhagwat talk about it, the ready availability of on-demand build triggers and integrations makes your static site actually perform much more like a cache. However, it’s the sort of cache that doesn’t require maintaining a highly technical caching stack like Varnish or Memcache. After the minute or two it takes for the build step to complete, your changes have completely propagated and site visitors are being served the new content.

React

Love it or hate it, React is here to stay. It is rapidly becoming the most common front-end stack in web development. Although it may not be for everyone, most developers who have spent the time to learn React tend to fall hard for it. Heck, this year’s Stack Overflow Annual Developer Survey puts React as one of the most “loved” frameworks by developers this last year, with 69.4% of respondents reporting that they enjoy working with React. Obviously, something is working for React in terms of developer experience. Being able to implement a React front-end without the architectural concerns of a complete decoupled solution certainly seems to be one of the bigger motivating factors behind Gatsby’s adoption among Drupal developers.

Performance

Last, but certainly not least, Gatsby’s appeal comes largely from speed. As mentioned above, since everything is served from static files, Gatsby sites load really fast. Without any server-side code to execute, the only limitations on performance are in the size of HTML/CSS/JS bundles being served over the network. When folks in a traditional Drupal site (or any other monolithic CMS) concoct solutions for better performance there are a few usual suspects:

  • better caching
  • reducing bundle sizes (minifying JS and CSS, for example)
  • optimizing images
  • serving non-dynamic site files (such as images) from a CDN

Out of the gate, a Gatsby site effectively implements all of these features by default. Since the entire site is a set of static files, your content is effectively ‘cached’ to the state of your website as of the last executed build, and the flat files that are generated from the build perform similarly to cache catch. Likewise, because Gatsby runs a build step, our entire HTML/CSS/JS bundle is minified from source using Webpack. No configuration required! Gatsby also comes pre-wired with a bunch of image optimization, including using the Sharp image processing library and progressive image loading to get a contentful, interactive page rendered as quickly as possible. And lastly, with a completely static site the most common way of hosting the site is through a CDN - some popular ones among Drupalers who use Gatsby include Netlify and Amazon S3. On top of all of that, Gatsby also has some other nifty built-in features like prefetching, which means that after the initial page load you’ll get lightning-quick performance between internal pages on your site.

Limitations

Of course, most folks discussing Gatsby + Drupal implementations at BADCamp were quick to acknowledge Gatsby’s limitations. First, and most notable among the limitations, is the inability to use conventional user-submitted data. If you wanted to define a webform and render those results into a views-generated table, that would be pretty straightforward in a conventional Drupal 8 site. In a Gatsby site, however, it would be much more complicated. Likewise, for many of the ‘nice to have’ out of the box things that Drupal gives us - authentication, custom user roles, and access restriction, etc. – they all need to be effectively re-engineered if we want to implement them on our Gatsby front-end. That said, we could certainly continue to do many of those things in Drupal, but in the emerging conventional patterns of using Gatsby and Drupal together, it is unclear what the ‘best practices’ method is for handling those use cases.

All in all...

I think it’s really interesting to see this attention and centralization on Gatsby as a defacto solution for decoupled Drupal sites. While there are plenty of limitations, Gatsby seems like an interesting solution for some of the characteristic decoupled problems that many decoupled Drupal devs are very familiar with by now.

And, as always, this year’s BADCamp was really great to get a chance to see the Hook 42 team – it’s amazing that we “see” each other so often online, but once we’re all together I’m reminded of what a smart, funny, and fun group of people we are. Can’t wait to see everyone again!

Nov 08 2018
Nov 08

Now that the excitement of BADCamp has worn off, I have a moment to reflect on my experience as a first-time attendee of this amazing, free event. Knowing full well how deeply involved Kanopi Studios is in both the organization and thought leadership at BADCamp, I crafted my schedule for an opportunity to hear my colleagues while also attending as many sessions on Accessibility and User Experience (UX) as possible.

Kanopi’s sessions included the following:

The rest of my schedule revolved around a series of sessions and trainings tailored toward contributing to the Drupal community, Accessibility and User Experience.

For the sake of this post, I want to cover a topic that everyone who builds websites can learn from. Without further ado, let’s dive a bit deeper into the accessibility portion of the camp.  

Who is affected by web accessibility?

According to the CDC, 53 million adults in the US live with some kind of disability; which adds up to 26% of adults in the US. Issues range from temporary difficulties (like a broken wrist) to permanent aspects of daily life that affect our vision, hearing, mental processing and mobility. Creating an accessible website allows you to communicate with 1 in 4 adults you might otherwise have excluded.

What is web accessibility?

Accessibility is a detailed set of requirements for content writers, web designers and web developers. By ensuring that a website is accessible, we are taking an inclusive attitude towards our products and businesses. The Web Content Accessibility Guidelines (WCAG) are a globally acknowledged set of standards that help us publish content that fits within the established success criteria. These guidelines are organized into the following four categories.

WCAG Categories:

  • Is your website perceivable? This applies to non-text content, time-based media (audio and video), color contrast, text size, etc.
  • Is your website operable? This ensures that content is easy to navigate using a keyboard, that animations and interactions meet real-user requirements, buttons are large enough to click, etc.
  • Is your website understandable? This means that text content is easy to read for someone at a ninth grade reading level, that interactions follow design patterns in a predictable manner, that form errors are easy to recover from, etc.
  • Is your website robust? This means that content should be easy to interpret for assistive technologies, such as screen readers.

The World Wide Web Consortium (W3C) is an international community whose mission is to lead the Web to its full potential. They have also published a checklist to aid our efforts in meeting WCAG success criteria.

How can we be successful in making the web accessible?

Industries have varied requirements when it comes to web accessibility. WCAG has three levels of compliance, ranging from A to AA to AAA. A conformity has the lowest set of requirements and AAA has the strictest set of requirements; so strict, in fact, it may be impossible to achieve across an entire site.

Efforts to meet these standards fall on every individual involved in the process of creating a website. Although there are many tools that aid in our journey, we reach accessibility through a combination of programmatic and manual means.

The most important thing to keep in mind is the fact that achieving success in the world of accessibility is a journey. Any efforts along the way will get you one step closer towards a more inclusive website and a broader audience base.

Please Remember: Once Kanopi helps you launch an accessible site, it’s your job to maintain it. Any content you add moving forward must be properly tagged; images should have proper alt text and videos should have captions. Users come to your site because they love your content, after all! The more you can make your content accessible, the more you will delight your users.

Interested in making your site more accessible? Check out some of the resources I linked to above to join in learning from my peers at BADCamp. If you need more help getting there, let’s chat!

Nov 08 2018
Nov 08

MidCamp is returning for its sixth year next March 20-23, 2019. We’ll be back at DePaul University for four days of presentations, professional training, contribution sprints, and socials. Designers, developers, and users will be able to brush shoulders with Drupal service providers, hosting vendors, and other members of the broader web development community.

Agenda Overview

This year we have some changes to our general agenda. We’ll be adding summits for the first time! We’ve also moved our sessions to Thursday and Friday so that attendees get some of their weekends back. A high-level agenda is as follows:

  • Wednesday, Mar 20 - Summits, Training, and Contribution Sprints

  • Thursday and Friday, Mar 21-22 - Sessions

  • Saturday, Mar 23 - Contribution Sprints

Stay Tuned for these Upcoming Dates

Stay tuned into the website and our newsletter for some upcoming dates.

  • NOW! - Ticket sales are open on Eventbrite. Spread the word and get your tickets early: https://midcamp2019.eventbrite.com/

  • Nov 14, 2018 - Our website will be fully up and running. It will be ready to open our call for papers.

  • Dec 12, 2018 - Call for papers will close and travel information will be available on the website.

  • Jan 9, 2019 - We will open the registration for training and summits.

  • Jan 16, 2019 - Announce Featured speakers on the website.

  • Jan 23, 2019 - We will post the Final schedule for the website.

Help us Make MidCamp!

It’s not too late to get involved with MidCamp 2019. We’re on MidCamp Slack. You can also contribute by telling us what topics you’re interested in seeing in the 2019 program.

Join the conversation

Nov 08 2018
Nov 08

This is a public update on the work of the Governance Task Force.

Drupal is one of the most successful open source projects in the world. Governance is fundamental to the project's success.

The community and the code has been built from the ground up. And as the code has grown, so has the community.

When communities are first emerging it's easy to bring newcomers along, but over time the community begins to mature, change, and then needs to adapt. Challenges and opportunities emerge as evolution occurs, and our community needs to navigate them strategically.

A Governance Task Force has been meeting weekly since May to put together the strategic proposal we now share with you. We've synthesized ideas, discussions, and experiences from people we've interviewed, and we've revisited the themes that emerged from the community listening project run by Whitney Hess and by previous governance discussions.

This Drupal Governance Task Force 2018 Proposal serves two purposes.

Firstly, it's clear that for community evolution to occur there needs to be broad agreement and buy-in. People are comfortable jumping in and building a new module, but community change and action is hard. People talked to us openly about the unclear processes and barriers holding back community progress.

We heard strong perceptions that support from Dries or the Drupal Association is needed before initiatives could be created or scaled; real or otherwise, this is affecting community progress and action. Speaking to people from the Drupal Association, the Community Working Group and other initiative leaders, they also feel limitations. But to change their terms of reference and priorities they also need to have a community directive.

The community is stronger and more influential than we sometimes assume  --- when we are speaking together.

That's why at the heart of this proposal is a new community governance structure.

The second purpose of the proposal is to create a starting point --- a framework. We’ve been practical, highlighting a range of actions that form a backbone for community evolution. It’s not a defined roadmap, and it’s not a list of every idea we had or heard. We welcome the discussion, debate and idea generation that this document will spark. We want to hear your solutions on how to get change done, and what you would like to contribute.

We strived to make practical recommendations with the potential to make progress, lower barriers, and help our community to continue to evolve with time.

Throughout this process we have heard people say they believe change is necessary. Change is necessary for the longevity of Drupal the project and the code. Change is necessary to create a new generation of Drupallers — the people we want to help build ambitious things and to have the chance to build a career within our community.

It is hard to not feel the project is at a crossroads. We’ve climbed the mountain of Drupal 8, we sit at the peak and look to the valley below.

Where we go next, and who we take with us, is up to you.

We hope this proposal helps.

David, Ela, Stella, Lyndsey, Rachel, Hussain, and Adam

File attachments:  Drupal-Governance-Task-Force-Proposal-2018.pdf
Nov 08 2018
Nov 08

The real cost of creating and maintaining a new website can be hard to estimate even for the best among Drupal professionals. By using the Total Cost of Ownership (TCO) methodology, organizations can ensure that both direct and indirect expenses of operating a website are considered and calculated rather than just emphasize on the initial spending. In this article we are going to take a look at what are the Drupal costs of owning a website versus using a proprietary software.

There are some key considerations to decide on before diving into building a website:

  1. Open Source vs. Proprietary License
  2. Creating and Managing Web Content
  3. Re-designing and Updating Content
  4. Future Upgrades and Longevity
  5. Long Term Savings

Custom Code - a necessity of the past?

Owning the custom code for your website has its own pros and cons, and in some cases it can still prove to be the most feasible choice depending on your website's needs. If you asked web developers a few years ago whether you should opt for a custom framework or open source CMS when building your website, they would tell you that open source CMS limit website's capabilities and features while with custom coding it's possible to adapt to any needs and necessary requirements. This has been changing at a fast pace since open source software started rapidly evolving and covering more and more ground in the industry. For example Drupal's exponential growth and limitless space for scalability has become rather impressive and hard to ignore for big organizations. Currently, Drupal serves as a direct competitor to the concept of custom code: people being able to integrate, twist and change the CMS to get the precise results they expect for their website. With the introduction of Drupal 8, new APIs have been integrated in the CMS which allow for supporting a decoupled Drupal site through Restful Web Services API, enhance user experience by faster page rendering and better caching protocols, as well as Drupal translation API which adjusts the language on your website depending on where the viewers lives - things that would have only been possible through custom code in the past. Not to say that building a custom code is unpractical, but in most cases the benefits that Drupal brings out-scales the custom framework, eventually saving organizations time, money and help them prevent future complications that might arise due to the custom.

1. Open Source vs. Proprietary License

Choosing a content management system (CMS) for a website is one of the fundamental decisions that has to be made before moving on to making other decisions. When talking about CMS, there are 2 directions a company can take: using an open source CMS such as Drupal or proprietary licensed software. If an organization opts for a Proprietary License, they have to regularly invest in the IT department that will take care to implement new website features, updates and support. Open Source CMS are continuously evolving and the bugs are discovered and fixed quicker due to a dedicated community of developers working to improve the CMS. Companies have the freedom to run it on numerous websites resulting in significantly reduced costs and can adapt the software to meet their business requirements. Besides, to further increase the functionality and convenience of the CMS, organizations can opt for using custom modules that enable Drag and Drop content management. For example our Glazed Builder module is currently one of the most powerful Drupal editors and is seamlessly integrated with both Drupal 8 and 7. Compared to the costs of proprietary license, Drupal costs are practically minimal, the CMS is more versatile (you're not stuck to a single software & its attributes) and it gives the organizations the power and freedom to build any kind of website.

2. Creating and Managing Web Content

Creating and managing content on your website on a regular basis is part of the daily routine for many stakeholders in large websites. Every open source CMS provides the necessary tools and flexibility to create new pages or edit already existing ones. In most cases this will still require assistance from a developer who will be responsible for handling it, but it doesn't necessarily have to. Thanks to the development of content editors and page builders such as Glazed Builder, managing and creating content is something anyone can do without relying on IT. They eliminate the need to go through the IT Department every time a change needs to be made on the website. This also drastically reduces IT costs and saves precious time, both could be re-invested in other company assets to deliver more value to consumers. On the other hand, managing content on proprietary website can prove to be quite a challenge, therefore there will always be a need for dedicated developers who are familiar with the software to systematically add and edit pages. 

3. Re-designing and Updating Content

The web environment is continuously evolving and changing at a fast pace. Along with it grows the customer expectations towards customers' online experiences and the value a website can deliver for them. In order to keep up and stay ahead of the competition, businesses need to periodically refresh their website's overall look and stay ahead by implementing new features that storm the web on a regular basis. It is also important to systematically update content in order to keep the information relevant, up-to-date and interesting for the visitor. When building a website, it is necessary to compare the long-term costs and available tools needed for re-designing the website and updating content. Open source software shines when it comes to empowering organizations in this aspect, Drupal having the reputation of the most flexible CMS makes it easy and convenient for organizations to create a new look for the website. Moreover, organizations can implement already made themes that suit their business model or create custom, unique themes by using page builders. Sooperthemes Glazed Builder gives you the control over every visual element on the website and makes it easy to modify them anytime in order to update and maintain content on main pages, product pages, landing pages, etc. 

4. Future Upgrades and Longevity

A website can last between 2-5 years before it needs a fresh re-design or to be completely rebuilt. The incentives that motivate businesses to upgrade their digital assets can include: security reasons, dated website, meeting Google SEO requirements or the necessity to keep up-to-date with the new technologies that emerge every year. Building organizational websites on a platform that grants the ability to be expanded and twisted to meet the latest trends in web technologies will prove to be a lifesaver when the time comes to upgrade the website. It will also eliminate the possibility to have to start building the website from scratch just because the proprietary foundation it was built on from the beginning does not allow for scalability. Drupal is known to be at the top of the industry when it comes to upgrades and space for future scalability. Drupal handles high-traffic websites with ease, maintaining minimum load times even at the highest peaks.

5. Drupal Costs & Long Term Savings

Analyzing the total cost of ownership of your website before starting to build it will save your organization a lot of money and time in the long run. These retained resources could be re-invested in other company facets that will add value for both the business and the end consumer. The costs of maintaining the website on a daily basis could be significantly reduced by using an open source software like Drupal. A good example is the Georgia Technology (GTA) case. When GTA decided to migrate 65 websites to Drupal, it resulted in 65% decrease on platform operational costs and costs associated with support decreased by 75% respectively. In the long term, the costs of operating a website on a Porprietary Licensed Software exceeds the Drupal costs by far. Unless there are very specific requirements for operating your website that could only be achieved with a Proprietary Software, Drupal will play a big part in minimizing the Total Cost of Ownership for an organizational website.

Nov 08 2018
Nov 08

What do you get when you put together: Drupal 8 + AI + UX? Drupal8's content management features and integration capabilities, AI, for storing and interpreting data and building a predictive model and UX for anticipating user behavior while adding a “human touch” to the equation? You get predictive UX in Drupal!

Is it possible? Can we implement predictive UX in Drupal and thus create anticipated user experiences that:
 

  • help you deliver meaningful content only    
  • simplify user choice
  • simplify users'... lives?
     

But how does machine learning actually power these predictive user experiences? What's the whole mechanism behind?

And how is predictive analytics UX any different from... personalization? 

Are there any “traps” to be avoided when using the same event data to make informed decisions on the customer's behalf? 

And last but not least: what makes Drupal 8 the best fit for predictively serving content?
 

1. What Is Predictive UX More Precisely?

“Less choice, more automation.”

Or: Anticipating users' needs and delivering them precisely and exclusively the content they need (when they need it).

In other words: creating those predictive user experiences that anticipate and meet your customers' needs...

Which one of these 2 possible definitions do you prefer?

Or maybe you'd like a more “elaborate” one:

Predictive UX means leveraging machine learning and statistical analysis to make informed decisions for the customer.

And if we are to turn this definition into a mathematical equation, it would go something like this:

machine learning (predicting) + UX design (anticipating)= predictive UX (based on a predictive or anticipatory design)
 

2. But Isn't This Just Another Word for “Personalization”?

As compared to personalization, predictive UX goes beyond tailoring content to users' past choices:

It actually makes decisions on their behalf.

It's not limited to leveraging data in order to deliver dynamic content. Which would automatically call for heavy manual work.

Instead, predictive UX is AI-driven, thus automating decision making on the user's behalf.
 

3. How Does Predictive Analytics Benefit You and Your Customers?

Here's an empathy exercise for you:

You're a mobile app user who's being constantly “flooded” by heavy streams of disruptive information through push notification, by text or by email. Or you're an online customer faced with a discouragingly “beefy” set of options as you're about to order food for lunch. There are so so many irrelevant options that you're striving to make your way through till you find the dish that really suits your preferences... that you just feel like closing the app and hitting the closest resto instead...

So, what if:

  • your app could... tell what you want to have for lunch and display the most relevant options only?
  • you would receive app alerts or push notifications in precisely the most appropriate moments (time of the day, of the month)?

It would:

  1. make your life so much easier
  2. improve your overall user experience 

As a company, by leveraging predictive analytics to deliver relevant user experiences only you'd be winning your customers' loyalty.

You'd be simplifying their lives, after all...
 

4. Leveraging Machine Learning to Create Predictive User Experiences

What's the whole mechanism behind the creation of predictive user experiences?

How is the machine learning technology/tool leveraged to predict user behavior?

It's no more than a 3-step sequence:
 

  1. you first define the problem (using machine learning terms)
  2. gather data in a suitable format
  3. put together a model 
     

For instance, here's a machine-learning-based recommendation system deconstructed:
 

  • content-based recommendation: recommending items based on similar characteristics
  • collaborative filtering: recommending items/services based on other customers' preferences (customers with similar past choices)
     

Note: more often than not it's a mix of these 2 types of recommendation systems that you'll find.
 

5. Predictive UX: 4 Common Sense Principles to Consider
 

5.1. Simplify the UI: keep the most relevant design elements and meaningful content only.

Instead of forcing customers to make too many choices, to scan through chunks of content, go for a minimal interface! Trim down the “irrelevant fat” and keep the essential.

Leveraging machine learning and statistical techniques, you should know by then what's essential and meaningful in terms of information and functionality for your users.
 

5.2. Disrupt the all-too-familiar patterns now and then.

In other words: don't get trapped into the “experience bubble”, where you keep recommending the same familiar options and encourage the user to make the very same choices over and over again.

Consider adding disruptive layers, now and then, “tempting” them to try something new, something different.
 

5.3. Avoid forcing those most relevant options on the user.

OK, so you have the data at hand, you're leveraging that machine learning algorithm that anticipates:
 

  • what the user needs
  • what the user wants
  • what the user's going to do next
     

That doesn't mean you should overlook that:

It's always the customer who makes the final choice!

So give them enough options to choose from! Put him/her in full control of the final decision-making process!
 

5.4. Create predictive user experiences that are helpful, not annoying

In other words: when it comes to push notifications, choose the most appropriate time (if you're a retailer, you can't possibly anticipate that anyone would read about your promotion during work hours).
 

6. Predictive UX in Drupal: What Makes Drupal 8 the Perfect Fit?

There are some particular characteristics that make Drupal the perfect “teammate” for a machine learning tool:
 

  • its content management features and (huge amounts of) data storing and maintaining capabilities
  • its API-first approach, which makes third-party integrations conveniently easy; you can integrate Drupal with any system providing an API and an interface 
  • the “decoupled architecture” approach, which enables Drupal to serve content in various ways
     

Now, just think about it:

Analyzing that huge volume of data, stored on your Drupal website, and leveraging it, using a machine learning tool, to create anticipated user experiences! Think of all the emerging possibilities of implementing predictive UX in Drupal!
 

7. And How Do You Implement Predictive UX in Drupal?

First of all: choose your machine learning tool.

Now, let's say you will have chosen to go with Apache PredictionIO for obvious reasons:
 

  • it's open source
  • it “spoils” you with a set of customizable templates
  • a full machine learning stack
  • the tool's also conveniently easy to deploy as a web service
     

Now, let's have a close look at the Drupal & machine learning tool interaction:

The Event Server collects data from your Drupal app/website — provides it to the Engine —this one reads it — leveraging machine learning, it uses it to put together a predictive model — one that it then sends over to your Drupal app/website — upon a query via REST

Et voila! A predictive result is sent to your Drupal website or application, one that will power a predictive user experience.

Now, since we've been talking about the event data that's being sent from Drupal to the machine learning tool and further “exploited” for building that predictive model, you should know that it comes in “2 flavors”:
 

  1. explicit: the user will have already rated or bought an item, so you have explicit information about his/her preferences 
  2. implicit: the already available information is being leveraged, since there's no past choice or user feedback to analyze for anticipating his/her needs
     

The END! What do you... predict:

Will we be witnessing more and more Drupal 8 websites leveraging predictive UX and, implicitly, machine learning technology, to create anticipated user experiences?
 

Photo by David Travis on Unsplash.

Nov 07 2018
Jay
Nov 07

Determining how content on your site should be cached isn't a simple topic. Last time, I covered cache contexts and tags. Today, I'd like to get into a couple more advanced topics: The use of custom cache tags and of max-age.

Custom Cache Tags

Drupal's built-in selection of cache tags is large, and some contributed modules add additional tags appropriate to what they do, but for really refined control you might want to create your own custom cache tags. This is, surprisingly, quite easy. In fact, you don't have to do anything in particular to create the tag – you just have to start using it and, somewhere in your code, invalidate it:

 \Drupal\Core\Cache\Cache::invalidateTags(['my_module:my_custom_tag']);

 As an example, to continue the scenario from part one about a page showing recent articles, there is one thing about this page that the tags we've already looked at don't quite cover. What if a new article gets created with a publish date that should be shown on your page? Or, maybe an article which isn't currently displayed has its publish date updated, and now it should start showing up? It's impractical to include a node-specific tag for every article that might possibly have to show up on your page, especially since those articles might not exist yet. But we do want the page to update to show new articles when appropriate.

The solution? A custom cache tag. The name of the tag doesn't matter much, but might be something such as my_module:article_date_published. That tag could be added on the page, and it could be invalidated (using the function above) in a node_insert hook for articles and in a node_update anytime that the Date Published field on an article gets changed. This might invalidate the cached version of your page a little more frequently than is strictly necessary (such as when an article's publish date gets changed to something that still isn't recent enough to have it show up on your custom page), but it certainly shouldn't miss any such updates.

This is a simple example of a custom cache tag, but they can be used for many other situations as well. The key is to figure out what conditions your content needs to be invalidated in and then start invalidating an appropriate custom tag when those conditions are met.Ready to get the most out of Drupal?  Schedule a free consultation with an Ashday Drupal Expert. 

Rules for Cache Max-Age

Setting a maximum time for something to be cached is sort of a fallback solution – it's useful in situations where contexts and tags just can't quite accomplish what you need, but should generally be avoided if it can be. As I mentioned in a previous article, a great example of this is content which is shown on your site but which gets retrieved from a remote web service. Your site won't automatically know when the content on the remote site gets updated, but by setting a max-age of 1 hour on your caching of that content, you can be sure your site is never more than an hour out of date. This isn't ideal in cases where you need up-to-the-minute accuracy to the data from the web service, but in most scenarios some amount of potential "delay" in your site updating is perfectly acceptable, and whether that delay can be a full day or as short as a few minutes, caching for that time is better than not caching at all. 

However, there is one big caveat to using max-age: It isn't directly compatible with the Internal Page Cache module that caches entire pages for anonymous users. Cache contexts and tags "bubble up" to the page cache, but max-age doesn't. The Internal Page Cache module just completely ignores the max-age set on any parts of the page. There is an existing issue on Drupal.org about potentially changing this, but until that happens, it's something that you'll want to account for in your cache handling.

For instance, maybe you have a block that you want to have cached for 15 minutes. Setting a max-age on that block will work fine for authenticated users, but the Internal Page Cache will ignore this setting and, essentially, cause the block to be cached permanently on any page it gets shown on to an anonymous user. That probably isn't what you actually want it to do.

You have a few options in this case.

First, you could choose to not cache the pages containing that block at all (using the "kill switch" noted in part one). This means you wouldn't get any benefit from using max-age, and would in fact negate all caching on that page, but it would guarantee that your content wouldn't get out of date. As with any use of the "kill switch," however, this should be a last resort.

Second, you could turn off the Internal Page Cache module. Unfortunately, it doesn't seem to be possible to disable it on a page-by-page basis (if you know a way, please drop us a line and we'll update this post), but if most of your pages need to use a max-age, this may be a decent option. Even with this module disabled, the Internal Dynamic Page Cache will cache the individual pieces of your page and give you some caching benefits for anonymous users, even if it can't do as much as both modules together.

My preferred option for this is actually to not use a max-age at all and to instead create a custom, time-based cache tag. For instance, instead of setting a max-age of 1 hour, you might create a custom cache tag of "time:hourly", and then set up a cron task to invalidate that tag every hour. This isn't quite the same as a max-age (a max-age would expire 1 hour after the content gets cached, while this tag would be invalidated every hour on the hour) but the caching benefits end up being similar, and it works for anonymous users.

Up Next

Now that we've gotten an overview of how to determine what rules you should use to cache content on your site, it's time to get a little bit more technical. Next time, I'll be taking a look at how Drupal stores and retrieves cached data, which can be immensely useful to understanding why the cache works the way it does, and it's also quite helpful to know when fixing any caching-related bugs you might encounter. Watch this blog for all the details!

Offer for a free consultation with an Ashday expert

Nov 07 2018
Nov 07

Your browser does not support the audio element. TEN7-Podcast-Ep-044-DrupalCamp-Ottawa.mp3

It is our pleasure to welcome once again Tess Flynn, TEN7's DevOps Engineer and DrupalCamp ambassador, to discuss the 2018 DrupalCamp Ottawa.

Here's what we're discussing in this podcast:

  • 2018 DrupalCamp Ottawa
  • Minnesota maple syrup
  • Camp format
  • Ottawa's move to Drupal open source
  • Award for travelling the farthest to attend
  • Camp without BOFs
  • Drupal 101
  • Keynote: “Building Accessible Experiences”
  • Accessibility is a core aspect of the entire design experience
  • Socketwench presents: "Healthcheck your site!"
  • Building software as a service
  • Privacy laws differences between Canada and the US

TRANSCRIPT

IVAN STEGIC: Hey Everyone! You’re listening to the TEN7 Podcast, where we get together every fortnight, and sometimes more often, to talk about technology, business and the humans in it. I am your host Ivan Stegic. In this episode of the Podcast, we’re talking to Tess Flynn about her visit to DrupalCamp Ottawa 2018, that happened on Friday, October 26. Tess, welcome back to the Podcast.

TESS FLYNN: Could you even use fortnight now? Isn’t that copyrighted?

IVAN: (laughing) Well, it’s spelled differently, so I think we might be ok. Yea, good point though. Let’s see, DrupalCamp Ottawa. You just got back from Canada. Did you bring back any maple syrup?

TESS: I did, but the problem is, that some of the maple syrup we get here locally actually tastes a bit better than the kind you get from the touristy travel shops that you get in Canada.

IVAN: Yea, we’re a little spoiled in Minnesota with maple syrup, I agree. So, DrupalCamp Ottawa is a little different in format than DrupalCorn that we talked about last. It’s one day of Camp, it’s a Friday, so 25% the length of the other Camps. How did that feel compared to the extended four days that we talked about last time?

TESS: I think that it actually felt rather appropriate. Mostly because you can’t really talk about this Camp without mentioning the fact it is doing head on comparison competition with BADCamp.

IVAN: Oh, that’s right. I’d forgotten that BADCamp was at the same time. What’s the format for BADCamp?

TESS: BADCamp’s a little bit more like TCDrupal. There’s a day of training, then two days of sessions, then a day of contributions.

IVAN: Do you think that affected attendance in Ottawa?

TESS: Well, I actually was wondering about this, as well. The question whether or not is, if you actually had the choice between the two, would you go to one or the other. And I think that’s kind of a false dichotomy, because from another perspective Ottawa is in a completely different country. Even though it’s not very far from Minnesota, at the same time it is technically a different country. So, there are reasons to actually choose a date that even coincides with one of the biggest regional Camps in the United States.

IVAN: And it’s also on the complete opposite end of the Continent as well.

TESS: Yea, it’s on the Eastern time zone.

IVAN: And, how large was DrupalCamp Ottawa, in terms of number of people?  Just share attendance. Just a guess.

TESS: They said that about 250 people registered. Some of those were going to be sponsors, and a fairly typical pattern is that they’ll register more people than actually shows up. So, I would probably guess maybe 175 at least, probably more like 200 and change.

IVAN: Wow, that’s a whole lot for regional Camp and only one day of programming.

TESS: Well, you know, it’s that other country factor, and there’s a lot to really unpack there, because it’s not just a DrupalCamp somewhere else. There are specific regional concerns that go along with having a DrupalCamp in Canada and using Drupal in Canada.

IVAN: So, let’s talk about that a little bit. Would you guess that most of the attendees were from Ottawa and from Ontario?

TESS: I would probably say so, because Ottawa, from what I recall, is the Capitol. So, there’s a lot of government in Ottawa. A lot. And, Ottawa is trying to pivot towards doing more Drupal open source, and more open source in general. So, the idea that a lot of people would attend this Camp to get more open source information makes perfect sense. And, to put it in the same city that a lot of people work in, also makes a lot of sense.

IVAN: It does make a lot of sense. Now, I heard you received a special award.

TESS: (laughing) There was kind of a joke about that. As a Camp speaker, there’s always kind of a little bit of a joke about, if you were the farthest one to attend the Camp. And, from my knowledge, I might’ve been one of the few Americans to attend the entire Camp, and probably the only one that really needed to take a flight to get there.

IVAN: (laughing) What was the prize? Or was it just a proverbial pat on the back?

TESS: It was more like, “oh, really, I am the furthest away one. Oh, that’s nice.” That was it. (laughing)

IVAN: (laughing) Now, I looked at the schedule and it looked like it was broken up into three tracks for the day, and it loosely seemed to be something along the lines of front end, back end and everything else. And, everything else was kind of like business, strategy, communications, content, which kind of makes sense. Did I get that right? Was that more or less how it was?

TESS: It certainly felt like that. I mean with only one day of Camp, and only about four different session periods, there’s not really that much need to break it up along too many different functional lines. There’s only so many slots available.

IVAN: And no BOFs from what I could tell.

TESS: No. I don’t think they had the room available at the venue in order to do that, but they might have.

IVAN: I see. Nice segue into the location of where the event was, it was at the University of Ottawa. The website says the SITE Building. Could you tell me more about the space?

TESS: That place has just got such an interesting personality. How can I explain this? Like if someone took material design and construction aesthetic and mashed them together, you get this combination of bright colors and metals and all sorts of interesting things. It was really, really, a nifty little venue. It was very visually interesting. And, because the Camp wasn’t particularly big, everything was in one building, so it was very easy to find everything.

IVAN: So, three rooms, all in one building. I would assume lunch in a central place, as we’ve come to expect?

TESS: That’s correct.

IVAN: Right. That’s great. That seems to make quite a cozy atmosphere for attendees. I really like those, when they’re all close together and bunched up. Let’s talk a little about the pre-keynote. It looked like there was a session on the scheduled called Drupal 101, that seemed to be very inviting for beginners, kind of before the keynote happens, if you’re new to Drupal, not sure what a node is. The description says, “bring your coffee and get a quick course in Drupal terminology.” I love this idea of kind of giving an intro before the festivities or the keynote begins.

TESS: Yea, I rather liked how that went because it provides a nice bit of framing, that would’ve otherwise been taken out by a training session on the first day of a multi-day Camp. And, I think it was a nice compromise in order to allow people who have heard about this Drupal thing, and then get a nice introduction, so that they can get value out of the Camp. And, because the Camp was on a Friday, some people might be attending this on their work hours.

IVAN: Yea, I think that’s a great welcoming idea. It would be interesting to talk to the organizers to hear what their take on the motivation behind that was. So then, that rolled into the keynote and the keynote was titled, “Building Accessible Experiences.” And, it was from developer advocacy lead at Shopify. I can’t pronounce Tiffany’s last name, I’m going to try, Tiffany Tse. Any ideas if I got close, Tess?

TESS: No, I don’t think the coffee had quite kicked in, and I think I barely missed her last name too. So, I can’t quite remember the pronunciation either.

IVAN: (laughing) We’re sorry Tiffany, if you’re listening. Call us and let us know how we did. Yea, so Shopify, first of all, I love the fact that the keynote was from someone outside of the Drupal ecosystem.

TESS: I just really appreciated this particular keynote. A lot of keynotes lately, including one that I gave myself, tended to be a lot more broad-reaching, a lot more big ideas and directions and business policies. And this one was a lot more down to earth, a lot more practical, really put you into the pilot seat of, “okay, you’re going to be an accessibility designer. What’s wrong with this?” And, it was just a wonderful experience, because it really sat you down and made you think about what you were looking at, and it was nice to do that as the first thing in a Camp, because it felt very direct.

IVAN: Glad to hear it. So, what do you think your major takeaway from the keynote was?

TESS: Well, I think the general message that I took away from it was that accessibility is not something that you can just bolt on later. It is a core aspect of the entire design experience, and you should consider it very carefully from the very beginning, because a site can be a lot more versatile than say an application can be. And, it has a lot more audiences, and a lot more modalities in which that, it is presented to different users. And, it was really, really, well communicated.

IVAN: And, further to that, the thing that I always want to try to remind everyone we’re working with, and the people that we help with our sites is, not only is accessibility important to think about from the design aspect and right from the beginning, but it doesn’t stop after you’ve launched a site. It’s something that continues, that all members of the team that are responsible for the site have to be aware of and continue to build on. It’s not something that you just launch as a feature, and you’re done. So, I’m glad to hear that was a good keynote. And, it looks like your session was directly after the keynote, in the same room (laughing). So, did you luck out and have a whole lot of people stay?

TESS: I apparently did have a lot of people staying for that session. I was kind of surprised, actually, about the number of people that attended. I think it was some 50 people that I counted right before I started. And, I know that some people came in after I got started as well, that I didn’t get a chance to count.

IVAN: And you gave away all the TEN7 swag at your session.

TESS: Yea. We were running a little bit late because the keynote ran a little bit long. So, when I first set up, I basically put everything out, and anyone who was an early bird I said, “here, come take. Don’t make me take this back through American security.”

IVAN: (laughing) Yea, we were a little light on swag at this Camp, because of the fact that you were traveling internationally. But, I’m sure we had enough to make some people happy there.

TESS: It all vanished anyway.

IVAN: Yea, that’s what we want. Any particularly interesting questions that came up in your session, that maybe you haven’t heard before?

TESS: So, the thing with my sessions is that very rarely do people actually come up with questions, because once I tend to get started, it’s really hard to get a question in edgewise, because I just have (laughing) such a presentation that is just a firehose of nonstop rambling for almost an hour. And it’s really hard for people to just stop and ask questions. Sometimes people do, but my sessions tend not to get a lot of questions.

IVAN: I think you do a great job of explaining things so clearly with analogies and with detail that, that’s maybe why there aren’t any questions. I certainly appreciate attending those. So, just looking at the other sessions on the schedule, a few that peaked my interest, “The New Face of CiviCRM.” CiviCRM still makes me a little scared, so I’m glad that there’s a new face. “Building Software as a Service in Drupal,” another session I thought was something I might have attended had I been at the Camp. And then, “Drupal as the Base of an Inclusive Workplace,” which was Mike Gifford’s session. It’s an interesting idea. I kind of read the description of the session, the fact that Drupal is largely still known as a CMS, and people really don’t realize that it’s much more than that, especially when you think about accessibility and user experience. You went to that session, right?

TESS: I did. But I went in with the expectation, because I didn’t read the description very well, that it was going to be a little bit more culturally focused, and how to build a more diverse team as a result of using Drupal. And, so, when they started going on the technical merits I was like, “Ahhh",  and it’s totally my fault, I didn’t read the session description very well.

IVAN: Oh. So, what was your takeaway then from that session?

TESS: A lot of it reminded me of the keynote, but it also kept pointing out one thing that was really important is that, accessibility doesn’t just benefit those who are disabled, because accessibility is not just going to be for those who have a permanent disability, but a temporal or situational disability as well. And, there was a lot of focus on bringing that into the conversation as well.

IVAN: Mike does a great job of being inclusive, and I imagine that was a wonderful session to attend. Did you go to the “Building Software as a Service on Drupal” session?

TESS: I did go to that one. I also, kind of was hoping this one was going to be a little bit more business focused. It actually was mostly a technical discussion about how to use Aegir, which has been around for the better part of 10 years in Drupal circles, and is still going, and is still a method to provide a Drupal solution as a software as a service. And, the next version of Aegir is supposed to finally support more than just Drupal, and virtually any php application, and possibly any web application that can be deployed.

IVAN: So that’s how you say it?

TESS: What? Software as a service?

IVAN: No. Aegir. I always wondered about that.

TESS: No, I only remember that because I think I listened to. Was it a Drupal easy podcast, like years ago, half a decade ago, about Aegir? And that was like one of the first things that they were going to talk about was, “how do you pronounce this? It’s got a diphthong in it, why?”

IVAN: (laughing) I want to spend some time talking about this building software as a service session. So, from what I understand, Aegir’s basically a way for you to host your own site, and maybe even sell hosting to others as a service, particularly just Drupal sites. And you said that it would, in the next version, be supporting more than just Drupal sites, but PHP applications as well. Is this the basis for Pantheon? Is this where Pantheon started? I have no idea. How is it similar or different to Pantheon?

TESS: I don’t know if Aegir was actually used in Pantheon at the beginning. I do know that they were using their own home brewed containerized solution, possibly using Xen or KVM at some point, and that they recently transitioned to Google Kubernetes engine in order to run most of their container systems. And primarily the product that they have is a web front end and a pricing tier in order to better leverage all of that usage. And, I’m not sure if they ever really utilized Aegir for that or not.

IVAN: It looked like this was a session that was more in the style of a BOF, the way the description was written. It felt like it was going to be more discussion oriented. Did that turn out to be the case?

TESS: It did turn out to be the case. I was really hoping for a lot more perspective from the business perspective, because it felt like it was very technically focused, very capability focused, as in Aegir can do this, Aegir can do this, this is how you do this. Yes, you can run it on your own hardware, why would you want to do that? And, this is where one of the key things that I took away from the entire Camp starting sitting in my mind, is, that, because I’m not in the West, there are different concerns for hosting, and a lot of Canadian companies do not want to rely on any US hosting. And I cannot blame them, considering our utterly lackadaisical privacy laws. And, I’m being generous when I describe it that way.

IVAN: So, what turned out to be the options for Canadian companies for doing hosting, if they’re not going to rely on US technology?

TESS: Well, I think that AWS is now involved, but that’s still a company that’s technically owned and operated from the US, and that might not be as comfortable for people. I actually haven’t had enough time yet, to really investigate the hosting market in Canada. It feels like it needs more development, honestly, is my initial impression. I could be wrong about that. I can guess that there’s probably a lot of on-premise hosting, but not nearly enough, like cloud-based hosting. And, there might be a lot of shared hosting, as well, that is used by a lot of smaller sites. But I’m really concerned that there’s just not enough cloud hosting, that is also hosted in Canada, in order to make sure that the privacy laws still apply, that the local/regional laws still apply, and that these are actually utilized for Canadian sites. And, this may be a hollow argument if a lot of the Drupal market share is government, because they’ll be more likely to self-host than use cloud products. Although, that made me think the following day, why isn’t it that the Canadian government itself, doesn’t form a wholly-owned and operated company that does nothing but hosting an infrastructure providing in a cloud facility. They’ve got to have more than one data center under their ownership already.

IVAN: Yea, that’s a good point. It seems like a market opportunity, that a company like Pantheon or Acquia could certainly take advantage of. But then at the same time, they’re a US company that are operating in Canada, and so, maybe there’s a Pantheon Canada that gets formed, or a company that’s run and operated in Canada by similar or related people to the same US company, and yet they have their own privacy standards and use privacy protocols that are acceptable to the Canadian laws. I think Google has GKE zones that are available in Canada, so in theory, you could potentially do that. I suppose.

TESS: Yea, I think there probably are some GKE zones in Canada as well. I have to look into that to be sure.

IVAN: Maybe we should start a hosting company, Tess.

TESS: I’m all for that.

IVAN: (laughing)

TESS: Ottawa isn’t bad, but I like Toronto more. (laughing)

IVAN: (laughing) Ok. We could be wherever you want.

TESS: This is a thing of mine though. When I used to do business travel a lot, I noticed that I tend to get immediate impressions of places that I touchdown in. It’s really weird, because it doesn’t seem to make any logical sense to me either. But Toronto had a very familiar vibe to what I’m used to in Minneapolis, but there were certain rounded corners that I didn’t have from the same vibe in Minneapolis. And those were probably where a lot more of a Canadian cultural vibe was poking out. And, Ottawa felt very similar to that, but a slower pace. It’s a little bit hard to describe. I wanted to describe it using a music analogy. So, the thing that pops to my mind is that, there is a video game called Undertale that has been around for a while, and towards the end there’s this one area of the game that has very upbeat, fast paced music. But if you take an alternate story path in that game, that same music plays, but in a very slow, lumbering pace instead. And, I didn’t get that exact feeling, but it definitely made me think, “wow, this is the same song, but it’s slightly slower. That’s interesting.” (laughing) I know this is all a ridiculous subjective, but that was something that just kept coming up when I was there.

IVAN: It felt familial and accessible I would argue. So, no direct flights to Ottawa from Minneapolis. Did you fly through Toronto?

TESS: I did fly through Toronto, and that was actually fairly easy. It’s only a two-hour flight from MSP and you could get a direct. The only problem is, once you’re in Toronto, you have to catch a one hour connecting flight to Ottawa. Now, I didn’t have any problems going to Ottawa, but coming back I just kept getting hit with delay after delay, and it was a little bit frustrating, because we left probably 15 minutes from Ottawa, from when we were supposed to. I didn’t mind so much, because I had enough time to account for the difference. But then once I landed in Ottawa, I failed to remember that the international security procedures had changed since I last traveled internationally. And now you have to go through international customs as an American citizen on the international side, rather than the US side. And that was a Kafkaesque experience to say the least. I felt like I was reenacting the movie Brazil a little bit there.

IVAN: (laughing) Yea, we had that same experience flying through Toronto on the way back from Europe this year, and, it actually made me think of, kind of what laws apply on the Canadian side after you’ve cleared US customs. I know that it’s US law that applies, but that just feels wrong.

TESS: Someone explained to me that Canadian transit agency, their equivalent of the TSA, is actually a superset of TSA law which just makes me go, “oh geez.” (laughing) In TSA law, if you know even a little bit about it, is already this nightmarish labyrinth of weird edge cases, and political meddling, and none of it makes any sense anymore, and it hasn’t since about 2007 honestly.

IVAN: Yea, it was pretty insane, certainly Kafkaesque as you said, going and clearing customs in Canada for the US, and then physically being in Canada, but technically being in the US after you’ve done that.

TESS: Well, the real hilarious part is the nature of how this works in the Toronto airport. When you actually go through Canadian security at the Toronto airport, and you first get cleared, you’re opened into this wide foyer and it’s got this giant flower sculpture thing, and the underside of each petal is actually your arrival and departure time screens. It’s really nice. And then afterwards I had to walk through there to go to a completely different concourse, and when you get to that concourse you have to go through security again, then you have to go through customs again, then you have to go through the customs waiting area, because they won’t let you go directly to your gate. Then after you go there then you walk to your gate, and by the time I got all the way to the end of my gate, I was on the other side of a glass window, and on the other side of that glass window that I was looking right through, was the giant flower. And that took an hour and a half.

IVAN: (laughing) Wow. Well, I think we have a little more time to talk about the other session that kind of peaked my interest, and you also I think went to it, was the “Journey through the Solr System.” And, the only reason it peaked my interest was because I thought the title of the session was amazing.

TESS: The slides were also great too. They had a really nice visual style that I really appreciated. It made it very fun, but at the same time it focused on information. And the talk itself was also different than I expected. Now, usually when you think of Solr and Drupal, you’re going to think of, well you’re probably going to use a search API implementation, and it’s going to be one site, and you’re going to configure which entities that you’re going to have going to which in that system, and then you’ll use views in order to make your search pages and yada, yada, yada. Well, they couldn’t do that with this solution. The problem is that they have some two hundred different sites, and they had to have a unified singular search mechanism. And it wasn’t a multi-site either, so you couldn’t kind of cheat and use some of that facility in order to populate a single index. So, either they had to come up with a completely custom solution in which any time content was posted for each individual site it went back to a standard search API server, or they’d have to do something completely different. What they used to use, they used to use a Google search appliance, and this was great because it was on premises, all of the data was local, they owned it. And then, suddenly, those yellow boxes stopped arriving from Google because Google deprecated the entire product line. Now you have to forward all of your search index information to some American server, and this is not comfortable for some people, and that is perfectly fair. So, they could’ve paid for a different solution, or they could’ve went, “well, we’ll just risk the privacy implications,” but instead they decided, “you know what, let’s see if we can try to build one of these ourselves.” So, the solution they came up with was, a high availability Solr configuration with an open source web crawler called Nutch, and it was just a fascinating combination of elements to make, basically, your own Google, but within your own organization, for your own sites, without having to have a direct backend connection.

IVAN: Nice. I really love that name, Nutch.

TESS: That was a really, really fascinating talk, and I wish that I could’ve captured more of the technical details of that, but I was coming right off of doing my session, so I still had a lot of adrenaline in me. (laughing)

IVAN: Yea, and I’m sure that the session video will be posted once it’s available. Yea let’s talk about that a little bit, and then I think we’ll wrap. So, it looks like there were sessions that were recorded again, courtesy of Kevin Thull and his equipment.

TESS: Well, not quite.

IVAN: Not quite?

TESS: Not quite. Kevin Thull was not there, he was at BADCamp.

IVAN: Oh. But his equipment was there.

TESS: Well, from my understanding what happened is that Kevin Thull trained the DrupalCamp Ottawa staff, and provided them a list of the hardware that he uses for his talks. So, they reimplemented all of that under his guidance, and then ran it themselves, independently. So, it was a very familiar experience. Everyone had the big red button that they had to press. So it was very, very familiar. I do know that they have a few gotchas with the session recording, but they had generally had a fairly good capture ratio.

IVAN: That’s wonderful. I do see that on the DrupalCamp Ottawa website they published a playlist on YouTube, and I think there are about six videos on there right now, six sessions that are currently available with the note that they’ll be adding the rest of the sessions in the coming week or so. So that’s great. We’re going to have a recording of your session, and you could probably go back to the Solr session as well and check the details of that one out as well. Well, all in all, a good Camp. Something that maybe I’ll consider going to next year, and maybe we’ll send you again next year. Tess, thank you so much for spending your time with me and talking through DrupalCamp Ottawa 2018.

TESS: No problem.

IVAN: You’ve been listening to the TEN7 Podcast. Find us online at ten7.com/podcast. And if you have a second, do send us a message. We love hearing from you. Our email address is [email protected]. Until next time, this is Ivan Stegic. Thank you for listening.

Nov 07 2018
Nov 07
Create Charts in Drupal 8 with Views

There are many ways to present data to your readers. One example would be a table or a list. Sometimes the best approach is to show data on a chart.

It can ease understanding of large quantities of data. There is a way to make charts in Drupal with the help of the Charts module and Views.

In this tutorial, you will learn the basic usage of the module in combination with the Google Charts library. Let’s start!

Install the Charts Module and the Library

  • Download and install the Charts module.
  • Click Extend.
  • Enable in the Modules page the Charts module and its submodule Google Charts.
  • Click Install:

click install for Drupal charts

Installation Using Composer (recommended)

If you use Composer to manage dependencies, edit "/composer.json" as follows.

  • Run "composer require --prefer-dist composer/installers" to ensure that you have the "composer/installers" package installed. This package facilitates the installation of packages into directories other than "/vendor" (e.g. "/libraries") using Composer.
  • Add the following to the "installer-paths" section of "composer.json": "libraries/{$name}": ["type:drupal-library"],
  • Add the following to the "repositories" section of "composer.json":
            {
                "type": "package",
                "package": {
                    "name": "google/charts",
                    "version": "45",
                    "type": "drupal-library",
                    "extra": {
                        "installer-name": "google_charts"
                    },
                    "dist": {
                        "url": "https://www.gstatic.com/charts/loader.js",
                        "type": "file"
                    },
                    "require": {
                        "composer/installers": "~1.0"
                    }
                }
            }
    
  • Run:
    composer require --prefer-dist google/charts:45
    
  • You should find that new directories have been created under /libraries
  • Click Configuration > Content authoring > Charts default configuration. 
  • Select Google Charts as the default charting library.
  • Click Save defaults:

select google charts

Step #2. Create a Content Type for your Drupal Charts

We need some kind of structured data to present in our charts. I’m going to compare the population of all the countries in South America. You can, of course, make your own example.

  • Go to Structure > Content types > Add content type.
  • Create a content type for your Drupal charts

create your content type

  • Add the required fields to match your data:

add required fields

  • At the end, you should have something like this:

the final result

  • Now that you have your content type in place, let's proceed to create the nodes. In this example, each node will be an individual country.

create countries

Step #3. Create the View for your Drupal charts

  • Click Structure > Views > Add view. 
  • Give your view a proper name. 
  • Choose the content type you want to present to your readers.
  • Choose to create a block with a display format Unformatted list of fields. You won’t be able to proceed in this step if you choose Chart due to a small bug in the logic of the module.
  • I’ve chosen 12 items per block because there are 12 countries I want to show in my chart.
  • Click Save and edit:

click save and edit

  • In the FIELDS section of Views UI click Add.
  • Look for the relevant field for your chart and click Add and configure fields.
  • Leave the defaults and click Apply:

add and configure fields

click apply

  • In the FORMAT section click Unformatted list.
  • Choose Chart.
  • Click Apply:

in the format section click apply

  • Select the Charting library in the drop-down. 
  • Select the title as the label field, if it’s not been selected already.
  • Check your relevant data field as provided data.
  • Scroll down and change the Legend position to None.
  • Click Apply. 
  • Feel free to play with all the configuration options available here to match the chart you want or need.

play with configuration options

  • Save the View.

Step #4. Place Your Block

  • Click Structure > Block layout.
  • Search for the region you want to place the block in.
  • Click Place block.
  • Search your block and click Place block once again.
  • Click Save blocks at the bottom of the screen and take a look at your site.

look at your site

There you have it - your Drupal chart is live. Of course, if you change the data in one of your nodes, the chart will adjust itself accordingly. If you want to change the chart display, just change it in the Chart settings of your view. 

You can also give the other charting libraries (C3, Highcharts) a try and see what fits your needs best.

As always thank you for reading! If you want to learn more Drupal, join OSTraining now. You'll get access to a vast library of Drupal training videos, plus the best-selling"Drupal 8 Explained" book!


About the author

Jorge lived in Ecuador and Germany. Now he is back to his homeland Colombia. He spends his time translating from English and German to Spanish. He enjoys playing with Drupal and other Open Source Content Management Systems and technologies.
Nov 07 2018
Nov 07

Every big Drupal release opens fantastic opportunities for websites. Three years ago, the eighth Drupal version came to this world — and the world fell in love with top-notch Drupal 8 improvements. Drupal 8 has been getting more and more awesome on its way from Drupal 8.1 to Drupal 8.5, and the latest version, Drupal 8.6, is cooler still. Drupal 8 is on its peak, but the cycles of development never stop. That’s why the Drupal community has already announced the expected release of Drupal 9 and end-of-life for Drupal 8 and 7. Let’s see what it means for Drupal 8 and 7 website owners and what action is needed from them. And, of course, our Drupal team is ready to help them take this action.

The planned release of Drupal 9 and end-of-life for Drupal 8 and 7

A little while after D8 was released, the most impatient and curious users began to ask questions about the future Drupal 9. When will it be released? What will it offer? Will websites need another upgrade?

Despite the fact that the development branch for Drupal 9 was started years ago, no one could know its release year for sure. There were even suggestions that Drupal 9 would never come at all — raised by a totally new approach to updates in Drupal 8 and the regular innovation mode.

However, the situation changed in September 2018, at the world-wide meetup for drupalers — Drupal Europe in Darmstadt. The “phantom” of Drupal 9 finally took shape. D9 will come, and the year is set! Drupal founder Dries Buytaert announced the Drupal community plans on it, as well as illustrated it  with images:

  • Drupal 9 release: 2020
  • Drupal 8 end-of-life: 2021
  • Drupal 7 end-of-life: 2021

What the future release of Drupal 9 means for Drupal 8 and 7

End-of-life for Drupal versions: hey, what does it mean?

End-of-life, or EOL, is the moment when official support for a particular Drupal version drops. The Drupal team stops watching over it and creates no more updates and patches for it, including those in the security area. This means more vulnerabilities to hacker attacks and, of course, no new features.

For example, February 24, 2016, was the EOL for Drupal 6. The service of upgrades from Drupal 6 to Drupal 7 or 8 became a very popular one with our Drupal team, because many customers applied for that. By the way, if you are still using Drupal 6, it’s high time to upgrade — better late than never!

Usually the two latest Drupal versions are supported: the newly released and the previous one. However, in the case with D9, the EOL for D7 comes a little bit later and for D8 a little bit earlier than usual — in 2021 for both.

Although the end-of-life sounds a little scary, there is no need to worry. See the next chapters for more information.

What Drupal 9 release means for Drupal 8 websites

What the future release of Drupal 9 means for Drupal 8 and 7

Drupal 8 is in the center of community’s ideas and is getting lucrative technological innovations all the time. Despite the EOL in 2021, the future looks particularly bright for D8 website owners. And here’s why.

Compared to Drupal 7, Drupal 8 is a technological breakthrough. That’s why upgrades from D7 to D8 are often lengthy (depending on the site complexity). But if you have moved to D8 from D7, that was your LAST cumbersome upgrade. No more of that from now on!

Drupal 8 websites will move to D9 quickly and smoothly. Lightning-fast upgrades will be provided for those that are using the latest Drupal 8 minor version and no deprecated APIs. This golden rule of keeping up-to-date and avoiding deprecated APIs helps even contributed and custom Drupal 8 modules be instantly compatible with Drupal 9! Our Drupal web studio is always ready to take care of this for you.

So an upgrade from D8 to D9 will be something you will barely notice. The secret is, according to the principle of continuous innovation, Drupal 8 releases backwards-compatible minor versions every half a year. Drupal 9 promises to be almost identical to the latest minor version of Drupal 8 with deprecated code removed, as stated in the article by Dries Buytaert “Making Drupal upgrades easy forever”.

What Drupal 9 release means for Drupal 7 websites

What the future release of Drupal 9 means for Drupal 8 and 7

We wrote that the future looks bright for Drupal 8 website owners, but it also looks good for Drupal 7 website owners! They only need to take more decisive action. So what should they do considering the dropping of support in 2021?

  • They might hope for a commercial support program for Drupal 7 that the community is thinking to implement, but relying on it looks like staying with the past.
  • They might also wait for Drupal 9 and jump directly to it. But they will need a big upgrade someday anyway — to Drupal 8 or 9, which are very close relatives. The time passes, and all this time they could be enjoying Drupal 8 without putting their success on the shelf.
  • The best option is to move to Drupal 8 in the nearest future. Just one upgrade will be their ticket to the future hassle-free upgrades (to Drupal 9, 10, and beyond). And, of course, they will be in line with the times and have all the Drupal 8 innovations. Our migration experts are ready to smoothly move you to Drupal 8.

So there’s no need for a fortune-teller to predict your future in relation to the release of Drupal 9 ;) The future looks bright for you in any case! The only condition is that have good drupalers on hand.

Drupal updates and upgrades, as well as Drupal support are among our areas of expertise.

Contact our Drupal team, and let’s choose the best action for your website!

Nov 07 2018
Nov 07

Installing Lando on a Windows machine is easy. Just follow these 30 (or more) simple steps:

  1. Review the directions.
  2. Figure out which version of Windows you are running.
  3. Realize that you need to upgrade to Windows 10 Professional, because apparently you have to pay extra to actually do work on a Windows machine.
  4. Open the Windows Store.
  5. Spend half an hour trying to figure out why the Windows store is only showing blank pages.
  6. Take a break, go vote, spend some time with your kids, and seriously consider buying a Mac so that you don't have to deal with this shit.
  7. Reboot your computer and finally get Windows store to respond.
  8. Pay $100 dollars, while updating your account information because everything is three years out-of-date. Do not pass Go.
  9. Reboot your computer twice.
  10. Go to the Lando releases page.
  11. Spend some time looking for the last stable release (note: there is no spoon stable release).
  12. Download and run the latest .exe.
  13. The installer will complain that you don't have Hyper V, which you just paid for.
  14. Find the obscure command you need to enable Hyper V.
  15. Find Powershell in the Start menu.
  16. Discover that you can paste into PowerShell just by right-clicking your mouse. This seems convenient, but it's a trap!
  17. Run the command. It doesn't work.
  18. Learn how to run PowerShell as an administrator.
  19. Run the command, again.
  20. Reboot your computer, again.
  21. Run the .exe, again.
  22. The installer wants to install Docker. Let it.
  23. The Docker installer wants you to log you out. Let it
  24. Log back in.
  25. Open Babun and try the lando command. It isn't found.
  26. Open Powershell and try the lando command. It isn't found.
  27. Open the Command Prompt and try the lando command. It isn't found.
  28. Re-run the Lando installer, for the third time. It turns out that it never finished because Docker logged you out.
  29. Open Powershell and try the lando command.
  30. It works! Congratulations, you are done!*

* Just kidding...

  1. Open PowerShell. Go to the directory where you have your Drupal site.
  2. Run lando init.
  3. Choose the drupal 7 recipe.
  4. Why is it asking for a Pantheon machine token? This isn't a Pantheon site! Hit Ctrl-C.
  5. Log into Pantheon, create a machine token for your Windows machine. note: Terminus and Lando are notorious for asking for this machine token over and over, so make sure to paste this machine token into a file somewhere, which kind of defeats the entire point of having a machine token.
  6. Run lando init, again.
  7. Right clicking to paste doesn't work for the hidden machine token. So, learn a different way to paste the machine token into PowerShell.
  8. Congratulations, you are done!**

** Just kidding...

  1. Run lando start. Your terminal will proceed to spew error messages for several minutes.
  2. Spend an hour searching through the Lando issue queue trying to find the magic sequence that will fix these errors.
  3. Go apple.com and start comparing the new MacBook Air to the new Mac Mini. Figure out if you can afford either one so that you don't have to deal with this shit.
  4. Your kids are picking up on your frustration, and everyone is melting down because it is bedtime (and your are anxious about the election).
  5. Give up for the night, and obsessively refresh the election results at fivethirtyeight.com until the results are clear at 11:00 PM.
  6. Get up the next morning and write a satirical article about installing Lando on your Windows machine.

I will let you know if I ever actually get it working.

Update November 19, 2018

I was finally able to get Lando working. Here is what I did:

  1. Deleted the "hidden" directory at ~\.lando.
  2. Uninstalled Docker and Lando with the Windows control panel.
  3. Downloaded and ran the latest version of Lando for Windows, which was lando-v3.0.0-rc.1.exe at the time I was writing this.
  4. The Lando installer also installed Docker, but Docker did not ask to log out this time. Also, Docker asked to install a newer version, which I did not allow.
  5. After a sucessful install, I used Powershell to navigate to the directory where I had my Drupal files.
  6. I removed the old version of .lando.yml.
  7. I ran lando init, chose the Drupal 7 recipe, and provided the Pantheon machine token. I knew I was going to need to keep that!
  8. I ran lando start.
  9. During the start, Docker asked for permisson to "share" with the C: drive, which I granted. I don't remember having to do that before, but I might have forgotten.
  10. Also, during the start Windows Defender asked what to do about a Docker sub-program that wanted internet access. I definitely don't remember that. So, I gave it permission on both private and public networks, since I suspect that it was crucial and I am on public networks somewhat regularly.
  11. The command ran cleanly, and the site responded in a browser. So, I am finally done. No kidding
Nov 07 2018
Nov 07

Last September Dropsolid sponsored and attended Drupal Europe. Compared to the Northern America’s conferences, getting Europeans to move to another location is challenging. Certainly when there are many conferences of such high quality that compete such as Drupalcamps, Drupal Dev Days, Frontend United, Drupalaton, Drupaljam, Drupal Business Days. I’m happy for the team they succeeded in making Drupal Europe profitable, this is a huge accomplishment and it also sends a strong signal to the market!

Knowing these tendencies, it was amazing to see that there is a huge market-fit for the conference that Drupal Europe filled in. Also a great sign for Drupal as a base technology and the growth of Drupal. Hence, for Dropsolid it was a must to attend, help and to sponsor such an event. Not only because it helps us getting the visibility in the developer community but also to connect with the latest technologies surrounding the Drupal ecosystem.

The shift to decoupled projects is a noticeable one for Dropsolid and even the Dropsolid platform is a Drupal decoupled project using Angular as our frontend. Next to that, we had a demo at our booth that showed a web VR environment in our Oculus Rift where cotent came from a Drupal 8 application.

People trying our VR-demo at Drupal Europe

On top of that, Drupal Europe was so important to us that our CTO helped the content team by being a volunteer and selection the sessions that were related to Devops & Infrastructure. Nick has been closely involved in this area and we’re glad to donate his time to help curate and select qualitative sessions for Drupal Europe.

None of this would have been possible without the support of our own Government who supports companies like Dropsolid to be present at these international conferences. Even though Drupal Europe is a new concept, it was seen and accepted as a niche conference that allows companies like Dropsolid to get brand awareness and knowledge outside of Belgium. We thank them for this support!

Afbeeldingsresultaat voor flanders investment and trade

From Nick: “One of the most interesting sessions for me was the keynote about the “Future of the open web and open source”. The panel included, next to Dries, Barb Palser from Google, DB Hurley from Mautic and Heather Burns. From what we gathered Matt Mullenberg was also supposed to be there but he wasn’t present. Too bad, as I was hoping to see such a collaboration and discussion. The discussion that got me the most is the “creepifying” of our personal data and how this could be reversed. How can one gain control the access of your own data and how can one revoke such an access. Just imagine, how many companies have your personal name and email and how could technology disrupt such a world where an individual controls what is theirs. I recommend watching the keynote in any case!”

[embedded content]

We’ve also seen how Drupal.org could look like with the announced integration with Gitlab. I can’t recall myself being more excited when it comes to personal maintenance pain. In-line editing of code being one of the most amazing ones. More explanation can be found at https://dri.es/state-of-drupal-presentation-september-2018.

[embedded content]

From Nick: 
“Another session that really caught our eye and is worthy of a completely separate blogpost is the session of Markus Kalkbrenner about Advanced Solr. Perhaps to give you some context, I’ve been working with Solr for more than 9 years. I can prove it with a commit even!  https://cgit.drupalcode.org/apachesolr_ubercart/commit/?id=b950e78. This session was mind blowing. Markus used very advanced concepts from which I hardly knew the existence of, let alone found an application for it. 

One of the use cases is a per-user sort based on the favorites of a user. The example Markus used was a recipe site where you can rate recipes. Obviously you could sort on the average rating but what if you want to sort the recipe’s by “your” rating. This might seem trivial but is a very hard problem to solve as you have to normalize a dataset in Solr which is by default a denormalized dataset. 

Now, what if you want to use this data to get personalized recommendations. This means we have to learn about the user and use this data on the fly to get these recommendations based on the votes the user applied to recipes. Watch how this work in the recording of Markus and be prepared to have your mind blown.”

[embedded content]

There were a lot of other interesting sessions and most of them had recordings and their details can be found and viewed at https://www.drupaleurope.org/program/schedule. If you are interested in the future of the web and how Drupal plays an important role in this we suggest you take a look. If you are more into meeting people in real-time and being an active listener there is Drupalcamp Ghent (http://drupalcamp.be) at the 23rd and the 24th of November. Dropsolid is also a proud sponsor of this event.

And an additional tip: Markus’s session will also be presented there ;-)

Nov 07 2018
Nov 07

Last week I attended BADCamp and as usual, I can confirm firsthand that BADCamp keeps being a blast. I will mention some of the reasons why.

The Summits

I had a chance to attend DevOps and Front-end Summits half day each. During such summits, participants shared their experiences about the tools and techniques used regularly while working with clients. While at the DevOps Summit, it was great to hear that a lot of developers are interested in Kubernetes. Also, it was interesting to hear conversations about CI/CD workflows and the different tools used when building disposable instances per PR and deployments per branches which is something we are already working on in weKnow’s client projects.

The sessions

Confident to say that GatsbyJS stole the show. The event included three sessions about GatsbyJS back-to-back and people was eager to learn more about the buzzword:

All the great minds behind the Gatsby presentations at @BADCamp https://t.co/waarKr3uZj pic.twitter.com/iwvGWRayPb

— Gatsby (@gatsbyjs) October 26, 2018

Other recurrent and interesting topics mentioned on different sessions during the event:

  • Design systems and Pattern Lab.
  • The new Drupal Layout Builder.

My session

I had an opportunity to speak at the event. The title of my session was “How To Keep Drupal Relevant In The Git-based and API-driven CMS Era”. Yes, I was presenting one of the sessions related to GatsbyJS. Check out the slides here and feel free to watch the recording as well:

[embedded content]

Feel free to ask any questions using the comment section of this blog post or via twitter mention me directly @jmolivas.

The party

As usual, BADCamp vibes were incredible and the party was great as the event itself and the weKnow team had an amazing time.

badcamp-party

Thank you Platform.sh for sponsoring the event.

The after-party

Well, the first rule of the after-party is not to talk about the after-party. So next year jump into the bus at midnight and join the after-party (only if you want to have fun).

See you at BADCamp 2019 or maybe sooner at DrupalCamp Atlanta 2018

Nov 06 2018
Nov 06

I’ve been running a lot lately, and so have been listening to lots of podcasts! Which is how I stumbled upon this great episode of the Lullabot podcast recently — embarrassingly one from over a year ago: “Talking Performance with Pantheon’s David Strauss and Josh Koenig”, with David and Josh from Pantheon and Nate Lampton from Lullabot.

(Also, I’ve been meaning to blog more, including simple responses to other blog posts!)

Interesting remarks about BigPipe

Around 49:00, they start talking about BigPipe. David made these observations around 50:22:

I have some mixed views on exactly whether that’s the perfect approach going forward, in the sense that it relies on PHP pumping cached data through its own system which basically requires handling a whole bunch of strings to send them out, as well as that it seems to be optimized around this sort of HTTP 1.1 behavior. Which, to compare against HTTP 2, there’s not really any cost to additional cost to additional connections in HTTP 2. So I think it still remains to be seen how much benefit it provides in the real world with the ongoing evolution of some of these technologies.

David is right; BigPipe is written for a HTTP 1.1 world, because BigPipe is intended to benefit as many end users as possible.

And around 52:00, Josh then made these observations:

It’s really great that BigPipe is in Drupal core because it’s the kind of thing that if you’re building your application from scratch that you might have to do a six month refactor to even make possible. And the cache layer that supports it, can support lots other interesting things that we’ll be able to develop in the future on top of Drupal 8. […] I would also say that I think the number of cases where BigPipe or ESI are actually called for is very very small. I always whenever we talk about these really hot awesome bleeding-edge cache technologies, I kinda want to go back to what Nate said: start with your Page Cache, figure out when and how to use that, and figure out how to do all the fundamentals of performance before even entertaining doing any of these cutting-edge technologies, because they’re much trickier to implement, much more complex and people sometimes go after those things first and get in over their head, and miss out on a lot of the really big wins that are easier to get and will honestly matter a lot more to end users. “Stop thinking about ESI, turn on your block cache.”

Josh is right too, BigPipe is not a silver bullet for all performance problems; definitely ensure your images and JS are optimized first. But equating BigPipe with ESI is a bit much; ESI is indeed extremely tricky to set up. And … Drupal 8 has always cached blocks by default. :)

Finally, around 53:30 David cites another reason to stress why more sites are not handling authenticated traffic:

[…] things like commenting often move to tools like Disqus and whether you want to use Facebook or the Google+ ones or any one of those kind of options; none of those require dynamic interaction with Drupal.

Also true, but we’re now seeing the inverse movement, with the increased skepticism of trusting social media giants, not to mention the privacy (GDPR) implications. Which means sites that have great performance for dynamic/personalized/uncacheable responses are becoming more important again.

BigPipe’s goal

David and Josh were being constructively critical; I would expect nothing less! :)

But in their description and subsequent questioning of BigPipe, I think they forget its two crucial strengths:

BigPipe works on any server, and is therefore available to everybody, and it works for many things out of the box, including f.e. every uncacheable Drupal block! In other words: no infrastructure (changes) required!

Bringing this optimization that sits at the intersection of front-end & back-end performance to the masses rather than having it only be available for web giants like Facebook and LinkedIn is a big step forward in making the entire web fast.

Using BigPipe does not require writing a single line of custom code; the module effectively progressively enhances Drupal’s HTML rendering — and turned on by default since Drupal 8.5!

Conclusion

Like Josh and David say: don’t forget about performance fundamentals! BigPipe is no silver bullet. If you serve 100% anon traffic, BigPipe won’t make a difference. But for sites with auth traffic, personalized and uncacheable blocks on your Drupal site are streamed automatically by BigPipe, no code changes necessary:

[embedded content]

(That’s with 2 slow blocks that take 3 s to render. Only one is cacheable. Hence the page load takes ~6 s with cold caches, ~3 s with warm caches.)

Nov 06 2018
Nov 06

By Jesus Manuel OlivasHead of Products | November 06, 2018

By Jesus Manuel OlivasHead of Products | November 06, 2018

During this year and at several events SANDCamp, DrupalCamp LA, DrupalCon Nashville, and DrupalCamp Colorado I had a chance to talk and show how at WeKnow we approached the development of API driven applications. For all of you that use Drupal, this is something like decoupled or headless Drupal but without the Drupal part.

This article outlines weKnow’s approach and provides some insight into how we develop some web applications.

Yes, this may sound strange but whenever we need to build an application that is not content-centric, we use Symfony instead of Drupal; what are those cases? Whenever we do not require the out-of-the-box functionality that Drupal offers as content management, content revision workflow, field widgets/formatters, views, and managing data structure from the UI (content types).

Why we still use PHP.

We definitely knew the language pretty well, we have a large experience working with PHP, Drupal and Symfony and we decided to take advantage of that knowledge and use it to build API driven applications.

Why the API Platform.

This project is a REST and GraphQL framework that helps you to build modern API-driven projects. The project provides an API component that includes Symfony 4, Flex, and Doctrine ORM. It also provides you with client-side components and an Admin based on React and a Docker configuration ready to start up your project using one single command. Allowing you to take advantage of thousands of existing Symfony bundles and React components.

Wrapping up

Our developer's expertise within different technologies has given us the advantage to provide a great time to market while developing client projects. We also like sharing, if you want to see this session live, probably for the last time you should attend and join me at DrupalCamp Atlanta

Video from DrupalCon Nashville at the Youtube Drupal Association channel here:

[embedded content]

You can find the latest version of the slides from DrupalCampLA here 
 

Nov 06 2018
Nov 06

Pattern Lab (PL), a commonly known pattern library, is an open-source project to generate a design system for your site. In the last two years it has gotten a lot of attention in the Drupal community. It's a great way to implement a design system into your front-end workflow.

The following post describes how our client (the City and County of San Francisco) began to implement a pattern library that will eventually be expanded upon and re-used for other agency websites across the SF.gov ecosystem.

USWDS.

Using the U.S. Web Design System (USWDS), until their own pattern library was ready for prime time, was a client requirement.

USWDS uses a pattern library system called Fractal. I think Fractal is a great idea, but it lacked support for Twig, Twig is the template engine used by Drupal. Fractal out of the box uses Handlebars (templating engine in JavaScript), and thought the template language in Fractal can be customized I wasn’t able to make it work with Twig macros and iterations even with the use of twig.js

Creating the Pattern Lab

Ultimately, I decided to start from scratch. In addition to the USWDS requirement, the client also needed to be able to reuse this pattern library on other projects. I used the Pattern Lab Standard Edition for Twig, among other things this means that you need PHP in the command line in order to "compile" or generate the pattern library.

I added a gulpfile that was in charge of watching for changes in the PL source folders. Once a Twig, Sass or JavaScript file was changed, the pattern library was re-generated.

Generating the Pattern Library

I also needed Gulp to watch the file changes.

The following is a SIMPLE example of the Gulp task that generates the PL watching the folders. the following code snippet shows the config object containing an array of the folder directories.


{
  "css": {
    "file" : "src/sass/_all.scss",
    "src": [
      "pattern-lab/source/_patterns/*.scss",
      "pattern-lab/source/_patterns/**/*.scss",
      "pattern-lab/source/scss/*.scss"
    ],
    "pattern_lab_destination": "pattern-lab/public/css",
    "dist_folder": "dist/css"
  },
  "js": {
    "src": [
      "pattern-lab/source/js/*.js",
      "pattern-lab/source/js/**/*.js"
    ]
  }
}

And the following one is a common watcher in gulp:


gulp.task('watch', function () {
    gulp.watch(config.js.src, ['legacy:js']);
    gulp.watch(config.css.src, ['pl:css']);
    gulp.watch(config.pattern_lab.src, ['generate:pl']);
    gulp.watch(config.pattern_lab.javascript.src, ['generate:pl']);
});


The following task is in charge of generating the pattern library with PHP:


gulp.task('pl:php', shell.task('php pattern-lab/core/console --generate'));

Please NOTE that this is an oversimplified example.

Sass

Having generated the Pattern Library, I figured out that in order to use this Pattern Lab into my Drupal theme, I needed to generate a single CSS file and single JavaScript (JS) file.
The main Sass file imports all Sass code from USWDS by using the `@import` statement.
I imported the source Sass code from USWDS which I required with npm and imported  the source file directly from the node_modules folder:



//pattern-lab/source/scss/components.scss

// Styles basic HTML elements
@import '../../../node_modules/uswds/src/stylesheets/elements/buttons';
@import '../../../node_modules/uswds/src/stylesheets/elements/embed';

Then I imported the scss files that were inside my pattern elements:

// Styles inside patterns.
@import "../_patterns/00-protons/*.scss";
@import "../_patterns/01-atoms/**/*.scss";
@import "../_patterns/02-molecules/**/*.scss";
@import "../_patterns/03-organisms/**/*.scss";
@import "../_patterns/04-templates/**/*.scss";
@import "../_patterns/05-pages/**/*.scss";


All the styles were dumped into a single file called components.css

Having this single CSS file created I was able to use USWDS CSS classes along with the new ones.
I had to add a /dist folder where the transpiled Sass would live and be committed for later use in the Drupal theme.

JavaScript

I did something similar for JavaScript. The biggest challenge was to compile the USWDS JavaScript files exactly as they were. I resorted to copying all the source for the JavaScript into the src folder of the pattern library and set a watcher specifically for the USWDS JavaScript, and added another watcher for the new Pattern Lab JavaScript:

Example:

In the following example I compile all the JS that lives inside the components into a single file.

Then the resulting file is copied  to: ./pattern-lab/public/js which is the folder that reads the Pattern Lab when working on Pattern Lab only.
The other copy of the file goes to the distribution folder ./dist/pl/js which is the one I use in my Drupal theme.


// Component JS.
// -------------------------------------------------------------------- //
// The following task concatenates all the JavaScript files inside the
// _patterns folder, if new patterns need to be added the config.json array
// needs to be edited to watch for more folders.

gulp.task('pl:js', () => {
    return gulp.src(config.pattern_lab.javascript.src)
        .pipe(sourcemaps.init())
        .pipe(babel({
            presets: ['es2015']
        }))
        .pipe(concat("components.js"))
        .pipe(sourcemaps.write())
        .pipe(gulp.dest('./pattern-lab/public/js'))
        .pipe(gulp.dest('./dist/pl/js'));
});

The resulting files:


--/dist/
--/dist/css/components.css
--/dist/js/components.js

Were included the HEAD of my Pattern Lab by editing pattern-lab/source/_meta/_00-head.twig 

I included the following lines:


<link rel="stylesheet" href="https://www.chapterthree.com/blog/decoupling-pattern-lab-from-your-theme-a-city-of-san-francisco-project/../../css/components.css" media="all">
<script src="https://www.chapterthree.com/blog/decoupling-pattern-lab-from-your-theme-a-city-of-san-francisco-project/../../js/dist/uswds.min.js"></script>
<script src="https://www.chapterthree.com/blog/decoupling-pattern-lab-from-your-theme-a-city-of-san-francisco-project/../../js/dist/components.js"></script>

Please refer to the repo if you need the details of the integration: GitHub - SFDigitalServices/sfgov-pattern-lab: SFGOV Pattern Lab

Integrating the Pattern Lab with Drupal.

Composer and libraries:

I used the following plugin:


composer require oomphinc/composer-installers-extender

This plugin allowed me to put the pattern library in a folder different than vendor
Then I added some configuration to the composer.json

Under extra I specified where composer should install the repository of type github:


"extra": {
        "installer-paths": {
            "web/libraries/{$name}": ["type:github"],
          }

Then under repositories I set the type:github


"repositories": {
        "github": {
            "type": "package",
            "package": {
                "name": "sf-digital-services/sfgov-pattern-lab",
                "version": "master",
                "type": "drupal-library",
                "source": {
                    "url": "https://github.com/SFDigitalServices/sfgov-pattern-lab.git",
                    "type": "git",
                    "reference": "master"
                }
            }
        }
    }

and required the package under require: As you can see the name matches the name in the previously declared github repo:


"require": {
   "sf-digital-services/sfgov-pattern-lab": "dev-master",
}

A composer update should clone the github repo and place the Pattern Lab inside relative to the Drupal web folder:

/web/libraries/sfgov-pattern-lab

Components Libraries

The Component Libraries module was especially important because it allowed me to map the Pattern Lab components easily into my theme.

Then I had to map my Pattern Lab components with the Drupal theme:

The Drupal Theme:

I created a standard Drupal theme:

The sfgovpl.info.yml file:

In the following part of the sfgovpl.info.yml file I connected the Pattern Lab Twig files to Drupal:


component-libraries:
  protons:
    paths:
      - ../../../libraries/sfgov-pattern-lab/pattern-lab/source/_patterns/00-protons
  atoms:
    paths:
      - ../../../libraries/sfgov-pattern-lab/pattern-lab/source/_patterns/01-atoms
  molecules:
    paths:
      - ../../../libraries/sfgov-pattern-lab/pattern-lab/source/_patterns/02-molecules
  organisms:
    paths:
      - ../../../libraries/sfgov-pattern-lab/pattern-lab/source/_patterns/03-organisms
  templates:
    paths:
      - ../../../libraries/sfgov-pattern-lab/pattern-lab/source/_patterns/04-templates
  pages:
    paths:
      - ../../../libraries/sfgov-pattern-lab/pattern-lab/source/_patterns/05-pages

libraries:
  - sfgovpl/sfgov-pattern-lab

The sfgovpl.libraries.yml file:

In the last line of the previous code example, you can see that I required sfgov-pattern-lab library,  which will include the files compiled by Gulp into my Pattern Lab.


sfgov-pattern-lab:
  css:
    base:
      '/libraries/sfgov-pattern-lab/dist/css/components.css': {}
  js:
    '/libraries/sfgov-pattern-lab/dist/pl/js/components.js': {}

Using the Twig templates in our theme:

The following is an example of how to use a molecule from the pattern library into the Drupal theme:

You can include the @molecules/08-search-results/03-topic-search-result.twig twig like this:

Pattern Lab twig:

node--topic--search-index.html.twig


<div class="topic-search-result">
<div class="topic-search-result--container">
<div class="content-type"><i class="sfgov-icon-filefilled"></i><span>{{ content_type }}</span></div>
<a class="title-url" href="https://www.chapterthree.com/blog/decoupling-pattern-lab-from-your-theme-a-city-of-san-francisco-project/{{ url }}"><h4>{{ title }}</h4></a>
<p class="body">{{ body|striptags('<a>')|raw }}</p>
</div>
</div>

Drupal template:

The following example calls the Pattern lab molecule originally located at: web/libraries/sfgov-pattern-lab/pattern-lab/source/_patterns/02-molecules/08-search-results/03-topic-search-result.twig but thanks to the Components module we just call it as: @molecules/08-search-results/03-topic-search-result.twig


{# Set variables to use in the component. #}
{% set url = path('entity.node.canonical', {'node': elements['#node'].id()  }) %}
{% set description = node.get('field_description').getValue()[0]['value'] %}
{% set type = node.type.entity.label %} {# content type #}
{# Icluding the molecule in our Pattern Lab.#}

{% include "@molecules/08-search-results/03-topic-search-result.twig" with {
  "content_type": type,
  "url": url,
  "title": elements['#node'].get('title').getString(),
  "body": description
} %}

Recommendations

SFGOV Pattern Lab, Initial development was made in large part for Chapter Three and this post is intended to show the core concepts of decoupling your Pattern Lab from your Drupal theme.

You can find the full code implementation for the Pattern library and Drupal in the following Urls:

SFDigitalServices/sfgov and the Pattern Lab here: SFDigitalServices/sfgov-pattern-lab

You should try the value module, it is great for extracting values in the Drupal twig templates and connect them with your Pattern Lab twig templates.

Give a try to the UI Patterns module, looks promising and a great solution for decoupled Pattern Libraries.

Nov 06 2018
Nov 06

This is part 2 in this series that explores how to use paragraph bundles to store configuration for dynamic content. The example I built in part 1 was a "read next" section, which could then be added as a component within the flow of the page. The strategy makes sense for component-based sites and landing pages, but probably less so for blogs or content heavy sites, since what we really want is for each article to include the read next section at the end of the page. For that, a view that displays as a block would perfectly suffice. In practice, however, it can be really useful to have a single custom block type, which I often call a "component block", that has an entity reference revisions field that we can leverage to create reusable components.

This strategy offers a simple and unified interface for creating reusable components and adding them to sections of the page. Combined with Pattern Lab and the block visibility groups module, we get a pretty powerful tool for page building and theming.

The image below captures the configuration screen for the "Up next" block you can find at the bottom of this page. As you see, it sets the heading, the primary tag, and the number of items to show. Astute readers might notice, however, that there is a small problem with this implementation. It makes sense if all the articles are about Drupal, but on sites where there are lots of topics, having a reusable component with a hard-coded taxonomy reference makes less sense. Rather, we'd like the related content component to show content that is actually related to the content of the article being read.

For the purpose of this article, let's define the following two requirements: first, if the tagged content component has been added as a paragraph bundle to the page itself, then we will respect the tag supplied in its configuration. If, however, the component is being rendered in the up next block, then we will use the first term the article has been tagged with.

To do that, we need three things: 1) we need our custom block to exist and to have a delta that we can use, 2) we need a preprocess hook to assign the theme variables, and 3) we need a twig template to render the component. If you're following along in your own project, then go ahead and create the component block now. I'll return momentarily to a discussion about custom block and the config system.

Once the up next block exists, we can create the following preprocess function:

function component_helper_preprocess_block__upnextblock(&$variables) {
  if ($current_node = \Drupal::request()->attributes->get('node')) {
    $variables['primary_tag'] = $current_node->field_tags->target_id;
    $variables['nid'] = $current_node->id();
    $paragraph = $variables['content']['field_component_reference'][0]['#paragraph'];
    $variables['limit'] = $paragraph->field_number_of_items->getValue()[0]['value'];
    $variables['heading'] = $paragraph->field_heading->getValue()[0]['value'];
  }
}

If you remember from the first article, our tagged content paragraph template passed those values along to Pattern Lab for rendering. That strategy won't work this time around, though, because theme variables assigned to a block entity, for example, are not passed down to the content that is being rendered within the block.

You might wonder if it's worth dealing with this complexity, given that we could simply render the view as a block, modify the contextual filter, place it and be done with it. What I like about this approach is the flexibility it gives us to render paragraph components in predictable ways. In many sites, we have 5, 10 or more component types. Not all (or even most) of them are likely to be reused in blocks, but it's a nice feature to have if your content strategy requires it. Ultimately, the only reason we're doing this small backflip is because we want to use the article's primary tag as the argument, rather than what was added to the component itself. In other component blocks (an image we want in the sidebar, for example) we could simply allow the default template to render its content.

In the end, our approach is pretty simple: Our up next block template includes the paragraph template, rather than the standard block {{ content }} rendering. This approach makes the template variables we assigned in the preprocess function available:

{% include "@afro_theme/paragraphs/paragraph--tagged-content.html.twig" %}

A different approach to consider would be adding a checkbox to the tagged content configuration, such as "Use page context instead of a specified tag". That would avoid having us having an extra hook and template. Other useful configuration fields we've for dynamic component configuration include whether the query should require all tags, or any tag, when multiple are assigned, or the ability to specify whether the related content should exclude duplicates (useful when you have several dynamic components on a page but you don't want them to include the same content).

As we wrap up, a final note I'll add is about custom blocks and the config system. The apprach I've been using for content entities that also become config (which is the case here), is to first create the custom block in my local development environment, then export the config and remove the UUID from the config while also copying the plugin uuid. You can then create an update hook that creates the content for the block before it gets imported to config:

/**
 * Adds the "up next" block for posts.
 */
function component_helper_update_8001() {
  $blockEntityManager = \Drupal::service('entity.manager')
    ->getStorage('block_content');

  $block = $blockEntityManager->create(array(
    'type' => 'component_block',
    'uuid' => 'b0dd7f75-a7aa-420f-bc86-eb5778dc3a54',
    'label_display' => 0,
  ));

  $block->info = "Up next block";

  $paragraph = Drupal\paragraphs\Entity\Paragraph::create([
    'type' => 'tagged_content',
    'field_heading' => [
      'value' => 'Up next'
    ],
    'field_number_of_items' => [
      'value' => '3'
    ],
    'field_referenced_tags' => [
      'target_id' => 1,
    ]
  ]);

  $paragraph->save();
  $block->field_component_reference->appendItem($paragraph);
  $block->save();
}

Once we deploy and run the update hook, we're able to import the site config and our custom block should be rendering on the page. Please let me know if you have any questions or feedback in the comments below. Happy Drupaling.
 

Nov 06 2018
Nov 06

Drupal Modules: The One Percent — Admin Denied (video tutorial)

[embedded content]

Episode 51

Here is where we bring awareness to Drupal modules running on less than 1% of reporting sites. Today we'll investigate Admin Denied, a module which prevents you from accessing the super user's account.

Nov 06 2018
Nov 06
Jody's desk

Hardware

After a long run on MacBook Pros, I switched to an LG Gram laptop running Debian this year. It’s faster, lighter, and less expensive. 

If your development workflow now depends on Docker containers running Linux, the performance benefits you’ll get with a native Linux OS are huge. I wish I could go back in time and ditch Mac earlier.

Containers

For almost ten years I was doing local development in Linux virtual machines, but in the past year, I’ve moved to containers as these tools have matured. The change has also come with us doing less of our own hosting. My Zivtech engineering team has always held the philosophy that you need your local environment to match the production environment as closely as possible. 

But in order to work on many different projects and accomplish this in a virtual machine, we had to standardize our production environments by doing our own hosting. A project that ran on a different stack or just different versions could require us to run a separate virtual machine, slowing down our work. 

As the Drupal hosting ecosystem has matured (Pantheon, Platform.sh, Acquia, etc.), doing our own hosting began to make less sense. As we diversified our production environments more, container-based local development became more attractive, allowing us to have a more light-weight individualized stack for each project.

I’ve been happy using the Lando project, a Docker-based local web development system. It integrates well with Pantheon hosting, automatically making my local environment very close to the Pantheon environments and making it simple to refresh my local database from a Pantheon environment. 

Once I fully embraced containers and switched to a Linux host machine, I was in Docker paradise. Note: you do not need a new machine to free yourself from OSX. You can run Linux on your Mac hardware, and if you don’t want to cut the cord you could try a double boot.

Philadelphia City Hall outside Jody's office
A cool office view (like mine of Philly’s City Hall) is essential for development mojo

Editor

In terms of editors/IDEs I’m still using Sublime Text and vim, as I have for many years. I like Sublime for its performance, especially its ability to quickly search projects with 100,000 files. I search entire projects constantly. It’s an approach that has always served me well. 

I also recommend using a large font size. I’m at 14px. With a larger font size, I make fewer mistakes and read more easily. I’m not sure why most programmers use dark backgrounds and small fonts when it’s obvious that this decreases readability. I’m guessing it’s an ego thing.

Browser

In browser news, I’m back to Chrome after a time on Firefox, mainly because the LastPass plugin in Firefox didn’t let me copy passwords. But I have plenty of LastPass problems in any browser. When working on multiple projects with multiple people, a password manager is essential, but LastPass’s overall crappiness makes me miserable.

Wired: Linux, git, Docker, Lando
Tired: OSX, Virtual machines, small fonts
Undesired: LastPass, egos

Terminal

I typically only run the browser, the text editor, and the terminal, a few windows of each. In the terminal, I’m up to 16px font size. Recommend! A lot of the work I do in the terminal is running git commands. I also work in the MySQL CLI a good deal. I don’t run a lot of custom configuration in my shell – I like to keep it pretty vanilla so that when I work on various production servers I’m right at home.

Terminal screenshot

Git

I get a lot of value out of my git mastery. If you’re using git but don’t feel like a master, I recommend investing time into that. With basic git skills you can quickly uncover the history of code to better understand it, never lose any work in progress, and safely deploy exactly what you want to.

Once I mastered git I started finding all kinds of other uses for it. For example, I was recently working on a project in which I was scraping a thousand pages in order to migrate them to a new CMS. At the beginning of the project, I scraped the pages and stored them in JSON files, which I added to git.  At the end of the project, I re-scraped the pages and used git to tell me which pages had been updated and to show me which words had changed. 

On another project, I cut a daily import process from hours to seconds by using git to determine what had changed in a large inventory file. On a third, I used multiple remotes with Jenkins jobs to create a network of sites that run a shared codebase while allowing individual variations. Git is a good friend to have.

Hope you found something useful in my setup. Have any suggestions on taking it to the next level?
 

Nov 06 2018
Nov 06

VisualN provides an interface to check "how it works" for any available drawer on the site. To see the list of drawers go to VisualN -> Available Drawers Preview menu item.

Available drawers list

Though VisualN allows you to use any resource type as data source (e.g. csv, xls files or views), for demo purposes it is enough to have some dummy data. Such data can be obtained from data generators. Data generators are simple plugins returning an array of data (which is just another resource type) of a given structure that can be used by certain drawers (e.g. Leaflet uses lat, lon and title data fields).

Available drawers list

Data generators may also provide info about drawer or drawers that can use generated data. Those drawers and data generators are considered compatible. Drawers highlighted green have compatible data generators.

There are a couple of use cases when you may want to use the Available drawers preview UI:

  • check drawer in action, examine configuration form settings
  • set configuration values to create a visualization style
  • use the preview UI to help drawer development and to test changes
  • check data format used by drawers (e.g. using table drawer)

Linechart Basic drawer preview

Choose «preview» for any of green-highlited drawers. The preview will be opened:

Linechart drawer preview

In the example presented below the graphics visualization is an interactive area with line charts. The values at the mouse-pointed areas are available, data series activating/ignoring etc. These are basic possibilities of the drawer logics, but VisualN module doesn’t narrow the set and complexity of these possibilities.

The drawer supports a set of configuration parameters in a UI for the user (drawer config).

The drawer supports a UI for configuring data generator parameters. Different drawers can use different data generators. In the VisualN module there’s used a term «resource» for an ordered set of data to make a drawing; and data generators represent one of resource providers.

To analyze drawer possibilities use different combinations of configuration parameters and press the «redraw» button.

Creating Visualization styles

When configuring is terminated use the Create style button. VisualN uses Drawing Style to keep the set of parameters of a drawer for a variety of drawings. Changing style configuration will change parameters for all the drawings with that style, wherever they are located in the project.

When style is created, name it and set its parameters:

Creating drawing style

More about styles can be seen in other articles.

Using Table Html Basic drawer to inspect data format

Sometimes you may be not sure about data format used by a drawer. It may happen when the format is not obvious by itself and no description is provided by developer. Though if you are a developer too, it still shouldn't be a problem, otherwise there should be some way to guess the format required.

In most cases, when drawer uses data in the form of plain array (plain table) and has a compatible generator provided, Table Html Basic drawer can be used to guess the format. It will output any data provided by data generator.

By default, Table Html Basic drawer uses data generator that generates random strings.

HTML Table drawer preview

On the other hand, if it doesn't work for some drawer's data generator, the generator obviously returns data of some more complex structure, e.g. trees or nested lists.

Thus, Drawer Preview functionality is useful for both content-managers and developers.

Contents of the series

  1. Getting started
  2. Using available drawers preview
  3. Creating visualization styles
  4. Creating drawing (entity) types
  5. Creating interactive content in CKEditor with VisualN Embed
  6. Configuring VisualN Embed integration with CKEditor
  7. Using VisualN Excel module for Excel files support
  8. Sharing embedded drawings across sites
  9. etc.
Nov 06 2018
Nov 06

Agiledrop is highlighting active Drupal community members through a series of interviews. Learn who are the people behind Drupal projects.

This week we talked with David Valdez. Read about what impact Drupal made on him, what contribution is he the proudest of and what Drutopia is.

1. Please tell us a little about yourself. How do you participate in the Drupal community and what do you do professionally?

I’ve been doing web development for fourteen years and Drupal the last eight.

I currently work for Agaric which is a worker-owned cooperative. This allows us to make decisions about the cooperative democratically. Equally important is that we support one another, not just professionally but personally as well. 

Agaric is involved in several Drupal Projects, including Drupal Training Days, Sprint Weekends, and other local events. You can learn more here

2. When did you first came across Drupal? What convinced you to stay, software or the community, and why?

The first time I used Drupal, I faced the well known steep learning curve. In the beginning, I disliked how difficult the CMS seemed, but later when I started to understand why things were done the way they are, I began to appreciate all the cool things you can do with it, how well thought the subsystems were and how Drupal dramatically improves between one version to the next.

And later, when I had questions about specific problems or bugs, I found many talented people working on the project and giving support. It was amazing and I felt motivated to also contribute back to the community. In this way, I learned a ton of new things, and at the same time, I was helping other people.

3. What impact Drupal made on you? Is there a particular moment you remember?

Drupal gave a new direction to my career. At the time I was working on several different technologies and frameworks. Drupal motivated me to become a specialist, so I left my job and sought out an opportunity to work in a Drupal shop, where I could spend more time improving my Drupal skills.

Having that in mind, I travelled to DrupalCon Austin at 2014 (it was my first time in the USA), and I was convinced, that I wanted to work in a Drupal shop to be more involved in the project.

4. How do you explain what Drupal is to other, non-Drupal people?

Firstly, I usually try to explain what Free Software is about, how this allows projects like Drupal to become so good and how it helps many people.

5. How did you see Drupal evolving over the years? What do you think the future will bring?

Drupal has always been considered as a Content Management Framework, and I believe Drupal 8 is following this path to become one of the most solid options to build any project.

6. What are some of the contribution to open source code or community that you are most proud of?

There are a few contributions at the Core which allowed me to interact in the whole process to fix a bug on Drupal 8. 

For instance, at Drupal 8.1 the permalinks were broken on the comments, so I helped to write the patch, discuss changes and wrote the tests, to make sure this bug won’t happen again. 

I learned by reading the feedback from other, more experienced developers, and at the same time, I understood how Drupal works (at least in the parts related to the bug).

The same happened with a bug in the migrations and the REST module.

And learning from those issues helped me to contribute in fixing other smaller core bugs and fixing bugs in a several contributed modules, from porting small modules as Image Resize Filter, to contribute to well-known modules as Migrate Plus.

7. Is there an initiative or a project in Drupal space that you would like to promote or highlight?

Yes, we at Agaric have been working on Drutopia (https://www.drutopia.org), which is a series of Drupal distributions for nonprofits and other low-budget groups. 

8. Is there anything else that excites you beyond Drupal? Either a new technology or a personal endeavorment. 

I live in Mexico and I’m a member of a PHP Group (https://phpmexico.mx), where we talk about good practices, help each other improve our skills and keep informed of other cool technologies. 
 

Nov 06 2018
Nov 06

The TWG coding standards committee is announcing two issues for final discussion. Feedback will be reviewed on 11/13/2018.

New issues for discussion:

Needs love

Interested in helping out?

You can get started quickly by helping us to update an issue summary or two or dive in and check out the full list of open proposals and see if there's anything you'd like to champion!

Nov 06 2018
Nov 06

The TWG coding standards committee is announcing two issues for final discussion. Feedback will be reviewed on 11/13/2018.

New issues for discussion:

Needs love

Interested in helping out?

You can get started quickly by helping us to update an issue summary or two or dive in and check out the full list of open proposals and see if there's anything you'd like to champion!

Nov 05 2018
Nov 05

By Manuel SantibanezFront-end Developer | November 05, 2018

By Manuel SantibanezFront-end Developer | November 05, 2018

weKnow gave me the opportunity to attend my first BADCamp as part of the team that represented the company at this awesome event.

First day I attended the Drupal Frontend Submit, a roundtable format that I had not experienced before. It was very rewarding to discuss my experience as a developer who has worked with accessibility guidelines, sharing the tools and strategies that I have used to implement such an important standard. 

Lots of great sessions shared valuable knowledge that allowed me to leave BADCamp as a better developer!

My top picks:

Without a doubt, the new kid on the block was Gatsby, a piece of technology that takes the development of sites to a new level.

In my opinion, and I may be a little biased here, one of the best talks of the camp was "How to keep Drupal relevant in the Git-based and API-driven CMS", given by Jesus Manuel Olivas. This session opened a great discussion about Drupal’s vision, touching base on how it can integrate to become a fundamental piece in the scheme of modern technologies and strategies in web development, allowing Drupal to focus on what it does best which is to manage content.

As a final note, I would like to highlight that the venue was great at UC Berkeley; awesome lounge area with coffee to keep us energized all day, pinball machines were a great surprise and a special mention for the waffles!

Thanks for everything, hope to return next year and this time proposing a talk and thus sharing my own knowledge and experience!

Nov 05 2018
Nov 05

For the past two North American DrupalCons, my presentations have focused on introducing people to the Webform module for Drupal 8. First and foremost, it’s important that people understand the primary use case behind the Webform module, within Drupal's ecosystem of contributed modules, which is to…

Build a form which collects submission data

The other important message I include in all my presentations is…

The Webform module provides all the features expected from an enterprise proprietary form builder combined with the flexibility and openness of Drupal.

Over the past two years, between presentations, screencasts, blog posts, and providing support, the Webform module has become very robust and feature complete. Only experienced and advanced Drupal developers have been able to fully tap into the flexibility and openness of the Webform module.

The flexibility and openness of Drupal

Drupal's 'openness' stems from the fact that the software is Open Source; every line of code is freely shared. The Drupal's community's collaborative nature does more than just 'share code'. We share our ideas, failures, successes, and more. This collaboration leads to an incredible amount of flexibility. In the massive world of Content Management Systems, 'flexibility' is what makes Drupal stand apart from its competitors.

Most blog posts and promotional material about Drupal's flexibility reasonably omits the fact that Drupal has a steep learning curve. Developers new to Drupal struggle to understand entities, plugins, hooks, event subscribers, derivatives, and more until they have an ‘Aha’ moment where they realize how ridiculously flexible Drupal is.

The Webform module also has a steep learning curve

The Webform module's user experience focuses on making it easy for people to start building fairly robust forms quickly, including the ability to edit the YAML source behind a form. This gives users a starting point to understanding Drupal's render and form APIs. As soon as someone decides to peek at the Webform module's entities and plugins, they begin to see the steep learning curve that is Drupal.

Fortunately, most Webform related APIs which include entities, plugins, and theming follow existing patterns and best practices provided by Drupal core. There are some Webform-specific use cases around access controls, external libraries, and advanced form elements and composites, which require some unique solutions. These APIs and design patterns can become overwhelming making it difficult to know where to get started when it comes to customizing and extending the Webform module.

It is time to start talking about advanced webforms.

Advanced Webforms @ DrupaCon Seattle 2019

Drupal Seattle 2019 - April 8-12

Drupal Seattle 2019 - April 8-12

DrupalCon Seattle is scheduled to take place in April 2019, but session proposals are closed. There are a lot of changes coming for DrupalCon Seattle 2019. The most immediate one affecting presenters is that proposals are limited to either a 30-minute session or a 90-minute training. I understanding this change because a well-planned and focused topic can be addressed in 30 minutes while training someone requires more time. I’m happy to say that my proposal for a 90-minute Advanced Webform presentation was accepted. Now I need to start putting together the session outline and materials

Talking about advanced ​webforms for 90 minutes is going to exhausting so I promise everyone attending there will be a break.

I’ve decided to break up this presentation into two parts. The first part is going to be an advanced demo. I’ll be walking through how to build an event registration system and maybe an application evaluation system. During these demos, we’re going to explore more some of the Webform's modules advanced features and use-cases.

The second part of the presentation is going to be a walkthrough the APIs and concepts behind the Webform module.

Topics will include…

  • Creating custom form elements

  • Posting submissions using handlers

  • Altering forms and elements

  • Leveraging API's

  • Writing tests

  • Development tips & tricks

This presentation is going to require a lot of work. Luckily, I have six months to practice my presentation and work out some of the kinks.

Talking about code is hard

Fortunately, there are plenty of DrupalCamps before DrupalCon Seattle where I can rehearse my presentation. Two weeks ago, I presented the second part of my training at BadCamp and learned that scrolling through code while talking is challenging. I struggled with clicking thru PHPStorm's directory trees and showing different interfaces for entities and plugins.

Being able to navigate and understand code is key to climbing Drupal's steep learning curve.

I enjoy giving live Webform demos because I really believe that physically showing people how easy it is to build a form makes them feel more comfortable with getting their hands dirty. Even making mistakes during a live demo helps people see that clicking a wrong button is just part of the process. For me, the most successful and inspiring presentations are the ones where I walk out of the room with a new appreciation of the topic and a new perspective regarding the best way to learn more about the subject matter. My live demo of code at BadCamp did not work great but now I have the opportunity to improve it and I need to…

Figure out how to show the code.

Showing code during a presentation

A few years ago, my friend and coworker, Eric Sod (esod), did an excellent technical presentation about Drupal Console, which is a command line tool used to generate boilerplate code, interact with and debug Drupal. He figured out how to get past the challenge of talking and showing people how to use a command line tool. He did this by recording each console command. It was an inspiring presentation because he calmly stood in front of the room and explained everything that was happening in his recorded demo. He didn’t have to worry about misspelling a command.

Eric's approach to pre-recording challenging demos may be the solution that I am looking for. I am going to test it out at DrupalCamp Atlanta.

Going on the road and practicing this presentation

The biggest takeaway from this blog post should be:

The road to presenting at DrupalCon requires a lot of practice.

If your session proposal did not get accepted for this year's DrupalCon, keep on practicing. Make sure to record your presentations, even if it is a private screencast so that you can share this recording with next year's DrupalCon Minneapolis session selection committee. They want to know that you are comfortable getting up in front of a sizeable and curious audience.

There are plenty of upcoming local meetups and DrupalCamps where you can present..

Below are the DrupalCamps I am hoping to speak at before DrupalCon Seattle.

I hope to see you at a local DrupalCamp or we can catch up at DrupalCon Seattle. And if you have any suggestions on how to improve my presentation, for the next six months I’ll be working on it and I’m all ears…

If you want to know where I am speaking next, please subscribe to my blog below or follow me on Twitter.

Almost done…

We just sent you an email. Please click the link in the email to confirm your subscription!

OKSubscriptions powered by Strikingly

Nov 05 2018
Nov 05

By Harold JuárezFull Stack Developer | November 05, 2018

By Harold JuárezFull Stack Developer | November 05, 2018

BADCamp 2018 was the first real big event I attended, aside from actively participating in Drupal Camp Costa Rica for three years. Kindly enough some co-workers who had already assisted shared with me their experience which gave me great expectations. In addition, I was excited to sightsee San Francisco and Berkeley.

After dedicating this year to front-end, BADCamp sessions left me more than satisfied, with refreshed knowledge and practices. So I would like to share my experience and the content of sessions I participated:

The second day was a highlight, assistants were given challenges and tools, dialogue tables enriched my personal experience by listening to others talk about ways to improve development applications.

My first BadCamp 03

On Friday Pattern Lab sessions were quite interesting, practising the creation of themes without relying on a backend. Although I already had the experience of using this tool before, it provided new knowledge to improve its implementation at work.

React + Gatsby’s potential to create static sites was explored, and I learned compelling ways to take advantage of these new tools to improve the performance of an application using React to render the page and Drupal as an API to enter data. This talk was presented by my co-worker Jesus in his session HOW TO KEEP DRUPAL RELEVANT IN THE GIT-BASED AND API-DRIVEN CMS ERA.

My first BadCamp 04

On Saturday I attended an Accessibility session that showed tools for people with different types of disability, some may be paid or free to implement on the site, it all depends on the needs of the specific project.

Another talk that caught my attention was Artificial Intelligence in Drupal, by using Google Cloud Vision API in sites that provide tagging of images, face, logo and explicit content detection through Machine Learning.

A fantastic experience and I am very grateful to weKnow for helping me attend. It was a great success that I hope to repeat in a near future!

My first BadCamp 05

Most Interesting Sessions

Nov 05 2018
Nov 05

Last year, Drupal Association has decided to take a break to consolidate and not organize an official DrupalCon Europe in 2018. Twelve community members stepped in and developed a plan to organize a DrupalCon replacement − named Drupal Europe. The final result was outstanding.

More than 1000 Drupalers from all over the world gathered in Darmstadt, Germany from 10th to 14th September 2018 to attend the yearly biggest European Drupal event. The new event brought a new concept. It featured 10 amazing tracks that guaranteed high-quality content for all of the Drupal target groups − developers, marketers and agencies. Additionally, it created more room for contribution and collaboration outside session slots.

Official Group Photo Drupal Europe Darmstadt 2018

We supported the event by giving three talks in two different tracks.

Miro Dietiker - Connecting media solutions beyond Drupal

On Tuesday, Miro Dietiker, founder of MD Systems, gave an interesting talk about integrating Digital Asset Management systems with Drupal. He compared the benefits of existing solutions and provided a practical guide what kind of solutions you could use to fulfill your media management needs.

More information about the session and slides can be found on https://www.drupaleurope.org/session/connecting-media-solutions-beyond-drupal while the session recording is available below.

[embedded content]

On Wednesday, Miloš Bovan held a session about Paragraphs module and its best practices. The full room of people proves that people love and use Paragraphs a lot. The talk focused on answering frequently asked questions when it comes to working with Paragraphs. Those covered some of the new features that are not well known as well as ideas on how to improve the editorial experience easily.

Miloš Bovan - Enrich your Paragraphs workflow with features you didn’t know about

The session summary is available at https://www.drupaleurope.org/session/enrich-your-paragraphs-workflow-features-you-didnt-know-about.

The conference featured many interesting sessions that provided ideas and actual implementations on how to make the Paragraphs user interface better (Creating an enterprise level editorial experience for Drupal 8 using React, Front-end page composition with Geysir, Improving the Editor Experience: Paragraphs FTW). The discussions about Paragraphs continued during all the conference and resulted in a BoF that happened on Thursday where Drupalers have discussed the future of Paragraphs UI. We look forward to fruitful collaboration.

John Gustavo Choque Condori - Drupal PKM: A personal knowledge management Drupal distro

John Choque, our colleague, gave a talk about the personal knowledge management distribution he has created as part of his bachelor thesis. The talk gathered people interested in education to get ideas how to improve knowledge management in their organizations. The session summary as well as slides are available at https://www.drupaleurope.org/session/drupal-pkm-personal-knowledge-management-drupal-distro

[embedded content]

Social activities

Besides sessions, contributions and coding we enjoyed attending social events as well. On Tuesday, the organizers brought participants to the Bavarian Beer Garden to taste traditional German bratwurst and beers. The place was located in the park and there was no chance to miss the great event. 


On Wednesday, we joined our fellow Drupalers from the Swiss community for a dinner. It is always nice to catch up with local people in an international event such as Drupal Europe.

As usual, the event came to an end with traditional and yet always exciting Trivia Night.

What’s next?

During Drupal Europe, the project founder, Dries Buytaert has announced that Drupal Association signed an agreement with Kuoni, an international company specialized in events organizations. This results in bringing the official DrupalCon Europe event back in 2019 and it is going to happen in the capital of The Netherlands − Amsterdam.

We hope to see you all there!

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web