Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
Jun 02 2020
Jun 02

End of life (EOL) software is a very real problem. Whether your business is using ecommerce and customer relationship management systems across multiple platforms or relying on basic scheduling and accounting software, you will at some point reach a technological expiry date.

When a system reaches end of life, the creator/owner of the software/technology no longer delivers support services for the product. That can include technical support, hardware or software upgrades, bug fixes, security patches, or feature development. In short, the system gets abandoned by its owner. 

Software becoming obsolete can cause all sorts of problems. Here are five risks to your business in running EOL software:

1. Security

End of life technology receives no security updates. No bug fixes. No patches. No monitoring. Your technology is dead in the eyes of the creator. That means your security is completely compromised, not only for the platform that is EOL, but also potentially for any others that connect to it.

At minimum, your system can be accessed and your content or records edited, stolen, or deleted. If you have any user data, financial data, or sensitive information, you could have a major problem. The monetary and reputational cost could kill your business.

A survey of 2,600 CIOs across the U.S. found that the number one concern was keeping systems and information secure. If you take no action on EOL systems, you are essentially condemning yourself to failure in that regard. 

2. Maintenance

Just because the software maker is no longer supporting the software doesn't mean you have no options for supporting it going forward. There are agencies that specialize in supporting older technology. But that support doesn’t come cheap, and integrations into other systems require even more time-consuming and expensive workarounds. 

As a general rule, maintaining EOL software is complex and expensive. Is it really worth not moving due to fear of change?

3. Liability

If you hold people's information and data, you are responsible and liable for it. Using systems that are not properly supported to keep that data secure means you can be prosecuted for not complying with government or industry regulations. Fines, shutdowns, and even jail time are potential outcomes from not acting responsibly with the information you have been entrusted with. 

4. Reliability

If you were a taxi driver, would you willingly drive an old car that is no longer maintained and has sporadic issues? Of course not. That’s because your livelihood relies on the economics of your vehicle. 

But that is what you are doing if you continue with EOL software. Old software is less reliable and more prone to failure. Even if you are able to find people who will work on it for you, it's going to cost a lot more money, because it takes much more time and expertise. 

And if it connects to other systems, be prepared for much more testing time to ensure that all the variables across all systems are working properly. Because there's no guarantee that they will.

5. Cost

EOL software costs more, whether it’s through lost/stolen data, updating and maintaining with third parties, legal liabilities, or lost revenue from downtime or issues. 

The sticker price on a new system can sometimes seem large. But the security gained from having well maintained and supported systems is critical. 

One side benefit from moving from EOL software is the opportunity to review your company's entire technology stack/architecture. We often see that when going through the cost and process of moving systems: Other changes are made to improve workflows and processes. These often result in net savings for the business. So not only do they spend less, they eliminate their security risks and improve workflows.

If you have software moving towards EOL, it's essential to look at not only replacing the single system, but also assessing your whole technology landscape for opportunities to make larger improvements.

Conclusion

Ultimately, EOL technology is costly to your business in multiple ways. Most technology providers give lots of notice when one of their products is going to be unsupported. That gives you time to assess your options and determine the path you should take. 

In some cases, it's migrating to a new version or equivalent product. In others, it’s reorganizing your company's whole technology structure and moving away from a system that was holding you back. 

To determine if your system has reached EOL and plan for your next move, check out our End of Life Playbook.

Download the End of Life Playbook (PDF)

Jun 01 2020
Jun 01

Many organizations are running into the challenge of managing content on their multiple websites for gaining centralized control and ensuring its secure flow.

Taking a piece-by-piece approach and allocating teams to work on each site separately drives the higher cost of maintenance, development, results in complex infrastructures, and inefficiencies in the process.

While content cannot be shared and shipped using CMI tools unlike configurations, Drupal modules can be utilized for sharing content among different sites or different instances of the same site.

This blog sheds light on the features that enterprises should not overlook while leveraging Drupal modules and also examines the benefits & limitations of the Entity share module and the cases in which it makes the biggest difference.

A Cost-effective Solution to Manage Content Across Sites 

Entity Share module helps enterprises achieve a workflow where subsites of a multisite architecture can share a piece of content across without disrupting the workflow at their respective ends. Besides, it also keeps UI experience and cost-effectiveness in check.

The module works for a setup where each of the sites have different databases. It provides easy means to share entities like nodes, taxonomy terms, media, etc on the basis of endpoints obtained from the JSON:API module via basic authentication.

 

Note- The websites sharing content among each other are designated by the terms, Server, and Client. The server (site) being the one from where content is shared and the client (site) is the one that takes in shared content.

Installation and Configuration Process of Entity Share Module

Follow these steps to install and configure the module-

  1. The entity_share module of desired version might be installed either via composer using
  2. Thereafter the module needs to be enabled using the drush command “drush en entity_share”.

  3. Next step is to create channels on the server site containing exact data to be exposed from its end. Channel configuration can be done after enabling the entity_share_server v.i.a command “drush en entity_share_server” which is a submodule present within entity_share. Additional filtering and sorting rules can be set on these channels as required after navigation through

    Configuration-> Web Services-> Entity Share-> Channels.

    text fields in white backgroundThe specification of an authorized user is a must to access this channel. 
    text fields in white background
  4. The client site, on the other hand, contains remote data that comprises the remote/server URL that it needs data from and authorization details like user and password to connect to the server which is provided after enabling the submodule entity_share_client using command “drush en entity_share_client”. Note this module needs to be enabled on the site that will pull shared content.

     

    Navigate to Configuration-> Web Services-> Entity Share-> Remote Websites and configure the remote settings. Ensure that the username and password in the Basic Auth section is the same as the credentials of the user that has access to entity share channels on the server end configured earlier.

    text fields in white background
  5. After successful authentication, all the shared content (from server end) will be available to the client site at [client_base_url]/admin/content/entity_share/pull to be pulled and displayed at its end. 


    This provides an added advantage to the client-side where it can accept the shared content only after complete verification. The shared content simply does not get created as soon as the server shares it.

    Moreover, the interface that the module provides for the entities to be pulled is user-friendly and easily understandable. It clearly depicts newly created and already pulled content along with its synchronization status. In case the content after being shared and pulled gets edited either at the server or client-side, the status gets immediately updated.

text fields in white background

Use Case of Entity Share

We implemented Entity Share module for a client project 

Recently Srijan came across a requirement where one of its established clients had local websites in regional languages distributed across many countries in the world.

Our main objective was to provide them with a solution where the administrator or central authority would be able to share some content like news updates, press releases, etc from the main/corporate website without affecting the rest of the content at each end. 

Additionally, a necessity of central control over each of the shared content/nodes was required where any change on the main site would be available on the client end to be pulled again or re-synchronized. 

Similarly, if a client site made any change on the content at its end, changes would appear on the corporate/main to be synced. The entity share module was best suited for such a scenario. 

We configured channels and remote sites as described above and the functionality was achieved. One of the custom functionalities added was to set the default status of the node being pulled into the Draft state so that the content editor can review the same before publishing. 

Despite the fact that Entity Share module is not yet identified as secure since a lot of inaccessible data is exposed using JSON:API endpoints, we implemented it for the client project. 

Because an extra security layer can be implemented to the web server configuration level by blocking requests from unwanted sources and allowing only trusted sources to fetch data. No third party expensive integrations were required. It matched with the clients’ requirements and also simplified our process of adding custom functionalities to it.

Benefits of Entity Share Module

It offers the following benefits-

  1. Authorized access- The module provides content exposure to a site ensuring authentication. Without proper authentication, no site can have access to the channel data exposed from the server website.
  2. Enhanced security for verifying content- The client site has a choice to pull data from the available list of content shared with it. This allows an extra layer of security that allows the administrator/editor of the client site to verify data at its end before synchronizing it. A link to the content/entity being shared is available beside each item in the list of entities present in the respective channel.
  3. Different versions to detect changes, if made- The module lets you view the difference between the already pulled entity and the entity on the server end, in case anyone of them gets changed.

    Given this, you have to install a module called diff to let you view revisions of an entity. Although the module has issues depicting differences in the reference fields; developers have an opportunity here to contribute to the community by finding an appropriate solution to the same.

  4. Multilingual support- Translated entities may be shared among sites provided the language is configured on both the ends. Even in the case where the default language of the server and the client site is different, this module is appropriate to use. 
    The client site may add appropriate translations based on the pulled content at its respective end.
  5. Auto-creation of referenced entities- All the referenced entities are auto-created based on UUID when a content/entity gets pulled if not present on the client end. Hence referenced paragraphs, images, and media that contain references to such fields need not be present on the client end before pulling content. They will be automatically created and linked.
  6. Clean and simple user interface- Lastly, the UI interface that entity_share provides for pulling/synchronizing content is easy to use. The entity pull access might be given to a specific user/editor of the website without developer intervention, once configured properly.

Limitations of Entity Share Module

Like other modules mentioned above, entity_share has limitations too:

  1. The entity when pulled on the client site, is displayed in the same state, i.e., published/unpublished as that on the main/server website. It implies that the module doesn’t obey customized editorial workflow and moderation process. Editors can’t take appropriate action of passing content through various workflow states such as draft, ready for review, approved and then published.

    For example - A published content when pulled is directly assigned a state from the pulled reference i.e published rather than in draft mode.

    However, there is a possibility to change this functionality by subscribing to the event

    \Drupal\entity_share_client\Event\EntityListDataAlterEvent

    provided by entity_share_client module to alter the status of the content being pulled.

    Likewise, other events are also available in the module that can be used to   tweak any functionality as and when required.

  2. The revision history of the node gets affected after pulling an already pulled entity that has been edited on the client end as well. This is because the changed timestamp that the JSON:API endpoint provides gets added to the client-side as it is after synchronization.

    This also needs to be fixed in the module to allow pull operations without affecting revisions on both ends. You can find another related issues  too.

Instead of using exorbitant and ineffective Drupal modules for content management across the various sites, give a try to Entity share module, it is a cost-effective solution that can be optimized as per enterprises' requirements.

Looking for a similar solution? Drop us a line and our team will get back to you.

May 28 2020
May 28

This month’s SC DUG meeting featured Will Jackson from Kanopi Studios talking about his virtual background and office.

Before everyone was learning to use Zoom virtual backgrounds, Will had built out a full 3D room for his background, including family pictures and other fun details. He talked about what he built and may inspire you to try some more personalized than swaying palm tree and night skies.

[embedded content]

If you would like to join us please check out our up coming events on MeetUp for meeting times, locations, and remote connection information.

We frequently use these presentations to practice new presentations, try out heavily revised versions, and test out new ideas with a friendly audience. So if some of the content of these videos seems a bit rough please understand we are all learning all the time and we are open to constructive feedback. If you want to see a polished version checkout our group members’ talks at camps and cons.

If you are interested in giving a practice talk, leave me a comment here, contact me through Drupal.org, or find me on Drupal Slack. We’re excited to hear new voices and ideas. We want to support the community, and that means you.

May 26 2020
May 26

Many costs are associated with developing a new ecommerce site or migrating from an antiquated setup to an upgraded version. And unless you work in the thick of ecommerce development every day, you likely don't know what questions to ask to ensure you’re getting the full picture.

This article explains what your typical expenses will look like and makes a few suggestions about how to approach budgeting for this undertaking.

Open Source vs. SaaS: A Comparison of Costs

You need to decide whether you will go with open source or a Software-as-a-Service (SaaS) platform to power your site. The cost of doing business is very different with each model.

An open source ecommerce framework has the expenses front-loaded. You pay for development time and configuration costs, and then the final product is yours to own and manage—license-free. 

A SaaS approach is quicker to get live and has lower costs up front. But then you pay an annual license fee and give a percentage of your revenue to the platform with each transaction made. 

Start by doing some easy math. Calculate three percent of your average annual sales. With an SaaS approach, if you sell $50 million online each year, you'll pay $1.5 million in revenue share (on top of licensing fees). If that is an acceptable cost of doing business and allows you to “set it and forget it," then SaaS is likely the right way to go for you.

But if you're a business that needs or wants more control of the front- and back-end experiences, you can use that three percent as a starting point to decide how to shape and invest in your online architecture. With open source software, you’d invest this money up front in year one. In years two and beyond, expenses taper down to about 15 percent of this initial investment annually to keep operational. 

Complete this exercise in relation to your own revenue and figure out what your working budget would be to get started. If three percent leaves you with peanuts, I’d suggest searching out a DIY platform-first ecommerce tool and seeking the help of an independent contractor to start generating revenue online. Your year-one investment may look closer to 50 percent of your annual online revenue to get where you need to be. 

Try to avoid thinking of this as an expense. Instead, think of how much money you’re going to spend to get a return on investment. How long will it take you to earn that ROI? Are these expectations realistic?

How to Budget for an Open Source eCommerce Architecture

Moving from an existing platform (typically SaaS or home-brew) over to a fully open source, headless ecommerce architecture setup incurs costs like:

Planning

Planning is the backbone of a successful ecommerce development project. If you don’t spend the time and money to work out that foundational blueprint, you will get a half-assed outcome that will likely cost more than you were initially promised.

On average, the planning processes for building a substantial ecommerce site for businesses that generate $50 million or more in revenue take 10 weeks of work and cost about $50,000. 

Planning is the absolute MUST-DO on your list. If you skip it, you may save $50,000, but your project will spend it on the other end trying to figure out who meant what because you flew cheap and blind. 

Ask if your proposed agency completes the following activities in their planning phase: 

  • Visualization / live prototyping 
  • Conversion planning, persona development, user journeys 
  • API integration planning, platform and integration reviews and selections 
  • ERP / product mapping 
  • Server and dev ops planning, security, performance and scalability planning

If you’re being pitched the above list, and you can see working past examples of blueprints such as these, then you’re spending your money wisely and you have a shot at getting this project right the first time. 

TIP: This plan should be detailed enough that you can either take it and build out your new site in its entirety with your on-staff tech team, or take it to ANY agency and have a crystal-clear spec for execution. 

Planning is not conceptual. It is a fully operational blueprint that the engineers have stamped and approved. This is a one time cost and the most essential ingredient in your budget. 

If you can only afford to get through planning in year one, make it a priority and wait for the next round of capital expenditure funding to implement it.  

Creative Design

Designing a new eComm site is the fun part. This phase of the project should be done after planning is fully signed off on. That’s because planning allows ideas to flow and evolve. And changes in functionality dictate front-end experiences. 

Your design phase will vary in price depending on what you want to see mocked up versus just built by the team without your input. Set aside $25,000 to $45,000 to make sure your creative phase reflects the quality of your business accurately. This is a one-time cost.

Here are a few tips to ensure that you’re spending your money wisely:

  • Beware of agencies that propose mockups for 30 pages within your new ecommerce site. This is a waste, a cash grab, and a sign of an inexperienced development team.  
  • Limit mockups to the home page, catalog landing page, product details page, and a general content page. However, if you have some funky options in your cart and/or checkout process, design them, too. 
  • Don’t bother fully mocking up creative designs for responsive options. If you’re dead set on seeing the mobile experience, start with the homepage on phone only and evaluate from there. 
  • Don’t waste time or money creating full mockups for each page. You can always add more designs as you go, if needed, or designers can provide elements to improve designs on single pages.
  • Complete and approve the home page design fully first before moving onto any “internal” templates. You don’t want rework across multiple designs. 
  • Use a web design agency, not a design agency. There are specifics for designing to web standards that don’t apply to companies that deal in logos, brands, and print work.

Sprinting / Development

Your project team should work with you to break your planning into stories, block these stories into epics, and group these epics into sprints. You’ll then have an idea of how many sprints you’ll need to get live.

Typical costs for sprinting range from $20,000 to $60,000 a month for the lifetime of the build cycle, which is usually six to 12 months. After this investment, you have a feature-rich ecommerce setup to push live. (Remember: These expenses are front-loaded. After this one-time cost, you own the site and don’t have to pay licensing fees or share your revenue).

Sprinting costs depend on velocity. That is, how many bodies can we afford to put on this development until the sprints are done? If you have $20,000 a month to spend for six months, you’ll get through $120,000 worth of programming or about 600 hours (give or take per agency).

That’s a decent amount of programming time for a version one go-live. You can alter the velocity, or speed with which you move, by altering your spend. After you get to that first launch, you may have the option to taper down resourcing (i.e., output) and slow spending over the following months.

Additional Features or Ongoing Support

Your site is not a static business channel. You’ll need to budget for continued rollout of new ideas, features, integrations, and changes. We often work with companies to train an in-house developer and take the agency expense out. With an open architecture and open source ecommerce setup, the ongoing costs are fully in your control.

Plan out your monthly spend over 12 months to figure out what’s realistic to your ROI, and if you should start right away or take a break.

TIP: Budget for  at least a year of ongoing expenses at whatever rate you deem suitable if you want to get a little consulting, training, advice, or coding from some experts. Just be sure to align your expectations of output with your willingness to spend.

Third-Party Expenses

Look past your site to see the full picture. What else does it need or plug into that has an annual contract? Account for these costs, too. A few typical additional expenses include:

  • Hosting
  • Server maintenance, security, updates and monitoring
  • Accounting software
  • ERP software / PIM 
  • CRM software
  • 3PL software (shipping, warehousing, labeling)
  • Programmers on staff
  • CSRs on staff 
  • Training and documentation

Conclusion

Your website is not an expense; it's a revenue channel that needs to be flexible and well architected. A substantial investment will be needed to compete online, so make sure you understand the costs involved. 

If you don’t know where to start, chat with a consultant to see if your math lines up with your goals, and then take this information to your internal team. You have options, and they should be clearly laid out for you up front, not presented to you with an invoice when you’re well into development with an agency’s team. 

Inform yourself on the process, not on the programming, and you’ll be in a better position to evaluate the best path forward.  

Click to contact one of our ecommerce consultants

May 19 2020
May 19

How amazing does it feel when you walk into a coffee shop and the barista greets you by name and asks if you’d like the usual? Or when you meet someone you haven’t seen in a long time and they ask about some obscure and specific hobby you once mentioned you had?

These personalized experiences give you the warm and fuzzies. You typically come away from those interactions a fan of the place or person. Heck, if someone were to criticize them, you’d speak up that that's not your experience. And you wouldn’t hesitate to recommend that place or person to others.

At an event a few years ago, I noticed someone who seemed a little hesitant. I introduced myself and invited them to join me at my table, and we chatted a little. We never spoke much after that. But on multiple occasions over the past few years, that person has given me a glowing reference when I came up in conversation. 

Personalization makes us feel valued and understood. And that's how you want your customers to feel. Because if they do, they will buy more and advocate for your brand.

Personalized Marketing Options to Consider

Broadly speaking, there are two ways to do web personalization: with real-time data or historical data.

Real-time data involves using location data to serve up a specific site, content, or offer. Here are a few examples:

  • Using device type or operating system to either manage how content is displayed or make assumptions on product needs
  • Using traffic source to tailor content (i.e., looking at when and what the user came from)
  • Basing promotions on products or services that have proven popular with others

Historical data goes deeper. This involves presenting personalized content, products, or offers based on users' previous interactions. You could look at factors like:

  • The number of orders they made
  • Their average order size
  • The total amount they spent
  • The products they looked at
  • The carts they abandoned
  • The time that has elapsed since their last transaction and/or visit

The options are as vast as the data you have collected. But through segmentation and rules, you can greatly increase the user's odds of converting.

Why You Need to Tread Carefully

Many consumers are becoming increasingly concerned about privacy and data management. You need to ensure that the personalization you supply is helping them in making a conversion decision and not simply showing them how much you know about them.

For instance, your barista asking if you fancy trying the new mocha latte (because they knew you had recently bought one from another brand) is much less creepy than being greeted with, "I heard you’re now into chocolate, so try this new mocha latte." The difference is small, but crucial.

Choose the Right Tools

With the overwhelming array of personalization options, it's important to work with an experienced team that can help guide you. At Acro, we love Drupal, and it can do many entry personalization functions within its platform (much more than most content management systems).

However, if you need to get very sophisticated, then you need a third-party platform. We love Acquia Lift. For features, usability and support, it is unparalleled. If you would like a personalized introduction to Acquia, hit me up and I’ll set you up, personally. 

The Bottom Line

Global research and advisory firm Gartner stated that the three key takeaways on personalization are:

  1. Consumers want to receive personalized help as they navigate the buying journey.
  2. Focusing solely on personalized recognition is potentially detrimental to a company’s commercial objectives.
  3. When it comes to help, consumers prioritize information, a simpler purchase process and saving time.1

Peronalization isn’t the ultimate goal. It’s another tool to achieve whatever your actual goal is, whether that be increased sales, increased order value, increased frequency or brand loyalty. Once you define what your goals are, you can explore if personalization will give the required ROI.

If you would like to have a conversation about your business goals and see if personalization is an appropriate tool for you, give me a call. And if not, if we ever meet out and about, you’re always welcome to sit at my table.

1 - Source: Gartner, "Maximize the Impact of Personalization,” April 2019

May 12 2020
May 12

Many people researching Drupal Commerce 2.x for Drupal 8 (or the upcoming Drupal 9) are likely wanting to either remove the extra ecommerce shopping carts or allow checkout for multiple carts. This blog post will explain why we have multiple carts—and why being able to checkout with multiple carts is challenging, but possible.

Why you can have more than one Drupal Commerce cart

First, let’s demonstrate what Commerce 2.x can do out of the box for a single user and is often considered a bug. 

  1. Go to Acro Media’s demo store.
  2. Start out as anonymous and register as a user.
    1. Register here.
    2. Check your email/spam and click a link.
    3. Set your password because you’ll need to log back in shortly.
      Note: Acro doesn’t use your email address used on the account sign up on this site to contact you for marketing purposes. You can opt into marketing materials by clicking the large red help question mark on the right.
  3. Once registered, add something to your cart, and log out.
  4. Add something to your cart and log in.
  5. Go to /cart.

Shopping-cart

If you are seeing two carts, then you have discovered, like many others, that Drupal Commerce 2.x shows multiple carts by design. Drupal Commerce 1.x created multiple carts like this as well, but would only show one cart at a time. In 1.x, you could follow the five steps outlined above, then checkout and your original cart would display.

Why? Because the system will not delete carts. We’re using a simple anonymous session to create two carts in a potentially common edge case.

The pros and cons of multiple carts

Pro Con
  • Customers never lose a cart, even if their use of the site means they have more than one.
  • You could have multiple sellers, enabling a marketplace feature to be built on top of the existing functionality
  • You could enable different checkout workflows (one for digital services, one for recurring services, and another for physical items that require shipping).
  • You could end up with a confusing user experience by making your customers check out multiple times.
  • Payment and fulfillment must be handled separately for different items or different vendors.
  • More than one cart presents a significant visual challenge for designers. In the cart dropdown, for example, how do you should more than one cart? On the cart page, for another example, how do you handle more than one checkout button?

Turning off multiple carts in Drupal Commerce 2.x

There are two relatively simple Drupal modules you can use to show a single cart to a user:

Commerce Combine Carts—If this module is turned on, the multi-cart demo above would not produce two carts.

Commerce Cart Advanced—This module packs a lot of features into it for the crowd of users who want management tools around their multiple cart experience, but it also includes the feature to display only one cart at a time. It was created and is maintained by Acro Media’s senior developer known as krystalcode (Dimitris Bozelos).

Checking out multiple carts, Etsy/Amazon style

The holy grail of marketplace commerce is multi-store and single-checkout. The idea is that you could have a site that features multiple stores and customers could check out once from more than one store. 

According to the original author and former maintainer of Drupal Commerce 2.x, bojanz, you can do this by coding a form that acts like a checkout flow-form, but changes more than one order simultaneously.

However, you also have to consider a number of other issues: 

  • Fulfillment—If the stores are selling physical products, how will these orders appear to the customer and to customer service for each store? Likely, each store would want to only see the products for which they are responsible.
  • Order management—Even Amazon does some weird things with orders for its customers. Often, orders are split up for seemingly no reason, changing order totals and making order management challenging for customers and for customer service.
  • Payment—If you, as the site owner, plan to pay stores from your own bank account, you’ll want to set up a single, site-wide payment gateway and manage disbursement payments to your store owners. If not, then you’ll require each store to have its own payment gateway credentials or some other even more complex setup.
  • Taxes—Assuming you have good solutions for all of the above, taxes will still likely make it very hard to move forward. Tax law is hard in the best of times, and depending on how you take payment, tax rules would need to be created and maintained per store. Solutions like Avalara AvaTax only work per store and can be overly expensive for small retailers.

The bottom line

Basically, you have a few contrib options if you want to manage carts for your customers. But if you want that elusive multi-vendor, single checkout, you’ll have to plan well according to your business needs. Regardless, the flexibility of Drupal’s ecommerce cart functionality is capable of creating the best ecommerce shopping carts out there, you just need to know how to do it.

May 05 2020
May 05

Hands-On Machine Learning With PHP, Part Two

By Liam Wiltshire

Last month, we looked at how we can take our data and, using machine learning, categorize the data by groupings we’d already decided upon—spam, ham, fraudulent, good, etc. This usage is a great way to start, but you might quickly find you want to take things further. Perhaps this means analyzing the data differently, to come up with new relationships you hadn’t previously considered, or maybe you want to improve the accuracy of your results by avoiding some common pitfalls in machine learning. These are the questions we are addressing today, starting with a different approach to machine learning—unsupervised learning.

Decoupling Drupal From Its Frontend System to Use in an Existing Website

By Jack D. Polifka

The ability to create and publish content in real-time without knowing HTML or the ability to program is a common feature of many websites. This capability allows individuals to produce web content regardless of their technical ability. Producing web content is a common feature of many content management systems (CMS). Some websites don’t allow direct creation and publication of content, so a possible solution to address this shortcoming is integrating a CMS into these websites. This article describes how Drupal, a CMS, was added to an existing Symfony website to allow users to publish content in real-time. Implementation details about integrating Drupal into the existing website using Headless Drupal are shared.

Passwordless Authentication

By Brian Reterrer

Passwords are part of our everyday life. You may not even think about them most of the time, that is until you forget one. What if you never had to use a password again? What if I told you it was more secure not to have a password? Would you believe me? Find out why companies are ditching passwords and moving towards multi-factor authentication.

PHP Puzzles: Factorials

By Sherri Wheeler

Each installment of PHP Puzzles presents a small coding exercise typical of those we might encounter in a job interview, or on a coding challenge website. In the following month, we’ll look at a couple of possible solutions for today’s puzzle. Perhaps one of the most common coding puzzles, I recall this one from high school computer class&msdash;calculating a factorial.

Education Station: Anatomy of a Web Response

By Chris Tankersley

Last month, we looked at HTTP requests and how a user agent asks for a specific resource. How do we provide an answer? Web servers send it back in an HTTP response. Let’s look at the parts of a response, how to indicate success or failure, and how to build the response body.

The Workshop: Specification BDD with Phpspec

By Joe Ferguson

phpspec is a package which allows us to use Behavior-Driven Development, BDD, which comes from Test-Driven Development, TDD. When applying BDD, we write our tests first and then only enough code to pass our tests and then refactor and verify the tests still pass exactly as we would with TDD. This cycle continues and is often referred to as red-green development of writing tests that fail and enough code to make them pass, then restarting the process.

History and Computing: Transcontinental Railroad

By Edward Barnard

We’re looking at the background behind the U.S. Department of Justice plan to consider antitrust action against the giants of high tech. We’ll see how ocean transportation gave way to transcontinental transportation. That’s the background we’ll need for seeing how transcontinental transportation became the antitrust action that’s setting a precedent for big tech.

Security Corner: Request Replay Protection

By Eric Mann

One of the most overused terms of security is “token.” It’s used in many different, often unrelated contexts to mean very different things. This month we’re going to discuss one form of tokens—replay prevention nonces—and how to use them.

By Eric Van Johnson

This month, we revisit our Canadian friends, this time we travel north of Toronto, Canada, to the York Region of Canada and the You Region PHP User Group.

finally{}: What’s in PHP Eight?

By Eli White

While much of the world shuts down, the PHP core developers have been hard at work preparing for the release of PHP 8.0 at the end of this year! The feature freeze is in just a few months (July 28th), so this is the exciting time when there is a push to get various features into this momentous release! Let’s take a look at a few of the bigger things currently planned for PHP 8.0.

May 05 2020
May 05

The ability to create and publish content in real-time without knowing HTML or the ability to program is a common feature of many websites. This capability allows individuals to produce web content regardless of their technical ability. Producing web content is a common feature of many content management systems (CMS). Some websites don’t allow direct creation and publication of content, so a possible solution to address this shortcoming is integrating a CMS into these websites. This article describes how Drupal, a CMS, was added to an existing Symfony website to allow users to publish content in real-time. Implementation details about integrating Drupal into the existing website using Headless Drupal are shared.

This article was originally published in the May 2020 issue of php[architect] magazine. To read the complete article please subscribe or purchase the complete issue.

Apr 23 2020
Apr 23

We’ve been making big websites for 14 years, and almost all of them have been built on Drupal. It’s no exaggeration to say that Four Kitchens owes its success to the incredible opportunities Drupal has provided us. There has never been anything like Drupal and the community it has fostered—and there may never be anything like it ever again.

That’s why it’s crucial we do everything we can to support the Drupal Association. Especially now.

The impacts of COVID-19 have been felt everywhere, especially at the Association. With the cancellation of DrupalCon Minneapolis, the Drupal Association lost a major source of annual fundraising. Without the revenue from DrupalCon, the Association would not be able to continue its mission to support the Drupal project, the community, and its growth.

The Drupal community’s response to this crisis was tremendous. For our part, we proudly joined 27 other organizations in pledging our sponsorship fees to the Association regardless of whether, or how, DrupalCon happened. I ensured my Individual Membership was still active, and I made a personal contribution.

But we need to do more.

You can help by joining us in the #DrupalCares campaign.

The #DrupalCares campaign is a fundraiser to protect the Drupal Association from the financial impact of COVID-19. Your support will help keep the Drupal Association strong and able to continue accelerating the Drupal project.

The Drupal Association

The outpouring of support has been… Inspiring. First, project founder Dries Buytaert and his partner Vanessa Buytaert pledged their generous support of $100,000. Then, a coalition of Drupal businesses pledged even more matching contributions. We are proud to count ourselves among the dozens of participating Drupal businesses.

Any individual donations, increased memberships, or new memberships through the end of April will be tripled by these matching pledges, up to $100,000, for a total of $300,000.

Please join us in supporting the Drupal Association. Your contribution will help ensure the continued success of the Association and the Drupal community for years to come.

Give to #DrupalCares through April to help the Association receive a 3:1 matching contribution. 

Apr 01 2020
Apr 01

Platform.sh, like any good PaaS, exposes a lot of useful information to applications via environment variables. The obvious parts, of course, are database credentials, but there's far more that we make available to allow an application to introspect its environment.

Sometimes those environment variables aren't as obvious to use as we'd like. Environment variables have their limits, such as only being able to store strings. For that reason, many of the most important environment variables are offered as JSON values, which are then base64-encoded so they fit nicely into environment variables. Those are not always the easiest to read.

That's why we're happy to announce all new, completely revamped client libraries for PHP, Python, and Node.js to make inspecting the environment as dead-simple as possible.

Installation

All of the libraries are available through their respective language package managers:

PHP:

composer install platformsh/config-reader

Python:

pip install platformshconfig

Node.js:

npm install platformsh-config --save

That's it, you're done.

Usage

All three libraries work the same way, but are flavored for their own language. All of them start by instantiating a "config" object. That object then offers methods to introspect the environment in intelligent ways.

For instance, it's easy to tell if a project is running on Platform.sh, in the build hook or not, or if it's in a Platform.sh Enterprise environment. In PHP:

$config = new \Platformsh\ConfigReader\Config();

$config->inValidPlatform(); // True if env vars are available at all.
$config->inBuild();
$config->inRuntime();
$config->onEnterprise();
$config->onProduction();

// Individual Platform.sh environment variables are available as their own properties, too.
$config->applicationName;
$config->port;
$config->project;
// ...

The onProduction() method already takes care of the differences between Platform.sh Professional and Platform.sh Enterprise and will return true in either case.

What about the common case of accessing relationships to get credentials for connecting to a database? Currently, that requires deserializing and introspecting the environment blob yourself. But with the new libraries, it's reduced to a single method call. In Python:

config = platformshconfig.Config()

creds = config.credentials('database')

This will return the access credentials to connect to the database relationship. Any relationship listed in .platform.app.yaml is valid there.

What if you need the credentials formatted a particular way for a third-party library? Fortunately, the new clients are extensible. They support "credential formatters", which are simply functions (or callables, or lambdas, or whatever the language of your choice calls them) that take a relationship definition and format it for a particular service library.

For example, one of the most popular Node.js libraries for connecting to Apache Solr, solr-node wants the name of a collection as its own string. The Platform.sh relationship provides a path, since there are other libraries that use a path to connect. Rather than reformat that string inline, the Node.js library includes a formatter specifically for solr-node:

const solr = require('solr-node');
const config = require("platformsh-config").config();

let client = new solr(config.formattedCredentials('solr-relationship-name', 'solr-node'));

Et voila. client is now a solr-node client and is ready to be used. It's entirely possible to register your own formatters, too, and third-party libraries can include them as well:

config.registerFormatter('my-client-library', (creds) => {
  // Do something here to return a string, struct, dictionary, array, or whatever.
});

We've included a few common formatters in each library to cover some common libraries. We'll be adding more as time goes by, and, of course, PRs are always extremely welcome to add more!

But what about my language?

We wanted to get these three client libraries out the door and into your hands as soon as possible. But don't worry; Go and Ruby versions are already in the works and will be released soon.

We'll continue to evolve these new libraries, keeping the API roughly in sync between all languages, but allowing each to feel as natural as possible for each language.

Mar 16 2020
Mar 16

Category 1: Web development

Government organizations want to modernize and build web applications that make it easier for constituents to access services and information. Vendors in this category might work on improving the functionality of search.mass.gov, creating benefits calculators using React, adding new React components to the Commonwealth’s design system, making changes to existing static sites, or building interactive data stories.

Category 2: Drupal

Mass.gov, the official website of the Commonwealth of Massachusetts, is a Drupal 8 site that links hundreds of thousands of weekly visitors to key information, services, and other transactional applications. You’ll develop modules to enhance and stabilize the site; build out major new features; and iterate on content types so that content authors can more easily create innovative, constituent-centered services.

Category 3: Data architecture and engineering

State organizations need access to large amounts of data that’s been prepared and cleaned for decision-makers and analysts. You’ll take in data from web APIs and government organizations, move and transform it to meet agency requirements using technology such as Airflow and SQL, and store and manage it in PostgreSQL databases. Your work will be integral in helping agencies access and use data in their decision making.

Category 4: Data analytics

Increasingly, Commonwealth agencies are using data to inform their decisions and processes. You’ll analyze data with languages such as Python and R, visualize it for stakeholders in business intelligence tools like Tableau, and present your findings in reports for both technical and non-technical audiences. You’ll also contribute to the state’s use of web analytics to improve online applications and develop new performance metrics.

Category 5: Design, research, and content strategy

Government services can be complex, but we have a vision for making access to those services as easy as possible. Bidders for this category may work with partner agencies to envision improvements to digital services using journey mapping, user research, and design prototyping; reshape complex information architecture; help transform technical language into clear-public facing content, and translate constituent feedback into new and improved website and service designs.

Category 6: Operations

You’ll monitor the system health for our existing digital tools to maintain uptime and minimize time-to-recovery. Your DevOps work will also create automated tests and alerts so that technical interventions can happen before issues disrupt constituents and agencies. You’ll also provide expert site reliability engineering advice for keeping sites maintainable and building new infrastructure. Examples of applications you’ll work on include Mass.gov, search.mass.gov, our analytics dashboarding platform, and our logging tool.

Mar 12 2020
Mar 12

by David Snopek on March 6, 2019 - 1:56pm

As you may know, Drupal 6 has reached End-of-Life (EOL) which means the Drupal Security Team is no longer doing Security Advisories or working on security patches for Drupal 6 core or contrib modules - but the Drupal 6 LTS vendors are and we're one of them!

Today, there is a Critical security release for the EU Cookie Compliance module to fix an Cross Site Scripting (XSS) vulnerability.

The module provides a banner where you can gather consent from the user when the website stores cookies.

The module doesn't sufficiently sanitize data for some interface labels and strings shown in the cookie policy banner.

This vulnerability is mitigated by the fact that an attacker must have a role with the permission "Administer EU Cookie Compliance banner".

See the security advisory for Drupal 7 for more information.

Here you can download the Drupal 6 patch or the full release.

If you have a Drupal 6 site using the EU Cookie Compliance module, we recommend you update immediately! We have already deployed the patch for all of our Drupal 6 Long-Term Support clients. :-)

If you'd like all your Drupal 6 modules to receive security updates and have the fixes deployed the same day they're released, please check out our D6LTS plans.

Note: if you use the myDropWizard module (totally free!), you'll be alerted to these and any future security updates, and will be able to use drush to install them (even though they won't necessarily have a release on Drupal.org).

Mar 12 2020
Mar 12

This post was created jointly by Michael Hess of the Security Working Group, and Tim Lehnen, Executive Director of the Drupal Association.

Last year, with the security release of SA-CORE-2018-002, the most significant security vulnerability since 2014, we heard the pain of site owners and development teams around the world staying up at all hours waiting for the complex security release process to complete and the patch to drop. We heard the pain of agencies and end-user organizations required to put teams on late shifts and overtime. We heard from some users who simply couldn't respond to patch their sites on the day of release, because of lack of resources or entrenched change management policies.

We've heard calls from the community for rotating the timezones for security advisories from release to release, or for having more on-call support from security team members across the globe, or simply for a longer horizon between the release of PSA and SA.

Yet at the same time, we're cognizant that these solutions would put increased burden on a security team composed of dedicated volunteers and contributors. There are a number of generous organizations who sponsor many of the members of the security team, but relying on their altruism alone is not a sustainable long-term solution—especially if we consider expanding the role of the security team to address the larger pain points above.

Last week, with the release of SA-CORE-2019-003, we heard these concerns for site owners and the sustainability of the security team echoed again.

The Security Team and the Drupal Association have been developing solutions for this issue for well over a year.

The goals are simple:

  • Provide a new service to the Drupal community, from small site owners to enterprise-scale end users, to protect their sites in the gap from security release to the time it takes them to patch.
  • Create a new model for sustainability for the Security Team, generating funding that 1) covers the operating costs of the program 2) can support security team operations and 3) can support additional Drupal Association programs.

Although the execution will take care and careful partnership, we are happy to announce that we've found a solution.

We're tentatively calling this: Drupal Steward. It is a service to be provided by the Drupal Association, the Security team, and carefully vetted hosting partners.

Drupal Steward will offer sites a form of mitigation through the implementation of web application firewall rules to prevent mass exploitation of some highly critical vulnerabilities (not all highly critical vulnerabilities can be protected in this fashion, but a good many can be - this method would have worked for SA-CORE-2018-002 for example).

It will come in three versions:

  • Community version - for small sites, low-budget organizations, and non-profits, we will offer a community tier, sold directly by the DA. This will be effectively at cost.
  • Self hosted version - for sites that are too large for the community tier but not hosted by our vendor partners.
  • Partner version - For sites that are hosted on vetted Drupal platform providers, who have demonstrated a commitment of contribution to the project in general and the security team in particular, protection will be available directly through these partners.

Next Steps

The Drupal Association and Security Team are excited to bring this opportunity to the Drupal Community.

We believe that the program outlined above will make this additional peace of mind accessible to the broadest base of our community possible, given the inherent costs, and are hopeful that success will only continue to strengthen Drupal's reputation both for one of the most robust security teams in open source, and for innovating to find new ways to fund the efforts of open source contributors.

We will announce more details of the program over the coming weeks and months as we get it up and running.

If you are a hosting company and are interested in providing this service to your customers, please reach out to us at [email protected].

Please also join us at DrupalCon for any questions about this program.

If you are a site owner and have questions you can join us in slack #drupalsteward.

For press inquiries, please contact us at: [email protected]

Feb 27 2020
Feb 27

by David Snopek on March 6, 2019 - 1:51pm

As you may know, Drupal 6 has reached End-of-Life (EOL) which means the Drupal Security Team is no longer doing Security Advisories or working on security patches for Drupal 6 core or contrib modules - but the Drupal 6 LTS vendors are and we're one of them!

Today, there is a Moderately Critical security release for the Ubercart module to fix a CSRF vulnerability.

The Ubercart module provides a shopping cart and e-commerce features for Drupal.

The taxes module doesn't sufficiently protect the tax rate cloning feature.

See the security advisory for Drupal 7 for more information.

Here you can download the Drupal 6 patch or the full release.

If you have a Drupal 6 site using the Ubercart module, we recommend you update immediately! We have already deployed the patch for all of our Drupal 6 Long-Term Support clients. :-)

If you'd like all your Drupal 6 modules to receive security updates and have the fixes deployed the same day they're released, please check out our D6LTS plans.

Note: if you use the myDropWizard module (totally free!), you'll be alerted to these and any future security updates, and will be able to use drush to install them (even though they won't necessarily have a release on Drupal.org).

Feb 25 2020
Feb 25

With the announcement of Drupal 9 we want to talk about how this affects our customers, what to expect when new versions come out and to let you know what we do at Amazee Labs to ensure the transition will be painless.

“The big deal about Drupal 9 is … that it should not be a big deal.”

- Dries Buytaert, Drupal Founder
 

Background

The changes to Drupal between versions 7 and 8 were, quite frankly, enormous. Drupal previously had a justified reputation for doing its own thing and ignoring burgeoning standards and practices utilised elsewhere in the PHP community. So, when Drupal 8 was announced, one of the main goals of the release was to get off the Drupal island and start to utilise some of the millions of lines of open source code and documentation available elsewhere.

There were many great sides to this upgrade. The code was being built on a more solid and tested foundation, principally being based on the Symfony framework and leveraging numerous other systems and libraries. This helped Drupal become more enterprise focussed whilst opening the development field to engineers of other systems who were already familiar with the standards and practices now utilised in Drupal.

Unfortunately, the major technical upgrade to Drupal also introduced some headaches. Migrating between Drupal 7 and Drupal 8 can be time consuming and expensive. As a result of this, businesses who undertook such a migration can be forgiven for worrying about Drupal 9 being released just 5 years after Drupal 8. Some clients have expressed concern about using Drupal 8 when another expensive upgrade seems to be just around the corner.

Why Drupal 9 is different

In short, if you keep your Drupal 8 website up-to-date, there will be no major upgrade worries. The core maintainers of Drupal want to make Drupal upgrades easy forever from now on. The Drupal team has a plan to ensure that Drupal 9 will essentially be a minor process. This is possible because Drupal 9 will be built in the same manner as Drupal 8, with the same practices and core libraries. Unlike Drupal 7 to Drupal 8, there will be no major architectural or structural changes to the codebase. 

The main changes, other than bug fixes, improvements and new features will be the upgrades to Drupal’s core libraries. For example, Symfony 3 (the library upon which Drupal is built) comes to its end-of-life in 2021, so it makes sense to have Drupal 9 running on Symfony 4 at that point.

End of support flow chart

How is this easy upgrade achievable? Well, the Drupal team will continue its 6-month release cycle until Drupal 9 is released. In these releases, the code will be deprecated and upgraded to bring it closer to the components and libraries that will be used by Drupal 9, ensuring that when the time does come to upgrade everything will be in place for an easy transition.

Maintenance is key 

Keeping up with new releases and updates ensures that your website stays relevant and secure, and also means that switching from Drupal 8 to 9 will be much more routine. By partnering with us even after your website is created, we can take proactive steps such as making sure there’s no deprecated code in your site before the newest release.

Feb 20 2020
Feb 20

Drupal Camp London is a 3-day celebration of the users, designers, developers and advocates of Drupal and its community! Attracting 500 people from across Europe, after Drupalcon, it’s one of the biggest events in the Drupal Calendar. As such, we're pleased to sponsor such an event for the 6th time!

Drupalcamp weekend (13th-15th March) packs in a wide range of sessions featuring seminars, Birds of a feather talks, Sprints and much more. Over the weekend there are 3 Keynotes addressing the biggest upcoming changes to the technical platform, its place in the market, and the wider Drupal community.

Check out all of the accepted sessions on the Drupal Camp London website here. Or keep reading to see our highlights…

CXO Day - Friday 13th of March

From Front Room to Front Runner: how to build an agency that thrives, not just survives - Talk from Nick Rhind

Few digital agency start-ups reach their first birthday, let alone celebrate over 16 years of success. Our CEO Nick Rhind will be sharing anecdotes and advice from 2 decades of building the right teams to help his agency group, CTI Holdings, thrive.

Catch up with Nick, or any of our team attending Drupal Camp by connecting with them on LinkedIn, or via our contact form.

Come dine with us - Agency Leaders Dinner London

Hosts Paul Johnson (CTI Digital), Piyush Poddar (Axelerant), and Michel Van Velde (One Shoe) cordially invite agency leaders to join them for a night of meaningful discussions, knowledge sharing, and of course great food, excellent wine, and the best company you could ask for. Details of the dinner can be found here.

DCL Agency Leaders Dinner 2020

Agency Leaders Dinner London

Drupal Camp Weekend

Drupal in UK Higher Education - A Panel Conversation

Paul Johnson, Drupal Director at CTI Digital, will be hosting influential bodies from the Higher Education community as they discuss the challenges facing universities in a time of light-speed innovation and changing demand from students. In addition, they will explore the role Drupal has played in their own success stories and the way open source can solve problems for other universities. Drupal camp panel details available here.

The Panellists:

Adrian Ellison, Associate Pro Vice-Chancellor & Chief Information Officer University of West London - Adrian has been involved in Registry, IT and Library Services in higher education for over 20 years. He joined UWL in 2012 from the London School of Economics, where he was Assistant Director of IT Services. Prior to that, he was IT Director at Royal Holloway, University of London, and held several roles at the University of Leeds.

Adrian is a member of the UCISA Executive Committee, representing the voice of IT in UK education. He has spoken about information technology at national and international conferences and events and co-wrote the Leadership Foundation for Higher Education’s 'Getting to Grips with Information and Communications Technology' and UCISA’s ‘Social Media Toolkit: a practical guide to achieving benefits and managing risks’.

Billy Wardrop, CMS Service Support Officer at Edinburgh University - Billy is a Senior Developer with 15+ years experience and the current technical lead for the migration to Drupal 8 at The University of Edinburgh. He has worked with many platforms but his passion lies in developing websites and web applications using open source such as Drupal, PHP, JavaScript and Python. Billy is an advocate in growing the open-source community. As part of his role in the university, he regularly mentors at events and encourages software contribution. 

Iain Harper Head Of Digital, Saïd Business School, University of Oxford - Iain started his career at leading medical insurer MPS, developing their first online presence. He then ran digital projects at a leading CSR consultancy business in the Community before joining the Civil Service. Iain worked with the Government Digital Service on Alphagov, the precursor to GOV.UK. He then joined Erskine Design, a small digital agency based in Nottingham where he supervised work with the BBC on their Global Experience Language (GEL). He now leads the digital team at Oxford University’s Saïd Business School.

Open source has won. How do we avoid dying from success? - A Panel Conversation

Drupal, founded on a philosophy of open source, has steadily grown into a global community, a feat some may label as having achieved ‘Success’. Drupal users and contributors will be discussing the sustainability of Drupal and the future of open source in an open panel session.

What are the challenges faced by different roles? How can we make the whole ecosystem fair and future proof? What does an open source business model look like? 

Join our very own Paul Johnson and Drupal panellists for this thought provoking discussion on the future of open source. More details on the session are available here.

Why should you attend Drupal Camp?

Share useful anecdotes and up-to-date knowledge 

Discover the latest in UX, design, development, business and more. There’s no limit to the types of topics that could come up...as long as they relate to Drupal that is!

Meet peers from across the industry

From C-Level and Site managers to developers and designers over 500 people attended last year. Meet the best and brightest in the industry at talks and breakouts.

Find your next project or employer

A wide variety of business and developers attend Drupal Camp, make the most of it by creating connections to further your own career or grow your agency team.

Mobomo’s Picks: Top 10 Drupal Websites

Jan 30 2020
Jan 30
Jan 23 2020
Jan 23

In the Drupal support world, working on Drupal 7 sites is a necessity. But switching between Drupal 7 and Drupal 8 development can be jarring, if only for the coding style.

Fortunately, I’ve got a solution that makes working in Drupal 7 more like working in Drupal 8. Use this three-part approach to have fun with Drupal 7 development:

  • Apply Xautoload to keep your PHP skills fresh, modern, and compatible with all frameworks and make your code more reusable and maintainable between projects. 
  • Use the Drupal Libraries API to use third-party libraries. 
  • Use the Composer template to push the boundaries of your programming design patterns. 

Applying Xautoload

Xautoload is simply a module that enables PSR-0/4 autoloading. Using Xautoload is as simple as downloading and enabling it. You can then start using use and namespace statements to write object-oriented programming (OOP) code.

For example:

xautoload.info

name = Xautoload Example
description = Example of using Xautoload to build a page
core = 7.x package = Midcamp Fun

dependencies[] = xautoload:xautoload

xautoload_example.module

<?php use Drupal\xautoload_example\SimpleObject; function xautoload_example_menu() { $items['xautoload_example'] = array( 'page callback' => 'xautoload_example_page_render', 'access callback' => TRUE, ); return $items; } function xautoload_example_page_render() { $obj = new SimpleObject(); return $obj->render(); } useDrupal\xautoload_example\SimpleObject;functionxautoload_example_menu(){  $items['xautoload_example']=array(    'page callback'=>'xautoload_example_page_render',    'access callback'=>TRUE,  return$items;functionxautoload_example_page_render(){  $obj=newSimpleObject();  return$obj->render();

src/SimpleObject.php

<?php namespace Drupal\xautoload_example; class SimpleObject { public function render() { return array( '#markup' => "<p>Hello World</p>", ); } } namespaceDrupal\xautoload_example;classSimpleObject{  publicfunctionrender(){    returnarray(      '#markup'=>"<p>Hello World</p>",    );

Enabling and running this code causes the URL /xautoload_example to spit out “Hello World”. 

You’re now ready to add in your own OOP!

Using Third-Party Libraries

Natively, Drupal 7 has a hard time autoloading third-party library files. But there are contributed modules (like Guzzle) out there that wrap third-party libraries. These modules wrap object-oriented libraries to provide a functional interface. Now that you have Xautoload in your repertoire, you can use its functionality to autoload libraries as well.

I’m going to show you how to use the Drupal Libraries API module with Xautoload to load a third-party library. You can find examples of all the different ways you can add a library in xautoload.api.php. I’ll demonstrate an easy example by using the php-loremipsum library:

1. Download your library and store it in sites/all/libraries. I named the folder php-loremipsum. 

2. Add a function implementing hook_libraries_info to your module by pulling in the namespace from Composer. This way, you don’t need to set up all the namespace rules that the library might contain.

function xautoload_example_libraries_info() { return array( 'php-loremipsum' => array( 'name' => 'PHP Lorem Ipsum', 'xautoload' => function ($adapter) { $adapter->composerJson('composer.json'); } ) ); } functionxautoload_example_libraries_info(){  returnarray(    'php-loremipsum'=>array(      'name'=>'PHP Lorem Ipsum',      'xautoload'=>function($adapter){        $adapter->composerJson('composer.json');      }

3. Change the page render function to use the php-loremipsum library to build content.

use joshtronic\LoremIpsum; function xautoload_example_page_render() { $library = libraries_load('php-loremipsum'); if ($library['loaded'] === FALSE) { throw new \Exception("php-loremipsum didn't load!"); } $lipsum = new LoremIpsum(); return array( '#markup' => $lipsum->paragraph('p'), ); } usejoshtronic\LoremIpsum;functionxautoload_example_page_render(){  $library=libraries_load('php-loremipsum');  if($library['loaded']===FALSE){    thrownew\Exception("php-loremipsum didn't load!");  $lipsum=newLoremIpsum();  returnarray(    '#markup'=>$lipsum->paragraph('p'),

Note that I needed  to tell the Libraries API to load the library, but I then have access to all the namespaces within the library. Keep in mind that the dependencies of some libraries are immense. You’ll very likely need to use Composer from within the library and commit it when you first start out. In such cases, you might need to make sure to include the Composer autoload.php file.

Another tip:  Abstract your libraries_load() functionality out in such a way that if the class you want already exists, you don’t call libraries_load() again. Doing so removes libraries as a hard dependency from your module and enables you to use Composer to load the library later on with no more work on your part. For example:

function xautoload_example_load_library() { if (!class_exists('\joshtronic\LoremIpsum', TRUE)) { if (!module_exists('libraries')) { throw new \Exception('Include php-loremipsum via composer or enable libraries.'); } $library = libraries_load('php-loremipsum'); if ($library['loaded'] === FALSE) { throw new \Exception("php-loremipsum didn't load!"); } } } functionxautoload_example_load_library(){  if(!class_exists('\joshtronic\LoremIpsum',TRUE)){    if(!module_exists('libraries')){      thrownew\Exception('Include php-loremipsum via composer or enable libraries.');    $library=libraries_load('php-loremipsum');    if($library['loaded']===FALSE){      thrownew\Exception("php-loremipsum didn't load!");

And with that, you’ve conquered the challenge of using third-party libraries!

Setting up a New Site with Composer

Speaking of Composer, you can use it to simplify the setup of a new Drupal 7 site. Just follow the instructions in the Readme for the Composer Template for Drupal Project. From the command line, run the following:

composer create-project drupal-composer/drupal-project:7.x-dev <YOUR SITE DIRECTORY> --no-interaction

This code gives you a basic site with a source repository (a repo that doesn’t commit contributed modules and libraries) to push up to your Git provider. (Note that migrating an existing site to Composer involves a few additional considerations and steps, so I won’t get into that now.)

If you’re generating a Pantheon site, check out the Pantheon-specific Drupal 7 Composer project. But wait: The instructions there advise you to use Terminus to create your site, and that approach attempts to do everything for you—including setting up the actual site. Instead, you can simply use composer create-project  to test your site in something like Lando. Make sure to run composer install if you copy down a repo.

From there, you need to enable the Composer Autoload module , which is automatically required in the composer.json you pulled in earlier. Then, add all your modules to the require portion of the file or use composer require drupal/module_name just as you would in Drupal 8.

You now have full access to all the  Packagist libraries and can use them in your modules. To use the previous example, you could remove php-loremipsum from sites/all/libraries, and instead run composer require joshtronic/php-loremipsum. The code would then run the same as before.

Have fun!

From here on out, it’s up to your imagination. Code and implement with ease, using OOP design patterns and reusable code. You just might find that this new world of possibilities for integrating new technologies with your existing Drupal 7 sites increases your productivity as well.

Drupal 9: What’s New, What to Expect

Jan 17 2020
Jan 17
Nov 04 2019
Nov 04
A proactive approach for cleaner Drupal coding


So you are stuck in the cruft, struggling to create some semblance of sanity within a sea of code-rot. Code standards sound like a great idea for your project, but perhaps automated enforcement tools look like more of a pain than they're worth. This post is intended for Drupal developers using PhpStorm who need fast, flexible, standards enforcement tools.

Maintaining a stringent standard for your codebase is a battle. On one hand, your code is cleaner, more unified, and easier to maintain. On the other hand, these little formatting rules cause frustration and time loss - especially if a tiny slip causes you to waste a full pipeline cycle just to pass an automated standards check. As they say, the best defence is a strong offence, and the tools proposed here will help you find and fix standards violations before they reach a pipeline.

Drupal recommends a tool called PHP Code Sniffer, aka phpcs, to scan your files for Drupal Code Standards violations. Thankfully, it also comes with a companion tool called PHP Code Beautifier and Fixer, aka phpcbf, which fixes the small, tedious violations for you.

The goal of this post is to get phpcs and phpcbf under your fingertips and into your habits. Only once you have hotkeys set-up to run these tools while coding will they become useful — instead of just annoying.

The steps are as follows:

  1. Install and set-up phpcs
  2. Create a custom combination of rulesets
  3. Integrate with PhpStorm for hotkeys and syntax highlighting

1. Install and set-up phpcs

It may seem straightforward to install phpcs globally via Composer or apt, or to simply require it in your current composer project. However, a global install is not easy to customize and share. Instead, I recommend using a standalone repo that is specifically for your code standards tools. When your standards stand alone, they are easier to edit, share with teammates, and transfer to new work environments.

Here’s a simple repo to get you started:
https://github.com/nilswloewen/drupal_code_standards

  1. If you currently have phpcs or phpcbf installed globally, uninstall them before proceeding.
  2. Quick install with example repo:
    git clone [email protected]:nilswloewen/drupal_code_standards.git
    cd drupal_code_standards
    composer install
  3. Once composer has installed phpcs for you, add it to your global path with:

    export PATH=$PATH:~/PATH/TO/drupal_code_standards/vendor/bin
    NOTE: Adjust accordingly for your shell and OS of choice.
  4. Next, you must tell phpcs which rulesets you have installed use.

    The optional tool phpcodesniffer-composer-installer will automatically detect rulesets in your composer package and set your phpcs & phpcbf installed_paths for you. This is part of the example repo and the next step should have been done for you during "composer install".

    However, to set installed paths to rulesets manually run:

    phpcs --config-set installed_paths vendor/drupal/coder/coder_sniffer,vendor/phpcompatibility/php-compatibility/PHPCompatibility
    

    Then confirm that phpcs knows about the rulesets within the installed paths with:

    phpcs -i

    You should see this list that confirms your rulesets:

    The installed coding standards are ... PHPCompatibility, Drupal and DrupalPractice
    

    You may need to set installed paths for phpcbf as well using the same process.

2. Create a custom combination of rulesets

Out of the box, phpcs can only run one standard at a time. This is a problem when working with Drupal because we have 2 standards to follow. For this post I have added a third standard, PHPCompatibility, which is helpful when upgrading php versions on legacy projects.

  1. To combine standards we first create a custom ruleset that references multiple rulesets. Note that this is already included in the example repo as phpcs-custom-standards.xml.
    <?xml version="1.0"?>
    <ruleset name="Custom code standards">
    <rule ref="Drupal"/>
    <rule ref="DrupalPractice"/>
    <rule ref="PHPCompatibility"/>
    </ruleset>
  2. Then set this standard as your default. Use an absolute path to ensure your standard will be found no matter what context phpcs is called from.
    phpcs --config-set default_standard ~/PATH/TO/drupal_code_standards/phpcs-custom-standard.xml
    
    See the example standard for a few other helpful settings.

3. Integrate with PhpStorm for hotkeys and syntax highlighting

There are two levels of integration with PhpStorm: Passive and Active.

Passive

Passive code analysis with PhpStorm Inspections will give you syntax highlighting and hover-over explanations of the file you are currently working on.

PhpStorm passive integration

This is quite helpful when dealing with one file at a time, but when you need to get an entire directory to pass standards, you need a way to hunt for violations.

Active

Active analysis when you use phpcs to scan many files at once. You can do this through the terminal with a command like:

phpcs ~/module # Scans all applicable files in dir.
phpcs ~/module/example.php # Scans only a specific file.

However, it’s a pain to open a terminal window, navigate to the file you are working on, and then type a command. You’ll probably forget or neglect to check your work because of these extra steps involved. A better way to run phpcs is to set-up hotkeys within PhpStorm to scan your files instantly.

Configure PhpStorm inspections

  1. Register phpcs and phpcbf as PHP Quality Tools.

    Settings | Languages and Frameworks | PHP | Quality Tools | Code Sniffer

    PhpStorm PHP Quality Tools settings
  2. Enable the inspection.

    Settings | Editor | Inspection | PHP | Quality Tools

    PhpStorm PHP Inspections settings

  • Set the extension list to match what Drupal standard sets: source
    php,module,inc,install,test,profile,theme,css,info,txt,md,yml
    
  • DO NOT set the "Installed standard paths", as this overrides what you have previously set in the command line.
  • The refresh list button on "Coding Standard" should mimic what "phpcs -i" shows. Choose "Custom" Coding Standard and then click the ellipses to choose the path to your custom standards file (i.e. phpcs-custom-standards.xml).
  • Click OK and your inspections should be working!

Configure hotkeys

  1. Register phpcs and phpcbf as external tools.

    Settings | Tools | External Tools

    PhpStorm External Tools settings

    The "$FilePath" argument runs the tool against the file you are currently working on, or against a selected folder when in project view.
  2. Double check that this step works by running the tool from the main menu.

    Tools | External Tools | phpcs

    Running phpcs external tool

  3. This is the special sauce. Configure a keyboard shortcut for your new tools.

    Settings | Keymap | External Tools

    PhpStorm Keymap settings

  4. Right click on the external tool you just registered and add a keyboard shortcut. "Ctrl+Alt+Shift+Comma" was simply a combination that was not used anywhere else in my setup.

Bringing it all together

Now you can actively use phpcs and phpcbf while you code! I frequently use the phpcbf hotkey while writing new code to do the tedious stuff for me, such as creating doc blocks and pushing whitespace around. Here's an example:

Use phpcbf in PhpStorm with a hotkey

With phpcs and phpcbf now under your fingertips you are set to be much more assertive in your application of code standards!

Taking it to the next level

If you are using Gitlab for CI/CD, which I hope you are, another great strategy for enforcing standards is to create a pre-deployment job that scans your custom code for violations. This will keep your team (and you) in check by stopping standards violations from being auto-deployed.

After a few super annoying pipeline failures for minor syntax errors, you will want this next level of enforcement — git pre-commit hooks. I highly recommend using grumphp to manage this for you.

Best of luck keeping your code readable and up to snuff!

End-to-end Drupal services

As a full service Drupal agency, Acro Media has significant expertise in digital commerce architecture, ecommerce design, customer experience, software development and hosting architecture. If you’re looking for a Drupal agency dedicated to code and project quality, check us out. We would love the opportunity to talk.

View Our Drupal Commerce Services

How to start contributing to Drupal without code - DrupalCon Amsterdam Edition

Nov 01 2019
Nov 01
Aug 03 2019
Aug 03

Approaching 20 years old, the Drupal Community must prioritize recruiting the next generation of Drupal Professionals

Kaleem Clarkson Ferris Wheel in Centennial Olympic Park in Atlanta, Georgia

Time flies when you are having fun. One of those phrases I remember my parents saying that turned out to be quite true. My first Drupal experience was nearly 10 years ago and within a blink of an eye, we have seen enormous organizations adopt and commit to Drupal such as Turner, the Weather Channel, The Grammys, and Georgia.gov.

Throughout the years, I have been very fortunate to meet a lot of Drupal community members in person but one thing I have noticed lately is that nearly everyone’s usernames can be anywhere between 10–15 years old. What does that mean? As my dad would say, it means we are getting O — L — D, old.

For any thriving community, family business, organization, or your even favorite band for that matter, all of these entities must think about succession planning. What is succession planning?

Succession planning is a process for identifying and developing new leaders who can replace old leaders when they leave, retire or die. -Wikipedia

That’s right, we need to start planning a process for identifying who can take over in leadership roles that continue to push Drupal forward. If we intend to promote Drupal as the solution for large and small enterprises, then we should market ourselves as a viable career option to lure talent to our community.

There are many different way’s to promote our community and develop new leaders, one of which is mentorship. Mentorship helps ease the barrier for entry into our community by providing guidance around how our community operates. The Drupal community does have some great efforts taking place in the form of mentoring such as Drupal Diversity & Inclusion (DDI) initiative, the core mentoring initiative and of course the code and mentoring sprints at DrupalCon and DrupalCamps. These efforts are awesome and should be recognized as part of a larger strategic initiative to recruit the next generation of Drupal professionals.

Companies spend billions of dollars a year in recruiting but as an open-source community, we don’t have billions so

… what else can we do to attract new Drupal career professionals?

This year’s Atlanta Drupal Users’s Group (ADUG) decided to develop the Drupal Career Summit, all in an effort to recruit more professionals into the Drupal community. Participants will explore career opportunities, career development, and how open source solutions are changing the way we buy, build, and use technology.

  • Learn about job opportunities and training.
  • Hear how local leaders progressed through their careers and the change open source creates their clients and business.
  • Connect one-on-one with professionals in the career you want and learn about their progression, opportunities, challenges, and wins.

On Saturday, September 14 from 1pm -4:30pm. Hilton Garden Inn Atlanta-Buckhead 3342 Peachtree Rd., NE | Atlanta, GA 30326 | LEARN MORE

Student and job seekers can attend for FREE! The Summit will allow you to meet with potential employers and industry leaders. We’ll begin the summit with a panel of marketers, developers, designers, and managers that have extensive experience in the tech industry, and more specifically, the Drupal community. You’ll get a chance to learn about career opportunities and connect with peers with similar interests.

We’re looking for companies that want to hire and educate. You can get involved with the summit by becoming a sponsor for DrupalCamp Atlanta. Sponsors of the event will have the opportunity to engage with potential candidates through sponsored discussion tables and branded booths. With your sponsorship, you’ll get a booth, a discussion table, and 2 passes! At your booth, you’ll get plenty of foot traffic and a fantastic chance to network with attendees.

If you can’t physically attend our first Career Summit, you can still donate to our fundraising goals. And if you are not in the position to donate invite your employer, friends, and colleagues to participate. Drupal Career Summit.

Aug 01 2019
Aug 01

Adam stood in the middle of the garden, enveloped in exquisite beauty. The world was there to delight him, succulent fruit, dignified trees, green meadows, sprinkling pool and species of all kinds. Yet he stood contemplating the nature, he felt certain loneliness and thus the Lord said 

It is not good that man is alone. I shall make him a compatible helper.

With the creation of other species, both male and female sprang up the same time. If the beginning of the entire universe was chosen to be this way, how can business be any good without clients and a strong relationship with them, Right? 

 Image of two hands where the upper one is offering an apple to the lower one


The productivity and enduring relationship not only provides value to clients that are consistent but also constructs a healthy connection in every business venture. 

Though there are times when you get stuck in a rut with clients and the relationship starts to rot. 

So, how do you change it? 

Maybe with some strategies or maybe with the help of some plan. Well, whatever it may be here are some of the approaches which you can adapt to sweet up that sour relationship and add more productivity to a particular project.  

But how can perfect client relationship get ruined?

Under perfect circumstances, organizations and big enterprises treat their clients right. However, there might be times when they are under pressure to sell more or retain those paying customers, chances are that they might deviate from their standards. Resulting in sorrowful client satisfaction. 

With this context here are some of the actions which can kill a perfect build client relationship:

Saying yes to a client when you should not 

There is no shame in accepting the fact that your organization can meet only a level of expectations and not beyond it. Taking up those clients who are not a good match is foolishness.  If a particular organization knows that they going to hate dealing with a client or they might fail to meet their quality standards, the money is not worth the inevitable breakdown.  

Overpromising 

When there is a wide gap between expectations and reality - it results in disappointment. If you are selling software or a product for that matter, don’t promise the integration which will take a week or so and won’t work perfectly. Give those commitments that you know are humanly and technically feasible. Overpromising results in fears. 

Not addressing the key details 

When you are serving a client, it is necessary to include each and every detail about the project. You leaving details out by omission is one thing. If you leave out details intentionally, you will screw up relationships. Thus, address to each and every key detail. 

Being unauthentic 

If you are focusing only on yourself, what works for you and whatever you do then spending your time considering what's best for your team, company, or business partnership is a waste. Adopt an all-or-nothing attitude, acting however is needed to win favor, seal a deal, or make a sale, even if it means lying or misrepresenting your position is a call for a sour relationship. 

Image showing a handset in red color with text as “The Customer”. There are arrows that are connected to it and many doodles depicting important factors

Taking These Few Important Steps for an Enduring Relationship 

We all know that a huge amount of time and effort is employed on acquiring clients, yet very few businesses spend the same energy nurturing the relationship. Here are some of the tips that would help you endure your client relationship.  

Communication is the key 

Clients depend on you to keep them informed. Having constant communication with them should be the top priorities. This includes updating them on various projects, as well as making them understand about any kind of bumps that you may encounter in the product delivering journey.

Information distribution

Don’t delay to share knowledge that might be useful to the clients, whether or not it benefits your organization in a way. The more value you present, the more a client attains to depend on you. There should not be a hesitation to share important and crucial data. 

Integrity 

If you are not honest to your client and vice-versa, no long term relationship survives. In addition to producing a product or service, your client requires you to show a chief responsibility towards all the dealings. Nowadays clients are really intelligent, they understand when they are being deceived or misled. Speaking a “ white lie” about why you failed can ruin your reputation. And without a reputation in terms of integrity, you can fail to cultivate the kind of long-term relationships that your business stands on.

Encourage multi-player team involvement 

The success of any project depends on the contribution of every member of the team.  Encouraging multi-player team with the involvement of the dev team can bring laurels to your project. This way the team members have a sense of ownership in a group project and they believe that their contributions are valued. They feel motivated to share their best work.

Goals 

There might be times when you would feel that you and your client are not on the same page. You have your own objective and your client has there's.  The solution to this common issue is to set mutual goals. 

And as soon as you start your new project and get engaged and committed to the deadlines, you help the client with vital product or services that might not be available in time to meet his or her needs. Set mutual goals from the very beginning to avoid any kind of friction later in the future.

Work for a strong partnership 

If you are building a relationship in all the appropriate ways and of course providing the products and services to your client needs, you can operate on developing a partnership with the client, something that is ahead of the project development. 

A client who determines that the organization that is serving them is in it for the long haul and that it motivates to help them succeed soon starts to view them as more than just a vendor or supplier. You become a partner in their enterprise and someone they grow to value today, tomorrow and in the years to come

Looking into the performance 

Re-examine the cost 

If you have been working with a particular customer for a long time, re-examine what it really costs you to do so. It would not be feasible to cut your price if it becomes cheaper to serve them.

Perceiving the Product 

Instead of thinking about what it is or what it does, you should infuse how it makes them feel. Even if you sell software, your software may relieve the stressful feeling of trying to get work done in a limited amount of time. It may make them feel confident in doing the job right.

Modify the strategy of budget 

Modify what you sell from a capital cost into an expense if your customer’s CEO won’t approve your product. Often, capital spending is prohibited but monthly expenses continue to be budgeted.

Finding an efficient distributor 

Sell your wares through a distributor if customers start to need smaller quantities or more service. Perhaps your service has declined as you pursued larger customers. If so, get a third party to sell and service your customer properly. You sure don’t need to make as much if you are doing less.

Selling your Service 

If they won’t buy your service by the unit, sell it by the hour or the result. So many times buyers are told to cut costs by cutting inventory.

Grant with a warranty if your product is at fault 

If your product or service was deficient, offer some kind of insurance to assure your customer it won’t be a problem next time.

Managing the departments 

The reasons customers buy your from you can change over time. A purchasing department can make decisions until its company has legal or customer problems, at which time their finance or marketing departments may now have the final say.

Managing projects with the help of various methods 

  • Waterfall: One of the more traditional project management methodologies, Waterfall is a linear, sequential design approach where progress flows downwards in one direction like a waterfall.
     
  • Agile development:  Agile is best suited for projects that are iterative and incremental. It’s a type of process where demands and solutions evolve through the collaborative effort of self-organizing and cross-functional teams and their customers
     
  • Scrum: Scrum is comprised of five values: commitment, courage, focus, openness, and respect. Its goal is to develop, deliver, and sustain complex products through collaboration, accountability, and iterative progress. 
     
  • Kanban: Kanban is another popular Agile framework that, similar to Scrum, focuses on early releases with collaborative and self-managing teams.
     
  • Six Sigma: It aims to improve quality by reducing the number of errors in a process by identifying what is not working and then removing it from the process.

Case Studies 

Ivey Business Journal 

A three-year cross-industry study by Ivey business journal explained how poor business strategy, inappropriate communication or damaged working relationships between partners account for 94 percent of all broken and failed alliances. On their own, poor or damaged working relationships account for 52 percent of all broken alliances. 

There are several reasons due to which an alliance is broken. Issues like impersonal problems, failure of team members communicating, high attrition rates, and most importantly the failure to reach a milestone.

When an alliance is recognized as broken, there are many critical tasks to perform and many separate decisions to be made. Partners require to diagnose why the alliance has broken down, examine and interpret the existing obstacles, disputes or tensions, and create a specific procedure to master these problems. They must furnish themselves to uphold a long-term relationship.

To relaunch your relationship with your client a three-step process can be followed:

  • Audit the relationship diagnosing the root causes
  • The partnership can succeed only if both organizations are fully persuaded that the alliance is the most effective means to meet their goals.
  • Conduct relationship planning build a joint contract and deal understanding
Image of a pie chart where 52% is red in color, 37% is blue in color and 11% is green in color. The pie chart shows the causes of partnership failure


OEM Profitability and Supplier Relations 

OEM Profitability and Supplier Relations - which is based in part on data gathered over the past 13 years from the annual Working Relations Index Study published by consultancy Planning Perspectives - found the better the relationship an automotive manufacturer has with its suppliers, the greater its profits are.

It explained the relationship “quantifies the economic value of suppliers’. This includes a supplier sharing new technology, providing the best team to support to the manufacturer, and providing support that goes beyond the supplier's contractual obligation.
 
The report added the research “establishes the fact that the economic value of the suppliers’ non-price benefits can greatly exceed the economic benefit realized from suppliers’ price concessions”. On average, this can be up to four to five times greater, according to the research.

Conclusion 

To get customer loyalty in today’s rapidly changing competitive world, companies need to rethink.

  • How do they engage customers?
  • Do they have the appetite required to build loyal relationships?
  • Is it even the right strategy for them in the first place?

Determine what your business and shareholders need first. If it’s short-term financial gains, then customer loyalty should not be a stated goal. Client seeks relationships, with their vendor. They want a place to be heard, a place to be appreciated and a place to connect. 

At Opensense Labs, we use social technologies and services that allow us to take relationships with customers to higher levels. Connecting with customers’ personal values helps in placing ahead of the competition in winning the hearts and minds of your customers.

Ping us now at [email protected] now. 

Jul 30 2019
Jul 30

“Oh snap”, said the project manager. “The client has this whole range of rich articles they probably are expecting to still work after the migration!”

The project was a relaunch of a Drupal / Commerce 1 site, redone for Drupal 8 and Commerce 2. A couple of weeks before the relaunch, and literally days before the client was allowed in to see the staging site, we found out we had forgotten a whole range of rich articles where the client had carefully crafted landing pages, campaign pages and “inspiration” pages (this is a interior type of store). The pages were panel nodes, and it had a handful of different panel panes (all custom).

In the new site we had made Layout builder available to make such pages.

We had 2 options:

  • Redo all of them manually with copy paste.
  • Migrate panel nodes into layout builder enabled nodes.

“Is that even possible?”, said the project manager.

Well, we just have to try, won’t we?

Creating the destination node type

First off, I went ahead and created a new node type called “inspiration page”. And then I enabled layout builder for individual entities for this node type.

Now I was able to create “inspiration page” landing pages. Great!

Creating the migration

Next, I went ahead and wrote a migration plugin for the panel nodes. It ended up looking like this:

id: mymodule_inspiration
label: mymodule inspiration
migration_group: mymodule_migrate
migration_tags:
  - mymodule
source:
  # This is the source plugin, that we will create.
  plugin: mymodule_inspiration
  track_changes: TRUE
  # This is the key in the database array.
  key: d7
  # This means something to the d7_node plugin, that we inherit from.
  node_type: panel
  # This is used to create a path (not covered in this article).
  constants:
    slash: '/'
process:
  type:
    plugin: default_value
    # This is the destination node type
    default_value: inspiration_page
  # Copy over some values
  title: title
  changed: changed
  created: created
  # This is the important part!
  layout_builder__layout: layout
  path:
    plugin: concat
    source:
      - constants/slash
      - path
destination:
  plugin: entity:node
  # This is the destination node type
  default_bundle: inspiration_page
dependencies:
  enforced:
    module:
      - mymodule_migrate

As mentioned in the annotated configuration, we need a custom source plugin for this. So, let’s take a look at how we make that:

Creating the migration plugin

If you have a module called “mymodule”, you create a folder structure like so, inside it (just like other plugins):

src/Plugin/migrate/source

And let’s go ahead and create the “Inspiration” plugin, a file called Inspiration.php:

<?php

namespace Drupal\mymodule_migrate\Plugin\migrate\source;

use Drupal\Component\Uuid\UuidInterface;
use Drupal\Core\Entity\EntityManagerInterface;
use Drupal\Core\Extension\ModuleHandlerInterface;
use Drupal\Core\State\StateInterface;
use Drupal\layout_builder\Section;
use Drupal\layout_builder\SectionComponent;
use Drupal\migrate\Plugin\MigrationInterface;
use Drupal\migrate\Row;
use Drupal\node\Plugin\migrate\source\d7\Node;
use Symfony\Component\DependencyInjection\ContainerInterface;

/**
 * Panel node source, based on panes inside a panel page.
 *
 * @MigrateSource(
 *   id = "mymodule_inspiration"
 * )
 */
class Inspiration extends Node {

  /**
   * Uuid generator.
   *
   * @var \Drupal\Component\Uuid\UuidInterface
   */
  protected $uuid;

  /**
   * Inspiration constructor.
   */
  public function __construct(
    array $configuration,
    $plugin_id,
    $plugin_definition,
    MigrationInterface $migration,
    StateInterface $state,
    EntityManagerInterface $entity_manager,
    ModuleHandlerInterface $module_handler,
    UuidInterface $uuid
  ) {
    parent::__construct($configuration, $plugin_id, $plugin_definition,
      $migration, $state, $entity_manager, $module_handler);
    $this->uuid = $uuid;
  }

  /**
   * {@inheritdoc}
   */
  public static function create(ContainerInterface $container, array $configuration, $plugin_id, $plugin_definition, MigrationInterface $migration = NULL) {
    return new static(
      $configuration,
      $plugin_id,
      $plugin_definition,
      $migration,
      $container->get('state'),
      $container->get('entity.manager'),
      $container->get('module_handler'),
      $container->get('uuid')
    );
  }

}

Ok, so this is the setup for the plugin. For this specific migration, there were some weird conditions for which of the panel nodes were actually inspiration pages. If I copy-pasted it here, you would think I was insane, but for now I can just mention that we were overriding the public function query. You may or may not need to do the same.

So, after getting the query right, we are going to do some work inside of the prepareRow function:

  /**
   * {@inheritdoc}
   */
  public function prepareRow(Row $row) {
    $result = parent::prepareRow($row);
    if (!$result) {
      return $result;
    }
    // Get all the panes for this nid.
    $did = $this->select('panels_node', 'pn')
      ->fields('pn', ['did'])
      ->condition('pn.nid', $row->getSourceProperty('nid'))
      ->execute()
      ->fetchField();
    // Find all the panel panes.
    $panes = $this->getPanelPanes($did);
    $sections = [];
    $section = new Section('layout_onecol');
    $sections[] = $section;
    foreach ($panes as $delta => $pane) {
      if (!$components = $this->getComponents($pane)) {
        // You must decide what you want to do when a panel pane can not be
        // converted.
        continue;
      }
      // Here we used to have some code dealing with changing section if this
      // and that. You may or may not need this.
      foreach ($components as $component) {
        $section->appendComponent($component);
      }
    }
    $row->setSourceProperty('layout', $sections);
    // Don't forget to migrate the "path" part. This is left out for this
    // article.
    return $result;
  }

Now you may notice there are some helper methods there. They look something like this:

  /**
   * Helper.
   */
  protected function getPanelPanes($did) {
    $q = $this->select('panels_pane', 'pp');
    $q->fields('pp');
    $q->condition('pp.did', $did);
    $q->orderBy('pp.position');
    return $q->execute();
  }

  /**
   * Helper to get components back, based on pane configuration.
   */
  protected function getComponents($pane) {
    $configuration = @unserialize($pane["configuration"]);
    if (empty($configuration)) {
      return FALSE;
    }
    $region = 'content';
    // Here would be the different conversions between panel panes and blocks.
    // This would be very varying based on the panes, but here is one simple
    // example:
    switch ($pane['type']) {
      case 'custom':
        // This is the block plugin id.
        $plugin_id = 'my_custom_content_block';
        $component = new SectionComponent($this->uuid->generate(), $region, [
          'id' => $plugin_id,
          // This is the title of the block.
          'title' => $configuration['title'],
          // The following are configuration options for this block.
          'image' => '',
          'text' => [
            // These values come from the configuration of the panel pane.
            'value' => $configuration["body"],
            'format' => 'full_html',
          ],
          'url' => $configuration["url"],
        ]);
        return [$component];

      default:
        return FALSE;
    }
  }

So there you have it! Since we now have amazing tools in Drupal 8 (namely Layout builder and Migrate) there is not task that deserves the question “Is that even possible?”.

To finish off, let's have an animated gif called "inspiration". And I hope this will give some inspiration to other people migrating landing pages into layout builder.

Jul 11 2019
Jul 11

On September 12–14, at Hilton Garden Inn Atlanta-Buckhead

Kaleem Clarkson Kyle Mathews, 2019 DrupalCamp Atlanta Keynote

This year, DrupalCamp Atlanta is honored to welcome Kyle Mathews as our keynote speaker, creator of the open source project Gatsby. Gatsby was a hot topic at DrupalCon this year, and we’re ready to dive into the software at DrupalCamp this September.

Follow Kyle on Twitter and Github.

Session submissions are now open for DrupalCamp Atlanta 2019! With Kyle as our keynote, we’re interested to see how others are combining Drupal and Gatsby. In addition, we’re also accepting sessions in the following tracks:

Each session is 40 minutes with 10 minutes for Q&A. Each room will be set classroom style and will have a projection screen and with in house audio.

Trainings

In addition to 50-minute sessions, we’re also looking for volunteer trainers for our full day of trainings on Thursday (9/12) and a half day on Friday (9/13). Training sessions can range across all experience levels. You can submit your call for training here.

One of our goals for this year’s camp was to increase the number of case studies. We encourage web development companies and units to connect with their clients to co-present a session at this year’s DCATL.

We see this as an opportunity to re-engage with a client by highlighting the great work you have done together all while introducing them to the awesome Drupal community we have. So, reach out to some of our clients and propose a presentation today!

SUBMIT YOUR PROPOSAL HERE

Jun 28 2019
Jun 28
Deirdre Habershaw

Today, more than 80% of people’s interactions with government take place online. Whether it’s starting a business or filing for unemployment, too many of these experiences are slow, confusing, or frustrating. That’s why, one year ago, the Commonwealth of Massachusetts created Digital Services in the Executive Office of Technology and Security Services. Digital Services is at the forefront of the state’s digital transformation. Its mission is to leverage the best technology and information available to make people’s interactions with state government fast, easy, and wicked awesome. There’s a lot of work to do, but we’re making quick progress.

In 2017, Digital Services launched the new Mass.gov. In 2018, the team rolled out the first-ever statewide web analytics platform to use data and verbatim user feedback to guide ongoing product development. Now our researchers and designers are hard at work creating a modern design system that can be reused across the state’s websites and conducting the end-to-end research projects to create user journey maps to improve service design.

If you want to work in a fast-paced agile environment, with a good work life balance, solving hard problems, working with cutting-edge technology, and making a difference in people’s lives, you should join Massachusetts Digital Services.

We are currently recruiting for a Technical Architect if you are interested submit your resume here

Check out more about hiring at the Executive Office of Technology and Security Services and submit your resume in order to be informed on roles as they become available.

Jun 20 2019
Jun 20

It's that time of year again! Leading up to DrupalCon Seattle, Chris Urban and I are working on a presentation on Local Development environments for Drupal, and we have just opened up the 2019 Drupal Local Development Survey.

Local development environments - 2018 usage stats
Local development environment usage results from 2018's survey.

If you do any Drupal development work, no matter how much or how little, we would love to hear from you. This survey is not attached to any Drupal organization, it is simply a community survey to help highlight some of the most widely-used tools that Drupalists use for their projects.

Take the 2019 Drupal Local Development Survey

Chris and I will present the results of the survey at our DrupalCon Seattle session What Should I Use? 2019 Developer Tool Survey Results.

We will also be comparing this year's results to those from last year—see our presentation from MidCamp 2018, Local Dev Environments for Dummies.

Jun 11 2019
Jun 11

Over the years, as Drupal has evolved, the upgrade process has become a bit more involved; as with most web applications, Drupal's increasing complexity extends to deployment, and whether you end up running Drupal on a VPS, a bare metal server, in Docker containers, or in a Kubernetes cluster, you should formalize an update process to make sure upgrades are as close to non-events as possible.

Gone are the days (at least for most sites) where you could just download a 'tarball' (.tar.gz) from Drupal.org, expand it, then upload it via SFTP to a server and run Drupal's update.php. That workflow (and even a workflow like drush up of old) might still work for some sites, but it is fragile and prone to cause issues whether you notice them or not. Plus if you're using Drush to do this, it's no longer supported in modern versions of Drush!

So without further ado, here is the process I've settled on for all the Drupal 8 sites I currently manage (note that I've converted all my non-Composer Drupal codebases to Composer at this point):

  1. Make sure you local codebase is up to date with what's currently in production (e.g. git pull master).
  2. Reinstall your local site in your local environment so it is completely reset (e.g. blt setup or drush site-install --existing-config). I usually use a local environment like Drupal VM or a Docker Compose environment, so I can usually just log in and run one command to reinstall Drupal from scratch.
  3. Make sure the local site is running well. Consider running behat and/or phpunit tests to confirm they're working (if you have any).
  4. Run composer update (or composer update [specific packages]).
  5. On your local site, run database updates (e.g. drush updb -y or go to /update.php). _This is important because the next step—exporting config—can cause problems if you're dealing with an outdated schema.
  6. Make sure the local site is still running well after updates complete. Run behat and/or phpunit tests again (if you have any).
  7. If everything passed muster, export your configuration (e.g. drush cex -y if using core configuration management, drush csex -y if using Config Split).
  8. (Optional but recommended for completeness) Reinstall the local site again, and run any tests again, to confirm the fresh install with the new config works perfectly.
  9. If everything looks good, it's time to commit all the changes to composer.lock and any other changed config files, and push it up to master!
  10. Run your normal deployment process to deploy the code to production.

All done!

That last step ("Run your normal deployment process") might be a little painful too, and I conveniently don't discuss it in this post. Don't worry, I'm working on a few future blog posts on that very topic!

For now, I'd encourage you to look into how Acquia BLT builds shippable 'build artifacts', as that's by far the most reliable way to ship your code to production if you care about stability! Note that for a few of my sites, I use a more simplistic "pull from master, run composer install, and run drush updb -y workflow for deployments. But that's for my smaller sites where I don't need any extra process and a few minutes' outage won't hurt!

Apr 30 2019
Apr 30

The opening talk as DrupalCamp Paris 2019 was a presentation given by Thomas Jolliet (FranceTV) and yours truly about how we rebuilt FranceTV Sport to a Symfony 4 / headless Drupal 8 combo.

The most salient points of the talk are probably the "defense in depth" mechanisms we built for scalability and fault tolerance, and the business results, like -85% full page load time or +50 iOS users.

Apr 19 2019
Apr 19

What we learned from our fellow Drupalists

Lisa Mirabile

On April 7th, our team packed up our bags and headed off to Seattle for one of the bigger can’t miss learning events of the year, DrupalCon.

“Whether you’re C-level, a developer, a content strategist, or a marketer — there’s something for you at DrupalCon.” -https://events.drupal.org/

As you may have read in one of our more recent posts, we had a lot of sessions that we couldn’t wait to attend! We were very excited to find new ideas that we could bring back to improve our services for constituents or the agencies we work with to make digital interactions with government fast, easy, and wicked awesome. DrupalCon surpassed our already high expectations.

At the Government Summit, we were excited to speak with other state employees who are interested in sharing knowledge, including collaborating on open-source projects. We wanted to see how other states are working on problems we’ve tried to solve and to learn from their solutions to improve constituents’ digital interactions with government.

One of the best outcomes of the Government Summit was an amazing “birds of a feather” (BOF) talk later in the week. North Carolina’s Digital Services Director Billy Hylton led the charge for digital teams across state governments to choose a concrete next step toward collaboration. At the BOF, more than a dozen Massachusetts, North Carolina, Georgia, Texas, and Arizona digital team members discussed, debated, and chose a content type (“event”) to explore. Even better, we left with a meeting date to discuss specific next steps on what collaborating together could do for our constituents.

The learning experience did not stop at the GovSummit. Together, our team members attended dozens of sessions. For example, I attended a session called “Stanford and FFW — Defaulting to Open” since we are starting to explore what open-sourcing will look like for Mass.gov. The Stanford team’s main takeaway was the tremendous value they’ve found in building with and contributing to Drupal. Quirky fact: their team discovered during user testing among high-school students that “FAQ” is completely mysterious to younger people: they expect the much more straightforward “Questions” or “Help.”

Another session I really enjoyed was called “Pattern Lab: The Definitive How-to.” It was exciting to hear that Pattern Lab, a tool for creating design systems, has officially merged its two separate cores into a single one that supports all existing rendering engines. This means simplifying the technical foundation to allow more focus on extending Pattern Lab in new and useful ways (and less just keeping it up and running). We used Pattern Lab to build Mayflower, the design system created for the Commonwealth of Massachusetts and implemented first on Mass.gov. We are now looking at the best ways to offer the benefits of Mayflower — user-centeredness, accessibility, and consistent look and feel — to more Commonwealth digital properties. Some team members had a chance to talk later to Evan Lovely, the speaker and one of the maintainers of Pattern Lab, and were excited by the possibility of further collaboration to implement Mayflower in more places.

There were a variety of other informative topics. Here are some that my peers and I enjoyed, just to name a few:

Our exhibit hall booth at DrupalCon 2019 Talking to fellow Drupalists at our booth

On Thursday we started bright and early to unfurl our Massachusetts Digital Service banner and prepare to greet fellow Drupalists at our booth! We couldn’t have done it without our designer, who put all of our signs together for our first time exhibiting at DrupalCon (Thanks Eva!)

It was remarkable to be able to talk with so many bright minds in one day. Our one-on-one conversations took us on several deep dives into the work other organizations are doing to improve their digital assets. Meeting so many brilliant Drupalists made us all the more excited to share some opportunities we currently have to work with them, such as the ITS74 contract to work with us as a vendor, or our job opening for a technical architect.

We left our table briefly to attend Mass.gov: A Guide to Data-Informed Content Optimization, where team members Julia Gutierrez and Nathan James shared how government agencies in Massachusetts are now making data-driven content decisions. Watch their presentation to learn:

  1. How we define wicked awesome content
  2. How we translate indicators into actionable metrics
  3. The technology stack we use to empower content authors

To cap it off, Mass.gov, with partners Last Call Media and Mediacurrent, won Best Theme for our custom admin theme at the first-ever Global Splash awards (established to “recognize the best Drupal projects on the web”)! An admin theme is the look and feel that users see when they log in. The success of Mass.gov rests in the hands of all of its 600+ authors and editors. We’ve known from the start of the project that making it easy and efficient to add or edit content in Mass.gov was key to the ultimate goal: a site that serves constituents as well as possible. To accomplish this, we decided to create a custom admin theme, launched in May 2018.

A before-and-after view of our admin theme

Our goal was not just a nicer looker and feel (though it is that!), but a more usable experience. For example, we wanted authors to see help text before filling out a field, so we brought it up above the input box. And we wanted to help them keep their place when navigating complicated page types with multiple levels of nested information, so we added vertical lines to tie together items at each level.

Last Call Media founder Kelly Albrecht crosses the stage to accept the Splash award for Best Theme on behalf of the Mass.gov Team. All the Splash award winners!

It was a truly enriching experience to attend DrupalCon and learn from the work of other great minds. Our team has already started brainstorming how we can improve our products and services for our partner agencies and constituents. Come back to our blog weekly to check out updates on how we are putting our DrupalCon lessons to use for the Commonwealth of Massachusetts!

Interested in a career in civic tech? Find job openings at Digital Service.
Follow us on Twitter | Collaborate with us on GitHub | Visit our site

Apr 08 2019
Apr 08
Acro Media teams up with BigCommerce


Acro Media has teamed up with BigCommerce, a leading SaaS ecommerce platform, to create the BigCommerce for Drupal module, a headless commerce module integrating both platforms together.

What does this mean? It means that companies can now utilize the quick-to-market and feature-rich backend benefits of BigCommerce SaaS while enjoying the content-rich and extensible frontend experience of Drupal, the open-source CMS. It’s a melding of systems that results in a best-of-both-worlds solution for today's digital experience driven ecommerce needs.

Discover BigCommerce for Drupal

Read the full press release below.

April 8, 2019 11:00 am EDT

BigCommerce for Drupal Brings Customized Shopping Experiences to Drupal Community

SEATTLE – April 8, 2019 – BigCommerce, the leading SaaS ecommerce platform for fast-growing and established brands, today announced BigCommerce for Drupal, a headless commerce module built specifically for the open-source content management system (CMS), at DrupalCon Seattle. Developed in partnership with Acro Media, a world-renowned digital commerce agency, BigCommerce for Drupal gives brands the ability to embed flexible, enterprise-level ecommerce functionality into revolutionary customer experiences created within Drupal’s highly-extensible and secure CMS.

Available now in the Drupal module library, BigCommerce for Drupal facilitates an agile headless commerce architecture for merchants by decoupling Drupal’s powerful front-end CMS and BigCommerce’s scalable commerce engine. Knitted together by fast, open-source APIs, the module allows the two platforms to operate simultaneously and more efficiently within a single interface. Additionally, BigCommerce for Drupal is built directly into Drupal Commerce, making it compatible with the many existing themes and modules available within Drupal Commerce.

“Shopping experiences should not be limited by any single content management or ecommerce platform’s native capabilities, and BigCommerce for Drupal embodies that philosophy. We want pioneering brands to continue driving retail innovation forward and help redefine how customers buy products, whether it be through augmented reality, social selling or any disruptive technology that lies ahead,” said Russell Klein, chief development officer at BigCommerce. “Furthermore, announcing BigCommerce’s headless implementation at DrupalCon, an event that brings together one of the strongest and most engaged online communities, signals the value we place on open-source technology that can be made better through collaboration.”

Key features of BigCommerce for Drupal include:

  • Drupal Commerce Core: BigCommerce for Drupal is built atop the Drupal Commerce module, developed in part by Acro Media, tapping into years of iterative improvements and enhancements.
  • Data Sync: With BigCommerce for Drupal, retailers can synchronize product and metadata directly from BigCommerce into Drupal, and then augment and manage data directly within the Drupal CMS.
  • Cached Commerce: The connected BigCommerce store will sync at merchant-determined intervals, saving a cached version of the catalog inside Drupal rather than pinging BigCommerce APIs for information.

“As two open, API-driven platforms, there is a natural alignment between BigCommerce and Drupal, and this module bridges the gap to unify their respective functionalities into one intuitive interface,” said Shae Inglis, chief executive officer at Acro Media. “The future of ecommerce is open architecture, and headless integrations lets even enterprise-level brands be nimble and capitalize on the explosion of new, innovative consumer touchpoints.”

To learn more about BigCommerce for Drupal visit www.bigcommerce.com/drupal. To download the BigCommerce for Drupal Module visit www.drupal.org/project/bigcommerce. DrupalCon attendees can also get more information by visiting the Acro Media booth (#802).

Is BigCommerce and Drupal right for you?

Quickly find out using Acro Media's Ideal Commerce Architecture Analysis.

Complete Your Ideal Architecture Analysis

Apr 05 2019
Apr 05

From time to time, I have the need to take a Twig template and a set of variables, render the template, replacing all the variables within, and then get the output as a string. For example, if I want to have a really simple email template in a custom module which has a variable for first_name, so I can customize the email before sending it via Drupal or PHP, I could do the following in Drupal 7:

<?php
$body
= theme_render_template(drupal_get_path('module', 'my_module') . '/templates/email-body.tpl.php', array(
'first_name' => 'Jane',
));
send_email($from, $to, $subject, $body);
?>

In Drupal 8, there is no theme_render_template() function, since the template engine was switched to Twig in this issue. And until today, there was no change record indicating the fact that the handy theme_render_template() had been replaced by a new, equally-handy twig_render_template() function! Thanks to some help from Tim Plunkett, I was able to find this new function, and after he pushed me to do it, I created a new change record to help future-me next time I go looking for theme_render_template() in Drupal 8: theme_render_template changed to twig_render_template.

In Drupal 8, it's extremely similar to Drupal 7, although there are two additions I made to make it functionally equivalent:

<?php
$markup
= twig_render_template(drupal_get_path('module', 'my_module') . '/templates/email-body.html.twig', array(
'my-variable' => 'value',
// Needed to prevent notices when Twig debugging is enabled.
'theme_hook_original' => 'not-applicable',
));
// Cast to string since twig_render_template returns a Markup object.
$body = (string) $markup;
send_email($from, $to, $subject, $body);
?>

If you are rendering a template outside of a normal page request (e.g. in a cron job, queue worker, Drush command, etc.) the Twig theme engine might not be loaded. If that's the case, you'll need to manually load the Twig engine using:

<?php
// Load the Twig theme engine so we can use twig_render_template().
include_once \Drupal::root() . '/core/themes/engines/twig/twig.engine';
?>

I shall go forth templating ALL THE THINGS now!

Apr 04 2019
Apr 04
Julia Gutierrez

DrupalCon2019 is heading to Seattle this year and there’s no shortage of exciting sessions and great networking events on this year’s schedule. We can’t wait to hear from some of the experts out in the Drupalverse next week, and we wanted to share with you a few of the sessions we’re most excited about.

Adam is looking forward to:

Government Summit on Monday, April 8th

“I’m looking forward to hearing what other digital offices are doing to improve constituents’ interactions with government so that we can bring some of their insights to the work our agencies are doing. I’m also excited to present on some of the civic tech projects we have been doing at MassGovDigital so that we can get feedback and new ideas from our peers.”

Bryan is looking forward to:

1. Introduction to Decoupled Drupal with Gatsby and React

Time: Wednesday, April 10th from 1:45 pm to 2:15 pm

Room: 6B | Level 6

“We’re using Gatsby and React today on to power Search.mass.gov and the state’s budget website, and Drupal for Mass.gov. Can’t wait to learn about Decoupled Drupal with Gatsby. I wonder if this could be the right recipe to help us make the leap!”

2. Why Will JSON API go into Core?

Time: Wednesday, April 10th from 2:30 pm to 3:00 pm

Room: 612 | Level 6

“Making data available in machine-readable formats via web services is critical to open data and to publish-once / single-source-of-truth editorial workflows. I’m grateful to Wim Leers and Mateu Aguilo Bosch for their important thought leadership and contributions in this space, and eager to learn how Mass.gov can best maximize our use of JSON API moving forward.”

I (Julia) am looking forward to:

1. Personalizing the Teach for America applicant journey

Time: Wednesday, April 10th from 1:00 pm to 1:30 pm

Room: 607 | Level 6

“I am really interested in learning from Teach for America on how they implemented personalization and integrated across applications to bring applicants a consistent look, feel, and experience when applying for a Teach for America position. We have created Mayflower, Massachusetts government’s design system, and we want to learn what a single sign-on for different government services might look like and how we might use personalization to improve the experience constituents have when interacting with Massachusetts government digitally. ”

2. Devsigners and Unicorns

Time: Wednesday, April 10th from 4:00 pm to 4:30 pm

Room: 612 | Level 6

“I’m hoping to hear if Chris Strahl has any ‘best-practices’ and ways for project managers to leverage the unique multi-skill abilities that Devsigners and unicorns possess while continuing to encourage a balanced workload for their team. This balancing act could lead towards better development and design products for Massachusetts constituents and I’d love to make that happen with his advice!”

Melissa is looking forward to:

1. DevOps: Why, How, and What

Time: Wednesday, April 10th from 1:45 pm to 2:15 pm

Room: 602–604 | Level 6

“Rob Bayliss and Kelly Albrecht will use a survey they released as well as some other important approaches to elaborate on why DevOps is so crucial to technological strategy. I took the survey back in November of 2018, and I want to see what those results from the survey. This presentation will help me identify if any changes should be made in our process to better serve constituents from these results.”

2. Advanced Automated Visual Testing

Time: Thursday, April 11th from 2:30 pm to 3:00 pm

Room: 608 | Level 6

“In this session Shweta Sharma will speak to what visual testings tools are currently out there and a comparison of the tools. I am excited to gain more insight into the automated visual testing in faster and quicker releases so we can identify any gotchas and improve our releases for Mass.gov users.

P.S. Watch a presentation I gave at this year’s NerdSummit in Boston, and stay tuned for a blog post on some automation tools we used at MassGovDigital coming out soon!”

We hope to see old friends and make new ones at DrupalCon2019, so be sure to say hi to Bryan, Adam, Melissa, Lisa, Moshe, or me when you see us. We will be at booth 321 (across from the VIP lounge) on Thursday giving interviews and chatting about technology in Massachusetts, we hope you’ll stop by!

Interested in a career in civic tech? Find job openings at Digital Services.
Follow us on Twitter | Collaborate with us on GitHub | Visit our site

Apr 01 2019
Apr 01

Helping content creators make data-driven decisions with custom data dashboards

Greg Desrosiers

Our analytics dashboards help Mass.gov content authors make data-driven decisions to improve their content. All content has a purpose, and these tools help make sure each page on Mass.gov fulfills its purpose.

Before the dashboards were developed, performance data was scattered among multiple tools and databases, including Google Analytics, Siteimprove, and Superset. These required additional logins, permissions, and advanced understanding of how to interpret what you were seeing. Our dashboards take all of this data and compile it into something that’s focused and easy to understand.

We made the decision to embed dashboards directly into our content management system (CMS), so authors can simply click a tab when they’re editing content.

GIF showing how a content author navigates to the analytics dashboard in the Mass.gov CMS.

The content performance team spent more than 8 months diving into web data and analytics to develop and test data-driven indicators. Over the testing period, we looked at a dozen different indicators, from pageviews and exit rates to scroll-depth and reading grade levels. We tested as many potential indicators as we could to see what was most useful. Fortunately, our data team helped us content folks through the process and provided valuable insight.

Love data? Check out our 2017 data and machine learning recap.

We chose a sample set of more than 100 of the most visited pages on Mass.gov. We made predictions about what certain indicators said about performance, and then made content changes to see how it impacted data related to each indicator.

We reached out to 5 partner agencies to help us validate the indicators we thought would be effective. These partners worked to implement our suggestions and we monitored how these changes affected the indicators. This led us to discover the nuances of creating a custom, yet scalable, scoring system.

Line chart showing test results validating user feedback data as a performance indicator.

For example, we learned that a number of indicators we were testing behaved differently depending on the type of page we were analyzing. It’s easy to tell if somebody completed the desired action on a transactional page by tracking their click to an off-site application. It’s much more difficult to know if a user got the information they were looking for when there’s no action to take. This is why we’re planning to continually explore, iterate on, and test indicators until we find the right recipe.

Using the strategies developed with our partners, we watched, and over time, saw the metrics move. At that point, we knew we had a formula that would work.

We rolled indicators up into 4 simple categories:

Screenshot of dashboard results as they appear in the Mass.gov CMS.

Each category receives a score on a scale of 0–4. These scores are then averaged to produce an overall score. Scoring a 4 means a page is checking all the boxes and performing as expected, while a 0 means there are some improvements to be made to increase the page’s overall performance.

All dashboards include general recommendations on how authors can improve pages by category. If these suggestions aren’t enough to produce the boost they were looking for, authors can meet with a content strategist from Digital Services to dive deeper into their content and create a more nuanced strategy.

GIF showing how a user navigates to the “Improve Your Content” tab in a Mass.gov analytics dashboard.

We realize we can’t totally measure everything through quantitative data, so these scores aren’t the be-all, end-all when it comes to measuring content performance. We’re a long way off from automating the work a good editor or content strategist can do.

Also, it’s important to note these dashboards are still in the beta phase. We’re fortunate to work with partner organizations who understand the bumps in the proverbial development road. There are bugs to work out and usability enhancements to make. As we learn more, we’ll continue to refine them. We plan to add dashboards to more content types each quarter, eventually offering a dashboard and specific recommendations for the 20+ content types in our CMS.

Apr 01 2019
Apr 01

This year is the sixth annual Midwest Drupal Camp (aka MidCamp). Palantir is excited to sponsor this year’s event and also have multiple Palantiri presenting sessions!

Palantir Sessions and Events

Community Working Group Update and Q&A by George DeMet

The mission of the Drupal Community Working Group (CWG) is to uphold the Drupal Code of Conduct and maintain a friendly and welcoming community for the Drupal project. In this session, CWG members George DeMet (gdemet) and Michael Anello (ultimike) will provide an update on some of the CWG's recent activities and what the group is working on in 2019, as well as answer audience questions.

  • Thursday @ 2:50pm
  • Room 314A


Federated Search with Drupal, SOLR, and React (AKA the Decoupled Ouroboros) by Matt Carmichael and Dan Montgomery

Our session will begin with a tour through a recent project developed by Palantir.net for the University of Michigan — bringing content from disparate sites (D7, D8, Wordpress) into a single index and then serve results out in a consistent manner, allowing users to search across all included properties. We’ll discuss how we got started with React, our process for hooking up to SOLR, and how we used Drupal to tie the whole thing together.

  • Friday @ 9am
  • Room 324


Overcoming Imposter Syndrome: How Weightlifting Helped Me Accept My Place in Tech by Kristen Mayer

Weightlifting and tech. On the surface, these two things may not seem to have much in common, but as a woman trying to navigate both of these male-dominated spheres, I’ve often been intimidated and doubted whether I really belonged. In this session, I’ll look at the strategies that helped me overcome imposter syndrome in the gym, and my journey of applying them to my professional life. I hope that anyone attending this session will walk away feeling empowered about their position and skills within the tech community!

  • Thursday @ 3:40pm
  • Room 312


Understanding Migration Development in Drupal 8: Strategies and Tools to See What's Happening by Dan Montgomery

Migrations in Drupal can be challenging for developers because the tools and strategies to get started and peer behind the curtain are different than those used in most backend development. This is an intermediate topic intended for developers who have a basic understanding of Drupal 8 concepts including plugins and the way entities and fields are used in Drupal to manage content.

  • Thursday @ 11:40am
  • Room 314B


Game Night!

Head to the second floor for a fun night of board games, camaraderie and conversation. Camp registration is required to attend this event.

  • Thursday from 6-9pm
  • 2nd Level


We'll see you there!

Mar 11 2019
Mar 11

I've been going kind of crazy covering a particular Drupal site I'm building in Behat tests—testing every bit of core functionality on the site. In this particular case, a feature I'm testing allows users to upload arbitrary files to an SFTP server, then Drupal shows those filenames in a streamlined UI.

I needed to be able to test the user action of "I'm a user, I upload a file to this directory, then I see the file listed in a certain place on the site."

These files are not managed by Drupal (e.g. they're not file field uploads), but if they were, I'd invest some time in resolving this issue in the drupalextension project: "When I attach the file" and Drupal temporary files.

Since they are just random files dropped on the filesystem, I needed to:

  1. Create a new step definition
  2. Track files that are created using that step definition
  3. Add code to make sure files that were created are cleaned up

If I just added a new step definition in my FeatureContext which creates the new files, then subsequent test runs on the same machine would likely fail, because the test files I created are still present.

Luckily, Behat has a mechanism that allows me to track created resources and clean up after the scenario runs (even if it fails), and those in Drupal-land may be familiar with the naming convention—they're called hooks.

In this case, I want to add an @AfterScenario hook which runs after any scenario that creates a file, but I'm getting a little ahead of myself here.

Create a new step definition

Whenever I want to create a new step definition, I start by writing out the step as I want it, in my feature file:

When I add file "test.txt" to the "example" folder

Now I run the scenario using Behat, and Behat is nice enough to generate the stub function I need to add to my FeatureContext in it's output:

--- Drupal\FeatureContext has missing steps. Define them with these snippets:

    /**
     * @When I add file :arg1 to the :arg2 folder
     */
    public function iAddFileToTheFolder($arg1, $arg2)
    {
        throw new PendingException();
    }

I copy that code out, drop it into my FeatureContext, then change things to do what I want:

  /**
   * @When I add file :file_name to the :folder_name folder
   */
  public function iAddFileToTheFolder($file_name, $folder_name) {
    $file_path = '/some/system/directory/' . $folder_name . '/' . $file_name;
    $file = fopen($file_path, 'w');
    fwrite($file, '');
    fclose($file);
  }

Yay, a working Behat test step! If I run it, it passes, and the file is dropped into that folder.

But if I run it again, the file was already there and the rest of my tests may also be affected by this rogue testing file.

So next step is I need to track the files I create, and make sure they are cleaned up in an @AfterScenario.

Track files created during test steps

At the top of my FeatureContext, I added:

  /**
   * Keep track of files added by tests so they can be cleaned up.
   *
   * @var array
   */
  public $files = [];

This array tracks a list of file paths, quite simply.

And then inside my test step, at the end of the function, I can add any file that is created to that array:

  /**
   * @When I add file :file_name to the :folder_name folder
   */
  public function iAddFileToTheFolder($file_name, $folder_name) {
    $file_path = '/some/system/directory/' . $folder_name . '/' . $file_name;
    $file = fopen($file_path, 'w');
    fwrite($file, '');
    fclose($file);
    $this->files[] = $file_path;
  }

That's great, but next we need to add an @AfterScenario hook to clean up the files.

Make sure the created files are cleaned up

At the end of my feature context, I'll add a cleanUpFiles() function:

  /**
   * Cleans up files after every scenario.
   *
   * @AfterScenario @file
   */
  public function cleanUpFiles($event) {
    // Delete each file in the array.
    foreach ($this->files as $file_path) {
      unlink($file_path);
    }

    // Reset the files array.
    $this->files = [];
  }

This @AfterScenario is tagged with @file, so any scenario where I want the files to be tracked and cleaned up, I just need to add the @file tag, like so:

@myfeature
Feature: MyFeature

  @api @authenticated @javascript @file
  Scenario: Show changed files in selection form using Git on Site page.
    Given I am logged in as a user with the "file_manager" role
    When I am on "/directory/example"
    Then I should see the text "There are no files present in the example folder."
    And I should not see the text "test.txt"
    When I add file "test.txt" to the "example" folder
    And I am on "/directory/example"
    Then I should see the text "text.txt"

And that is how you do it. Now no matter whether I create one file or a thousand, any scenario tagged with @file will get all its generated test files cleaned up afterwards!

Mar 06 2019
Mar 06

tl;dr: Run composer require zaporylie/composer-drupal-optimizations:^1.0 in your Drupal codebase to halve Composer's RAM usage and make operations like require and update 3-4x faster.

A few weeks ago, I noticed Drupal VM's PHP 5.6 automated test suite started failing on the step that runs composer require drupal/drush. (PSA: PHP 5.6 is officially dead. Don't use it anymore. If you're still using it, upgrade to a supported version ASAP!). This was the error message I was getting from Travis CI:

PHP Fatal error:  Allowed memory size of 2147483648 bytes exhausted (tried to allocate 32 bytes) in phar:///usr/bin/composer/src/Composer/DependencyResolver/RuleWatchNode.php on line 40

I ran the test suite locally, and didn't have the same issue (locally I have PHP's CLI memory limit set to -1 so it never runs out of RAM unless I do insane-crazy things.

So then I ran the same test but with PHP's memory_limit set to 2G—yeah, that's two gigabytes of RAM—and it failed! So I ran the command again using Composer's --profile option and -vv to see exactly what was happening:

# Run PHP 5.6 in a container.
$ docker run --rm -it drupaldocker/php-dev:5.6-cli /bin/bash
# php -v     
PHP 5.6.36 (cli) (built: Jun 20 2018 23:33:51)

# composer create-project drupal-composer/drupal-project:8.x-dev composer-test --prefer-dist --no-interaction

# Install Devel module.
# cd composer-test
# composer require drupal/devel:^1.2 -vv --profile
Do not run Composer as root/super user! See https://getcomposer.org/root for details
[126.7MB/5.04s] ./composer.json has been updated
[129.6MB/6.08s] > pre-update-cmd: DrupalProject\composer\ScriptHandler::checkComposerVersion
[131.5MB/6.10s] Loading composer repositories with package information
[131.9MB/6.52s] Updating dependencies (including require-dev)
[2054.4MB/58.32s] Dependency resolution completed in 3.713 seconds
[2054.9MB/61.89s] Analyzed 18867 packages to resolve dependencies
[2054.9MB/61.89s] Analyzed 1577311 rules to resolve dependencies
[2056.9MB/62.68s] Dependency resolution completed in 0.002 seconds
[2055.5MB/62.69s] Package operations: 1 install, 0 updates, 0 removals
[2055.5MB/62.69s] Installs: drupal/devel:1.2.0
[2055.5MB/62.70s] Patching is disabled. Skipping.
[2055.5MB/62.80s]   - Installing drupal/devel (1.2.0): [2055.6MB/62.83s] [2055.6MB/63.02s] Downloading (0%)[2055.6MB/63.02s]                   [2[2055.6MB/63.04s] Downloading (5%)[2[2055.6MB/63.06s] Downloading (15%)[[2055.7MB/63.08s] Downloading (30%)[[2055.7MB/63.10s] Downloading (40%)[[2055.8MB/63.12s] Downloading (55%)[[2055.8MB/63.14s] Downloading (65%)[[2055.9MB/63.15s] Downloading (75%)[[2055.9MB/63.15s] Downloading (80%)[[2055.9MB/63.17s] Downloading (90%)[[2056.0MB/63.18s] Downloading (100%)[2055.5MB/63.19s]
[2055.5MB/63.19s]  Extracting archive[2055.6MB/63.57s]     REASON: Required by the root package: Install command rule (install drupal/devel 1.x-dev|install drupal/devel 1.2.0)
[2055.6MB/63.57s]
[2055.6MB/63.57s] No patches found for drupal/devel.
[731.5MB/71.30s] Writing lock file
[731.5MB/71.30s] Generating autoload files
[731.8MB/73.01s] > post-update-cmd: DrupalProject\composer\ScriptHandler::createRequiredFiles
[731.6MB/78.82s] Memory usage: 731.61MB (peak: 2057.24MB), time: 78.82s

So... when it started looking through Drupal's full stack of dependencies—some 18,867 packages and 1,577,311 rules—it gobbled up over 2 GB of RAM. No wonder it failed when memory_limit was 2G!

That seems pretty insane, so I started digging a bit more, and found that the PHP 7.1 and 7.2 builds were not failing; they peaked around 1.2 GB of RAM usage (yet another reason you should be running PHP 7.x—it uses way less RAM for so many different operations!).

Then I found a really neat package which had some outlandish promises: composer-drupal-optimizations mentioned in the README:

Before: 876 MB RAM, 17s; After: 250 MB RAM, 5s

I went ahead and added the package to a fresh new Drupal project with:

composer require zaporylie/composer-drupal-optimizations:^1.0

(Note that this operations still uses the same huge amount of memory and time, because the package to optimize things is being installed!)

And then I ran all the tests on PHP 5.6, 7.1, and 7.2 again. Instead of spelling out the gory details (they're all documented in this issue in the Drupal VM issue queue), here is a table of the amazing results:

PHP Version Before After Difference 5.6 2057.24 MB 540.02 MB -1.5 GB 7.1 1124.52 MB 426.64 MB -800 MB 7.2 1190.94 MB 423.93 MB -767 MB

You don't have to be on ancient-and-unsupported PHP 5.6 to benefit from the speedup afforded by ignoring unused/really old Symfony packages!

Next Steps

You should immediately add this package to your Drupal site (if you're using Composer to manage it) if you run Drupal 8.5 or later. And if you use a newer version of Acquia BLT, you're already covered! I'm hoping this package will be added upstream to drupal-project as well (there's sort-of an issue for that), and maybe even something could be done on the Drupal level.

Requiring over 1 GB of RAM to do even a simple composer require for a barebones Drupal site is kind-of insane, IMO.

Mar 04 2019
Mar 04

BLT to Kubernetes

Wait... what? If you're reading the title of this post, and are familiar with Acquia BLT, you might be wondering:

  • Why are you using Acquia BLT with a project that's not running in Acquia Cloud?
  • You can deploy a project built with Acquia BLT to Kubernetes?
  • Don't you, like, have to use Docker instead of Drupal VM? And aren't you [Jeff Geerling] the maintainer of Drupal VM?

Well, the answers are pretty simple:

  • Acquia BLT is not just for Acquia Cloud sites; it is a great way to kickstart and supercharge any large-scale Drupal 8 site. It has built-in testing integration with PHPUnit and Behat. It has default configurations and packages to help make complex Drupal sites easier to maintain and deploy. And it's not directly tied to Acquia Cloud, so you can use it with any kind of hosting!
  • Yes, Acquia BLT is a great tool to use for building and deploying Drupal to Kubernetes!
  • Why, yes! And while I use, maintain, and love Drupal VM—even for this particular project, for the main local dev environment—it is just not economical to deploy and maintain a highly-scalable production environment using something like Drupal VM... not to mention it would make Kubernetes barf!

Anyways, I figured I'd jump right in and document how I'm doing this, and not get too deep into the whys.

Generating a build artifact using only Docker

The first problem: BLT has built-in tooling to deploy new code to environments in traditional cloud environments, in a separate Git repository. This works great for hosting in Acquia Cloud, Pantheon, or other similar environments, but if you need to deploy your BLT site into a container-only environment like Kubernetes, you need to generate a BLT deployment artifact, then get that into a deployable Docker container.

Along that theme, my first goal was to make it so I could build the deployment artifact in a perfectly clean environment—no PHP, no Composer, no Node.js, no nothing except for Docker. So I built a shell script that does the following:

  1. Starts a build container (uses the PHP 7.2 CLI Docker image from Docker Hub).
  2. Installs the required dependencies for PHP.
  3. Installs Node.js.
  4. Installs Composer.
  5. Runs blt artifact:build to build a deployment artifact in the deploy/ directory.
  6. Deletes the build container.

Here's the script:

All you need is to be in a BLT project directory, and run ./blt-artifactory. Now, I could optimize this build process further by building my own image which already has everything set up so I can just run blt artifact:build, but for now I'm a little lazy and don't want to maintain that. Maybe I will at some point. That would cut out 3-5 minutes from the build process.

Building a Docker container with the build artifact

So after we have a deployment artifact in the deploy/ directory, we need to stick that into a Docker container and then push that container into a Docker registry so we can use it as the Image in a Kubernetes Deployment for Drupal.

Here's the Dockerfile I am using to do that:

I put that Dockerfile into my BLT project root, and run docker build -t my/site-name . to build the Docker image. Note that this Dockerfile is meant to build from a PHP container image which already has all the required PHP extensions for your project. In my case, I have a preconfigured PHP base image (based off php:7.2-apache-stretch) which installs extensions like gd, pdo_mysql, simplexml, zip, opcache, etc.

Once I have the my/site-name image, I tag it appropriately and—in this case—push it up to a repository I have set up in AWS ECR. It's best to have a private Docker registry running somewhere for your projects, because you wouldn't want to push your site's production Docker image to a public registry like Docker Hub!

Running Drupal in Kubernetes

Now that I have a Docker image available in AWS ECR, and assuming I have a Kubernetes cluster running inside AWS EKS (though you could be using any kind of Kubernetes cluster—you'd just have to configure it to be able to pull images from whatever private Docker registry you're using), I can create a set of namespaced manifests in Kubernetes for my Drupal 8 site.

For this site, it doesn't need anything fancy—no Solr, no Redis or Memcached, no Varnish—it just needs a horizontally-scalable Drupal deployment, a MySQL deployment, and ingress so people can reach it from the Internet.

Unfortunately for this blog post, I will not dive into the details of the process of getting this particular site running inside Kubernetes, but the process and Kubernetes manifests used for doing so is extremely similar to the ones I am building and maintaining for my Raspberry Pi Dramble project. What's that?, you ask? Well, it's a cluster of Raspberry Pis running Drupal on top of Kubernetes!

If you want to find out more about that, please attend my session at DrupalCon Seattle in April 2019, Everything I know about Kubernetes I learned from a cluster of Raspberry Pis (or view the video/slides after the fact).

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web