Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
Mar 19 2024
Mar 19

The most impactful customer experiences are data-driven. CDPs help organizations collect, unify, manage, and analyze customer data. This allows businesses to understand their customers better and deliver personalized digital experiences.

But selecting and implementing a CDP is not a straightforward journey. It’s a digital transformation that begins with organizations understanding what a CDP can do for them and how they can utilize the platform to their advantage.

Evaluating If You Need A CDP

Every organization likes to think they know their customers, but most are missing out. According to a study conducted by Mapp, an international provider of insight-led customer engagement, lack of customer insight is the biggest challenge in providing personalized experiences.

Successfully implementing a CDP can help fix this challenge. Submit this form to find out if your organization needs a CDP.

New call-to-action

The CDP Implementation Framework: Everything From Discovery To Implementation & Enablement

Implementing a Customer Data Platform (CDP) involves a structured approach encompassing discovery, strategy, implementation, and enablement.

Stage 1: Discovery

During the discovery phase, it's essential to gain a clear understanding of your organization's data landscape, business goals, and available resources. Key aspects include:

1. Data Maturity And Business Goals

To assess data maturity, consider the following:

  • Volume: How much data does your organization generate and collect?
  • Variety: What types of data are available (e.g., customer demographics, transaction history)?
  • Velocity: How quickly is data generated and updated?
  • Veracity: How reliable and accurate is the data?
  • Value: What insights can be derived from the data to drive business decisions?

For example, a retail company may have vast amounts of transaction data but lack comprehensive customer profiles.

2. Data Inventory And Quality Assessment

Cataloging existing data sources involves identifying where data resides, such as:

  • Customer Relationship Management (CRM) systems
  • Enterprise Resource Planning (ERP) systems
  • Web analytics platforms

Assessing data quality entails evaluating data accuracy, completeness, and consistency. For instance, organizations may discover inconsistencies in customer information across different databases, impacting marketing campaigns' effectiveness.

3. Stakeholder Engagement And Workflow

Engage stakeholders from various departments (e.g., marketing, sales, IT) to understand their data needs and ensure alignment with business objectives. Establish clear roles and responsibilities, such as:

  • Data stewards responsible for data governance
  • Analysts tasked with generating insights from customer data

Define workflows to streamline data collection, processing, and analysis. For example, establish protocols for updating customer profiles and sharing insights across teams.

Key Questions To Ask

Ask relevant questions to guide the discovery process.

Project & Business Objectives

  • What are the specific goals for this CDP implementation?
  • What problem are you trying to solve with the CDP?
  • What are your business goals and how does the CDP align with them?
  • What would be considered a success for the CDP implementation?
  • What are the current solutions and how can CDP improve them?​

Budget & Timelines

  • What is the budget for the CDP implementation?
  • Is there a fixed deadline or time constraints for the project?
  • Are there any financial limitations that need to be considered?​

Identify Stakeholders

  • Who are the primary stakeholders in this CDP project?
  • What are the needs and expectations of each stakeholder?
  • How will different stakeholders be involved in the project?​

Data Sources & Quality

  • What are the sources of your customer data?
  • What is the quality and maturity of the data you currently have?
  • Who owns the data and where does it come from?​

Segmentation & Identification Strategy

  • How will you segment your customer data?
  • What is your strategy for identifying and resolving customer identities?​

The Action Plan & Expectations

  • What are the most impactful outcomes you expect from the CDP?
  • How do you plan to monitor, measure, and capture the value of the CDP?​

Ensure answers to all of these questions to develop an effective CDP implementation strategy.

Stage 2: Strategy

The strategy phase focuses on developing a comprehensive plan to leverage the CDP effectively. Key considerations include:

1. Data Strategy Development

Define a data strategy aligned with business objectives and use cases. For example:

  • Personalization: Use customer data to deliver personalized experiences and targeted marketing campaigns.
  • Customer Retention: Identify at-risk customers and implement strategies to improve retention rates.

Plan implementation phases in a "crawl, walk, run" approach to prioritize high-impact use cases and ensure gradual adoption across the organization.

2. Customer Data Landscape Assessment

Assess the customer data landscape by identifying all data sources and their integration capabilities. Consider:

  • Online & Offline Data: Integrate data from online sources (e.g., website interactions, social media) and offline sources (e.g., in-store purchases, call center interactions).
  • Identity Resolution: Develop a strategy for resolving customer identities across different data sources to create unified customer profiles.

Define the universal data layer, including data points and naming conventions, to ensure consistency in data collection and storage.

3. Stakeholder Engagement & Collaboration

Ensure alignment across departments and stakeholders by:

  • Hosting workshops and collaborative sessions to gather input from various teams.
  • Establishing a governance structure to manage data access, usage, and privacy.
  • Define workflows and data ownership responsibilities to streamline collaboration and decision-making processes.

Stage 3: Implementation

During the implementation phase, the focus shifts to executing the plan and mapping data to the CDP effectively. Key steps include:

1. Data Mapping & Integration

Map data from various sources to the CDP to create a unified view of customer data. For example:

  • Integrate CRM data to track customer interactions and purchase history.
  • Combine website analytics data to understand customer behavior and preferences.
  • Implement identity resolution to merge duplicate customer records and create accurate customer profiles.

2. Identity Resolution

An important part of data implementation is identity resolution (IR). When there's a surge of customer data coming in from various channels, it can be quite challenging to filter through everything. An easy solution is if you have a Customer Data Platform (CDP).

Imagine records like this: 

Draco Malfoy

Draco

D. Malfoy

[      ]

Malfoy Manor

[      ]

[      ]

[      ]

Wiltshire, England

[email protected]

[      ]

[      ]

These three records are of the same person only. But more basic systems might mix the data up and create three separate Dracos. A CDP will use its IR powers to consolidate an example like this into a single customer profile. The resolved record would complete Mr. Malfoy’s profile.

Draco Malfoy

Malfoy Manor

Wiltshire, England

[email protected]

Stage 4: Enablement

The enablement phase involves preparing for operational deployment and deriving value from the CDP. Key activities include:

1. Provisioning & Training

Set up the production environment and train users on CDP functionalities, including:

  • Data ingestion and integration processes
  • Segmentation and targeting capabilities
  • Reporting and analytics features

Build segments, reports, and personalized campaigns to leverage customer data effectively in marketing initiatives.

2. Iteration & Scaling

Start with small, manageable goals for quick wins and iterate based on feedback and insights from the CDP. For example, test different segmentation strategies to identify high-value customer segments.

Users can also analyze campaign performance metrics to optimize targeting and messaging. Scale up over time by expanding the use of the CDP across departments and incorporating additional data sources and use cases.

Conclusion

Do you need more help in building a strategic roadmpa for a CDP implementation? Or are you not even sure if opting for a CDP is the right solution?

Submit the form to get personalized recommendations from our CDP experts.

New call-to-action

Mar 12 2024
Mar 12

Network, Learn, and Collaborate - The three key motivations for individuals and organizations to participate in conferences. Every regular conference has a theme or niche that serves as a focal point for discussions and advancement. These events serve as stages for personal branding and business promotion, with attendees aiming to gain insights and contacts that directly benefit their individual goals and organizational interests. 

Although open-source events rely on these key motivations too, they have a unique flavor of community spirit and collaboration that’s not found in traditional conferences. Open source events like DrupalCons thrive on shared knowledge, transparent innovation, and a sense of collective growth.

What is DrupalCon? 

DrupalCon is an annual open-source conference that brings together open-source enthusiasts, developers, designers, and end users for networking, learning, and collaboration, all under one roof. This is where you can meet the people who made the software, get inspired, and actively contribute to the project. The next upcoming DrupalCon North America event is being held in Portland, Oregon, from 06 May 2024 to 09 May 2024. We’ll give you some reasons why you should attend open-source events like DrupalCon 2024.

Benefits of Attending Open Source Conferences 

An open-source enthusiast knows that events like DrupalCons are celebrations of community-driven innovation. The energy is contagious, the ideas are limitless, and the camaraderie extends beyond the conference halls. 

Spirit of Open-Source

Open source is almost synonymous with collaboration. Collaboration by contributors who are the heartbeat of any open-source project. These events provide a platform for individuals and organizations to come together, contribute to the community, and drive the future of open source. It aligns with the open-source commitment to empowering innovation through the collective efforts of a vibrant and engaged community. In an event like DrupalCon, you get a chance to meet people who are passionate about Drupal and driving it forward. 

Career Boost

If you're launching your career or contemplating a switch to something more fulfilling, few experiences rival the rewards of joining an open-source community. And there’s no better place to kick off this journey than an open-source conference. You’re not just exploring job opportunities but also gaining the knowledge you need from training sessions and meaningful interactions with seasoned experts. You can also upgrade your skills through hands-on workshops and interactive sessions at the event. At DrupalCon, you can always find support if you’re new to the world of Drupal or Open source. A mentor will help guide you through your entire experience by suggesting what sessions you should attend for your professional development. You can even learn to make your first contribution to the project through your mentor.

Spot the Trend

Want to know what’s new in your area of interest? Open-source conferences are the best places to identify emerging trends, innovations, and shifts in the industry - much before they become mainstream! You come out well-equipped with insights into upcoming technologies and initiatives. This will not only help you in your professional development but also enable you to contribute meaningfully to innovative projects. All of this ultimately leads to improved user experiences and future-ready applications. At DrupalCon, immerse yourself in firsthand insights as Dries Buytaert, the founder himself, shares the current state of Drupal in his keynote (DriesNote). Discover upcoming initiatives and innovation on the horizon, and get a sneak peek into the exciting developments set to launch. 

The Power of Open Source Networking

We all know how powerful networking can be for your career or business development. But for an open-source community, networking is an indispensable aspect. It's impossible to have a successfully operating community without networking. Open-source events let you connect with like-minded individuals, developers, agencies, and contributors, fostering potential collaboration. Get mentorship, guidance, and exposure to new opportunities to aid your professional growth. Attend DrupalCon to connect with thousands of open-source enthusiasts and build meaningful connections with professionals just like you. Programs like BoFs (Birds of a Feather) at DrupalCon let you exchange information and share best practices around a common topic of interest. Make DrupalCon your opportunity to grow.

Real-World Learning

Learning from real-world scenarios truly refines your understanding of technology and innovation. Attending industry summits at open-source conferences is a great way to gain practical insights from industry leaders. It’s a chance to understand the real-world challenges faced by them and the practical solutions implemented. Through live demos, case studies, and applications, you can see the ropes in action. Industry summits often highlight the methodologies that are proving successful in the current landscape, providing actionable takeaways. DrupalCon has a full day dedicated to industry summits like the higher educational summit, non-profit summit, government summit and community summit. 

Final Thoughts

Whether it's networking opportunities, hands-on learning, or trend forecasting, open-source conferences offer a holistic approach to staying on top of ever-changing technologies. They contribute to the collective growth of the entire open-source community. It's an investment in continuous learning, professional enrichment, and the boundless possibilities of open collaboration. Did we mention that DrupalCons aren't just about coding and tech talk? There's a ton of fun to be had too! Take a look at the social events from last year

Mar 12 2024
Mar 12

Data plays a pivotal role in shaping customer interactions, so implementing a CDP can empower businesses to unify and analyze customer data from various sources. This can lead to more personalized and effective marketing strategies, improved customer segmentation, and enhanced digital experiences.

However, diving into CDP implementation without proper assessment can lead to challenges such as underutilization of the platform, data management issues, and inadequate return on investment. To avoid these challenges, organizations must comprehensively evaluate their readiness across various dimensions before embarking on a CDP implementation journey.

Evaluating Readiness For CDP Implementation

Before implementing a Customer Data Platform (CDP), organizations must establish several prerequisites to ensure a successful integration and utilization of the platform.

  • Ensure a clear data strategy, outlining objectives, sources, and governance frameworks aligned with broader business goals.
  • Ensure good data quality through cleansing and validation for accurate insights and decision-making.
  • Assess and enhance the technology infrastructure required to support a CDP integration.
  • Foster cross-functional collaboration, secure stakeholder buy-in, allocate resources, and develop a comprehensive change management plan.

Once these prerequisites are in place, the next step is to conduct a thorough evaluation across different dimensions.

Do You Need A CDP

Assess your organization across the following dimensions to evaluate the need for a CDP.

New call-to-action

1. Data Maturity & Infrastructure

Assess your organization's data management capabilities and infrastructure readiness for CDP integration.

  • Current Data Sources

Identify existing data sources and evaluate their formats, accessibility, and captured data.

  • Integration Capability

Assess the ability to integrate data sources with a CDP, considering API availability and compatibility.

  • Data Quality & Consistency

Ensure data accuracy, reliability, and adherence to governance and compliance standards.

Organizations with scattered data across various platforms and systems who wish to enable better decision-making should consider implementing a CDP. On the other hand, organizations with limited data sources do not need a CDP and should consider a different solution depending on their goal.

2. Organizational Readiness

Evaluate your organization's readiness beyond technology.

  • Stakeholder Alignment

Ensure key stakeholders understand and support the CDP initiative.

  • Skillset Availability

Assess if your team has the necessary skills for CDP implementation or if training/new hires are needed.

  • Technology Stack Compatibility

Evaluate compatibility with existing technology and potential upgrades.

Organizations where multiple stakeholders across departments need access to unified customer data for decision-making should consider opting for a CDP. On the other hand, organizations with a unified vision for utilizing existing data sources and access to the right technology infrastructure for supporting data integration can do without a CDP.

3. Future Scalability

Consider the scalability of your CDP implementation.

  • Scalability Assessment

Evaluate scalability to accommodate future business growth.

  • Flexible Architecture

Ensure the CDP architecture can adapt to evolving data sources and business needs.

Implementing a CDP is a great idea for organizations anticipating significant customer data volume and complexity growth. It is also ideal for organizations anticipating the adoption of new technology and data sources in the future.

Businesses anticipating minimal growth in customer data volume and with a stable industry can look into other avenues for building better customer experiences.

4. Change Management

Address organizational changes associated with CDP implementation.

  • Culture Of Data-Driven Decision Making

Foster an environment where decisions are based on data insights.

  • Process Integration

Ensure seamless integration with existing processes to make data insights actionable.

  • Change Management Strategies

Implement strategies to manage organizational transitions smoothly.

Organizations transitioning to a data-driven culture with the necessary stakeholder approvals, training, and support are usually in a great position to implement a CDP. Organizations with this culture or those not looking to change their current processes and systems should refrain from implementing a CDP.

5. Resource Availability

Ensure the availability of resources for successful CDP implementation.

  • Technical Expertise

Assess the need for skilled data management, integration, and analysis personnel.

  • Infrastructure

Ensure adequate technological infrastructure to support CDP requirements.

Allocate the budget for initial implementation, maintenance, and updates.

Implementing and managing a CDP requires specialized skills and expertise. Organizations with this expertise or are willing to invest in it should consider implementing a CDP.

6. Data Democratization

Promote accessibility and understanding of data across the organization.

  • User-Friendly Tools

Implement tools for easy data access and interpretation.

  • Data Governance

Establish clear policies on data access, usage, and security.

  • Training & Literacy

Provide training to improve data literacy across the organization.

Organizations looking to unify customer data for decision-making should consider implementing a CDP.

7. Technology Compatibility & Integration

Ensure seamless integration with existing technology platforms.

  • Existing Tech Stack Assessment

Evaluate compatibility with current systems.

  • API & Data Exchange Capabilities

Ensure seamless data exchange with other systems.

Organizations looking to change their existing technology platforms that operate in silos and hinder data exchange and integration should consider opting for a CDP. Businesses with existing technology platforms that operate cohesively allow seamless data exchange and integration without needing a specialized platform like a CDP.

8. Use Case Definition & Business Goals

Align CDP implementation with business objectives.

  • Clear Use Cases

Identify specific use cases for the CDP.

  • Alignment with Business Objectives

Ensure CDP directly contributes to achieving key business goals.

Implementing a CDP is a good idea when businesses require advanced data analysis capabilities, such as personalized marketing or real-time analytics, that cannot be achieved using existing tools. It can also help segment customers based on specific criteria and personalize marketing strategies.

9. Compliance & Data Governance

Ensure compliance with data privacy regulations and robust governance.

  • Data Privacy & Security

Confirm compliance with data protection regulations.

  • Audit & Reporting Requirements

Support necessary audit trails and reporting for compliance.

Organizations operating in highly regulated industries or globally must comply with necessary data governance and compliance regulations such as GDPR and CCPA.

10. Budget & ROI Consideration

Evaluate the financial aspects of CDP implementation.

  • Cost-Benefit Analysis

Understand the total cost of ownership and expected ROI.

  • Long-Term Financial Commitment

Consider long-term maintenance and scaling costs.

Implementing a CDP is beneficial when the potential benefits, such as improved customer engagement, increased sales, and enhanced marketing effectiveness, outweigh the initial investment and ongoing costs.

11. Vendor Evaluation

Select a suitable CDP vendor.

  • Market Research

Conduct thorough research on potential CDP vendors.

  • Proof Of Concept

Consider running a pilot program to test effectiveness.

Customer support and assistance are crucial for successful CDP implementation and ongoing maintenance. Ensure a thorough vendor evaluation when choosing a reputable and reliable CDP vendor.

12. Scalability & Future-Proofing

Ensure CDP scalability and adaptability.

  • Scalability Assessment

Ensure CDP can scale with business growth.

  • Adaptability To Future Trends

Ensure CDP can adapt to future data trends and technological advancements.

A scalable CDP solution is necessary when businesses anticipate significant data volume and complexity growth. A CDP can help adapt to future data trends and technological advancements.

Submit The Form To Get Personalized CDP Recommendations

Assessing readiness across these areas ensures successful CDP implementation and effective utilization in driving business growth and enhancing customer experiences. You can also submit this form to determine if you need a CDP or speak to our experts about your business requirements and goals.New call-to-action

Mar 10 2024
Mar 10

Imre's headshot

We're delighted to introduce Imre Gmelig Meijling, one of the newest members elected in October of the Drupal Association Board. Imre, CEO at React Online Digital Agency in The Netherlands, brings a wealth of digital experience from roles at organizations like the United Nations World Food Programme, Disney, and Port of Rotterdam.

Imre is not only a member of the Drupal Association Board of Directors but also serves as an executive member on the DrupalCon Europe Advisory Committee. Previously, he chaired the Dutch Drupal Association, expanding marketing efforts and establishing a successful Drupal Partner Program. Imre played a key role in launching drupal.nl, a community website used by several countries. He co-created the Splash Awards and led Drupaljam, a Dutch Drupal event with almost 500 attendees. In 2023, Imre joined the Drupal Business Survey.

As a recent board member, Imre shares insights on this exciting journey:

What are you most excited about when it comes to joining the Drupal Association board?
I am very excited about joining the Drupal Association Board and contributing with insights and perspectives from the digital business market in Europe. Drupal has a strong market position with many opportunities for the coming years. I look forward to supporting the marketing team in their expanding efforts. I am particularly proud and excited to be part of an inclusive global community. Being part of an inclusive global community and supporting the Open Web Manifesto aligns closely with my personal values.

What do you hope to accomplish during your time on the board?
I aim to help expand Drupal's marketing outreach aiming for more wonderful brands and organizations adopting Drupal and attracting new talent to get involved with Drupal. I am also looking forward to establishing and sustaining relationships between Europe and other regions with the Drupal Association and finding ways to work even more closely together.

What specific skill or perspective do you contribute to the board?
Being part of an inclusive global community and supporting the Open Web Manifesto aligns closely with my personal values. Working with Drupal at various digital agencies in Europe, I support the growth of Drupal from a business-perspective, but having a technical background, I know the strength of the Drupal community has and can be for brands. Having been in both worlds for a long time, I will help and make sure we bring them together.

I was Chair of the Board for the Dutch Drupal Association, in which time a successful Dutch Partner Program was launched. Also, marketing and advertising on mainstream media was taking off during that time. I was also involved in the design and setup of the Dutch Drupal website, which is now open source. I co-founded the Splash Awards and I am Executive Member of the DrupalCon Europe Community Advisory Committee. I will share all of my experiences where I can. 

How has Drupal impacted your life or career?
It's part of my life, both professional as well as personal, for over 16 years.

Tell us something that the Drupal community might not know about you.
I own my own digital agency in The Netherlands, React Online. I began my career as a UX designer and front end developer for Lotus Notes applications, called 'groupware' at the time, a long gone predecessor to the social collaboration platforms that we now know well. Interestingly, my birthday is on January 15, just like Drupal!

Share a favorite quote or piece of advice that has inspired you.
A true leader is not one with the most followers, but one who makes the most leaders out of others. A true master is not the one with the most students, but one who makes masters out of others.

We can't wait to experience the incredible contributions Imre will make during his time on the Drupal Association Board. Thank you, Imre, for dedicating yourself to serving the Drupal community through your board work! Connect with Imre on LinkedIn.

The Drupal Association Board of Directors comprises 12 members, with nine nominated for staggered 3-year terms, two elected by the Drupal Association members, and one reserved for the Drupal Project Founder, Dries Buyteart. The Board meets twice in person and four times virtually annually, overseeing policy establishment, executive director management, budget approval, financial reports, and participation in fundraising efforts.

Mar 04 2024
Mar 04

We have made a recent update on drupal.org that you haven’t probably noticed. In fact, although it's a meaningful and important section, I bet you have not seen it in months or even years. It's something you would have only seen when you registered on Drupal.org for the first time. I’m talking about the welcome email for new registered users.

One of the goals we have had recently is improving the way our users register and interact with drupal.org for the first time. Improvements to onboarding should improve what I call the long tail of the (Open Source) contribution pipeline (a concept, the long tail of contribution, that I will explain further in the next few days).

For now, let’s have a look at one of the first things new users in our community saw first:

This is what I called the huge wall of text. Do you remember seeing it for the first time? Did you read any of it at all? Do you remember anything important in that email? Or did you just mark it read and move on?

Fortunately, we've taken a first, incremental step to make improvements. As I said before, this isn't something our existing userbase will see, but the new welcome email to Drupal.org has changed and been simplified quite a bit. Here is the new welcome email:

We have replaced a lot of the wall of text with a simpler set of sections and links, and landing pages on drupal.org. This simplifies the welcome email, but is also going to allow us to track which links are most useful to new users, how many pages they visit on drupal.org, where do they get stuck, what interests them most, etc - and use that to make further refinements over time.

The other section I wanted to include is something that is very important for the Drupal Association, but also for the whole community. I wanted to highlight the contribution area, something that was not even mentioned in the old email. Our hope is this is an opportunity to foster and promote contribution from new users.

A few weeks ago I also launched a poll around contribution. This poll combined with the updates on these few changes in the user registration are aimed towards the same goal: improving contribution onboarding. You can still participate if you’d like to, just visit https://www.surveymonkey.com/r/XRTNNM3

Now, if you are curious about what I am calling the long tail of the contribution pipeline, watch this space.

File attachments:  confirmed-user.png
Feb 24 2024
Feb 24

We support Ukraine.

On the two-year anniversary of the Russian government’s attack on Ukraine, the Drupal Association wishes to reiterate its support for Ukraine. The invasion was an act of aggression, and our hearts are still with our Drupal Ukraine community.

We want to bring attention once more to the ways that the Drupal community can continue to support Ukraine. Here is a list of organizations* accepting donations to help people directly affected by the events in Ukraine:

  • Nova Ukraine, a Ukraine-based nonprofit, provides citizens with basic needs and resources. Donate here.

  • United Help Ukraine receives and distributes donations, food, and medical supplies to internally displaced Ukrainians and anyone affected by the war. Donate here

  • People in Need provides humanitarian aid to over 200,000 people on the ground. Donate here

  • The Ukrainian Red Cross undertakes humanitarian work, from aiding refugees to training doctors. Donate here.

  • UN Refugees Agency supports refugees. Donate here

  • UNICEF Ukraine is repairing schools damaged by the bombings and providing emergency responses to children affected by the war. Donate here.

As always, our global Drupal community is better together. We stand in solidarity and hope for peace. 

*List of resources originally compiled by Global Citizen

Dec 24 2023
Dec 24

Routes in Drupal can be altered as they are created, or even changed on the fly as the page request is being processed.

In addition to a routing system, Drupal has a path alias system where internal routes like "/node/123" can be given SEO friendly paths like "/about-us". When the user visits the site at "/about-us" the path will be internally re-written to allow Drupal to serve the correct page. Modules like Pathauto will automatically generate the SEO friendly paths using information from the item of content; without the user having to remember to enter it themselves.

This mechanism is made possible thanks to an internal Drupal service called "path processing". When Drupal receives a request it will pass the path through one or more path processors to allow them to change it to another path (which might be an internal route). The process is reversed when generating a link to the page, which allows the path processors to reverse the process.

It is possible to alter a route in Drupal using a route subscriber, but using path processors allows us to change or mask the route or path of a page in a Drupal site without actually changing the internal route itself.

In this article we will look what types path processors are available, how to create your own, what sort of uses they have in a Drupal site, and anything else you should look out for when creating path processors.

Types Of Path Processor

Path processors are managed by the Drupal class \Drupal\Core\PathProcessor\PathProcessorManager. When you add your a path processor to a site this is the class that manages the processor order and calling the processors.

There are two types of path processor available in Drupal:

  • Inbound - Processes an inbound path and allows it to be altered in some way before being processed by Drupal. This usually occurs when a user sends a request to the Drupal site to visit a page. Inbound path processors can also be triggered by certain internal processes, for example, when using a path validator. The path validator will pass the path to the inbound path processor in order to change it to ensure that it has been processed correctly.
  • Outbound - An outbound path is any path that Drupal generates a URL. The outbound path processor will be called in order to change the path so that the URL can be generated correct.

Basically, the inbound processor is used when responding to a path, the outbound processor is called when rendering a path.

Let's go through a couple of examples of each to show how they work.

Creating An Inbound Processor

To register an inbound service with Drupal you need to create a service with a tag of path_processor_inbound, and can optionally include a priority. This let's Drupal know that this service must be used when processing inbound paths.

It is normal for path processor classes to be kept in the "PathProcessor" directory in your custom module's "src" directory.

services:
  mymodule.path_processor_inbound:
    class: Drupal\mymodule\PathProcessor\InboundPathProcessor
    tags:
      - { name: path_processor_inbound, priority: 20 }

The priority you assign to the path_processor_inbound tag will depend on your setup. The internal inbound processor that handles paths in Drupal has a priority of 100, so any setting less than 100 will cause the processing to be performed before Drupal's internal handler is called.

The InboundPathProcessor class we create must implement the \Drupal\Core\PathProcessor\InboundPathProcessorInterface interface, which requires a single method called processInbound() to be added to the class. Here are the arguments for that method.

  • $path - This is a string for the path that is being processed, with a leading slash.
  • $request - In addition to the path, the request object is also passed to the method. This allows us to perform any additional checks on query strings on the URL or other parameters that may have been added to the request.

The processInbound() method must return the processed path as a string (with the leading slash). If we don't want to alter the path then we need to return the path that was passed to the method.

To create a simple example let's make sure that when a user visits the path at "/some-random-path" that we translate this internally to be "/node/1", which is not the internal route for this page. In this example, if the path passed into the method isn't our required path then we just return it, effectively ignoring any path but the one we are looking for.

Now, when the user visits the path "/some-random-path" they will see the output of the page at "/node/1". It is still possible to view the page at "/node/1/" and see the output, so we have just created a duplicate path for the same page.

This is a simple example to show how the processInbound() method works, we'll look at a more concrete example later.

Creating An Outbound Processor

The outbound processor is defined in a similar way to the inbound processor, but in this case we tag the service with the tag path_processor_outbound.

services:
  mymodule.path_processor_outbound:
    class: Drupal\mymodule\PathProcessor\OutboundPathProcessor
    tags:
      - { name: path_processor_outbound, priority: 250 }

The priory of the path_processor_outbound is more or less the opposite of the inbound processor in that you'll generally want your outbound processing to happen later in the callstack. The internal Drupal mechanisms for outbound processor is set at 200, so setting our priory to 250 means that we process our outbound links after Drupal has created any aliases.

The OutboundPathProcessor class we create must implement the \Drupal\Core\PathProcessor\OutboundPathProcessorInterface interface, which requires a single method called processOutbound() to be added to the class. Here are the arguments for that method.

  • $path - This is a string for the path that is being processed, with a leading slash.
  • $options - An associative array of additional options, which includes things like "query", "fragment", "absolute", and "language". These are the same options that get sent to the URL class when generating URLs and allow us to update the outbound path based on the passed options.
  • $request - The current request object is also sent to the method and can make decisions based on the parameters passed to the current path.
  • $bubbleable_metadata - An optional object to collect path processors' bubbleable metadata so that we can potentially pass cache information upstream.

The processOutbound() method must return the new path, with a starting slash. If we don't want to change the path then we just return the path that was sent to us, otherwise we can make any change we require and return this string.

Taking a simple example in the inbound processor further, let's change the path "/node/1" to be "/some-random-path". In this example we are looking for the internal path of "/node/1", and if we see this path then we return our new path.

With this in place, when Drupal prints out a link to "/node/1" it will render the path as "/some-random-path".

On its own this example doesn't do much; we are just rewriting a path for a single page. The real power is when we combine inbound processing and outbound processing together. Let's do just that.

Creating A Single Class For Path Processing

It is possible to combine the inbound and outbound processors together into a single class by combining the tags in a single service. This can be done by combining the path processors together in the module's services file.

services:
  mymodule.path_processor:
    class: Drupal\mymodule\PathProcessor\MyModulePathProcessor
    tags:
      - { name: path_processor_inbound, priority: 20 }
      - { name: path_processor_outbound, priority: 250 }

The class we create from this definition implements both the InboundPathProcessorInterface and the OutboundPathProcessorInterface, and as such it includes both of the processInbound() and processOutbound() methods.

Now all you need to do is add in your path processing.

It's a good idea to create a construct like this so that you translate the path going into and coming out of Drupal. This creates a consistent path model and prevents duplicate content issues where different pages have the same path.

The Redirect Module

If you are planning to use the inbound path processor system then you should be aware that the Redirect module will attempt to redirect your inbound path processor changes to the rewritten paths. The Redirect module is a great module, and I install it on every Drupal site I run, but in order to prevent this redirect you'll need to do something extra, which we'll go through in this section.

To prevent the Redirect module from redirecting a path you need to add the attribute _disable_route_normalizer to the route before the kernel.request event triggers in the Redirect module's RouteNormalizerRequestSubscriber class. We do this by creating our own event subscriber and giving it a higher priority.

The first thing to do is add our event subscriber to our custom module services.yml file.

  mymodule.prevent_redirect_subscriber:
    class: Drupal\mymodule\EventSubscriber\PreventRedirectSubscriber
    tags:
      - { name: event_subscriber }

The event subscriber itself just needs to listen to the kernel.request event, which is stored in the KernelEvents::REQUEST constant. We need to trigger our custom module before the redirect module event, and so we set the priority of the event to be 40. This is higher than the Redirect module event, which is set at 30.

All the event subscriber needs to do is listen for our path and then set the _disable_route_normalizer attribute to the route if it is detected.

getRequest()->getPathInfo() === '/en/some-random-path') {
      $event->getRequest()->attributes->set('_disable_route_normalizer', true);
    }
  }

}

When the Redirect module event triggers it will see this attribute and ignore the redirect.

This will only happen if you are changing the path of an entity of some kind using only the inbound path processor. Creating only the inbound processor creates an imbalance between the outer path and the translated inner path, which we then need to let the Redirect module know about to prevent the redirect. If we also translated the outbound path in the same (and opposite) way then the redirect wouldn't occur.

Doing Something Useful

We've looked at swapping paths and preventing redirects, but let's do something useful with this system.

I was recently tasked with creating a module that would allow any page to be rendered as RSS. It wasn't that we needed a RSS feed, but that each individual page should have an RSS version available.

This was required as there was an integration with an external system that was used to pull information out of a Drupal site for newsletters. Having RSS versions of pages made it much easier for the system to parse the content of the page and so produce the newsletter. This also meant that if the theme changed the system wouldn't be effected as it wouldn't be using the theme of the site.

Essentially, the requirement meant that we needed to add "/rss" after any page on the site and it would render the page accordingly.

The resulting module was dubbed "Node RSS" and made extensive use of path processors to produce the result.

The first step was to create a controller that would react to path like "/node/123/rss" to render the page as an RSS feed. This required a simple route being set up to allow Drupal to listen to that path and also to inject the current node object into the controller. The route also contains a simple permission, which provided a convenient way of activating the system when it was ready.

node_rss.view:
  path: '/node/{node}/rss'
  defaults:
    _title: 'RSS'
    _controller: '\Drupal\node_rss\Controller\NodeRssController::rssView'
  requirements:
    _permission: 'node.view all rss feeds'
    node: \d+
  options:
    parameters:
      node:
        type: entity:node

The rssView action of the NodeRssController just needs to render the node and return it as part of an RSS document. Using this we can now go to a node page at "/node/123/rss" and see an RSS version of the page.

I won't go into detail about producing the RSS version of the page here as it contains a lot of boilerplate code that goes beyond the scope of this article.

So far we only have half the functionality required. Seeing an RSS version of the page via the node ID is fine, but what we really want is to visit the full path of the page with "/rss" appended to the end.

The next step is to setup our path processor so that we can change the paths on the fly. In addition to the tags we are also passing in two other services for us to use in the class. These services are the path_alias.manager service for translating paths and the language_manager to ensure that we get the path with the correct language.

services:
  node_rss.path_processor:
    class: Drupal\node_rss\PathProcessor\NodeRssPathProcessor
    arguments:
      - '@path_alias.manager'
      - '@language_manager'
    tags:
      - { name: path_processor_inbound, priority: 20 }
      - { name: path_processor_outbound, priority: 220 }

The processInbound() method looks for the "/rss" string at the end of the passed path. If this is found then we remove that from the path and try to find the internal path of the page in the site. If we do find the path then it will be returned as "/node/123" instead of the full path alias and this means we can just append "/rss" to the end of the path to point the path at our NodeRssController::rssView action.

  public function processInbound($path, Request $request): string {
    if (preg_match('/\/rss$/', $path) === 0) {
      // String is not an RSS feed string.
      return $path;
    }

    $nonRssPath = str_replace('/rss', '', $path);
    $internalPath = $this->pathAliasManager->getPathByAlias($nonRssPath, $this->languageManager->getCurrentLanguage()->getId());

    if ($internalPath === $nonRssPath && preg_match('/^node\//', $internalPath) === 0) {
      // No matching path was found, or, it wasn't a node path that we have.
      return $path;
    }

    return $internalPath . '/rss';
  }

The opposite process needs to happen for the processOutbound() method. In this case we look for a path that looks like "/node/123/rss" and convert this back into the full path alias of the page. If we find an alias for that path then we append "/rss" to the path and return it.

  public function processOutbound($path, &$options = [], Request $request = NULL, BubbleableMetadata $bubbleable_metadata = NULL): string {
    if (preg_match('/^\/node\/.*?\/rss?$/', $path) === 0) {
      // String is not an RSS feed string.
      return $path;
    }

    $nonRssPath = str_replace('/rss', '', $path);
    $alias = $this->pathAliasManager->getAliasByPath($nonRssPath, $this->languageManager->getCurrentLanguage()->getId());

    if ($nonRssPath === $alias) {
      // An internal alias was not found.
      return $path;
    }

    return $alias . '/rss';
  }

We now have an RSS feed for any content path on the website (as long as it is a node page of some kind).

If we attempted to visit the RSS output of any other kind of page (like a taxonomy term) then we would receive a 404 error. This is possible thanks to the route we have in place as the parameter will only accept node paths.

As we have translated the path completely we do not need the Redirect module overrides here since there is a coherent input/output mechanism for these paths. It's only when there is an imbalance in the paths that we need to override the Redirect module to prevent redirects.

Don't worry if you are looking for the full source code for the above module as I have recently released the Node RSS module on Drupal.org. It only has a dev release for the time being as I would like to add the ability to pick what content types are available for the feeds. I'm also testing it with different setups to make sure that the feed works in different situations. Let me know if it is useful for you and please create a ticket if you have any issues.

If you want to see another module that makes use of this technique then there is the Dynamic Path Rewrites module. This allows the rewriting of any content path on the fly without creating path aliases. This is an alternative to using modules like Path Auto without actually creating path aliases within your system and uses a nice caching system to speed up the responses.

Conclusion

The path processing system in Drupal is really quite powerful and can be used to build some interesting features that rewrite paths on the fly. We can take any incoming request and redirect it to any path we like on the fly.

Without this system in place we would need to generate additional aliases for every path we wanted and add them to the database before we would be able to use the system. That is fine on smaller sites, but I manage sites with millions of nodes and that amount of data would bloat the database and probably not be used all that much.

Path processing does have some interactions with other modules (like Redirect) but these problems are easily overcome. Perhaps the most complex part of this is ensuring that you have the right weights to some of the interactions here as getting things wrong will likely lead to unwanted interactions.

Dec 14 2023
Dec 14

Annual awards for the best Drupal projects

Last Friday, the prestigious #SplashAwards2023 took place in Mannheim. At this annual event, often referred to as the Oscars of the Drupal community, provides agencies with the opportunity to submit their outstanding #Drupal projects.

This year, the awards featured a total of 28 projects. There was great excitement when the winners in 8 different categories were announced. Each category recognised both a winner and a deserving runner-up. In addition, every year the jury honours special projects close to their hearts with the "Honorable Mentions".

1xINTERNET continues its winning streak

Since their introduction in German-speaking countries in 2017, the Splash Awards have become a benchmark for outstanding Drupal projects. 

This year, 1xINTERNET has once again impressed the jury with its innovative solutions and creative projects and continued its success with both a first and a second place.

Oct 23 2023
Oct 23

The Splash Awards are not only a recognition of hard work and outstanding dedication, but also a chance to network with industry peers, talk about the latest trends in web development and design, and celebrate the best solutions.

And the winner is…

The 2023 Splash Awards will be presented on November 10 in Mannheim, Germany. As every year, we look forward to presenting our work and hope to continue our success story this year. We congratulate all nominees and look forward to exchanging ideas with other experts in the Drupal community.

See all nominated projects

Oct 12 2023
Oct 12

DrupalCon is the biggest event in the European Drupal community calendar, and this year, it will take place in Lille, France from October 17 to 20. We are excited to share that we are Platinum sponsors of the event, arriving with a big team from 1xINTERNET.

We're always excited about DrupalCon, it's one of the biggest events of the year. This year is extra special for us, we're celebrating our 10 year anniversary in December, that's 10 years of Drupal projects! We are bringing a big part of our team with us, showcasing our CMS with a "TRY DRUPAL" demo, giving you the chance to test out the foundation of our digital projects. Come and meet us at our booth in the exhibition hall!

Oct 11 2023
Oct 11

We contributed the module Search API Decoupled to allow the creation of engaging search experiences with client-side Javascript applications. 

In comparison to traditional server side search, client side solutions are much faster, because fetching and rendering search results is much more efficient then requesting full pages. 

We released the backend functionality of Search API Decouple in Drupal Dev Days 2023 in Vienna, the slides and video are available online.

However, Drupal needs a fully working search client so that developers and marketers can easily evaluate the functionality for their projects. 

Such a client should provide great search functionality out-of-the-box and should be easy to extend and style.

We applied for funding at the Drupal Innovation Contest Pitchburgh but we did not get it. 

But we decided to build and contribute the functionality anyway.

Jul 20 2023
Jul 20

I get offers almost every day from recruiters looking for Drupal talent. I appreciate sincere offers, but most fail in ways that are obvious to me as a candidate. I used to write a response to every offer, but that took too much time. Now I delete most offers as soon as I receive them. If you are a recruiter, this may baffle you. Why would I ignore your offer if I’m open to work? For recruiters who have been waiting to hear back from me, consider this my response. For recruiters who want to do better, here are some suggestions.

Know your target candidate

An experienced engineer who has been in a position for a long time has no reason to respond to a short term offer at an entry level pay rate. An entry level developer cannot meet requirements for years of experience with multiple versions of Drupal. You have to choose which kind of talent to recruit, or you will not reach anyone.

The most experienced candidates are usually well established in their jobs. If you want to recruit them, you need to understand what it takes for your job to be better than the one they already have. If you want to pay entry level rates, you need to offer candidates a path to obtaining the experience you require.

Blasting out a message with “Urgent” in the title and a long list of requirements shows high expectations, but I can’t imagine how it is supposed to attract either experienced or entry level candidates. Even worse is offering entry level pay but requiring qualifications that only an experienced candidate can meet. I will not be flattered that you think my experience is an exact match for your needs.

Be clear about what you are offering

Most of the offers I see completely fail to say what they are offering. Any decent offer starts with reasonable pay. I no longer respond to offers that do not include a target pay range. Disclosing a pay range does not weaken your negotiating position. You do not automatically have to pay the top of the range just because the candidate knows what it is. You and the candidate negotiate what the candidate will bring to the job to justify the top rate. If what the candidate offers does not justify being paid in the upper part of the pay range, you would pay in the lower part of the range.

If you contract with an agency you will have to pay about US$150 per hour for Drupal work. If you want to contract directly with an experienced developer, expect to pay at least US$80 per hour. If you are recruiting an established developer with a salary of US$150,000 per year with benefits, you need to offer more.

Money is not the only form of compensation that matters. Highlighting other forms of compensation helps your offer to stand out. These could include

  • Time off
  • Working remotely
  • A supportive office culture
  • Work-life balance
  • Time to work on personal projects
  • A clear plan for career advancement

Provide a path for entry level candidates

If the top of your pay range is US$60,000 a year, you need to open the position to entry level candidates. Instead of expecting candidates to show up with qualifications and experience, think about offering a path for them to obtain qualifications and experience. The need may be immediate, but wishing for experienced candidates who will work for entry level pay won’t make them appear. How can you provide a path to experience for entry level candidates?

The best way for a new candidate to obtain experience is to partner with a mentor.

  • The Drupal community has an active mentorship program for new contributors. Some mentors may be willing to work with your candidates.
  • If your company has experienced developers, assign them to mentor new candidates.
  • Work with candidates from a company that provides training, such as DrupalEasy, Debug Academy, or EvolvingWeb.

Other ways you can give new candidates experience:

  • Sponsor them to work on projects for non-profit organizations.
  • Sponsor their contributions to open source projects related to their work.

Artificial intelligence is at a point where it can help developers boost their skills quickly. Providing access to artificial intelligence will help entry level candidates improve their productivity, but they need to know enough to check their work, find documentation, and collaborate with other developers. When access to code has to be protected, developers may not be able to use artificial intelligence.

Understand the role

Most job offers I see are getting better at this. Drupal work encompasses information architecture, site building, and front and back end coding. To get really good, most people have to focus on of these. The mythical full stack expert who does a complex, highly customized project all on his own in a short period of time is usually not available. If you cannot be specific about the type of work, you may not be ready to recruit candidates.

The project matters

It’s not just the work that matters, but what the project is and who it is for. Would I be proud to be working on this project? Do I respect the company? This is where you can really make your offer stand out from the rest.

  • Is the company is a leader in their field?
  • Will the project have a positive impact on my community?
  • Will I be working with colleagues whose past work is widely respected?
  • Is it an opportunity for me to apply my skills in a new field?

The personal touch

If you have done everything I have discussed so far, I am really going to be impressed with your offer. I may even pass it along to my talent network. If that was your goal, you can stop now. But if you want to guarantee that I will not only be impressed but also personally respond to your offer, there is one more thing you can do. Give it the personal touch.

What could make the job personally meaningful to me?

  • Are there people I have worked with before who want to work with me again?
  • Is the work closer to my area of interest than my current job?
  • Does it support a cause I believe in?
  • Is there any existing connection to me or my family?
  • Does the company know about and want to support work I’ve been doing on my own?

If you add the personal touch to a good offer, you will definitely hear back from me, and I will give serious consideration to your offer.

If you need help implementing any of my suggestions for a position you are trying to fill, please contact me and I will do my best to help.

Jun 22 2023
Jun 22

Jill Farley, Ken Rickard, and Byron Duvall discuss their experiences with the Cypress front-end testing framework.

We want to make your project a success.

Let's Chat.

Podcast Links

Cypress.io

Transcript

George DeMet:
Hello and welcome to Plus Plus, the podcast from Palantir.net where we discuss what’s new and interesting in the world of open source technologies and agile methodologies. I’m your host, George DeMet.

Today, we’d like to bring you a conversation between Jill Farley, Ken Rickard, and Byron Duvall about the Cypress front end testing framework. Cypress is a tool that web developers use to catch potential bugs during the development process. It’s one of the ways we can ensure that we’re building quality products that meet our client’s needs and requirements. 

So, even if you aren’t immersed in the world of automated testing, this conversation is well worth a listen. Without further ado, take it away, Jill, Ken, and Byron.

Jill Farley:

Hi, I'm Jill Farley from Palantir.net. I'm a senior web strategist and UX architect. Today, I'll be discussing Cypress testing with two of my colleagues. I'll let them introduce themselves, and then we can have a relaxed conversation about it.

Ken Rickard:

I'm Ken Rickard. I'm senior director of consulting here at Palantir.net.

Byron Duvall:

And I am Byron Duval. I'm a technical architect and senior engineer at Palantir.net.

Jill Farley:

Well, thanks for sitting down and talking with me today, you guys. We are going to maybe just start this off for anybody who doesn't know what Cypress automated testing is with a quick, maybe less technical overview of it. So, Ken, what are Cypress tests in 60 seconds?

Ken Rickard:

In 60 seconds, Cypress is a testing framework that is used to monitor the behavior of a website or app in real time within the browser, so it will let you set up test scenarios and record them and replay them so that you can guarantee that your application is doing what. You expect it to do when a user clicks on the big red button.

Jill Farley:

I did not time you, but that was brief enough. To give a little context for why we're talking about this today, our technical team, led by Ken, just recently developed a virtual event platform for one of our clients. Actually, it was developed a couple of years ago.

We've been iterating on it for a few years and it's unique in that it debuts for a few intense weeks each year. It's only live for a couple of weeks and specifically hosts this virtual event for four days, and then goes offline for the rest of the year. So, we have to get it right and we specifically have to get it right for the tens of thousands of visitors over the course of that four-day event that are coming.

So, this year we really went all in on Cypress testing to really ensure the success of the event.

So, Ken, I've heard you say we have 90% test coverage on this particular platform right now after the work that we've done. What does that mean? What does 90% test coverage mean?

Ken Rickard:

"It means we can sleep at night," I think, is what I mean when I say that. It’s simply that roughly 90% of the things that an individual user might try to do on the website are covered by tests. So, I joke a little bit about what happens when you press the big red button.

I mean, we have big orange buttons on the website, and the question becomes, "What happens when you press that button? Does it do the thing you expect it to do?" Also, the content and behavior of that button might change depending on whether or not we're pre-conference, we're during the conference, or we're during a specific session in the conference, and that changes again post-conference.

So, we have all of these conditions that change the way we expect the application to behave for our audience. I'll give you a simple example. During a session, the link of the session title, when you find it in a list, doesn't take you to the session page. It takes you to the video channel that's showing that session at that time. That is true for most sessions for a 30-minute window during the entire conference.

Our testing coverage is able to simulate that so we know, "Yep, during that 30-minute window, this link is going to go to the right place." So when we talk about that sort of 90% coverage, it means, from an engineering standpoint, well, even from a product management standpoint, you can look at the feature list and say, "Well, we have 300 features on this website and we can point to explicit tests for 270 of them."

Those numbers I just made up, but that gets you the point.

Jill Farley:

That sounds like an incredible amount of work to try to understand what to test and what types of tests to write. I'm actually going to go over to Byron for a second. As a member of the development team, what was it like actually writing these tests, creating them, and using them?

Up front, prior to them actually doing their job and covering our bases, if it was hard, let's talk about that.

Byron Duvall:

It was interesting and it was different because we use a different language for the testing. There's a lot of special keywords and things that you have to use in the testing framework, so just learning that was a bit of a curve.

Then, the biggest issue, I think we ran into with all of the testing was the timing of the test. The Cypress browser runs tests as fast as it can, and it runs faster than a human can click on all of the things. So, you start to see issues when the app doesn't have time to finish loading before the test is clicking on things. You have to really work to make sure that you have all of the right conditions for that to happen, and everything loaded, and you have to specifically wait on things.

I think that was the most challenging part. We usually had an idea of what we were looking for when we were writing a piece of functionality, what we were looking for it to do. That was kind of an easier part because we could write the click commands and write the test for what we're actually looking for in the return on the page.

So, that was the easiest part of it. The trickiest part was just the whole timing issue.

Jill Farley:

So Ken, how do we decide what to test if we're doing hundreds of tests? Is there ever really an end to what we can test or how do you do that prioritization?

Ken Rickard:

There is a theoretical end because we could theoretically cover every single combination of possibilities. You go for what's most important, and for what the showstopper bugs are. So, for instance, three simple examples of the first test we wrote, actually test #1: Do the pages that we expect to load, actually load? And do they have the titles that we expect them to have when we visit them?

Test #2: Those pages all exist in the navigation menu. Does the navigation menu contain the things we expect it to? And when you click on them, do they go to the pages we want them to? Also fairly simple.

Then we start to layer in the complexity because some of those menu items and pages are only accessible to certain types of users, certain types of conference attendees on the website. So you have to have a special permission or a pass to be able to see it.

So test #3 would say, "Well, we know that this page is only visible to people who are attending a conference in person. What happens when I try to hit that page when I'm not an in-person attendee? And does it behave the way we expect it to?"

So we start from the sort of show stoppers, right? Because if someone has paid extra money for an in-person ticket, but I let everyone view that page or don't treat that in-person user as special in some way, we're going to have angry clients and angry attendees. So we sort of test that piece first.

Then it's a question of, I would argue, testing the most complicated behaviors first. Like, what is the hardest thing? What is the thing most likely to go wrong that will embarrass us? And in that case, it's we have a whole bunch of functionality around adding and removing things to a personal schedule.

And we counted it up and over because we have pre-conference, during conference, post-conference there turned out to be 15 different states for every single session and we have tests that cover all fifteen of those states. So that we know what happens when you do, like I say, click the big something at the specific time.

So that's really how we break it down.

Jill Farley:

Makes tons of sense. Sounds like those are both, well, it brings new meaning to coverage. We're not just covering the functionality; we're kind of covering our butts too, making sure that, you know, we're not missing any of the big things that could really affect the attendee experience and the business, I guess, the business focus of the conference.

Ken Rickard:

Right. And the other thing too, the, well, those two other things as well. It does let us focus on what the actual requirements are because there are times when you go to write a test, you're like, "Well, wait a minute. I'm not sure what this thing should do if I click on it. Let's go back to the project team and find out." And we did that a number of times.

And then when you have something that's complex and time-sensitive, the biggest risk you run from a development standpoint, I think, is "Oh, we fixed issue A but caused issue B." So, you get a bug report. You fix that bug and it breaks something else.

Complete test coverage helps you avoid that problem. Because we broke tests a lot and you'd see a failing test and be like, "Oh wait, that thing I just touched actually has effects on other parts of the system." And so having those pieces again gives us a better product overall.

Jill Farley:

So test failures could be a good thing in some cases.

Ken Rickard:

Very much so. I actually was reviewing someone's work this morning and they had to change the test cases. I don't think they should have, based on the work they were doing. So, I was reviewing the pull request and I said, "Hey, why did you change this? This doesn't seem right to me because it indicates a behavior change that I don't think should exist."

Jill Farley:

Byron, I want to ask you. I know that you were involved in some of the performance work on this particular platform. What do you think? Did our Cypress tests in any way prevent some performance disasters? Or do you think that it's mostly about functionality? Like, do the two relate in any way?

Byron Duvall:

I don't think we had any tests that uncovered performance issues. I can't think of any specific example. It was mostly about the functionality and it was about avoiding regressions, like Ken was talking about. You change one thing to fix something, and then you break something else over here. I don't think that we had any instances where a performance bug would have been caught.

Ken Rickard:

I would say every once in a while. One of the things that Cypress does is it monitors everything your app is doing, including API requests. I think it came in handy in a couple of cases where we were making duplicate requests. So, we had to refactor a little bit. These were pretty small performance enhancements, so yeah, nothing big around infrastructure scaling or things like that.

But Cypress could catch a few things, particularly, I mean, if you're talking about a test loading slowly. It's like, “Oh, we have to wait for this page to load.” That can be indicative of a performance issue.

Byron Duvall:

Yeah, that's a good point. And then, the Cypress browser itself will show you every request that it's making, so you can tell if it's making lots of requests that you don't believe it should be making, or if it's making them at the wrong times. That could indeed be a way to uncover something, but it's really completely separate from the tool that we use to test performance outside of those other clues that you might get from Cypress testing.

Jill Farley:

So we've talked about Cypress testing and functionality. Can it test how things look? How things display?

Ken Rickard:

It can. But it's not a visual testing tool. It's not going to compare screenshot A to screenshot B, but we could write specific tests for markup structure in the HTML. For example, does this class exist inside this other class? It does have some tools for testing CSS properties, which we use in a few cases. Jill, you'll remember this. Are we using the right color yellow in one instance? Thus, we have an explicit test for, “Hey, is this text that color?”

Jill Farley:

That's the that would have been the big disaster of the event is if it wasn't the right color yellow.

Ken Rickard:

So, we do have a few of those which are visual tests, but they are not visual difference tests. That's a whole different matter. However, you can write a test to validate that. Let's revert to my previous example: the big orange button is, in fact, orange.

Jill Farley:

Just a couple more questions. Let's start with: What are a few ways that going all in on Cypress testing this year got in our way? Perhaps there's something we might do a little differently next time to streamline this, or maybe it's always going to get in our way, but it's worth it.

Ken Rickard:

I think the answer is, it can be painful, but it's worth it. The fundamental issues we had when we push things up to GitHub and then into CircleCI for continuous integration testing is that we don't have as much control over the performance and timing of the CircleCI running of the app as we do when we're running it locally. So tests that pass routinely locally might fail on CircleCI. That took us a long time to figure out. There are ways to get around that problem, which we are using.

The other significant issue is that to do tests properly, you have to have what are called data fixtures. These simply mean snapshots of what the website content looks like at certain points in time. They're called fixtures because they are fixed at a point in time, so they should not change. But because we were transitioning from last year's version of this application to this year's, there was a point where we had to change the content in our fixtures. We're actually about to experience that next week when we transition from the test fixtures we were using to a new set of fixtures, which contain the actual data from the conference - all of the sessions, all of the speakers, all of it.

Updating that is a massive amount of work. So being able to rely on things like, "Hey, I want to test that Jill Farley's speaker name comes across as Jill Farley," means we have to make sure that the content we have maps to that.

Jill Farley:

Is it fair to say that incorporating this into our development process, again, it's worth it, but it slows it down?

Ken Rickard:

I don't think it overall slowed us down. I believe overall, it might have increased our efficiency.

Byron Duvall:

Indeed, I believe it did increase our efficiency. It allows us to avoid a lot of manual point-and-click testing on things when we're done. We can do some development, write a test or have someone else write the test, and use that tool to do our manual testing while we're coding instead of just sitting there pointing and clicking. Even creating test data, if we have the fixtures in there like Ken was talking about, is a big help as well.

One of the things that did slow us down, however, was being able to differentiate between a timing or an expected failure versus an actual problem or regression with the code. There were instances where we didn't recognize which type of failure it was. We changed the test, but then we inadvertently broke something else in the app, so we should have paid more attention to that specific test failure.

When you're trying to troubleshoot various types of failures, timing failures, and things that may differ on CircleCI or just fail intermittently, figuring out whether it's a real problem or not can slow you down. It can also kind of defeat the purpose of testing as well.

Ken Rickard:

Right. In normal operation, which I would say we're in about 90% of the time, it simply means that on a given piece of work we're doing, we're changing just one thing, and that's the only thing we need to focus on. We can trust the tests to cover everything else. So, we can be assured that nothing else broke. The question then becomes, do we have a good new test for this thing that just got added?

Sometimes, I conduct pull request reviews without actually checking out the code and running it locally. I can just look at it and think, "OK, I see what you did here. You're testing here. You didn't break anything. OK." That's acceptable and, actually, it's a great feeling.

Jill Farley:

This is probably the last question. We've talked a lot about the benefits to the development process and kind of gotten into the specifics of how to do it. I'm curious for anyone who's considering incorporating this into their process, are there key, you know, kind of maybe two to three benefits from a business perspective or a client perspective? Why take the time to do this? I can actually think of one from a business side. I was sort of the delivery manager on this project, and in layman's terms, the manual QA process on the client side, once we sort of demoed this work to them, was so much shorter. I mean, last year, when we weren't doing this, there was a lot more pressure on the human point and click testing, like Byron was saying, not just on our development team side but on the client side as well. So as they're reviewing the functionality and really testing our work to see if it's ready for prime time, the development of a safer testing process really decreased the amount that we found during the final QA phases. It was really nice to sit back at that point and say, "Yeah, we've covered all of our bases." So Ken, what would you say are the biggest business benefits to incorporating Cypress testing into a product?

Ken Rickard:

The biggest business benefits, I would say, are getting a better definition of what a feature actually does because the developers have to implement a test that covers that feature. This creates a good feedback loop with the product team regarding definition. Another significant factor that derails projects is either new feature requests that come in at inappropriate times or regressions caused by making one change that accidentally breaks multiple things we were unaware of. You witnessed such incidents in past projects when we didn't have test coverage, but now we don't have to deal with that anymore. Those are the big ones.

It also occurred to me, as you were talking about the client, that one of the nice things about Cypress is that it records all the tests it runs and generates video files. Although we didn't share those with the client, we could have sent them the videos and said, "Okay, we just finished this feature. Here's how it plays. Can you make sure this covers all your scenarios?" They could watch the video and provide feedback. This potential is significant from a business standpoint because it allows for various asynchronous testing. It's funny because when you play back the Cypress videos, you actually have to set them to play at around 1/4 speed, otherwise, it's hard to follow along.

Jill Farley:

So that question I had about does this slow us down? It sounds like we make up the time.

Ken Rickard:

Oh, we definitely make up the time. Yeah, most definitely, again, just in in catching regressions.

Jill Farley:

So, business benefits we've got: saving the client time and heartache on the final QA, we've got getting to a better definition of what a feature does, safeguard against regression. Byron, do you have any other thoughts on the sort of business benefits of Cypress?

Byron Duvall:

I think that pretty well sums it up. I don't think I would add anything to that list, that's pretty comprehensive.

Ken Rickard:

I mean, it's a little selfish to say, but I think it made us better developers, because you have to think through all of the implications of things.

Jill Farley:

Well, I think that's a great statement to end this conversation on. You both feel as though you're now better developers because of this experience. Thank you both!

Hopefully, this gave everyone an idea of how we went about it, some things to plan for, and a bit of guidance on how to approach introducing Cypress into your process. Thank you both so much for your time, and happy Cypress testing!

Development Drupal Open Source Site Building
Apr 17 2023
Apr 17

Nowadays, NGOs and nonprofits need as much exposure as they can get. The days of local advertisements are beginning to wane. We are now in the digital age. The era of online and the Internet.

It’s almost impossible to overstate the impact the Internet has had on the modern world. Of the many things the Internet has allowed people to do, one of them is the ability for a message to reach more people than ever thought possible, from places all around the world. Any organization that wants to put its name out there and be heard now has to create a website.

 

Maximum exposure may not be the only reason nonprofits or NGOs want a website, however. Recently, we see a shift by many big organizations to have a larger online presence. Many companies offer their services through their main website and communicate with their patrons on social media. While getting your message out there may be important, the way users engage with the website and the services it provides are equally crucial factors for any organization.

The prospective organization looking to make the transition to the digital age, or those who might already have a site but want to utilize it to their full potential, may find the abundance of CMSs to choose from dizzying. Organizations should, however, consider using Drupal for their websites, and here are just a couple reasons why.

 

Cost

The cost of building and maintaining a website can vary wildly. It depends on what platform you use, server costs, whether or not you hire professional help to design and maintain your web page. Due to their nature, it may be in a nonprofit or NGO’s best interest to try and find the most cost-effective way to build a website. While a simple drag-and-drop pre-built theme may look pretty and be the cheapest option available, these often lack the features under the hood that nonprofits and NGOs require for a really polished, professional user experience.

If you’re looking to keep costs down, you will be pleased to know that Drupal is license-free and open-source. This keeps costs down, as the usually exorbitant licensing fees that other CMS applications ask for aren’t present in Drupal. In addition, all modules and themes that are found on Drupal’s website, and several of those that aren’t, are free to use for web development.

While other CMS vendors may attach additional costs for each new server, ongoing maintenance, each module a prospective buyer would like to purchase, and, on top of all this, charge a monthly or annual licensing fee to use their system, Drupal’s suite of features and modules built with the idea of community-supported, open-source software in mind makes it an attractive option for organizations that want a CMS that won’t put a hole in their wallet.

Easy and Ready to go

Designing websites can take a while to do, and that includes both the visual design and the underlying code that makes the site run. The longer it takes to get a website up and running, the more it can eat up an organization’s precious budget, especially when working with a professional web development agency.

Sometimes it may not even be a budget issue, but rather an issue of time. A site may need to be put up as quickly as possible on the heels of a natural disaster or some event that needs attention. Drupal can help you get that website online fast.

Drupal comes with a number of features that allow users to get a website out there doing what they want right out of the box. Themes and website builder kits such as YG Charity and OpenAid are built with cause-driven organizations in mind, allowing users to quickly set up a site with features expected of a professional website, such as blog integration, image galleries, team profile pages, and testimonials.

Some organizations have even developed their own website starter kits for any new chapters they have springing up. The YMCA, who have scattered organizations around the world, have Open Y, a digital distribution platform shared by the founding YMCAs in order to help fledgling YMCAs develop their own websites.

True to Drupal’s commitment to modularity and the nature of open-source software in general, these kits can simply be used on their own or used as templates to build upon for more customized looks and features that better benefit the particular message you want. Bits and pieces can also be borrowed from elements of this kit to help a web developer build their own features, allowing for much quicker development and deployment of said features. Such kits allow smaller NGOs, who may not have as many resources as other organizations, to develop websites and let their causes be heard.

Get a Free Consultation on Boosting Your Donations.

Scalability

Whether your nonprofit is small or large, Drupal has the tools to benefit websites of all shapes and sizes.

As previously mentioned, smaller organizations can benefit from Drupal’s low barrier to entry thanks to the ease of setting up a website quickly, but what about larger organizations that have more complex needs? Drupal is equipped to handle them as well.

Perhaps your organization is thinking of launching a website for a particular campaign while maintaining its own separate main website. Drupal has the ability to connect your main site to any further sites you may want to launch in the future.

Organizations such as the Great Ormond Street Hospital Charity have had great success using modules such as Organic Groups to launch and maintain several different websites for their multiple campaigns.

Drupal not only allowed them to extend their reach but also helped them handle what came with that growth. Under Drupal, their site is able to handle surges in user traffic, avoiding lag and crashes that would’ve soured the user experience during the time where it was most crucial: during a spike in user traffic.

Stories of organizations such as UNRWA showcase that Drupal’s flexibility means that it is able to handle any size website, no matter what their needs may be. Shameless plug: the UNRWA site is a Vardot project, so if you liked it, feel free to shoot us a message—we’ll be more than happy to accommodate you!

 

While launching other websites may seem daunting, especially once different content editors and site administrators get in the mix, this CMS supports editor-friendly features, such as role permissions, editing authorizations and the ability to tag and categorize site content. Drupal’s interface streamlines the editing of content, while editing authorizations keep content editors from changing something they aren’t supposed to and causing confusion.

With a little UI configuration, Drupal offers a nice, comfortable experience for your editors so they can provide a pleasant experience to those browsing your website.

 

Robust Features

Drupal, being an open-source CMS, has a wide array of features supported by a dedicated community. As we’ve already shown, many of these features are used to a nonprofit or an NGO, but with a vast suite of modules, there are many more that you may find interesting. It depends on what you want to do with your website.

Perhaps you want your message to reach more people. Drupal’s website contains multiple responsive mobile website themes to make mobile websites that look and feel great for those browsing your site on their phones.

Perhaps you’d like to not just reach mobile users, but people from all over the globe? Drupal features support for translating content shown to site visitors in their own local language to make sure the message of your cause reaches far, without the added complexity of maintaining multiple alternate-language websites. Furthermore, this CMS features the ability to change the language of the system interface, allowing ease of access to the people working on the website, wherever they may come from.

Perhaps you want your organization to receive donations online. Drupal modules like Payment are simple to install and allow you to securely accept donations through your website through multiple payment gateways like PayPal and credit cards.

If you want to show the progress towards a donation goal, there are multiple modules that integrate a donation thermometer onto your webpage to show your site visitors how close you are to hitting that goal.

Drupal is also great for SEO, for when you want your organization to be boosted in the search rankings to attract more traffic and get your message spread wider. There are a number of modules that help you do this. Pathauto automatically generates SEO-friendly URLs for your web pages. SEO Checklist is a to-do list of best optimization practices, checking your site for what you have already done and telling you what to do to have a fully optimized website.

These are only just a few of the Drupal modules available that fit the basic needs of a nonprofit or an NGO. With proper development using this CMS, you can make a website tailored to your cause and message with the features you need.

Conclusion

Web presence is an important thing in today’s world. An enormous amount of the population is on the Internet now. Thus, it becomes important for nonprofits and NGOs to put themselves into cyberspace to get their message out there and be noticed.

Just having a page on the Internet is not enough, however. SEO, site features, the overall user experience, the look and feel of the website—these are all important factors to maintaining a successful website, and these things need to be great whether you’re a big organization or a small one.

Right now, Drupal is used to power multiple global and local nonprofits and NGO websites (such as the UNRWA website developed by Vardot). With its broad community support and a flexible base system built to fit custom needs, Drupal offers all kinds of tools to build and benefit an organization’s website.

Are you an NGO looking to increase its online presence? Feel free to reach out to us, and we’ll be more than happy to help!

11.5+ Million USD Processed Through the UNHCR, UN Refugee Agency's Drupal Fundraising Platform Developed by Vardot.

Message us through our Contact Us page, or via email at [email protected].

Apr 06 2023
Apr 06

A stable version of Recipes is yet to be released in 2023, this initiative is a prominent part of Drupal 10.


Drupal 10 provides a powerful platform for building websites and applications. It offers various ways of site-building, including profiles, distributions, and now recipes. 

As part of the Drupal strategic initiative, site-builders & developers can greatly benefit from the improvements to be provided by the recipes initiative.

Drupal 10 recipes are expected to provide more flexibility and ease of use to site builders and developers, allowing them to create custom solutions that meet their specific needs. Although a stable version of recipes is yet to be released in 2023, this initiative is a prominent part of Drupal 10 features. 

This blog will help you understand how recipes are different from profiles and distributions and how they are a way forward in Drupal site-building. 

Understanding Profiles & Distributions


Profiles and distributions are often confused, but they are not the same thing. Drupal vanilla, or the basic Drupal installation, is relatively bare and lacks many of the essential features required to create a full-fledged website.

Profiles and distributions are pre-configured packages that contain a set of modules, themes, and configurations that can be used to create a specific type of website.

Profiles and distributions are pre-configured packages that contain a set of modules, themes, and configurations that can be used to create a specific type of website.

Distributions are built on top of Drupal and provide a use-case-specific package. They include a pre-selected set of modules, themes, and configurations that are designed to fulfill a particular use case.

For instance, a media and publishing distribution will include modules like feed, carousel banner, facet search, or similar features specific to media websites. 

Distributions are a great way to get started quickly and provide a solid foundation to build on top of.

Profiles, on the other hand, are subsets of distributions and are included in Drupal core. Drupal core comes with three installation profiles: Standard, Minimal, and Demo (Umami). 

Installation profiles determine the set of modules, themes, and configurations that are included in a distribution. 

For an e-commerce site, one could use the standard installation profile as a base and add additional modules and themes to customize the site.

Drupal vanilla is bare and lacks any pre-configured settings or features. Profile and Distribution are similar concepts in Drupal, but serve different purposes.

Enter recipes!

What are Drupal Recipes?


Recipes are a modular approach to site-building in Drupal. They are small use cases that can be easily combined or customized to create a unique solution. Recipes are like microservices that can be plugged in and played as needed.

Recipes are modular building blocks that allow developers to create custom site features quickly and efficiently.

A distribution is a use case that customizes Drupal to fulfill a specific need. Unlike distributions, recipes do not use installation profiles and can be tweaked at any point in the site-building process.

Installation profiles are part of Drupal core. These profiles determine which set of modules, themes, and configurations are installed during the initial setup of your Drupal site. 

The actual backend of any distribution is happening here, as installation profiles are responsible for setting up the initial site structure.

A Profile is a type of distribution that provides a more focused set of features for a specific use case. Profiles can be thought of as smaller, more specific distributions that cater to particular needs.

Unlike distributions, recipes do not use installation profiles and can be tweaked at any point in the site-building process.

To illustrate the differences between these site-building methods, let's consider an example. Suppose you want to build a news website that includes features such as a feed, carousel banner, and facet search. You could use a pre-built news distribution that includes these features out of the box. 

However, if you need to make further customizations, you would need to modify the installation profile or distribution, which could be time-consuming and complicated.

Alternatively, you could use an installation profile such as Standard and then install the necessary modules manually. This approach provides more flexibility, but it requires more effort and expertise to set up. 

Finally, you could use a recipe approach and install each required module and configure them individually. This approach provides the most flexibility but requires the most effort to set up.

Steps to install Drupal recipe

drupal-recipe-installation


 

Why Recipes?

One of the primary objectives of Recipes initiatives is to overcome the challenges, site maintainers and developers face with distributions and to:

  • Allow users to install multiple Drupal recipes on the same project, unlike the current scenario where selecting a distribution like OpenSocial prohibits the installation of another distribution like Commerce Kickstart or Thunder. This limitation will be eliminated, and multiple Drupal recipes can be installed on the same site.
     
  • Install a recipe at any point in a project's life cycle, which is currently not feasible. For instance, if a user wants to incorporate community collaboration tools in their site after a few years of using standard Drupal, they can do so without any impediment.
     
  • Simplify the process of maintaining the multisite architecture. This initiative aims to ensure that any changes made do not create additional challenges in this regard.
     
  • Make updating easier, which is currently a challenging task as every existing site is in a different state, the Update Helper module developed by a few distributions will be integrated into the core.
     
  • Make it easy for Drupal recipes to provide demo content, which is currently done in different ways such as importing from CSV or using custom modules, a functionality will be provided in the core to enable Drupal recipes to ship demo content.

What Drupal recipes are not

Drupal recipes have certain limitations, such as:

Wrapping Up

In conclusion, Drupal provides several site-building methods that allow users to create custom solutions to their specific needs. Profiles, distributions, and recipes are all powerful ways that can help you build your Drupal site efficiently and effectively. 

Drupal 10 recipes are an exciting addition to the Drupal ecosystem and will help make building websites and applications faster and more efficient than ever before. 

As a leading open-source community leader, OpenSense Labs has helped numerous enterprises transform their digital presence with our expert Drupal services and solutions. From custom Drupal development to UX design, we have the experience and expertise to help your organization succeed in the digital landscape.

Don't miss out on the opportunity to partner with a trusted and experienced team. Contact us today at [email protected] to learn more about how we can help you achieve your digital goals. 

Mar 23 2023
Mar 23

In order to create a flawless product, it needs to undergo unparalleled quality assurance and testing.

One of the preliminary testing techniques to understand the fundamental behavioral output of a system or application is black box testing. It aims to identify the functioning of the application such as usability, response time, and reliability, and categorize them under expected or unexpected outcomes.

A powerful testing technique, it exercises a system end-to-end. 
This blog will help you understand black box testing in detail, including its various techniques and tools used. 

What is Black Box Testing?

Black box testing is a software testing technique where testers do not have access to the internal code or structure of the system being tested. Instead, testers focus on the software from the perspective of an end-user, testing for input/output behavior, usability, and software functionality. It helps to ensure that the software meets the user's requirements, and also helps to identify potential bugs and errors that could harm the functionality of the software. This type of testing is crucial in ensuring that software is reliable and good quality for end-users.

Let’s understand with an example, suppose you are testing a website's search functionality. You know that users should be able to enter a search term and receive a list of results related to that term. You do not know how the search algorithm works, but you can test its functionality by entering different search terms and observing the results. 

Black Box Functional Testing Technique

Black box testing has various techniques that are used to test the functionality of an application. It is important to understand the concepts of each of these techniques to understand which one is the right for your project. Let’s take a look at some of the most commonly used black box testing techniques

  • ECP technique

Equivalence class partitioning (ECP) is a software testing technique that helps to identify test cases by dividing input data into equivalent classes. The goal of ECP is to reduce the number of test cases needed to achieve maximum test coverage while still providing effective testing.

The basic premise of ECP is that input data can be divided into different categories or classes based on their equivalence. For example, if a system accepts input values in the range of 50 to 90, input values can be divided into the following equivalence classes:

  • Valid input values - Input values within the range of 50 to 90 are considered valid input values and belong to this equivalence class.
  • Invalid input values - Input values outside the range of 50 to 90 are considered invalid input values and belong to this equivalence class.
  • Null input values - Input values that are empty or null are considered null input values and belong to this equivalence class.

    ECP Technique

By dividing input data into these equivalence classes, testers can identify a set of representative test cases that can effectively test the system. For example, a test case can be created for each equivalence class to ensure that the system handles each type of input correctly.

The Equivalence class represents or defines a set of valid or invalid states for each input after it has been validated.
The requirements and specifications of the software serve as the basis for this technique. The benefit of this strategy is that by going from an infinite to a finite number of test cases, it helps to shorten the testing period. It is appropriate for use at every stage of the testing procedure.

Let's look at one instance:

Let’s consider, a feature of the software application that takes a cellphone number with 10 digits.
Example of ECP Technique

Invalid 1 Test Case

Invalid 2 Test case 

Invalid 3 Test case

Valid

DIGITS>=11

DIGITS<=9 

DIGITS=10

DIGITS=10

98472622191

984543985

9991456234

9893451483

With a mobile number of 10 digits as an example, we can observe that there are two equally legitimate and invalid partitions. The valid partitions operate the same way, which is to redirect to the next page.
In the example above, two further partitions include erroneous values of 9 or fewer than 9 and 11 or more than 11 digits. When these invalid values are applied, these invalid partitions behave similarly and are forwarded to the error page.

The above example shows that there are only three test cases, which is consistent with the equivalence partitioning principle, which states that this technique aims to reduce the number of test cases. 

The benefit of the ECP Testing Technique

There are several benefits to using the ECP testing technique:

  • Increased accuracy: ECP can detect errors that might be missed by other testing techniques increasing the overall accuracy of the testing process.
  • Easy to implement: The ECP testing technique is not difficult to implement, and it can be used with a variety of platforms and software.
  • Improved efficiency: ECP can save time and effort by quickly identifying invalid input and reducing the need for manual testing.
  • Cost-effective: As compared to other testing methods, ECP is a cost-effective solution for software testing.
  • Reduction of production issues: ECP testing helps to identify issues early on in the software development process, reducing production issues and making it easier to fix problems before they become costly mistakes.

Overall, the ECP testing technique is a powerful tool for detecting errors and improving software quality.

  • Boundary Value Analysis Technique 

Boundary value analysis is a software testing technique that focuses on testing the input values at the boundary or edge of the acceptable input range for a system or application. It is a type of black box testing that helps to identify errors or defects in the software that might be caused by boundary conditions.
The basic premise of boundary value analysis is that errors often occur at the extreme boundaries of the input values, rather than in the middle of the input range. By testing these boundary values, testers can identify potential errors and improve the quality of the software.

For example, let's consider a system that accepts input values in the range of 1 to 100. To perform boundary value analysis on this system, the tester would focus on testing the following input values:

  • Minimum value- Testing the input value of 1, which is the minimum value in the acceptable range, helps to ensure that the system handles the smallest input value correctly.
  • Maximum value- Testing the input value of 100, which is the maximum value in the acceptable range, helps to ensure that the system handles the largest input value correctly.
  • Values just below the minimum- Testing input values just below the minimum value, such as 0 or -1, helps to ensure that the system handles values outside the acceptable range correctly and provides appropriate error messages.
  • Values just above the maximum- Testing input values just above the maximum value, such as 101 or 1000, helps to ensure that the system handles values outside the acceptable range correctly and provides appropriate error messages.
     
  • Decision Table Technique

The Decision Table Testing software testing approach is used to examine how the system responds to various input combinations. This methodical technique tabulates the various input combinations and the resulting system behavior.

Because the decision table records the causes and consequences of thorough test coverage, it is also known as a Cause-Effect table. Techniques such as decision table testing, which tests two or more inputs with a logical relationship, are frequently used.

There are multiple rules in the table for a single decision. A decision table's rules can be created by simply inserting AND between conditions.

In the below example, you will understand, how different input combinations, provide different results. Here “AND” is detonated by the sign of circumflex (^), Y stands for “Yes” and N stands for “No”. R1 to R4 stands for different conditions under certain input and outputs. 

The following are the major rules that can be extracted (taken out) from the table:

  • R1 = If (working-day = Y) ^ (holiday = N) ^ (Rainy-day = Y) Then, Go to the office. 
  • R2 = If (working-day = Y) ^ (holiday = N) ^ (Rainy-day = N) Then, Go to the office.
  • R3 = If (working-day = N) ^ (holiday = Y) ^ (Rainy-day = Y) Then, Watch TV. 
  • R4 = If (working-day = N) ^ (holiday = Y) ^ (Rainy-day = N) Then, Go to picnic.

As per below graph, There is no need to check the condition in R1 and R2. If the day is working, whether it is sunny or rainy, the decision is to go to the office.
Decision table Technique Example

Example of Decision Table Technique
So Outlook = Rainy and Outlook = Sunny. The following rules are the optimized versions of the previous rules R1 and R2. 

  • R1 optimized: If (Day = Working) Then Go To Office 
  • R2 optimized: If (Day = Working) Then Go To Office 

The refinement/optimization step produces rules that are effective, efficient, and accurate.

  • State Transition Table Technique 

State transition testing is used when some aspect of the system can be described using a 'finite state machine'.

This simply means that the system can exist in a finite number of states, and the transitions between them are determined by the machine's rule. This is the model that the system and tests are based on. A finite state system is any system in which the output varies depending on what has happened previously. A state diagram is a common representation of a finite state system.

If you ask for ₹100 from a bank ATM, you will be given cash. You may later make the same request but be denied the funds (because your balance is insufficient).

This refusal is due to the fact that the balance in your bank account has dropped from sufficient to cover the withdrawal to insufficient. The earlier withdrawal is most likely what caused your account to change state.

A state diagram can depict a model from the perspective of the system, an account, or a customer.

A state transition model is made up of four basic components:

  • the states that the software may occupy (open/closed or funded/insufficient funds)
  • the transitions from one state to another (not all transitions are allowed)
  • the events that cause a transition (closing a file or withdrawing money)
  • the actions that result from a transition (an error message or being given your cash)

It is important to note that in any given state, one event can only cause one action, but the same event from a different state can cause a different action and a different end state.
 one event can only cause one action, but the same event from a different state can cause a different action and a different end state.

An example of entering a Personal Identification Number (PIN) into a bank account is shown above.

The states are represented by circles, transitions by lines with arrows, and events by text near the transitions.

The state diagram depicts seven states but only four events (card inserted, enter a PIN, valid PIN, and invalid PIN).

There would also be a return from the 'Eat card' state to the initial state. 

There would be a 'cancel' option from 'wait for PIN' and the three tries, which would also reset the card to its initial state and eject it. The 'access account' state would mark the start of a new state diagram displaying the valid transactions that could now be performed on the account.

Use Case Testing Tests

The use case is a functional test of black box testing that is used to identify test cases from the beginning to the end of the system based on the system's usage. The team uses this technique to create a test scenario that can exercise the entire software based on the performance of each function from start to finish.

It is a graphical representation of business requirements that describes how the end user will interact with the software or application. The use cases provide us with all of the possible techniques for how the end-user will use the application, as shown in the image below:
 possible techniques for how the end-user will use the application,

The image above shows a sample of a use case with a condition related to the customer requirement specification (CRS).

We have six different features for the software's module P.

And in this case, the Admin has access to all six features, the paid user has access to three features, and the Free user has no access to any of the features.

As with Admin, the various conditions are as follows:

  • Pre-condition→ Admin must be generated
  • Action→ Login as Paid user
  • Post-condition→ 3 features must be present
  • And for Free users, the different conditions would be as below:
  • Pre-condition→ free user must be generated
  • Action→ Login as a free user
  • Post-condition→ no features

Who writes the use case?

The client supplies the customer requirement specification (CRS) for the application, and the development team drafts the use case in accordance with the CRS and then sends the use case to the client for review.
Explains Software Development Lifecycle

After the client’s approval, developers design and code the software, and the testing team writes test plans, and test cases for various software features. 

Test design techniques benefits include:

There are several benefits of the test design techniques, Let’s discuss them briefly. 

  • Efficient use of time and resources: Test design techniques help testers to identify the most important and relevant test cases that need to be executed. This makes the testing process more efficient and saves time and resources.
  • Improved test coverage: By using various test design techniques, testers can ensure that all the important features and functionality of the software are thoroughly tested. This improves test coverage and reduces the likelihood of defects being missed.
  • Better defect detection: Test design techniques help testers to identify potential defects early in the testing process. This allows developers to fix the issues before they become more difficult and costly to resolve.
  • Increased test effectiveness: Test design techniques allow testers to design tests that are more effective in identifying defects. This leads to higher-quality software and improved customer satisfaction.
  • Consistent testing: Test design techniques provide a structured approach to designing tests that ensure that each test is executed consistently.

Black Box Testing Vs White Box Testing

While both black box and white box testing ensure a flawless end product, It's important to understand what is the underlying difference between the two. 

Black Box Texting 

White Box Texting 

Performed by Software Testers

Performed by Software Developers

Software Implementation knowledge is not required

Software Implementation Knowledge is required

This approach treats the software as a black box, meaning the tester focuses only on the software's functionality and does not consider its internal structure, code, or design

This approach tests the software's internal code, design, and architecture

Coding knowledge is not necessary 

Coding knowledge is a must

Test the software from the end user's perspective

Focuses on testing the entire system, not just from the user’s side

To sum up 

Black box testing is used to find errors in the system without peering into the actual code. As mentioned above, it’s an efficient way to test larger code segments. This type of testing is often used to verify the quality and reliability of a system or application, by focusing on the user’s view of the system. 

With emerging technological trends you need a partner that makes sure your website is innovative and user-friendly. At OpenSenseLabs, we help enterprises provide a better digital experience. Contact us at [email protected] and let our experts help you out.

Feb 24 2023
Feb 24

At the beginning of the project, we need to understand the requirements of the client, then we adapt the design, update the components and after that, integrate it into Drupal.  

Here is an example to understand how we apply this solution and integrate it:

For example, if we have a card (web component molecule) that we want to integrate into Drupal, we need to figure out what information we want to show. Let's imagine that we want to display some short information about a blog content type. 

We can have different “types” of cards, for example, with an image, link, date, and/or with all the data that you want to show. In this case, we want to use a card where we have a title, an author, and a link to read the whole content. In the blog content type, we will use these “fields” to call inside the card, so in the case of the title of the card, we will use, for example, the “label”, and call it inside the template. And so on with all the information that we want to show.

This is how it is finally displayed:

Feb 10 2023
Feb 10

Use the same language for clear communication between teams 

Clear communication in design is important. We need to talk and speak the same language, the same structure and nomenclature. Here at 1xINTERNET, we have set up a consistent Design System (1xDXP) that forms the foundation of all our projects. Developers always use the same structure and the same naming conventions for their components.

A year ago, we started to use a design template in Figma, our main collaborative design tool in 1xINTERNET. We use the same structure and naming conventions for all the components that we are going to use in the new project. We have defined  FOUNDATIONS - COMPONENTS - PATTERNS - REGIONS and PAGE EXAMPLES.

The structure and the way that we use this template is the same across all of our projects and the basic elements like cards, forms, headings have been defined by the design team for our developers. So when a new project starts we can define the foundations based on the brand identity of the client, meaning that very quickly we can provide a first draft of the project aligned with the company brand.

For the design team, it is better to make more effort in the beginning in defining the functionalities and solutions that solve the bigger goals of the client  than to spend time redefining simple components or patterns. This type of task can be left for the second round of development when the client has already approved the major part of the design and is happy with the progress. 

We understand that we need to work quickly,  that clients need to see solutions. At this stage we are designing ideas, not the pixel perfect final result. Spending time on the overall design and main features is more important than working on the basic issues.

Jan 27 2023
Jan 27

Drupal 10 released in December 2022 and ever since, the community has been pushing its users to migrate to the latest version. 

As many as 54% of all Drupal sites are running on Drupal 7. 

Using an outdated version has downsides. Businesses miss out on technological advancements and new features that can speed up and safeguard their digital properties.

With the release of Drupal 10 and the fact that Drupal 7 will reach end of life in November 2023, it is crucial to migrate to Drupal 10 soon. Here’s why enterprises should plan their Drupal 10 migration now, and not wait until the last moment. 

Why should you migrate from Drupal 7 to Drupal 10? 

Drupal 10 brings automated updates, improved user experience, along with several other feature additions. These components will be more secure, user-friendly, and powerful. Let’s dive deep into why enterprises must plan their Drupal 7 to 10 migration. 

  1. Community support: As an open source CMS, community support is what keeps Drupal's continuous innovation ongoing. With Drupal community prioritizing and actively focusing on security of newer versions, as and when Drupal 7 reaches end of life, the community support will seize. This primarily jeopardizes the security of your Drupal 7 website. This also means that contributed modules and themes that are currently used in your Drupal 7 website, will also lose support for maintenance. This would bring challenges in website maintenance.
     
  2. New features and upgrades: Another consequence of not upgrading to latest version is that certain functionalities may cease to perform as intended. Or there may be better alternatives available. Not only can this cause extra annoyance among website maintainers, but resolving these issues may incur additional expenditures for your company owing to the time and resources required to do so. In Drupal 7, while developers had to manually upgrade/updates or search for modules from drupal.org , Drupal 10 has simplified this with Automated updates and Project browser, respectively. A lot of Drupal 7 features are either incorporated out-of-the-box in Drupal 10 or simply removed to maintain ease-of-use. 
    • Automatic updates: Drupal sites require manual upgrading, which may be challenging, time-consuming, and expensive. Delays in applying security upgrades can lead to hacked sites. The Automated Updates Initiative seeks to give Drupal sites safe, secure automatic updates. In order to minimize the total cost of ownership of maintaining a Drupal site, increase the security of Drupal sites in the wild, and lower the entry barrier for utilizing Drupal, a safe mechanism for automatically deploying updates in Drupal is to be implemented.
       
    • Project browser: Module installation and locating involve too many steps. Some steps call for you to navigate away from your Drupal site and visit Drupal.org. Other procedures, like utilizing Composer from the command line, need technical knowledge. For users who are new to Drupal and site builders, project browser aims to make it simpler to identify and install modules. This eliminates the need to visit Drupal.org or other sites. It is one of the primary Drupal strategic projects that determines the platform's development goals.
       
    • New themes: Olivero & Claro - The Drupal 7 "Seven" theme from 2009 gave off an out-of-date system impression. Seven was replaced by the new "Claro" theme, which was created in accordance with the most recent requirements. The front-end theme, "Olivero," was created to fit with features that are well-liked by users, such as the Layout Builder. The Olivero theme will be WCAG AA compliant.
      The simple finding and installation of modules should empower Drupal newcomers as well as "ambitious site builders" – Dries Buytaert
       
  3. Technical Dependencies: Drupal work on current supported PHP versions. Recommended PHP versions are the best choice for building a Drupal site because they will remain supported longer. Drupal 10 is built on PHP version 8.0 while 7 isbuilt on PHP 7 which is also reaching end of life. This creates technical dependencies in supporting the platform better.
     
    • jQuery, jQuery UI, jQuery Forms- Drupal 7 includes old and unsupported versions of these libraries. jQuery's current version is 3.6.x. Drupal 7 includes 1.4.2. Other libraries have comparable challenges. You may minimize this little with the jQuery Update module, although the most recent version is 2.1.x. Drupal 8 and later (as well as many other content management systems) make it simple to provide API access to your content. In the age of "publish everywhere," this is a critical feature. Drupal 7 has some basic API support, but if you want a full-fledged API with write support, you'll have to create it yourself, which adds technical debt and possible vulnerabilities.
       
    • CKEditor 5 update from CKEditor 4 - With a thorough rebuild and an exciting new feature set, CKEditor 5 gives Drupal 10 a modern, collaborative editor experience. Users of programs like Microsoft Word or Google Docs will be used to the new CKEditor's interface. It also offers common tools for collaboration like comments, change suggestions, version histories, and other accepted editing practices. Additionally, it has outputs to.docx and.pdf files for straightforward conversion to print formats. Although the integration is still being worked on, Drupal core 9.3 already has an experimental version if you want to try it out.
       
    • Composer 2 and PHP 8 support - Although backporting of composer 2 to Drupal 8 was successful, PHP 8 compatibility was not. PHP 8 will be required for Drupal 10 because PHP 7 will be discontinued in November 2022.
       
  4. Modules that have gone out of support: The Drupal 10 core will be updated to eliminate a few modules that are redundant or are not frequently used. For uniformity, these modules will be transferred to the Contributed Module area. Gathers and presents syndicated material from outside sources (RSS, RDF, and Atom feeds).
    • QuickEdit: In-place content editing
    • HAL - Serializes entities using the Hypertext Application Language.
    • Users may keep track of recent content with the activity tracker feature.
    • RDF – Enhances websites with metadata so that other systems may comprehend their characteristics.

You will have to leave Drupal 7 behind. Eventually, the opportunity cost of continuing to use software that is more than 10 years old is substantial, and once it is considered End of Life (EOL), the risk and expense of an uncovered vulnerability increases rapidly.

There are several possibilities available to you, and you have an additional year to choose and make plans for one of them. The ideal option will rely on the expertise level of your team, the amount of business logic you have included into Drupal 7, and your projected budget.

Conclusion 

Drupal 7 will reach end of life in November 2023, Drupal experts recommend that organizations begin migrating to Drupal 10 soon and not wait till November 2023.

If you want to migrate your website to Drupal 9 and prepare for Drupal 10, you may rely on our Drupal migration skills and expertise.

OpenSense Labs, as a Drupal partner, is committed to providing active support. Contact us at [email protected] for a long-term and fruitful collaboration.

Jan 20 2023
Jan 20

It goes without saying that (software) upgrades improve the overall performance of websites. Drupal is no exception. 

Regular updates to the Drupal core benefit not just the site owners in terms of security but also help deliver better user experience. 
 
Businesses should regularly update their websites to make them faster, secure, and easier to use.

Here’s why upgrading your Drupal website is crucial.

Why upgrading your website is important

Below are a few reasons you should prioritize upgrading your website.

  1. Security: Between November 2020 and October 2021, 5212 organizations worldwide experienced data breaches (source: statista). Delays before security updates are applied on site can result in compromised sites as seen in Drupalgeddon.

    Acquia is known to have observed more than 100,000 attacks a day.

    The scale and the severity of this Drupalgeddon brings to fore the importance of keeping websites updated on time. When enterprises fail to upgrade their sites on time, chances of it being compromised are very high.

    Websites that do not receive security upgrades are vulnerable to hacker assaults. The Drupal Security Team issues security announcements for all core vulnerabilities contributed modules that are very critical (labeled "highly critical"), requiring that available upgrades be implemented as soon as feasible. 
     

  2. Support & Maintenance: Community support is what enables Drupal's continual evolution as an open source CMS. The community support for Drupal 7 will wane as it approaches end of life since the Drupal community is actively prioritizing and concentrating on the security of subsequent versions. 
     
  3. Improved design and cost effective: The design of your website accounts for 94% of consumers' first impressions. So, if your website design is unappealing or unpleasant to users, your visitors will bounce off. Further, 38% of website visitors do not engage with an unattractive website, and the design alone accounts for 73% of your website's trustworthiness. The online equivalent may be your website, which deters potential customers from utilizing your services or purchasing your goods. It might no longer be a symbol of excellence. So an upgrade to the design is absolutely needed. A revamp is necessary, in my opinion, every year or two.
     
  4. New Functionalities: Another consequence of failing to apply timely upgrades  is that certain functionalities may cease to perform as intended. Not only can this cause extra annoyance among website maintainers, but resolving these issues may incur additional expenditures for your company owing to the time and resources required to do so. 
     
  5. Technology Benefit: Technology has also advanced, enabling us to forgo conventional, cumbersome Javascript writing in favor of more user-friendly and feature-rich ways to significantly enhance the user experience (JQuery, Prototype, etc.). corner boxes and shadows. HTML5 and CSS3 may significantly improved websites.
     
  6. Improved Speed: Your website has to load as quickly as possible since Google now considers page load time as a ranking factor. It's possible that your older website might use a speed boost to obtain some additional Google points using the new technologies and techniques for speeding up websites. Images should be optimized, compression should be used on the server, web pages should be cached, and CSS and Javascript should be minified.

Drupal for a better digital experience

Drupal is popular among enterprises because of its flexibility, modularity, and authoring experience. Drupal also provides several more perks and advantages that make it one of the top CMS.

Here are some advantages that businesses might gain by developing a website using a Drupal-based platform:

  • Being an open-source platform, Drupal has strong community support that makes website upgrade/migration for consistent branding simple.
  • Drupal as an enterprise CMS, provides a full range of functionality, including multi site management, themes, SEO, content control, and connectors.
  • It lets companies provide digital experiences consistently and uniformly across all of their journey's channels.

Even if we are aware of the numerous advantages that Drupal provides or the features that are more companies, marketers, and even digital agencies to this CMS, there are a number of factors that set Drupal CMS apart.

  • Content Presentation With A Headless Architecture
  • Personalization With Machine Learning And Predictive UX
  • Chatbots To Drive The Business Value
  • Exploring Markets With Augmented Reality (AR) And Virtual Reality (VR)

New Drupal Upgrades since Drupal 7

As of right now, Drupal 10 has just launched, and Drupal 7 is walking towards it's End of Life. It is advised that companies utilizing Drupal 7 start preparing for their migration to Drupal 10 immediately. Some of the features that set Drupal 10 apart as a unique version with some brand-new feature additions and some feature updates are listed below.

  1. Automatic updates - The goal of Drupal's Automatic Updates is to address some of the most challenging usability issues that arise when managing Drupal websites. Updates to the production, development, and staging environments are included, and certain integrations with the current CI/CD procedures are also necessary.
  2. Project browser - The Project Browser simplifies module discovery for site builders. When you pick a module, you will be given instructions on how to install it on your site. This browser is embedded into the Drupal site, so you don't have to leave it to search for modules.
  3. jQuery, jQuery UI, jQuery Forms- Drupal 10 runs on PHP 8, which will be phased out by November 2022. jQuery UI. Furthermore, Internet Explorer 11 and Drupal 10 are incompatible. Modern JavaScript components could someday take the role of the jQuery user interface.
  4. New themes: Olivero & Claro - The Drupal 7 "Seven" theme from 2009 gave off an out-of-date system impression. Seven was replaced by the new "Claro" theme, which was created in accordance with the most recent requirements. The front-end theme, "Olivero," was created to fit with features that are well-liked by users, such as the Layout Builder. The Olivero theme will be WCAG AA compliant.
  5. CKEditor 5 update from CKEditor 4 - Another outstanding upgrade in Drupal 10 is the new WYSIWYG editor. It is challenging to characterize it as just an upgrade of CKEditor from version 4 to version 5 because all the code was written from scratch.

This is not all here are some more insights on Drupal 10 features and modules

Conclusion 

Drupal's modular design and ready-to-use configurations offer quick market entry and the capacity to keep up with technological advancement. One of the top technologies reshaping the IT sector is Drupal, which gives organizations the adaptability and scalability to develop while keeping in mind the needs and preferences of their users.

As Drupal 10 is here, and most  businesses are planning their migration, We at OpenSenseLabs assist businesses in offering a superior digital experience. Email us at [email protected] so that our Drupal experts can assist you.

Jan 09 2023
Jan 09

The significance of SEO in digital business is no secret. Organizations invest thousands of dollars just to be on the top of the search engine result pages. Despite several tectonic shifts in consumer behavior, organic search delivers the most traffic. More on that later

SEO is more than just adding keywords.

An important part of doing SEO right is ensuring your technical SEO is set right. 

Apparently, organizations that have websites built on mature CMS like Drupal have to care less about technical SEO. All aspects of Google-identified site and page-level best practices are ensured out-of-the-box.

In this article, we’ll dive deep to understand different Google ranking factors and how Drupal ensures you stay on top of those trends.
 

SEO Objectives

Organic search remains the dominant source since budget restraints many marketers to opt for Google Ads for more visibility on SERP.

Best SEO practices not only ensure higher ranking but are also budget-friendly. 

According to research, 61% of Google traffic comes from organic searches. Before we move to understand SEO objectives, here's a look at organic SEO trends. 

Organic SEO trends

Some of your SEO goals can be: 

  • Brand Visibility: It simply means the rate at which your brand is visible to the target audience. SEO immensely helps enterprises to improve their brand visibility.

    The more people are aware of your brand online the better your chances of a higher conversion rate. In this digital age where people consume most of their information online, not having a solid online presence is simply not an option.

    Content marketing notably has a great role to play in brand awareness and when it gets infused with SEO it has the potential to make a significant transformation. 
     

  • Better ROI: Good brand visibility paves the way to high ROI. If you use SEO strategically then your conversions will increase. On-page SEO is an effective strategy to rank higher on search engines and drive more organic traffic to your website.

    The more people visiting your website the more likely you will get better conversions. 
     

  • Organic Traffic: Despite so many shifts in consumer behavior people still mostly use organic ways to search for any product/service.

    According to content marketing statistics 2022, quality content with optimized images can increase organic traffic by over 111.3%. On-page SEO addresses all of the tactics necessary for improved organic traffic.

    The improved technical aspects of your website contributed to a better user experience, ease to crawl, and thus increased organic traffic.

Let's understand some Drupal SEO modules that you can use to make your website SEO friendly and rank higher on search engines.
 

Site Level Factors & Modules

  1. Advanced CSS/JS Aggregation: Slow website load directly affects the search engine rankings. Advanced CSS/JS Aggregation is a must-have module for websites with a lot of CSS/JS files which means this module compresses your frontend files and speeds up your website. 
  2. CDN: A content delivery network is a distributed system of servers deployed in multiple data centers around the world. Website performance directly affects your search engine ranking and user experience. A website that takes an eternity to load will eventually lose most of its visitors. A CDN will distribute a load of your website across servers globally to reduce load time. This module will help the website in loading faster and thus improves user experience and website performance.   
  3. AMP: AMP also known as the accelerated mobile page is an open-source framework launched by a joint initiative of google and several other tech companies. Accelerated mobile pages are lightweight and designed to give mobile users lighting fast and more engaging experience. This module converts web pages as per AMP standards and helps web pages to load faster on mobile and tablets. 
     
  4. Search 404: 404 error occurs when a content of a URL is moved or deleted. It is considered bad for SEO as it hampers user experience and google bots also penalize such websites. When a web page is deleted or moved ‘search 404 module’ will do a search for URLs that are being moved or removed and show the result of the search instead of a 404 error. This improves the site’s SEO by making sure inactive URLs don’t hamper user experience by delivering the content as per the user’s query. 
  5. Blazy: Lazy loading is a technique for waiting to load particular parts of the website, especially images- when needed. The Blazy module provides lazy image loading to save load bandwidth and help reduce bounce rate. It also helps in saving bandwidth for important information before the website load completely. 
  6. Cloud Flare- This module helps your website load 30% faster, use 60% less bandwidth, and process 65% fewer requests. Cloud flare is a global network designed to make everything you connect secure, fast, private, and reliable. Fast content delivery and Improve SEO are some more features of cloud flare. 
  7. XML Sitemap: In the simplest terms site map is an XML file that contains all your website URLs along with additional metadata about each URL. XML site map creates an efficient site map of your website that is easy to crawl and automatically gets added to search engines like google, bing, yahoo, etc. Site map links pertaining to the content, menu items, taxonomy terms, and user profile can also be added. 
     
  8. Simple XML Sitemap:  This module helps your website in creating individual site maps for each webpage according to google guidelines and policies. Apparently, a simple XML sitemap helps your website get indexed faster on any search engine. This module creates multilingual site maps for entities, views, and custom links and it supports all Drupal’s content entities i.e. nodes, taxonomy terms, users, and menu links. 
     
  9. Robots txt: A robots.txt file tells search engine crawlers which URLs the crawler can access on your site. Use this module if you are operating numerous Drupal sites from a single code base and each one requires a distinct robots.txt file. This module produces the robots.txt file automatically and allows you to change it from the web UI on a per-site basis.
     
  10. Schema.org Usage: Pages with micro formatting tend to rank higher on SERP. Schema markup help website create formatted pages that help search engines understand your content easily. This module specifies the mapping between content types and fields with schema.org types and properties. 
  11. Menu breadcrumbs:  A website with a large amount of data would immensely benefit from the menu breadcrumbs module. As this module builds a navigational path for visitors that helps the website track users' exact behavior on the website and enhance user experience.

    Snapshot Breadcrumb
     

  12. Easy breadcrumb: Navigational design of a website describes how different pages of a website are organized and connected to each other. When a site is easy to navigate it increases visit duration. The easy breadcrumbs module helps in creating an accessible navigational website structure that is easy to navigate. 

  13. Link checker: This module is useful to detect broken links from the content. Both internal and external links can be checked via the link checker module. It occasionally removes broken links from your stored content by checking the remote sites and assessing the HTTP response codes. Broken links can easily be checked in the log/report section. In the “my account” section, author-specific broken links are stored. 

  14. Menu attributes:  As the name suggests, this module helps you add extra attributes to your menu section like class, ID, name, style, relationship, and target. A well-organized menu helps users’ website accessibility and experience. 

  15. Pathauto: User-friendly URLs are extremely vital for boosting your SEO efforts. Pathauto module allows you to generate a URL  based on your content. It is a must-have module for any website to make it SEO-friendly. 

  16. Metatag: Search engines usually prefer websites with optimized content and meta description. The meta tag module creates structured meta tags for your website so that your website ranks higher on SERP.  In addition to that this module also provides support to meta tags from other social media sites. You can furthermore control how your content is going to appear on different platforms. 

  17. Hreflang: Now if are not sure what hreflang is then it is an HTML attribute used to specify the language and geographical targeting of a webpage. When you have different geographies to target then your website should be multilingual. This module is best for websites with multilingual content. The Hreflang module automatically adds tags to your website’s enabled languages. 

    The multilingual website attracts a larger audience, improves the brand image, and has an edge over competitors. With only 25% native English speakers, it's a great opportunity to reach out to the rest 75%. 

  18. External Hreflang: This module allows you to add hreflang for external sites to easily detect spammy and faulty websites. External links connecting spammy websites degrade the user experience and can damage your brand presence and online ranking. 

  19. Redirect: Directing visitors and bots from one URL to another is called redirection. The key benefits of redirection are to improve user experience and help search engines better understand website content. The redirect module helps in merging websites, moving domains, deleting existing pages, and switching from HTTP to HTTPS. 

    301, 302, and javascript redirects inform the web browser that a page has moved from one URL to another and let visitors and users land on the right page.

  20. Domain 301 Redirect: A 301 redirect sends all the requests from the old URL to the new one. Domain 301 redirect help in redirecting users to relevant pages and page relevance congruent with the user’s query help in good ranking. This module allows sites to 301 redirects to a domain that has been selected as the main domain. 

  21. W3C Validator: HTML errors or poorly coded websites can hamper the site quality and bad site quality consequently damages your website’s SEO. W3C validator reduces the code size on your web page and adds value to your content. Small codes help your website loading speed on diverse platforms. It validates your HTML and CSS codes according to international coding standards.

Page level factors 

  1. CKeditor: It is a rich text editor that helps users write or edit the content inside the webpage. It has a nice writing interface where you can easily format and style content like adding a head, sub-heads, and numbering and bullet feature for better readability. 
  2. Disqus: Pages with lots of comments may be a signal of user interaction and quality. In fact, one Googler said comments can help “a lot” with rankings. Enabling comments on a blog has multiple benefits including the possibility. To engage with your readers and grow a strong community. Disqus module helps you integrate a comment feature on your website that helps your readers to better engage with your content.

    Snapshot-Benefits of comments in website
     

  3. Scheduler: This module can help content writers to plan and publish content as per their requirements. The scheduler module makes your overall publishing experience hassle-free and supports both Drupal 7/9. 

    Time Scheduling enables achieving your intended interest group in the circumstances they're on the web, regardless of the possibility of what that time is, hence it has a huge benefit in content marketing. 
     

  4. Automatic Alternative Text: Alt text is basically an alternative text for images. The whole purpose of alt text is to improve accessibility by describing what an image looks like to the user who doesn’t have the ability to see it. 

    SEO benefits of Alt-text

    This module automatically generates an alt text for your images if you missed putting one. Interestingly google prioritize images with meaningful alt text and it can be helpful in ranking higher on google.  
     

  5. ImageAPI Optimize: A blog with optimized images has always been the prime factor for higher ranking. Search engines prioritize images that have a proper title, description, alt text, file name, and caption. This module allows you to compress images without compromising image quality with flawless image optimization.
     
  6. ImageMagick: This is an open-source software suite that helps edit, display, and convert picture elements and vector files. It’s also helpful in creating image thumbnails, color correction, and liquid rescaling which basically means rescaling the image without distortion. 
     
  7. CKEditor Nofollow: A no follow link doesn’t authorize the website it is linking to. You can make no follow the link by Adding the “re=nofollow” code to your website. This module requires no modules outside of Drupal core and is used to add rel="nofollow" to links using CKEditor widget.
     
  8. Linkit: Internal links are the inclusion of web pages connected to each other. It can increase the ranking of the other pages within your website. It helps both readers and crawlers to easily find your site. Linkit gives you a nice interface for searching and links to internal content on your website instead of you trying to look for URLs manually. Within the editor section, you can select which URL you want to interlink and you will get the asked result.  
     
  9. Editor advance Link: This module helps you convert normal hyperlinks into a styled button. This module allows content creators and editors more control over link appearance or functionality. 
     
  10. CKEditor Entity Link: Drupal entities (content, tag, files) can be easily linked with the help of CKEditor Entity Link. This module is compatible with Drupal 9 and with all content entity types. It allows you to choose which entity types and bundles to search for and also provides an autocomplete box to make entity selection easier.

Drupal SEO Checklist Modules 

  1. SEO Checklist: This module is compatible with Drupal 8/9. As the name suggests Drupal SEO checklist module creates an automated SEO optimization checklist so that you can optimize your website hassle-free. If you are someone who likes to follow organized checklists for website optimization then this module is a must-have. 
     
  2. Real-time SEO for Drupal: If you like writing your content according to SEO guidelines then this module is quite helpful. SEO is a great source for organic reach and a higher SERP ranking is the prominent source of it. While writing, real-time SEO for Drupal provides inbuilt SEO-friendly suggestions as per best SEO standards.
     
  3. Requires on Publish:This module is required when your content is published or getting published. Requires on publish can be used when you have fields on your pieces of content, such as tags or SEO information, that editors don't need to fill up until the material is published. 


Conclusion

Drupal comes with countless SEO benefits and it delivers the best SEO results. All the modules that we discussed encompass all the features that can help you achieve the best quality results in 2023 and beyond. 

With changing Google updates and SEO trends, you need a technology partner that makes sure your digital game is top-notch. 

Our Drupal expertise ensures that your site measures up to Drupal’s capabilities in SEO and you leverage them for the top-most SERP rankings.

Looking to enhance your website’s visibility on Search Engines? Get the SEO results you want.

Contact us at [email protected] and let our experts help you out with SEO.

Nov 16 2022
Nov 16

Victorious in 2022 as well

And the winner is: 1xINTERNET! Right at the start of the event, our project for Dr. Willmar Schwabe GmbH & Co. KG was announced as the winner in the "Healthcare" category. We were delighted to receive the Splash Award from our partner company Acquia. Congratulations and many thanks to the Schwabe team and to Acquia Inc.!

For our client Schwabe, manufacturer of plant based pharmaceutical and health products, such as the well known brands Lasea, Umckaloabo and Tebonin, 1xINTERNET implemented a scalable multisite solution based on Drupal in only 3 months. The system is designed to build all new web properties of the Schwabe group and to manage and host them centrally. It gives a single face to the customer without hindering different markets by presenting themselves individually. The website is based on our 1xDXP Multisite Management System and can be rolled out successively in all 16 markets of the Schwabe Group. The solution meets the client's high security requirements and existing third-party solutions can be integrated. Storybook was used for the central design; for the central dashboard and hosting, Schwabe uses Acquia Site Factory. 

We invite you to learn more about this project in our video or by reading the project description.

Oct 18 2022
Oct 18

The Splash Awards are awarded each year to celebrate the best Drupal projects. The awards have taken place in the  Netherlands since 2014. In 2017 projects in German-speaking countries were honored with Splash Awards for the first time.

The Splash Awards are intended to honor not only Drupal service providers, but also the end users who do exceptional work. 1xINTERNET are happy to have been nominated in three categories for the German - Austrian Splash Awards.

Oct 07 2022
Oct 07

1xTalks at DrupalCon Prague 2022

Other highlights for us at 1xINTERNET were obviously our speakers. 

Lara Garrido Moreno, designer, Jose Nieves, and Mónica Rodríguez Cárdenas, both front-end developers at 1x attended their first DrupalCon this year. They hosted a session and gave guests an insight into how they work in projects at 1xINTERNET. Their session was called “The collaborative flow between design and frontend in the development of an atomic design system”. This talk was very well received by guests, there were a lot of questions and there seemed to be a lot of interest on the topic. Being able to share with others how we work at 1xINTERNET was a great experience and hopefully this group will have more opportunities to do so. 

For more information you can access the slides from the presentation here.

Dr. Christoph Breidert, co-founder and Managing Director of 1xINTERNET and Stefan Weber CTO & Managing Director of 1xINTERNET hosted a session about how we created a micro frontend solution using Drupal for a large European B2B/B2C food retailer. They showed the audience real examples of fully functional websites we have been working on at 1xINTERNET in the past months.

For more information you can access the slides from the presentation here.

Jun 29 2022
Jun 29

At the DrupalCon Portland community summit, we discussed how local meetup organizers could support each other by maintaining a shared slide show for meetups. I have started a slide show and am looking for volunteers to join me in building it. Leave a comment on the issue in the Event Organizers Working Group if you can help.

Apr 28 2022
Apr 28

When creating websites on Drupal, as developers, we should try to make our job easier. Managing modules, users, generating code – all these processes can be automated and performed with single commands. In this article, we'll take a look at the tools available and discuss them, giving specific examples of use.

1. Drupal Console

Drupal Console is a powerful Command Line Interface. It's used to generate boilerplate code and maintain and debug Drupal. The latest version of this tool is v1.9.8, released on 28 November 2021.

To add Drupal Console to our project, all we need to do is use one command:

composer require drupal/console:~1.0 \
--prefer-dist \
--optimize-autoloader

After that, we can use various commands provided by Drupal Console. We provide some examples below.

Module generation:

drupal generate:module  \
  --module="modulename"  \
  --machine-name="modulename"  \
  --module-path="/modules/custom"  \
  --description="My Awesome Module"  \
  --core="8.x"  \
  --package="Custom"  \
  --module-file  \
  --composer  \
  --test  \
  --twigtemplate

Entity generation:

drupal generate:entity:content  \
  --module="modulename"  \
  --entity-class="DefaultEntity"  \
  --entity-name="default_entity"  \
  --base-path="/admin/structure"  \
  --label="Default entity"  \
  --is-translatable  \
  --revisionable
  --has-forms

Service generation:

drupal generate:service  \
  --module="modulename"  \
  --name="modulename.default"  \
  --class="DefaultService"  \
  --interface  \
  --interface-name="InterfaceName"  \
  --path-service="/modules/custom/modulename/src/"

User creation:

drupal user:create  username password  \
  --roles='authenticated'  \
  --email="[email protected]"  \
  --status="1"

As we can see, Drupal Console gives us a lot of possibilities. Another interesting option that this Command Line Interface (CLI) provides us with is running a local PHP server to test our website.

$ drupal server

This command will launch a local server on port 8088 for us.

2. Examples for Developers

The Examples for Developers project is one great collection of examples of how we can write our own modules in Drupal. We have 33 different modules at our disposal, from simple blocks, through various types of forms, to controllers with REST API support. We'll probably find everything we need. This module will allow us to learn new things and work faster.

3. Devel

The Devel module includes additional functions and help pages for developers and administrators. It provides us with blocks and toolbars for quick access and developer information. We can use it to “simulate” another user. It's a very helpful functionality, especially when we need to test roles and permissions in Drupal. Devel provides us with features that help us with debugging. And the icing on the cake – we can use it to generate test content.

To install this module, we use Composer.

composer require --dev drupal/devel

4. Weight

Sometimes, it happens in our project that we use modules that use the same hooks. By default, Drupal doesn't allow to choose the order of module execution. However, we can work around this limitation in three ways.

Method 1 – setting the Drupal module weight during its installation

In the install file of our module, we can add HOOK_install and use it to set the module weight.

Drupal 9 provides a built-in feature to deal with this issue:

function your_module_name_install() {
  module_set_weight('[your_module_name]', [your_preferred_weight]);
}

It's a little more complicated in Drupal 7, because we have to change this field in the database by ourselves:

function your_module_name_install() {
  db_update('system')
    ->fields(array('weight' => your_preferred_weight))
    ->condition('name', '[your_module_name]', '=')
    ->execute();
}

Method 2 – Changing the weight in core.extension.yml

If we use configurations in our project, we can change the weight of our module in the core.extension.yml file after exporting them. The weight is shown as a number after the module name. The larger the weight, the earlier the methods in the module will be executed.

Method 3 – using Modules weight

The Modules weight Drupal module will add an easy-to-use configuration of module weights. When we go to the configuration page (/admin/config/system/modules-weight/configuration), we'll see this:

Setting module weights in a Drupal development tool – Modules weight

Here we can easily set the weights of our modules.

5. Settings.php and services.yml

We can find these two files in the sites folder of our Drupal installation. With just a few lines, we can make developing software a lot easier. It's a good idea to create a settings.dev.php file and put all the changes in it. During development, we can include it in the main settings.php file and remove it when we upload our website to the server.

An even more convenient option is to use an environment variable such as ENV and set it to "dev". Next, we check in settings.php if there's a settings.$env.php file. If so, we include it.

.
$env = getenv("D_ENV");

if (file_exists($app_root . '/' . $site_path . '/settings.' . $env . '.php')) {
  include $app_root . '/' . $site_path . '/settings.' . $env . '.php';
}

We can keep all our support changes for development in the settings.dev.php file.

What exactly can we do there?

1. Enable error display in addition to the message that our website has encountered a problem.

$config['system.logging']['error_level'] = 'verbose';

2. Disable CSS and JS aggregation – something we often forget when editing JS styles or scripts.

$config['system.performance']['css']['preprocess'] = FALSE;
$config['system.performance']['js']['preprocess'] = FALSE;

3. Disable render cache:

$settings['cache']['bins']['render'] = 'cache.backend.null';

4. Include the development.services.yml file.

$settings['container_yamls'][] = $app_root . '/' . $site_path . '/development.services.yml';

The content of the latter file may look like the one we show below.

parameters:
  twig.config:
    debug: true
    auto_reload: null
    cache: false
services:
  cache.backend.null:
    class: Drupal\Core\Cache\NullBackendFactory

We enable Twig debugging. These are helpful comments in the HTML structure that'll make it easier for us to find a template file or create hooks. In addition, we disable cache. Also, we add the cache.backend.null service, which we used earlier to disable the render cache.

6. Lando

It's a tool that facilitates the local development of our website. It allows us to run literally anything and is based on Docker. We have CLI at our disposal, which allows us to easily manage our installations. To set up a Drupal installation locally, all we need is a few commands.

First, we create a Drupal project using Composer.

composer create-project drupal/recommended-project lando-demo

Then, we go to the created directory and run the command:

lando init
  • Choose drupal9 as the recipe,
  • name your webroot web,
  • and then name your project.

After this process, a .lando.yml file will be created. We'll find there a lot of different information, such as the name of our project, what services we use (initially, these will include appserver and database), addresses of our application, PHP version, or access data for our database.

Next, we run the command

lando start

Now we can go to the address given in .lando.yml and finish the Drupal installation.

7. Drush

A tool that every Drupal developer should know. Drush has been with us since Drupal 4.7 and is still developed. As of writing this, the latest version is 11.0.8, and it was released on 9 April 2022. Drush allows us to manage our website by importing and exporting configurations, updating the database, or cleaning the cache. Newer versions of this tool also allow us to generate code. Let's go over some useful commands:

Cache cleaning

drush cr

Configuration importing

drush cim

Configuration exporting

drush cex

Creating a one-time login link

drush uli

Module enabling

drush en module_name

Module disabling and removing

drush pmu module_name

Database updating

drush updb

Watchdog checking (dblog module)

drush ws

When it comes to code generation, we have a lot of options to choose from. We recommend using the command

drush generate

and finding the option that interests us the most.

Drupal development tools - summary

We went through some interesting tools that we can use when working with Drupal. As we can see, Drupal development can be quite quick and pleasant when we know the tools that make the job easier. It's good to take some time to explore these solutions. This investment will pay for itself quickly!

Jul 16 2021
Jul 16

Since Drupal 8 there Tour module is in core. It is very useful when you need to guide user within your application. We use it In EK management tools targeted at users with small to medium scale companies. They usually have limited resources and time to spend on back office trainings. This is where the Tour module is very convenient to introduce functionalities to users who can quickly grasp the functions available to manage their back office.

You can try the latest version of Tour with the demo application where many of the forms and pages have a tour guide for users.

May 25 2021
May 25

Last week we released a new version of OpenLucius: a lean and fast Drupal social collaboration distribution with features like: groups, social posts, messages, group chats, stories, file -and folder management, notebooks, categories, activity streams, notifications, @-mentions, comments and likes.

OpenLucius 2.0 has been in production for the last ~5 months, it's stable enough to go into beta! We also keep on improving: what we globally did since last release:

  • Added new features;
  • Enhanced existing features;
  • Tweaked UI / Design.

All work we did was based on feedback we got internally, from our customers -and trial users. And we plan to keep it this way, so if you have ideas for new/better features: let me know!

The fastest way to explore OpenLucius is by trying it via the product site. And since a lot has changed, I though I'd make it easy on myself by just showing off with current main features -with the newest on top, here you go:

Task / Kanban Board (*Sneak peak*)

You can already try this task board, but it needs work to get it to an open source release. We plan on releasing this as an add-on contrib module:

Screenshot kanban board

@-mentions (*new*)

What we really missed in previous version where @-mention, so that's now included in texts and chats, with autocomplete:

screenshot @mentions

@group mention: As you can see, you can also mention everyone in current group.

Technical background: via Drupal core's CKEditor this was hard to accomplish, so we tested out other editors and came up with open source editor Summernote, a light-weight editor based on Bootstrap. It's extendable, able to facilitate inline-editing and it's also very nice that it automatically inherits theme styling (since it's not loaded via an iframe). 

Also, we are building a Kanban/Scrum board with highly interactive modals and for example: inline editing of card descriptions and comments. For that we also needed a lean editor.

And last but not least: we could tweak the editor UI, making it blend with the theme smoothly.

Summernote also facilitates drag-and-drop images & texts:

Drag/drop images (*new*)

So drag-and-drop images is now available in all text editors:

screenshot drag and drop images

It also has some great, user friendly, inline image options:

Screenshot image options summernote editor

General settings (*new*)

Set global colors, homepage tabs and homepage image:

screenshot General settings

Order book pages (*new*)

You can now order book pages easily, with unlimited depth:

Screenshot order pages

Groups

Screenshot groups home

  1. Group name, with drop down for group settings.
  2. Group sections, with activity badges that you can turn on/off per group.
  3. Group activity stream, bundled per day.

Activity streams (global and per group)

Homepage with activity stream example:

Screenshot homepage

  1. Home banner, configurable;
  2. Stories;
  3. Activity stream, personalised, bundled per day, per group;
  4. Your Groups, with link to group archive;
  5. Social posts, global, can also be turned off.

Social posts

Screenshot  social posts

Messages

Screenshot messages

Group chats

Screenshot group chats

Stories

Screenshot stories

File -and folders management

Screenshot docs and files

Notebooks

Overview:

Screenshot notebooks

  1. Hierarchical book pages;
  2. Order pages modal;
  3. Like and comment;
  4. Add file attachments to comments and notebooks.

Order pages easily, with unlimited depth:

Screenshot order pages

Use notebooks for example for:

  • Project documentation
  • Manuals
  • Notes
  • Web links
  • Agreements
  • Minutes
  • Ideas
  • Brainstorm sessions
  • Onboarding information
  • House rules
  • Customer information
  • ...whatever needs text.

Notifications (non-disturbing)

Screenshot notifications

Comments

Screenshot reacties

Likes

Screenshot Likes

Get it, got it, get that!

That's it for now, if you want to test OpenLucius this instant, that of course is possible via the product website. Or download and install OpenLucius yourself via the project page on Drupal.org

Planet Drupal Written by Joris Snoek | May 25, 2021
Apr 12 2021
Apr 12

Last week, I wrote about how free software has to break out of the customer-vendor mindset. The customer-vendor mindset doesn’t work with free software because users don’t pay and developers don’t provide customer service. The free software community works on a build-what-you-use model. I ended by saying that the build-what-you-use model is not enough to sustain hero developers—people who contribute at a level that cannot be sustained by their own use of free software. I suggested that hero developers are vendors and should sell their work to paying customers rather than releasing it to the community for free.

Another way hero developers can sustain contribution is by working for a large company. Large companies like Facebook, Apple, and Google routinely release free software to promote adoption of their products. When a company stops maintaining its free software, it is either abandoned or taken over by the community. Examples of projects being taken over by the free software community are XQuartz, LibreOffice, NetBeans, OpenBSD, and Firefox.

The problem with depending on a single large company is that the hero developer is vulnerable to a change in company priorities. Once a free software project stops being part of a company’s business plan, the company can no longer sustain the hero developer’s contribution. An example was Corel abandoning Linux as a condition of receiving investment from Microsoft.

Is there a way for the free software community to sustain hero developers? While the community works best with a wide base of contributors who are sustained by their own use of free software, there is massive value in the contributions of people who are motivated more by their love of a project than by what the project can do for them.

Sponsorship alliances

I believe contribution from hero developers can be sustained if users of their work form an alliance to sponsor their contribution. A free software project is an alliance of developers who have an incentive to recruit more developers so they can all benefit from each other’s work. Members of a sponsorship alliance would have a similar incentive to recruit more sponsors to the alliance because new members would increase the value of the alliance to existing members. If the alliance is large enough, the loss of one member will not affect the alliance’s ability to sponsor contribution from a hero developer.

There are many example of alliances being formed to support free software projects. After the Heartbleed vulnerability showed that the OpenSSL library had become unmaintainable, Google and OpenBSD both began working on replacement libraries. When the XFree86 project was unable to coordinate contributions from the community, the previously dormant X.org foundation took over stewardship of the X Window System. When OpenVPN was too difficult for many users, Linux vendors came together to release WireGuard in the Linux kernel. The Linux kernel itself is supported by the members of the Linux Foundation. All of these alliances were formed after deficiencies were found in the maintenance of projects that their members relied on.

If a hero developer is providing professional support for free, I would expect it to be difficult to build momentum to form an alliance. How then can hero developers encourage the free software community to form sponsorship alliances before their ability to contribute is exhausted?

My suggestion is that hero developers who want to do their work within the free software community withhold updates and support from their projects until they are sponsored. This is difficult for developers to do because they are always afraid of their work becoming outdated. To avoid that happening, hero developers should showcase their own work and incorporate the work of other contributors, but should not release their own work or support other users unless they are paid. This will highlight the extent of the hero developer’s contribution and the need for sponsorship to make it sustainable.

As a member of the Drupal community, I would suggest that our community could help with the formation of sponsorship alliances by providing a platform to set them up, manage their memberships, and link them to projects. I may examine the requirements for such a platform in a future article.

Apr 08 2021
Apr 08

I was able to spend some time in the DrupalCon Community Summit yesterday. One of our topics was how a paid ecosystem could align with Drupal core values. We have developers who are not getting paid for the work they do to support their projects, and users who complain about the support they are given as if they had paid for it. Developers feel like users are bad customers because they don't pay, and users feel like developers are bad vendors because they don’t provide support.

In my experience as a free software developer, I have never worked for free. I either developed for myself or I asked others to pay me to develop for them. The reason I participate in free software development is that my work is more valuable to me when others are free to improve it. Free software development means that when a project stops helping me reach my goals I can walk away and wait for someone else to step up and carry it forward. Of course, I do my best to make it easy for someone else to take over because the progress they make may benefit me in the future. Stepping back is difficult when you have become closely identified with your project. But you are not leaving a part of yourself behind. The work that you did will always be part of you, and if you decide to later you can continue it.

Free software development doesn't make sense if it just means users don’t pay. Free software only makes sense when its developers are also users and having more users makes it more valuable to its developers. If users do not participate in developing software, the developers are vendors and the users are customers. If a software project’s developers are going to be vendors rather than users, and its users are going to be customers rather than developers, it should not be provided for free.

The WordPress community has embraced the customer-vendor model. Developers create plugins and themes, and users buy them. The result has been that developers do not have an incentive to encourage users to participate in development, and they do have an incentive to create copycat versions of other developers’ plugins and themes to capture sales. The Drupal community in contrast has embraced a build-what-you-use model, encouraging all users to contribute to the projects that they use in any way they can and share the benefits with everyone. To keep our community healthy, we must replace the customer-vendor mindset with the build-what-you-use mindset.

The build-what-you-use model is not enough to sustain hero developers, who just want to spend more time contributing to Drupal. But in the long run, the community will be better off with a wide contributor base than it would be if it relied on a few heroic contributors. And when the customer-vendor model is a better fit for a project, developers can distribute it independent of the community and users can pay them directly.

Fund raising ideas

Community-led projects still have staff and bills to pay, and new users need support before they can become contributors. So here are some suggestions for how the Drupal community can raise money that are consistent with the build-what-you-use model. The idea is to charge to do for others the same things that we already do for ourselves.

  • Audits of Drupal hosting providers. Providers who pay can display a badge certifying their level of support for running Drupal sites.
  • Private collaborative development environments. We have already built a public version for the community. We could offer a private version for companies.
  • Automated testing. This is available for contributed projects. It could be made available as a paid service for private projects.
  • Hosted Drupal. Like simplytest.me, but you can pay to keep your site up and use your own domain name.
  • Site management dashboard. Something like a hosted version of Aegir that lets you administer all your Drupal sites, no matter where they are hosted.
  • Project status server. This is already provided for contributed projects, but could be a paid service for developers who want to distribute modules and themes only to paying customers.
  • Security coverage. Provided free for community projects that opt in to it, but could be made a paid option for private projects.
Jul 16 2020
Jul 16

Yesterday I had my fears confirmed about the Drupal Automatic Updates initiative. It requires sites to be able to modify core Drupal files. While this makes it easier to fix vulnerabilities, it is not something you want when your site is actually being attacked. The best way to protect against someone exploiting a vulnerability to modify your Drupal core files is to change the file permissions so that your site cannot modify them.

The Automatic Updates team has done amazing work to make the update process itself trustworthy. The problem is that if you allow your site to modify core files, you cannot guarantee that only trustworthy processes will use it. I’ve seen a site that had malicious code added to almost all its files and had malicious files created in all its folders. I’ve been on an IRC chat with a Drupal site administrator who was trying to clean up a site that was being attacked so persistently that as soon as it came back up it would be compromised again.

There is a workaround. Instead of giving the site permission to modify core files, you can give that permission to a different user and run a command line script as that user. But you can already create scripts to update your site automatically. If you want to use the admin UI, it will not work.

What makes this frustrating is that a solution already exists. Starting in Drupal 7, Drupal has supported SSH file transfer. The Update module uses it to update contributed modules. If Drupal does not have permission to change a contributed module, the Update module checks for the SSH2 PHP extension. If it exists, it prompts for the user name and password of the user who has permission to change contributed modules. No credentials are stored in Drupal, so only an authorized use can modify files. It seems that this feature is so obscure and rarely used that the Automatic Updates team overlooked it.

I hope to get support for SSH file transfer added to the Automatic Updates initiative before it is finalized.

Jan 03 2019
Jan 03

Context

EK application has a module that store personal documents for user. When user account is deleted, those documents may be transferred to another account.

To achieve that, we need to alter the user account cancel form when building the form, validating and submitting it.

Let's review the 3 steps.

BUILD

The form before altering it looks like this

cancel user account before hook

We need to add a field to select another user account to which the document of the canceled account will be moved to.

To achieve that we Implements hook_form_alter() in MyModule.module:


function MyModule_form_alter(&$form, \Drupal\Core\Form\FormStateInterface $form_state, $form_id) {    
  if ($form_id == 'user_multiple_cancel_confirm') {
        $form['move_uid_documents'] = [
      '#type' => 'textfield',
      '#title' => t('Move uer documents'),
      '#autocomplete_route_name' => 'MyModule.user_autocomplete',
      '#description' => t('Select to whom to transfer personal documents'),
    ];
    $form['#validate'][] = 'MyModule_form_user_delete_validate';
    $form['#submit'][] = 'MyModule_form_user_delete_submit';
    
    return $form;
      
  }
}

What we can notice here is:

  • We alter selected form defined by form ID. In this case : "user_multiple_cancel_confirm";
  • We create the required field by returning $form['move_uid_documents'] ;
  • We add 2 new actions for validation, $form['#validate'][], and submit, $form['#submit'][],  for the next steps.

After altering the form will look like this:

cancel user account after hook

We have a new field to select user. In our case, we also have an autocomplete function that helps selecting existing user. However, we need to ensure that the value entered in the field is really an existing user. This is the part handled by the validation.

 

VALIDATE

The validation is defined in MyModule_form_alter by adding validate callback named MyModule_form_user_delete_validate. Therefore, we need to create the function with thah particular name in MyModule.module.


function MyModule_form_user_delete_validate(&$form, \Drupal\Core\Form\FormStateInterface $form_state) {   
  if ($form['#form_id'] == 'user_multiple_cancel_confirm') {
        if ($form_state->getValue('move_uid_documents') <> '') {
            
            $query = "SELECT uid FROM {users_field_data} WHERE name = :n";
            $data = db_query($query, [':n' => $form_state->getValue('move_uid_documents')])
                    ->fetchField();
            if ($data) {
                $form_state->setValue('move_uid_documents', $data);
            } else {
                $form_state->setErrorByName('move_uid_documents', t('Unknown user to move documents'));
            }
            
 
        }
    
     return $form;
      
  }

Here the function will check against user_field_data table that the id is valid.

If not an error message will be displayed:

cancel user account validation error

However, if valid, we store the value to be used in the next step which is the submission.

SUBMISSION

As for validation, the submission is defined in MyModule_form_alter by adding validate callback named MyModule_form_user_delete_submit.


function MyModule_form_user_delete_submit(&$form, \Drupal\Core\Form\FormStateInterface $form_state) {
    
  if ($form['#form_id'] == 'user_multiple_cancel_confirm') {
    if($form_state->getValue('move_uid_documents')){
        foreach($form_state->getValue('accounts') as $key => $id) {               
        \Drupal::database()->update('MyModule_table')
           ->fields(['uid' => $form_state->getValue('move_uid_documents'), 'folder' => t('Moved from user @u', ['@u' => $id])])
           ->condition('uid', $id)->execute();
       }
    }
     \Drupal::messenger()->addStatus(t('Documents moved to user @u', ['@u' => $form_state->getValue('move_uid_documents')]));
     return $form;
      
  }
}

In the function above, we pick the id of each user account that is canceled and change to new user id in the document table.

The function also display a message to confirm actions: both cancellation and the submit hook have been executed.

cancel user submit alert

Please feel free to comment or suggest improvements.

Thank you.

Apr 14 2016
Apr 14

Drupal 8 performance: the Supercache module

Post date: 

Thu, 04/14/2016 - 00:00

Difficulty: 

Piece of Cake

The Supercache module is the result of an attempt to improve Drupal 8 efficiency when dealing with cache tag management and other design issues with several caching components that make it a pain to deal with Drupal 8 based applications that change a lot. 

An out of the box Drupal 8 install will issue about 2,100 database statements for a simple task such as performing a log in and creating two articles.

With a little setup and the Supercache module I was able to bring that down to 240 statements.

Here is a video proof that these numbers are real. The statement count is being measured real time thanks to the awesome SQL Server Profiler tool.

[embedded content]

The impact of the Supercache module - that was for a while a core patch - was benchmarked and proved to reduce wall times by about 25% and database queries by as much as 50% after things change (doing a cache write).

How does the Supercache module do this?

  • Drupal's cache system got heavier in Drupal 8 at the expense of making it more precise thanks to cache tags. But there are situations where you simply do not need all that bloatage. The Supercache module introduces a new and sleeker cache layer (of course without cache tags). A simple cache tag stored in Drupal's cache system takes up 196 bytes. The new caching system only uses 12 bytes. This does not seam like a big deal after all, it's just a few bytes difference. But it translates to being able to store 65,000 cache tags in 1MB of APCu/Wincache instead of just 5,000, But that is not the only advantage of this cache layer:
    • Reduced storage size, up to x12 less for small cache items.
    • Levarage native functionalities provided by most storage backends such as touch, counter, increment, etc.
    • Faster processing due to lack of cache tags and other extras.
    • Scalar types are stored natively so you can batch operate on the cache items themselves if the storage backend allows you to do so (database, couchbase or mongo)
  • Drupal 8 introduced the very useful ChainedFastBackend (that you can easily use in Drupal 7). But the current implementation of that backend has some design flaws - such as invalidating the whole fast backend when doing a cache write or not using cache tags in the fast backend. Supercache replaces the ChainedFastBackend implementation with one that solves those two issues improving hit rates in the fast backend on systems with lots of writes.
  • Replaces the default cache tag invalidator services (that works directly on the database) for one that leverages a similar concept to the ChainedFastBackend.
  • Introduces the ability for the key value storage to work similarly to the ChainedFastBackend.

To properly leverage what the Supercache module has to offer you should setup support for a centralized caching backend such as Couchbase

By: root Thursday, April 14, 2016 - 00:00

Jan 09 2014
Jan 09

Twig is one of the  good template engines which is provided by SensioLabs, It’s syntax originates from Jinja and Django template, it’s Secure, Flexible and Fast : 

Twig is a modern template engine for PHP

• Fast: Twig compiles templates down to plain optimized PHP code. The overhead compared to regular PHP code was reduced to the very minimum.

• Secure: Twig has a sandbox mode to evaluate untrusted template code. This allows Twig to be used as a template language for applications where users may modify the template design.

• Flexible: Twig is powered by a flexible lexer and parser. This allows the developer to define its own custom tags and filters, and create its own DSL.

I’m not going to write about twig itself in this Article and it might I’ll do in future but in this article, I’m going to write a custom twig extension.

It was a good news that  Drupal 8 want to  use some of  Symfony’s useful components  in its core, one of them is using Twig template Engine instead of Php template engine which faced a lot of Drupal Front-end Developers with  hard situation which let them do everything in template engine even DROP the Database completely!!!, but with using Twig template Engine we don’t let PHP codes work on template files, but on of the big disadvantages of this is restrict Front-End developer to use a limited number of filters and functions in template.

The solution is Extend the Twig Extensions easily, I’m going to write a custom filter to show how is it simple, let’s go.

1. Create A custom Module ( or you can do it in your exists custom modules) :

 in `/modules/` create a directory, call it ctwigfilters, then create  ctwigfilters.info.yml with following contain

name: Custom Twig Filters
type: module
description: 'Provide some custom twig filter and functions(later)'
package: Core
core: '8.x'

Create ctwigfilters.services.yml // It would contain following lines,

services:
  ctwigfilters.twig_extension:
    arguments: ['@renderer']
    class: Drupal\ctwigfilters\TwigExtension\MyHumanize
    tags:
      - { name: twig.extension }

3.Create MyHumanize.php   at ctwigfilters/src/TwigExtension/ It would contain followings in that,

Enable the Custom Twig Filters module and clear the cache, then you could use the “myhumanize” filters in your twig file,

{{ your-variable | myhumanize }}

Done, Yup, very simple, isn’t?

The source files of this Article is available on my GitHub at this URL

Ref:  https://symfony.com/doc/current/templating/twig_extension.html

Additional resource:  List of Twig Filter and Functions

   

Mar 30 2012
Mar 30

The command line: most programmers love its power; most web users fear its (alleged) complexities. But for those willing to dive in, the reward is great. Using Drush on Drupal can save you several hours a week just on website maintenance tasks alone. Here is a short list to get you started:

1. Download and Install Multiple Modules Simultaneously

Installing modules on drupal is easy. You just have to

  1. Go to Drupal.org
  2. Search for the module's page
  3. Download the module's zip file
  4. Unzip
  5. Move to sites/all/modules
  6. Open up your web browser
  7. Go to the modules page
  8. Click off the checkbox for the module
  9. Save your changes

That's 9 steps per module. And if you have 30+ modules to install, that can take up to several hours if you're not nimble with your keyboard and mouse. With drush, you can do it in 2 lines:

Download command:

drush dl addressfield admin_menu advanced_help amazon_s3 awssdk backup_migrate boxes calendar ckeditor ckeditor_swf coder commerce commerce_features commerce_feeds commerce_file commerce_product_key contemplate context ctools custom date devel devel_themer drupalforfirebug echo email emogrifier entity facebook_pull fb_social features feeds feeds_querypath_parser feeds_tamper feeds_xpathparser field_group filter_transliteration flowplayer getid3 globalredirect gmap google_analytics grammar_parser hacked html5_tools htmlmail i18n imce include job_scheduler jplayer jquery_update kfs libraries link location mailchimp mailmime mailsystem md5check media media_amazon media_browser_plus media_flickr media_youtube mediaelement menu_block migrate mimemail nice_menus omega_tools panels pathauto pathologic plupload quicktabs references relation rolereference rules search_api service_links services sexybookmarks skinr styles token transliteration viewfield views views_accordion views_bulk_operations views_pdf workbench workbench_access workbench_files workbench_media workbench_moderation wysiwyg xmlsitemap

Install command:

drush install addressfield admin_menu advanced_help amazon_s3 awssdk backup_migrate boxes calendar ckeditor ckeditor_swf coder commerce commerce_features commerce_feeds commerce_file commerce_product_key contemplate context ctools custom date devel devel_themer drupalforfirebug echo email emogrifier entity facebook_pull fb_social features feeds feeds_querypath_parser feeds_tamper feeds_xpathparser field_group filter_transliteration flowplayer getid3 globalredirect gmap google_analytics grammar_parser hacked html5_tools htmlmail i18n imce include job_scheduler jplayer jquery_update kfs libraries link location mailchimp mailmime mailsystem md5check media media_amazon media_browser_plus media_flickr media_youtube mediaelement menu_block migrate mimemail nice_menus omega_tools panels pathauto pathologic plupload quicktabs references relation rolereference rules search_api service_links services sexybookmarks skinr styles token transliteration viewfield views views_accordion views_bulk_operations views_pdf workbench workbench_access workbench_files workbench_media workbench_moderation wysiwyg xmlsitemap

The above commands may look ominous, but it's just drush 'command' and then a list of modules. Drush takes care of the rest.

2. Automatic Module Updates

Updating modules can be a pain. You have to check for avaialble updates, then repeat the process above in #1. Or, you can run: "drush up." This will tell drush to:

  1. check which modules are installed on the current site
  2. check to see if there are updates available
  3. notify you what modules are out of date
  4. ask you if you'd like to proceed
  5. download all the modules and place them into the proper location
  6. run update.php for you

This function alone saves me 2 hours a week.

3. Quickly Clear All Caching

You made a change to your site, but it's just not showing up! It might be a cache thing. Views, blocks, css, javascript: many components of Drupal are cached for performance. But this can make development difficult because you need to keep navigating to the admin areas to clear the system cache and flush the changes.

OR, you can run "drush cc all" to clear all the caching systems at the same time. This is super convenient.

4. Easy Backups

If you're developing on the bleeding edge (Drupal 7 with only dev versions of all of your modules, possibly with patches), you've probably experienced a corrupt database that simply could not be recovered. No fun. The easiest way to protect yourself is quick backups. But just like clearing your cache, you don't want to have to leave the page you're on and come back. Simply run "drush bam-backup" and a database copy will be generated and downloaded into the manual backups directory. For bleeding edge projects, I use this command compulsively because it's saved me so many times.

5. Control multiple sites

If you run more than one site (or a dozen sites), it can be tedious to manually update the modules at the same time. But drush allows you to create scripts and installation profiles, so you can quickly run the same commands on all of them. "drush @site1 up; drush @site2 up; drush @site3 up" would run the module update commands on 3 different sites, one right after another, without having to navigate your terminal to each site in between. This gives you a command center feel and allows you to connect to each site from one location, saving lots of time and focus for the bigger tasks at hand.

 

I hope you enjoyed this. If you have any questions or things to add, please leave a comment below. Are you using drush? If not, what's holding you back?

PS. If you're already a fan, you can buy the I heart drush here.

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web