Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
Mar 20 2020
xjm
Mar 20

Huge thanks for responding to our call for contributors posted at the beginning of the month. All must-have beta requirements have been completed, so, we've released Drupal 9.0.0-beta1! Drupal 8.9.0-beta1 will be released next week (the week of March 23).

The release of the first betas is a firm deadline for all feature and API additions. Even if an issue is pending in the Reviewed & Tested by the Community (RTBC) queue when the commit freeze for the beta begins, it will be committed to the next minor release only.

  • Developers and site owners can begin testing the betas after their release.

  • Sites must be on 8.8 or later to update to 9.0.0-beta1. Therefore, we will also provide bugfix releases of 8.8 and 8.7 that resolve known upgrade path criticals, so sites that have not been able to update to 8.8 can test the upgrade.

  • Once 8.9.0-beta1 is released, the 9.1.x branch of core will open for development. This branch is where feature and API additions should be made. All outstanding issues filed against 8.9.x will be automatically migrated to 9.1.x.

  • Alpha experimental modules have been removed from the 8.9.x and 9.0.x codebases (so their development will continue in 9.1.x only).

  • Additional fixes will be committed to 9.0.x under the beta allowed changes policy through the end of the beta phase on April 28.

  • The release candidate phase will begin the week of May 4th.

See the summarized key dates in the release cycle, allowed changes during the Drupal 8 and 9 release cycle, and Drupal 8 and 9 backwards compatibility and internal API policy for more information.

Drupal 9.0.0 and Drupal 8.9.0 are both scheduled to be released on June 3, 2020.

Bugfixes and security support of Drupal 8.8 and 8.7

Drupal 8.8 will receive additional bugfix releases through May 4, 2020, and it has security coverage through December 2, 2020. Drupal 8.7 has security coverage until the release of 8.9.0 on June 3.

Mar 20 2020
xjm
Mar 20

Huge thanks for responding to our call for contributors posted at the beginning of the month. All must-have beta requirements have been completed, so, we've released Drupal 9.0.0-beta1! Drupal 8.9.0-beta1 will be released next week (the week of March 23).

The release of the first betas is a firm deadline for all feature and API additions. Even if an issue is pending in the Reviewed & Tested by the Community (RTBC) queue when the commit freeze for the beta begins, it will be committed to the next minor release only.

  • Developers and site owners can begin testing the betas after their release.

  • Sites must be on 8.8 or later to update to 9.0.0-beta1. Therefore, we will also provide bugfix releases of 8.8 and 8.7 that resolve known upgrade path criticals, so sites that have not been able to update to 8.8 can test the upgrade.

  • Once 8.9.0-beta1 is released, the 9.1.x branch of core will open for development. This branch is where feature and API additions should be made. All outstanding issues filed against 8.9.x will be automatically migrated to 9.1.x.

  • Alpha experimental modules have been removed from the 8.9.x and 9.0.x codebases (so their development will continue in 9.1.x only).

  • Additional fixes will be committed to 9.0.x under the beta allowed changes policy through the end of the beta phase on April 28.

  • The release candidate phase will begin the week of May 4th.

See the summarized key dates in the release cycle, allowed changes during the Drupal 8 and 9 release cycle, and Drupal 8 and 9 backwards compatibility and internal API policy for more information.

Drupal 9.0.0 and Drupal 8.9.0 are both scheduled to be released on June 3, 2020.

Bugfixes and security support of Drupal 8.8 and 8.7

Drupal 8.8 will receive additional bugfix releases through May 4, 2020, and it has security coverage through December 2, 2020. Drupal 8.7 has security coverage until the release of 8.9.0 on June 3.

Mar 20 2020
Mar 20

Drupal 9.0.0-beta1 is available now. This release includes all the dependency updates, updated platform requirements (web server, PHP, and database versions), stable APIs, and features that will ship with Drupal 9. The stable Drupal 9.0.0 release is scheduled for June 3, 2020!

The beta release marks Drupal 9 as API-complete, so now is a great time to start getting your projects ready for Drupal 9. Most projects need a few small changes. A single project release can now be compatible with Drupal 8 and 9 at the same time, so you don't even need to release a new branch of your project to support Drupal 9.

Drupal 8.9.0 will also be released on June 3, 2020. It will contain the same features as Drupal 9.0.0 while keeping backwards compatibility with Drupal 8. This final minor version of Drupal 8 will receive long-term support, with bug fixes and security coverage until November 2021. Meanwhile, Drupal 9 will continue to receive new features in Drupal 9.1 (out in December, 2020) and beyond.

Mar 20 2020
Mar 20

Fostering Inclusion in Tech

In the previous article of this series, we talked about how fostering diversity, equity, and inclusion (DEI) in an organization is no easy feat. However, there are steps you can take to help get you on your way. When it comes to the hiring process specifically, it's important to hire in the spirit of openness, transparency, accountability, and have a shared vision of what constitutes success for the new position. By offering a welcoming process for applying, organizations attract excellent candidates who participate and collaborate well with the existing mission, vision, and values. As we continue learning how to do this successfully, we've rounded up some tips that might be useful to your organization.

Evaluate the Job Title

Look at current job-seeking tools like IndeedGlassdoorIdealist, to make sure the current job title, description, and range of responsibilities are appropriate and reasonable for that role. Modifying the title or level of the position to match generally accepted standards (i.e., the difference between "Senior Product Manager" vs. "Product Manager" vs. "Project Manager" vs. "Product Associate") may make the difference in who applies. 

Be Thoughtful with Language

Terms like "rock stars, ninjas, unicorns" do not suffice as descriptive language. Identify the bulleted list of actual skills required, as well as the desired background or experience, and consider the implications of the language. Hiring for a "unicorn" or a "collaborative team player," will receive different responses: the first from unicorns, the second from team players. Matthew Tift, James Sansbury, and Matt Westgate discuss "The Imaginary Band of Rock Stars at Lullabot" on the Hacking Culture podcast.

Cut out jargon to focus on the required skills and listed responsibilities of the job. If these are not yet clear, re-evaluate the role and its job description, and list out how a person will succeed in the role. 

Ruby Sinreich (http://lotusmedia.org), a web developer, technologist, and strategist who has worked in progressive advocacy organizations and online communities for over two decades, suggests the following tools for minimizing bias within the text (from the Drupal Diversity and Inclusion group):

Identify and Make Any Assumptions Explicit

List all relevant aspects of the position to attract the correct type of applicants and make the implicit assumptions of who can work in this role transparent.

Sample questions to address in the description:

  • Is travel included or required in this job?
  • Is there a need to lift heavy objects or crawl under desk spaces?
  • Is this a remote job or an on-site job?
  • Is the position salaried, contract, temp-to-hire?

Include all non-negotiable aspects of the work up front, and be explicit about what constitutes success. For example, a recent job description for a Tugboat Enterprise Account Executive position provided a coherent, attainable measure of success:

Like anything, we understand it takes a bit of time to ramp up to a new gig. At the end of 6 months, Lullabot will have spent roughly $70,000 in wages for the position, and we'd be looking to come in a little above break-even with this investment. Our minimum expectation is to hit a Monthly Recurring Revenue goal of $20,000 of new business by the end of six months.

Other questions to ask when measuring success include: Am I enjoying the work? Is the market opportunity substantial? Am I having fun? 

Publish the Pay Range

Include a pay range and whether or not the role is salaried, temp-to-hire, short-term contract, or a long-term contract position. When you provide a salary range, studies show that this level of transparency increases job applications by 30%. After all, no one wants to go through a lengthy hiring process only to find out the role isn't a financial fit.

Consider being clear about salary range, requirements, and perhaps, bands inside the role, and you'll come to a quicker agreement with the final candidate who has understood salary expectations from the beginning.

Clearly List Benefits

For many, health, vision, dental, retirement matching, flex-time, parental leave, paid time off, holidays and add-ons like fitness or technology budgets make a job significantly more attractive. At times, it might even be a determining factor. Display listed benefits in the "Work" or "Careers" section, e.g., our annual Events and Education budget is listed publicly on our website, among other benefits we offer.

Encourage People from Marginalized and Underrepresented Groups to Apply

Consider adding language that encourages applicants who identify as being from an underrepresented community to apply. It's important to go beyond the standard "equal opportunity" language and will make your job description appeal far more to diverse groups of people.

Comply with Federal, State, and Local Guidelines

Make sure the organization complies with any guidelines regarding harassment and discrimination. Here is some sample language about how hiring committees might consider candidates (from Green America's hiring statement):

All qualified applicants will receive consideration for employment without discrimination regarding actual or perceived

  • race, 
  • color, 
  • religion, 
  • national origin, 
  • sex (including pregnancy, childbirth, related medical conditions, breastfeeding, or reproductive health disorders),
  • age (18 years of age or older),
  • marital status (including domestic partnership and parenthood),
  • personal appearance,
  • sexual orientation,
  • gender identity or expression,
  • family responsibilities,
  • genetic information,
  • disability,
  • matriculation, 
  • political affiliation,
  • citizenship status,
  • credit information, or
  • any other characteristic protected by federal, state, or local laws.
  • Harassment on the basis of a protected characteristic is included as a form of discrimination and is strictly prohibited.

Focus on the Organization's Culture: Mission, Vision, and Values

Culture is one of the most significant determinants of whether or not the candidate will continue through with the process of applying. How do you attract high-quality teammates? The current organizational mission, vision, and publicly-stated values make a difference. What does the company, team, or project stand for? Say it loud and proud, and make sure the applicant understands organizational values. A blog post, "About" page, or video linked inside the job application will make values clear.

Circulate the Listing to Diverse Audiences

Change up and expand the networks where job listings get circulated. For example, Historically Black Colleges and Universities (HBCUs), remote job boards, or community groups that focused on a specific area, industry, or desired applicant pool, as well as many Slack channels, have job postings. Consider sharing the job post with the following networks first, and then expanding it to general job boards. Some examples include:

Identify Scoring in Advance

Have a sheet that lists out the evaluation system used when evaluating applicants. If possible, include this in the job description to surface candidates who will be able to speak to the desired points and provide transparency in how they will be scored. In parallel, this procedure works when evaluating RFP respondents; for example, here's a sample questionnaire (in the footer is the scoring mechanism) to evaluate a website redesign. 

Same Interviewers, Same Questions

To make a fair assessment, have all interviewers ask the same questions of all the finalists. Use the predetermined points system when interviewers compare notes. Evaluate against the organization's stated responsibilities, and cross-check against mission, vision, and values.

Consider Implementing the Rooney Rule

The National Football League policy requires league teams to interview ethnic-minority candidates for head coaching and senior football operation jobs. Consider making an effort to interview at least one woman or other underrepresented minority, for the role, to mimic the NFL's results: at the start of the 2006 season, after instituting the Rooney Rule (definition from the NFL) in 2002, the overall percentage of African-American coaches increased to 22%, up from 6%.

Offer Alternate Ways of Interviewing

If being successful in a particular role requires a whiteboard walkthrough, 20-minute brainstorming exercise, video or written component, teleconference demonstration, or another method, it is appropriate and understandable to ask for this during the interview process. For example, if the role requires teleconferencing, allow for one of the interviews for the finalists to be held on the teleconferencing software needed. However, don't make these the only mechanisms for evaluation. 

Consider offering multiple ways to answer questions to help the team make the best decision. It's also appropriate to ask for an existing portfolio or demonstration of existing products or tools that are relevant to the job. For example, if you're hiring for a designer, asking for a walkthrough of the three design projects for which the candidate is most proud of, is appropriate. 

For further reading, there's another in-depth review of the hiring process on MoveOn CTO Ann Lewis's blog, "How We Hire Tech Folks." Thanks to James Sansbury, Marc Drummond, and Andrew Berry, for reviewing and providing thoughtful comments and feedback.

Mar 20 2020
Mar 20

Mike and Matt talk with organizers of DrupalCon Europe about the organization of the conference, COVID-19, and differences between it and DrupalCon North America.

Drupal Landing Page - part 2

Mar 20 2020
Mar 20
Mar 20 2020
Mar 20

In previous article we have seen how we create a very simple theme with 1 main content region as page layout.

theme admin

Lets look now at the custom module to handle theme switch and content display.

Part 2: the custom module

The module called "land_page" structure is as follow:

module

Lets look at the most important parts specific to the land page.

land_page.info.yml : this is the standard module info settings.

land_page.rounting.yml : in this file we will define our land page routes. Those route will be the reference to switch theme. I.e.

default_land_page:
path: '/land-page _controller: '\Drupal\land_page\Controller\Controller::defaultLandPage'
requirements:
_access: 'TRUE'

The next important part is the class that manage the theme switching in ThemeNegotiator.php (see Drupal):

<?php
/**
 * @file
 * Contains \Drupal\land_page\Theme\ThemeNegotiator
 */
namespace Drupal\land_page\Theme;
use Drupal\Core\Routing\RouteMatchInterface;
use Drupal\Core\Theme\ThemeNegotiatorInterface;
class ThemeNegotiator implements ThemeNegotiatorInterface {
    /**
     * @param RouteMatchInterface $route_match
     * @return bool
     */
    public function applies(RouteMatchInterface $route_match)
    {
        return $this->negotiateRoute($route_match) ? true : false;
    }
    /**
     * @param RouteMatchInterface $route_match
     * @return null|string
     */
    public function determineActiveTheme(RouteMatchInterface $route_match)
    {
        return $this->negotiateRoute($route_match) ?: null;
    }
    /**
     * Function that does all of the work in selecting a theme
     * @param RouteMatchInterface $route_match
     * @return bool|string
     */
    private function negotiateRoute(RouteMatchInterface $route_match)
    {
        if ($route_match->getRouteName() == 'default_land_page')
        {
            return 'ek';
        }
        return false;
    }
}

When the visitor navigate on the website, the theme negotiator will check if the route = land page route and sitch to the appropriate theme.

In order to achieve the above you need to declare a service.

land_page.services.yml :

services:
    land_page.theme.negotiator:
        class: Drupal\land_page\Theme\ThemeNegotiator
        tags:
          - { name: theme_negotiator, priority: 1000 }

In the Controller.php we will create the function that is called in the land_page.routing.yml (defaultLandPage()):

/**
 * Default land page
 * @return array
 *
*/
 public function defaultLandPage() {   
    $items = [];
    $items['asset'] = drupal_get_path('module', 'land_page') . "/assets/";   
    return array(
        '#theme' => 'land_page',
        '#items' => $items,
        '#title' => '',
        '#attached' => array(
            'library' => array('land_page/land_page'),
        ),
    );
 }

In this function we define which template to use, we pass some $items for content and attach our library.

The them template is defined in land_page.module :

/**
 * Implementation hook_theme().
 */
function land_page_theme() {
  return array(
    // default
    'land_page' => array
    (
      'template' => 'land_page',
      'variables' => array('items' => array(), 'data' => array()),
    ),  
  );     
}

The library is defined in land_page.libraries.yml . Library will be very important as it will define all the custom css, js or external resources needed to render the page. A simple example of library  that include custom css, js and fonts will be:

land_page:
  version: 1
  css:
    theme:
      //fonts.googleapis.com/css?family=Barlow:400,500,600&display=swap: { type: external }
      css/land_page.css: {}
  js:
    js/js.min.js: {}
    js/land_page.js: {}

In the hook_theme(), the template called is "land_page" which is a twig template under template folder: land_page,html.twig. In this template you will build your html content to render the actual land page. This is where your creativity will start.

On big advantage of twig templates is that you can insert content from other Drupal source like webform or existing blocks directly into the land page.

Now you can install your module, navigate to your /land-page url and access to your land page content.

Mar 19 2020
Mar 19

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

The world is experiencing a scary time right now. People feel uncertain about the state of the world: we're experiencing a global pandemic, the OPEC is imploding, the trade war between the US and China could escalate, and stock markets around the world are crashing. Watching the impact on people's lives, including my own family, has been difficult.

People have asked me how this could impact Open Source. What is happening in the world is so unusual, it is hard to anticipate what exactly will happen. While the road ahead is unknown, the fact is that Open Source has successfully weathered multiple recessions.

While recessions present a difficult time for many, I believe Open Source communities have the power to sustain themselves during an economic downturn, and even to grow.

Firstly, large Open Source communities consist of people all around the world who believe in collective progress, building something great together, and helping one another. Open Source communities are not driven by top-line growth -- they're driven by a collective purpose, a big heart, and a desire to build good software. These values make Open Source communities both resilient and recharging.

Secondly, during an economic downturn, organizations will look to lower costs, take control of their own destiny, and strive to do more with less. Adopting Open Source helps these organizations survive and thrive.

Open Source continues to grow despite recessions

I looked back at news stories and data from the last two big recessions — the dot-com crash (2000-2004) and the Great Recession (2007-2009) — to see how Open Source fared.

According to an InfoWorld article from 2009, 2000-2001 (the dot-com crash) was one of the largest periods of growth for Open Source software communities.

Twenty years ago, Open Source was just beginning to challenge proprietary software in the areas of operating systems, databases, and middleware. According to Gartner, the dot-com bust catapulted Linux adoption into the enterprise market. Enterprise adoption accelerated because organizations looked into Open Source as a way to cut IT spending, without compromising on their own pace of innovation.

Eight years later, during the Great Recession, we saw the same trend. As Forrester observed in 2009, more companies started considering, implementing, and expanding their use of Open Source software.

Red Hat, the most prominent public Open Source company in 2009, was outperforming proprietary software giants. As Oracle's and Microsoft's profits dropped in early 2009, Red Hat's year-over-year revenue grew by 11 percent.

Anecdotally, I can say that starting Acquia during the Great Recession was scary, but ended up working out well. Despite the economic slump, Acquia continued to grow year-over-year revenues in the years following the Great Recession from 2009 to 2011.

I also checked in with some long-standing Drupal agencies and consultancies (LullabotPhase2 and Palantir.net), who all reported growing during the Great Recession. They attribute that growth directly to Drupal and the bump that Open Source received as a result of the recession. Again, businesses were looking at Open Source to be efficient without sacrificing innovation or quality.

Why Open Source will continue to grow and win

Fast forward another 10 years, and Open Source is still less expensive than proprietary software. In addition, Open Source has grown to be more secure, more flexible, and more stable than ever before. Today, the benefits of Open Source are even more compelling than during past recessions.

Open Source contribution can act as an important springboard for individuals in their careers as well. Developers who are unemployed often invest their time and talent back into Open Source communities to expand their skill sets or build out their resumes. People can both give and get from participating in Open Source projects.

That is true for organizations as well. Organizations around the world are starting to understand that contributing to Open Source can give them a competitive edge. By contributing to Open Source, and by sharing innovation with others, organizations can engage in a virtuous and compounding innovation cycle.

No one wants to experience another recession. But if we do, despite all of the uncertainty surrounding us today, I am optimistic that Open Source will continue to grow and expand, and that it can help many individuals and organizations along the way.

Mar 19 2020
Mar 19

It's a very sad week for us at Hook 42. COVID-19 has hit our clients and our families hard, and we regret to announce that we're closing up shop at the end of the month.

Hook 42 has been driven by our core values since we started in 2012: honesty, quality, community, ongoing improvement, work/life balance, and humor. We've loved being part of the open source community, in particular, the Drupal community all these years. We will always have fond memories of working with an amazing team, great clients, and wonderful community members.

Community

drupal drop held up by community members with phrase it takes a village

The biggest reason we focused on Drupal all these years was because of the awesome people in the Drupal community. We've had so many fun and interesting times at Drupal events around the world, both big and small. From so many DrupalCons to the Santa Cruz and SF Drupal User Groups to BADCamp to Stanford Web Camp, it's always been a joy to hang out with the community at BoFs, sessions, after-hour events, and our favorite "hallway track".

We sincerely hope that we've made a positive impact on the Drupal community. We have engaged in many ways, through speaking, organizing, volunteering, mentoring, training, and code contributions, because we believe in open source and we believe in actively participating in the community to make it a better place for everyone.

As individuals, we hope to continue to be involved in open source communities in the future and look forward to seeing your wonderful faces again. Please feel free to reach out to us individually to connect.

Clients & Partners

We've had some amazing clients and partners along the way and want to give a big thanks to those both past and current. We wish you all the best. Please keep in touch!

Team

We'll miss the Hook 42 team so much! We want to thank everyone on the team from the bottom of our hearts for being part of the Hook 42 family. We also sincerely thank all past "alum" team members as you are an essential part of our history.

The team has been an amazing bunch to work with, and you'd be lucky to work with them too. They are brilliant and supportive and always determined to do a great job. They go above and beyond to help each other, even now, during these difficult and trying times.

Here they are in alphabetical order with a short blurb that doesn't do them justice. Please reach out to them if you are looking for amazing team members. Contact us directly if you want any additional information. They all have our sincere endorsements.

Note: We've worked with many other great team members not listed here but we've focused on people who are either current employees or contractors who've recently worked 15+ hours/week for several months.

Chris Darke | Senior Developer

Chris is a veteran on the team and is open to working on almost anything. He's a versatile full stack developer who's got strong expertise in Drupal, React, Elasticsearch, and AWS microservices. When not geeking out, he's also keen on the outdoors and enjoys scuba diving, biking, surfing, and photography. We'll miss Chris' British accent and warm humor.

For more details, check out Chris' profiles: Hook 42 | LinkedIn | Drupal.org

Darryl Richman | Senior Developer

Darryl's been creating awesome Drupal sites since 2007 and has been with Hook 42 since 2013! With an eye for detail, he's a solid backend developer who's well versed in module development, site building, databases, and migrations. When not coding, Darryl is often traveling around the US and Europe on his motorcycle. We'll miss hearing about Darryl's grand adventures and getting his advice on beer and BBQ.

For more details, check out Darryl's profiles: Hook 42 | LinkedIn | Drupal.org

Ellen Doornbos | Developer

While one of our newer team members, we've been super impressed with Ellen's attention to detail, communication, and positivity. She's a certified accessibility expert and a great backend developer. She has helped our team with automation tools, migrations, audits, custom modules, and even some front-end theming. When not doing tech, Ellen can be found crafting or in the garden cultivating her wildflowers. We'll miss Ellen's earnestness and helpfulness.

For more details, check out Ellen's profiles: Hook 42 | LinkedIn | Drupal.org

Jason Flatt | Senior Developer

Jason has been with the team a long time and his attention to detail and dedication is amazing. He's a strong backend developer who can architect great solutions with Drupal or Backdrop, and has been working with Drupal since 2004. When Jason's not working, he's spending time with his family, or working on personal software projects because he enjoys software development so much. We'll miss Jason's dry humor and straightforwardness.

For more details, check out Jason's profiles: Hook 42 | LinkedIn | Drupal.org

Jonathan Daggerhart | Architect

Jonathan is a very experienced architect and full stack developer who can tackle any Drupal or WordPress project with ease. He can create solid modules, plugins, and themes of any complexity with smart architectures that follow best practices. Jonathan has been a great mentor to others on the team, taking personal interest in the growth and success of every team member who reaches out for help. When not doing tech, he enjoys role-playing games and watching movies. We'll miss Jonathan's Southern accent and charm.

For more details, check out Jonathan's profiles: Hook 42 | LinkedIn | Drupal.org

Joseph Flatt | Developer

Joseph is one of the youngest Drupal developers we've known, but don't let that fool you. He's a very experienced backend developer who started programming in his teens. Joseph is skilled with both Drupal and Backdrop. Joseph has a passion for improving website performance and solving Rubik's cubes. We'll miss hearing about Joseph's latest personal record-breaking Rubik's cube solves on meetings.

For more details, check out Joseph's profiles: LinkedIn | Drupal.org

Kristen Littlefield | Project Manager

As one of the newest team members, Kristen dove in and picked up a bunch of client projects without missing a beat. She's a veteran PM, mostly in the web development space, whether it be a Drupal or WordPress website and a focus on digital accessibility. When not managing web projects, she gracefully manages a busy household and enjoys hosting football parties. We'll miss her compassion and fun-loving spirit.

For more details, check out Kristen's profiles: Hook 42 | LinkedIn

Melissa Kraft | Project Manager

Melissa joined the team when there was a huge need for project management and she didn't even bat an eye. We were amazed at how little time it took to transition the projects to her and getting feedback from the clients on how impressed they were with her skills. Melissa has worked with web projects for many years and it shows. She's also great at whipping up new recipes for her family and friends. We'll miss Melissa's candor and upbeat attitude.

For more details, check out Melissa's profiles: Hook 42 | LinkedIn | Drupal.org

Lindsey Gemmill | Senior UX Designer

Lindsey's title is Senior UX Designer, and she has amazing design, UX, branding, and accessibility skills (a Certified Web Accessibility Professional). If that isn't enough she has also been our entire marketing department. She ran our social media, created landing pages, wrangled the team to write blog posts, created marketing roadmaps, rebranded the company, and much more. She's incredibly fast, creative and proactive. When unplugging, Lindsey likes to go to the beach and camping with her dogs. We'll miss Lindsey's can-do attitude and passion.

For more details, check out Lindsey's profiles: Hook 42 | LinkedIn | Drupal.org

Michelle Darling | Front End Developer

Michelle is our newest team member and in a very short time has shown us that her capabilities extend well beyond our expectations. Michelle showed off her serious front-end mastery by jumping right into projects that needed her expertise. Whether it's Drupal, WordPress, or Shopify, she's got the front-end covered while ensuring accessibility and usability are front and center. When not at work, Michelle can be found outdoors on a hike, reading, and enjoying music. We'll miss Michelle's determination and good attitude (and her great hair!).

For more details, check out Michelle's profiles: Hook 42 | LinkedIn | Drupal.org

Ryan Bateman | Architect

Ryan Bateman started with Hook 42 as a developer and quickly proved his chops to be promoted to senior developer and then architect. He's a very skilled full stack developer in Drupal, React, Gatsby, among other web systems and frameworks. When not producing awesome websites or mentoring team members, Ryan can be found doing all-things-outdoors in Alaska, or even hosting his own radio show. We'll miss Ryan's conscientiousness and thoughtfulness.

For more details, check out Ryan's profiles: Hook 42 | LinkedIn | Drupal.org

Ryan Nelson | Director of Operations & Projects

Ryan Nelson has been an integral part of the team for a year. He is a master of calm and can talk with anyone about anything. With 20 years' industry experience under his belt, he helped with so many things that it's hard to list but includes strategy, hiring, operations, HR, project management, process improvement, accounting, and people management. He's been the "go to" person on the management team for anything and everything. When not at work, Ryan enjoys traveling the globe (he has been to 5 continents!) and spending quality time with his wife. We'll miss Ryan's pleasantness and good humor.

For more details, check out Ryan's profiles: Hook 42 | LinkedIn | Drupal.org

Will Long | Senior Developer

Will's a very seasoned senior Drupal and PHP developer who you can throw any project at. Whether its complex business logic, a difficult migration, or streamlining a CI environment, he thrives on solving difficult problems. When not in the tech world, Will enjoys hanging out with his family and friends. We'll miss Will's positivity and grit.

For more details, check out Will's profiles: LinkedIn | Drupal.org

What's Next?

As one door closes, others open. What's next for the amazing people that were part of the Hook 42 adventure?

Community

We were planning on going to DrupalCon Minnesota as a team. Sadly, that won't happen. But Hook 42 has four confirmed speakers: Jonathan, Ryan, Aimee & Kristen. Aimee & Kristen still plan to speak at DrupalCon, and we're hoping to meet up with Hook 42 alums at the event when it does happen. We have other events penciled in as well such as Stanford Webcamp, GovCon, and BADCamp, so we're sure we'll see you in the Drupalverse in the future.

If you are looking for speakers, mentors, volunteers, trainers, etc., ping us and we'll see what we can do!

Clients

We're placing some clients with the team members to continue projects. For other clients, we've been reaching out to web agencies who might be a good fit for future support.

If you are potentially interested in client introductions, please let us know and provide your current rate sheet or typical blended rate, so we can see if it aligns with current contracts. We will only introduce clients if it is a good fit for the client and for the agency or contractor.

Team

The team members above are open to new opportunities. We can vouch for them and they'll vouch for each other. Maybe grab a few and you'll have a ready-made team! :)

If you have contract or employee opportunities, you can also send them to us directly and we'll pass along the information as we'll be in communication with the team in the coming weeks to help find them new work homes.

Aimee and Kristen

illustration of kristen and aimee

With Hook 42 going away, Aimee & Kristen will be spending time with their families and figuring out what the future holds. We could not have created such a wonderful company without the love and support of our friends and family. Thank you for the many hours of support you have given the both of us over the years.  We are open to discuss opportunities!

Aimee is passionate about accessibility, multilingual, and web architectures. After a much-deserved break to care for herself, her family and to dance. Aimee is looking forward to opportunities to follow her passions to support learning, growth, and community. She’ll be finally back in the Drupal issue queues instead of mentoring and supporting other’s contributions in the queues - she’s missed the hands-on work and loves the technology. 

To connect, check out Aimee's profiles: Hook 42 | LinkedIn | Drupal.org

Kristen is planning for some much-needed rest and reconnecting with her kids, husband, and mom. Then, she's hoping to get back into the Drupal issue queues for a bit because she finds that fun! :) After recharging and contributing, she'll be looking for something different in the next chapter, and is open to ideas, however crazy they may seem!

To connect, check out Kristen's profiles: Hook 42 | LinkedIn | Drupal.org

Both of us hope to say "hi" in person to former Hook 42 and client team members as well as community members as soon as the quarantines are over!

so long and thanks for all the fish

FIN!

the word bye in 8 languages around the drupal drop
Mar 18 2020
Mar 18

One of the founders of Lullabot and former CEO, Jeff Robbins, used to joke that Lullabot has "built-in disaster recovery" because the employees are accustomed to working from just about anywhere. Lullabot, one of the first Drupal consulting companies, started in 2006 after Matt Westgate and Jeff Robbins met on Drupal.org. Drupal has been at the heart of Lullabot's work for more than 14 years, and the core of what Jeff suggested could apply similarly to the Drupal community.

As each of us negotiates a world where COVID-19 dominates the headlines and our everyday interactions, this article considers how some of the lessons that the Drupal community—perhaps an idealized Drupal community—has learned might shape our understanding of these times that feel so extraordinary. Drupal does not have a monopoly on any of these concepts, but in stressful times, similes and metaphors can help us interrogate our underlying assumptions and the communities that we have each constructed.

You Don't Have to Do Anything

Free software communities thrive when people contribute in the ways that feel comfortable to them rather than out of guilt. People support the Drupal community in a wide variety of ways, and we encourage people who choose to contribute to the project to have fun and enjoy the process of contributing. Sometimes this means the best choice is to take a step back and not contribute at all. The Drupal community is huge, with nearly 5,800 contributors to Drupal core alone, and it's okay for people to pause once in a while—or altogether—and let others step forward.

As COVID-19 spreads through the world, and the world works together to slow the progress, sometimes the best option is for us to stay home. This recommendation goes against the natural human urge to fix things, but we can bring to mind the fact that, with Drupal and viruses alike, we simply can't fix everything. No one of us can "fix" the more than 95,000 open issues in Drupal core any more than we can "fix" the very real devastation caused by COVID-19. You can contribute, or you can do nothing, and the world will continue without you. There is no reason to feel guilty about taking a break and pausing to examine what is important to your life.

Honor Your Family

Historically, the Drupal community has supported people who have needed to take a break and focus on themselves or their families. From daily interactions to the highly-visible gestures of support, such as when Aaron Winborn needed it, members of the Drupal community have offered countless acts of kindness.

In a recent example from just weeks ago, before most of us had ever heard of COVID-19, our friend, colleague, and long-time Drupal contributor, Jerad Bitner, needed help after his wife received a diagnosis of Stage 4 Brain Cancer. Jerad and his family have received assistance from people in all areas of their lives, and it was especially heartening to see so many people from the Drupal community among the impressive list of supporters.

While the Drupal community may seem like it exists and organizes itself primarily on the web, in a "socially distant" manner, it can present itself in very human and sincere ways when our members need assistance. Take the time to focus on yourself and your family during this period of uncertainty and take comfort in the fact that the Drupal community has a remarkable capacity to support its members in times of need.

Get Off the Drupal Island

Especially since Drupal 8, the Drupal community has learned about the benefits of drawing from other communities. When we partner with others "off the island," we can save ourselves a lot of work.

Likewise, we can take what we have learned to help others. For those of us with the good fortune to have a job working for one of the many Drupal agencies with "built-in disaster recovery," we have a unique chance to help others. We can use our technical knowledge of online collaboration tools, microphones, cameras, and more to act as resources to those with less technical experience.

We don't have to go far to get off the "island." All around the world, meditation centers, yoga studios, churches, synagogues, and other places people seek during stressful times are scrambling to transform their services models and move them online. We in the Drupal community have an opportunity to volunteer our skills and knowledge to support organizations like these, both non-profits and for-profits. These are not business opportunities, but rather opportunities to help our neighbors. We can share our recommendations about open-source options for collaborating online, such as OBS Studio and Jitsi Meet. Perhaps our station in life allows us to donate money to organizations that need help, such as the food shelves that provide food and groceries to kids in areas where schools have closed.

Or we can put our Drupal skills directly to use. For instance, you might feel especially appreciative of your local public media organization for bringing you impartial news at this time. Many public radio and television stations use Drupal. Without needing access to their entire codebase or infrastructure, you could ask them if any contributed modules need features they will use, bugs they need fixing, or other ways to help that match your skillset.

Because of the prevalence of Drupal among non-profits, non-governmental, and community organizations, there are many opportunities to contribute directly to local organizations doing good. Now that seemingly every in-person conference, vacation, user group, and other regular meetings on the schedule have been canceled, we might be looking for activities to fill those hours. The chances are high that organizations and businesses in our communities -- the ones important to our daily lives -- are struggling to find a way forward, and they might welcome unsolicited offers to help.

We don't need rock stars

In both the Drupal community and our local communities, our capacity to bring about change can feel limited. Anyone who has contributed to Drupal likely knows that the process of getting things done in the community can sometimes take a lot of time, effort, and discussion. We progress one patch at a time. Often it would be much simpler just to make a change to fit one specific use case, but in the Drupal community, we have learned that we need to work together and create consensus around ideas. We realize that we are stronger together and that sharing code feels so much better than hoarding code. Getting code into Drupal is rarely about maximizing revenue, but rather contributing to something bigger.

During a pandemic, the same mindset applies. Social distancing might feel challenging, but it's an act of compassion that benefits others. We help in the ways that feel genuine, not forced. We don't need rock stars and hoarders. We need just enough people to work toward more manageable, short-term goals. Thus, joining a group rather than going it alone can help make your otherwise small contributions feel more significant.

Do Your Homework

Through experience working on Drupal sites, we realize that many of the problems we face are already solved. We don't assume every problem is a bug in Drupal core. We don't assume that the problem with our Drupal site is unique. We encourage the person with a question to "do your homework." We look for others who have encountered similar problems and learn from those who are kind enough to share their solutions.

The argument that we live in exceptional times, while accurate in the short term, does not reflect a broader view of history. Everywhere in human history, people have been affected by violence, war, injustice, widespread fear, and, yes, disease. Our seemingly exceptional problems, which cause real suffering, are variations on similar historical problems. The 1918 flu pandemic, for instance, killed 50 million people. Understanding and connecting to past events can help reduce the sense of exceptionalism that we all feel. In history, we find people who overcame fear and redirected their focus from helping themselves to helping others. As we become more socially distant in this current reality, we can connect to people online and in the past that have encountered problems like ours. We can also see how problems in the past always come to an end, even if they reappear again.

Your Code Won't Last Forever

The Drupal codebase and community, like everything else in the world, changes constantly. Our prized contributions get replaced. With software, it can be easier to accept the fact of change. We have learned that the point in time when we know everything about Drupal will never arrive. For as long as it can take to get a patch into Drupal core, it can simultaneously feel like Drupal moves at a breakneck speed. The list of completed and in-process strategic initiatives just for Drupal 8 is long, and Drupal 9 will arrive before we know it. We have learned to accept the fact that we need to learn continually, all of our contributions to Drupal will eventually be replaced, and change in the Drupal community is inevitable.

Similarly, the cozy worlds that some of us had grown to inhabit now feel threatened. We live in a society that rarely admits the inevitability of sickness and death, and yet both are guaranteed in life. The world, like Drupal, is always changing, and after our initial reactions begin to subside, we can choose how we respond to these ever-changing circumstances. We will each find our way to negotiate these always-changing realities. Some days will be sunny, and others will not.

Ask For Help

The current state of reality might feel overwhelming, but in the Drupal community, our response is to encourage people to ask for help. The Drupal software can feel like a complex, unknowable beast. We have learned to find others with more knowledge in a particular area than we do. We practice acts of kindness when we first look for answers by ourselves before asking others for help. Sometimes we work really hard on a problem and do everything we can before we "bother" another community member. In the Drupal community, we regularly practice a version of "social distancing" out of respect for the other people in our community. But at some point, we must ask for help, and significant relief can follow when the recipient of our question seems happy to offer assistance.

As we find our way through this new (and temporary) reality, we have many options: do nothing, offer help, connect with friends and family, connect our experiences with historical events, dig into the Drupal codebase, ask for help when necessary. None of these responses is incorrect. We can imagine the ways that Drupal can help. Even better, we can stop merely imagining better worlds and embrace this reality by finding activities, words, and thoughts that reduce our struggles and the struggles of the people around us. When you notice that something you are doing is not helpful, consider shifting your efforts. The Drupal software will continue to evolve, and we can, too.

Mar 18 2020
Mar 18

Automatic website updates are incredibly convenient, whatever CMS your site is built on.

Unfortunately, the automation of Drupal website updates used to be impossible. But now we have great news for all Drupal 7 and Drupal 8 website owners — automatic website updates are already here! Read on to discover more details.

The most desired Drupal feature: benefits of auto updates

When asked what they would like to see in Drupal, website owners and users have always mentioned automatic updates. In the long-term competition between Drupal and WordPress, the latter had this trump card in its sleeve — even considering certain risks with unattended processes.

Every Drupal website owner or admin often sees a frustrating warning on the dashboard that a new release is available. However, they do not touch it because they know the procedure can be cumbersome. What if this all could be done automatically? This would give website owners unquestionable benefits such as:

  • being able to easily keep up with the Drupal release cycles on their own
  • never having to worry about security updates
  • never having to deal with Composer, which is a bit cumbersome for users

The automation was really needed in order to make Drupal more user-friendly, safer thanks to timely security updates, and more competitive on the market. They have become one of the strategic initiatives highlighted by Drupal’s creator Dries Buytaert in the “State of Drupal” presentation. The Automatic Updates initiative now celebrates great success that we can now describe!

Drupal automatic updates as one of strategic priorities

How Drupal automatic updates work

There is a new contributed module that is headed for inclusion into the Drupal core — Automatic Updates. It is meant to auto-update Drupal as simply and as cleanly as possible. When you see the update steps, you will notice they really have plenty of measures for cleanliness and safety. Let’s now take a closer look at how it performs both Drupal 7 and Drupal 8 automatic updates.

The key automatic Drupal update steps:

  • Displaying the security release announcements

The new module will let you know several days in advance about the security announcements (PSAs) for the core and contributed modules. The respective notices will be posted on the admin dashboard. This will be implemented as a PSA.json feed from drupal.org.

  • Making sure your website is ready for the update

Next, the module checks whether your website is ready for a smooth update and there is nothing to mess with it. These checks are made through the Drupal Cron runs. Upon the check, it displays errors explaining what necessarily needs to be fixed or just shares warnings. Here are some examples of issues that can be listed:

  1. the site uses a read-only filesystem
  2. ongoing database updates
  3. insufficient disk space
  • Performing the in-place Drupal update

Here comes the key step in the automatic update process — the actual update. The module downloads a ZIP archive from drupal.org. For security reasons, it is hashed and signed. With the help of the Libsodium library, its signature is verified to make sure this is the official drupal.org archive. Next, all files meant for the update are backed up. Finally, they are overwritten with the new ones using the PHP copy function.

  • Do automatic website updates need your interference?

The release announcements and website readiness checks are all automatic. The actual update part depends on your choice:

  1. you can manually start the update on the module configuration page
  2. or you can check the box on the same page that allows automatic updates run with the help of Cron
  • The current state of automatic Drupal updates

Right now, automatic updates are in active development and their amazing team keeps adding more features. The module is currently focused on automatic Drupal core updates according to security releases. Adding other types of updates, support for contributed modules, and working better with Composer are in its roadmap for the future.

In order to be added to the Drupal core, the module needs to get a feature called the A/B front-end controller. It will be able to swap between the two codebases and go back to the backed-up one in case something goes wrong during the automatic update, which will add even more safety and reliability to the process.

Apply for any help with automatic website updates

If you are impressed with this and are ready to try the automatic website update feature, we encourage you to contact our Drupal support and maintenance team.

  • We will help you install and configure the Automatic Updates module and resolve all the update-hampering issues it may discover on your website.
  • As an option, we can inform you when the module gets into the Drupal core and update your website so it is there already.

Enjoy smooth, easy, and safe automatic Drupal updates!

Mar 18 2020
Mar 18
Project: Drupal coreVersion: 8.8.x-dev8.7.x-devDate: 2020-March-18Security risk: Moderately critical 13∕25 AC:Complex/A:User/CI:Some/II:Some/E:Proof/TD:DefaultVulnerability: Third-party libraryDescription: 

The Drupal project uses the third-party library CKEditor, which has released a security improvement that is needed to protect some Drupal configurations.

Vulnerabilities are possible if Drupal is configured to use the WYSIWYG CKEditor for your site's users. An attacker that can create or edit content may be able to exploit this Cross Site Scripting (XSS) vulnerability to target users with access to the WYSIWYG CKEditor, and this may include site admins with privileged access.

The latest versions of Drupal update CKEditor to 4.14 to mitigate the vulnerabilities.

Solution: 

Install the latest version:

Versions of Drupal 8 prior to 8.7.x have reached end-of-life and do not receive security coverage.

The CKEditor module can also be disabled to mitigate the vulnerability until the site is updated.

Note for Drupal 7 users

Drupal 7 core is not affected by this release; however, users who have installed the third-party CKEditor library (for example, with a contributed module) should ensure that the downloaded library is updated to CKEditor 4.14 or higher, or that CDN URLs point to a version of CKEditor 4.14 or higher. Disabling all WYSIWYG modules can mitigate the vulnerability until the site is updated.

Mar 17 2020
Mar 17

Current realities are rapidly shifting for all of us. What to do now? What can we expect? 

During a time of crisis, the quality of communications can have a huge impact, and not just in the moment. The effects of what is said and what is not said will linger, and reveal much about the organization, its leadership, and individuals involved. 

A crisis and challenge on the magnitude of a global pandemic stands to bring out the best in us or the worst in us. There are the immediate concerns surrounding staying safe and helping to ensure the health and well-being of those around us. Then, of course, there are a myriad of business and financial concerns. Life goes on. Promises still need to be kept.

At Promet Source, we are committed to reaching out and being our best selves during this time. It’s a commitment that starts at the top with clear and honest communications.  

This commitment calls for a higher degree of empathy and outreach. What are people going through? What do they need to know? How can we help?

Empathy PLUS Honesty

During times of uncertainty, people look for answers and reassurance, and it’s tempting to want to provide that. Keep in mind that making promises and commitments that may not be kept is likely to be interpreted as dishonesty and deception, both of which are particularly difficult to forgive. Now is the time for integrity.

As businesses around the world scramble to meet obligations and keep teams productive and connected while working from remote locations, the right messaging matters more than ever before. 

Great communication also calls for a focus on connectedness: focusing on what it is that we have to offer and how that translates into our role as corporate citizens.

Resources and Expertise

At Promet Source, we’ve reached out to our sphere recently, pointing out that our team is distributed and that due to our well-established business processes for working remotely, we’re positioned to consult on the systems and technical tools that we’ve found to be the most effective for staying connected and optimizing productivity. 

Our communications team can also serve as a critical resource during this trying time, helping with strategic messaging and communications plans for optimizing culture and strengthening client relationships.

New Concerns, New Paradigms

It's important that we actively acknowledge that team members are facing new sets of challenges that extend beyond working at home. In many cases, school-aged children are at home. There are likely to be constant interruptions and new expectations for homeschooling. 

There are also big concerns about coronavirus symptoms and the impact of recent exposures. 

Even among those who remain healthy and symptom-free, let there be no doubt that this pandemic is having a major impact on everyone’s lives. Team members with high-stakes deliverables who are now facing a whole new set of realities need to know that they are not alone. 

These new realities call for new ways of connecting, being present, and adding value. It’s a challenge that few could have anticipated, but that many are working through and revealing considerable character in the process.

Did you know that Promet’s communication team can help with strategic communications support? Contact us today to schedule a workshop, or simply to start a conversation. 

Introduction to Digital Experience Frameworks

Mar 17 2020
Mar 17
Mar 17 2020
Mar 17

As I said in the previous post, during these months I will be playing with migrations, preparing some cases for a future (I hope) book. Well, during these days of confinement, I intend to continue with small articles around here to show experiences related to migrations.

In the former post, I was writing about migrations in Drupal from a point of view based in the look for a tool-box, just a set of basic resources in order to focusing a migration.

There’s a lot of information to process about it and some more concepts, technics and tactics to resolving a migration, you can be sure. So this month I want to write something that allows me play with migrations, maybe more practical than theorical.

This article was originally published in https://davidjguru.github.io
Picture from Unsplash, user Émile Séguin, @emileseguin

Table of Contents

1- Introduction
2- Arrangements
3- Approach
4- Migrations
5- Key Concepts
6- Resources
7- :wq!

This article is part of a series of posts about Drupal Migrations

1- Thinking about Drupal Migrations (I): Resources
2- Thinking about Drupal Migrations (II): Examples

1- Introduction

The Drupal Migration API can be one of the most interesting, but also one of the most complex, since its activities are often related to classes and methods of other Drupal APIs (so it’s especially particular when debugging). In any case, and as the amount of concepts can be overwhelming, I think we could practice migration mechanics through a couple of exercises.

Well, for this article I had proposed to model two different migration processes, under a point of view that could be summarized as “primum vivere, deinde philosophari” (first you experiment, then you theorize). This is why I have decided to organize it in a particular way:

  • The first thing to say is that the two processes are divided into sections that are common to both and instead of finishing one and starting the next one, both go in parallel (you choose your own adventure).

  • Then, Only at the end of this post you will find some key concepts used in this article. First we gonna to play with the structures, then we’ll understand them.

So, in the next steps, we’ll working around two certain experiencies:

  1. Migrating Data from a embedded format (maybe the most simple example of Drupal migrations).

  2. Migration Data from a classical CSV file format (just a little more complex than the previous example).

Both of the cases are perhaps the most basic scenarios for a migration, so I recommend reading this article for those who want to get started on its own mechanics, as a practical complement to get into Drupal migrations.

2- Arrangements

First case: Migrating embedded data

For our first case we will need, on the one hand, to enable the Migrate module of the Drupal core, and on the other hand, to download and install a contributed module to be able to manage migrations.

From the different options we have, we are going to choose migrate_run, which we have already mentioned in the previous post and could be interpreted as a light version of migrate_tools (although it’s actually a fork of the project): both of wich provide drush commands to run migrations, so if you have migrate_tools installed you must uninstall it in order to avoid collide with migrate_run.

As a curious note, the first lesson here is that for running Drupal migrations, neither migrate_plus nor migrate_tools are “hard” dependencies, that is, we can implement migrations without having these modules enabled in our Drupal installation.

By the way I have to say that it’s important to know that migrate_run is optimized for Drush 9 and later. If you use Drush 8 you will have to use an adapted version, like the Alpha 4, which was still prepared for Drush 8.

Using Composer and Drush:

composer require drupal/migrate_run
drush pmu migrate_tools # If you need
drush en migrate migrate_run -y
drush cr

Using Drupal Console:

composer require drupal/migrate_run
drupal mou migrate_tools # If you need
drupal moi migrate migrate_run

And you will see in the path /admin/modules:

Enabling Migrate and Migrate Run modules

Building the resources

Now, we’re going to create a new custom module for our first Migration:

cd project/web/modules/custom
mkdir migration_basic_module

Then, the migration_basic_module.info.yml file with content:

name: 'Migration Basic Module'
type: module
description: 'Just a basic example of basic migration process.'
package: 'Migrations Examples 2000'
core: 8.x
dependencies:
  - drupal:migrate

Create the new migration definition file with path: /migration_basic_module/migrations/basic_migration_one.yml.

In our new declarative file basic_migration_one.yml, which describes the migration as a list of parameters and values in a static YAML-type file, we will include the embedded data of two nodes for the content type “basic page” to be migrated, loading only two values:

  1. A title (a text string).
  2. A body (A text based on the ChiquitoIpsum generator*, http://www.chiquitoipsum.com).

*Chiquito de La Calzada was a national figure in the Spanish state, a legendary comedian.

basic_migration_one.yml

id: basic_migration_one
label: 'Custom Basic Migration 2000'
source:
  plugin: embedded_data
  data_rows:
    -
      unique_id: 1
      page_title: 'Title for migrated node - One'
      page_content: 'Lorem fistrum mamaar se calle ustée tiene musho pelo.'
    -
      unique_id: 2
      page_title: 'Title for migrated node - Two'
      page_content: 'Se calle ustée caballo blanco caballo negroorl.'
  ids:
    unique_id:
      type: integer
process:
  title: article_title
  body: article_content
destination:
  plugin: 'entity:node'
  default_bundle: page

And this will be the structure of the new custom module for basic migration example:

/project/web/modules/custom/  
                     \__migration_basic_module/  
                         \__migration_basic_module.info.yml  
                             \__migrations/  
                                 \__basic_migration_one.yml  

Enabling all the required modules using Drush:

drush pm:enable -y migrate migrate_run migration_basic_module
drush cr

Or using Drupal Console:

drupal moi migrate migrate_run migration_basic_module

Second Case: Migrating from csv files

For this second case we are going to deactivate migrate_run (if applicable) and activate the superset of modules: migrate, migrate_plus and migrate_tools. Besides, for the treatment of CSV files we are going to use a Source Plugin stored in a contrib module called Migrate Source CSV migrate_source_csv. This contrib module in its version 3.x is using league/csv for processing CSV files. Ok, let’s go. So using Composer + Drush:

composer require drupal/migrate_plus drupal/migrate_tools drupal/migrate_source_csv
drush pmu migrate_run # If you need 
drush en migrate migrate_plus migrate_tools migrate_source_csv -y
drush cr

So, now in the path /admin/modules/:

Enabling Migrate and Migrate Plus Migrate Tools

Building the resources

We’re going to create another new custom module for our second Migration:

cd project/web/modules/custom
mkdir migration_csv_module

With a new migration_csv_module.info.yml file:

name: 'Migration CSV Module'
type: module
description: 'Just a basic example of basic migration process with a CSV source.'
package: 'Migrations Examples 2000'
core: 8.x
dependencies:
  - drupal:migrate
  - drupal:migrate_tools
  - drupal:migrate_plus

In this example we’re going to require a declarative file of the migration too (as in the previous case) but with the exception that we’re going to locate it in a different place. This will be placed in the /migration_csv_module/config/install/ path.

The structure will look like this just now:

/project/web/modules/custom/  
                     \__migration_csv_module/  
                         \__migration_csv_module.info.yml
                          \__csv/
                               \_migration_csv_articles.csv
                           \__config/
                                \__install/
                                     \__migrate_plus.migration.article_csv_import.yml

So we need a csv with original data to migrate. It’s easy to solve this using web tools like Mockaroo, a pretty good random data generator. I’ve created a CSV file with some fields like: id, title, body, tags, image. Download it from here. This file will be our datasource for the Migration process. Ok, by now create the directories for the module and put the new custom CSV in the /csv path:

CSV Migrate module structure

And now, our migrate_plus.migration.article_csv_import.yml file (In later sections we will explain its construction and sections):

uuid: 1bcec3e7-0a49-4473-87a2-6dca09b91aba
langcode: en
status: true
dependencies: {  }
id: article_csv_import
label: 'Migrating articles'
source:
  plugin: csv
  path: modules/custom/migration_csv_module/csv/migration_csv_articles.csv
  delimiter: ','
  enclosure: '"'
  header_offset: 0
  ids:
    - id
  fields:
    -
      name: id
      label: 'Unique Id'
    -
      name: title
      label: Title
    -
      name: body
      label: 'Post Body'
    -
      name: tags
      label: 'Taxonomy Tag'
    -
      name: image
      label: 'Image Field'
process:
  title: title
  body: body
  tags: field_tags
  image: field_image
  type:
    plugin: default_value
    default_value: article
destination:
  plugin: 'entity:node'

Okay, we now have all the resources we need to create our new migration. Now let’s see how we approach the process.

3- Approaches

We’re going to describe the different approaches that we will apply to our example cases, in order to understand them better.

First case: Migrating embedded data

In this first case, we considered making the lightest possible case of migration in Drupal: Only two nodes with two basic fields each under an embedded format: the lightest possible.

Also, in this example we are going to use for the three ETL phases of the migration (Extract, Transformation and Loading) processing plugins already provided by Drupal (we will not develop any custom plugin). If you don’t know anything about the concept of Migration Plugins, please stop by for a moment and back here to read a little introduction to the topic.

To make things lighter, we will keep the “lite” version of Migration Tools, Migrate Run. Besides, we will only use the basic commands without any other options or complementary parameters, only with the basic argument of the migration file identifier.

Second Case: Migrating from csv files

For this execution, I would like to play with something pretty interesting&mldr;due to we’ll running this second migration example as configuration, I was thinking that will be funny do the inverse road&mldr;Yes, I propose not to install (activate, drush enable) the new custom module for CSV and leave it&mldr;only as storage for the CSV file.

Let’s move and run the migration from somewhere else. Surprise. Visit the path /admin/config/development/configuration/single/import into your Drupal installation and we’ll see there!.

4- Migrations

First case: Migrating embedded data

Getting info about the available migrations

drush migrate:status
drush ms

Output from console:
----------------- -------- ------- ---------- ------------- --------------------- 
  Migration ID      Status   Total   Imported   Unprocessed   Last Imported        
----------------- -------- ------- ---------- ------------- --------------------- 
basic_migration_one   Idle     2       0          2               
----------------- -------- ------- ---------- ------------- --------------------- 

Running migrations

drush migrate:import basic_migration_one
drush mi basic_migration_one  

Output from console:
----------------- -------- ------- ---------- ------------- ------------------- 
  Migration ID      Status   Total   Imported   Unprocessed  Last Imported        
----------------- -------- ------- ---------- ------------- ------------------- 
basic_migration_one   Idle     2       2 (100%)   0            2020-03-17 23:19:36  
----------------- -------- ------- ---------- ------------- ------------------- 

And so, going to the path /admin/content you’ll see the two new nodes:

Drupal Basic Migration Embedded Data

Rollbacking migrations (undoing)

drush migrate:rollback basic_migration_one
drush mr basic_migration_one  

Output from console: 

[notice] Rolled back 2 items - done with 'basic_migration_one'

Drupal Basic Migration Commands

Second Case: Migrating from csv files

Well, now in the path /admin/config/development/configuration/single/import we have to import our new custom migration definition file, Ok?

Loading the migration config data

Just go to Import -> Single Item, select the configuration type as “Migration” and paste the content of the original migration file:

Drupal Migration load File by Config

Click The “Import” button and the new Config object will be created in the Config System.

And now?

Running the migration

With the Migration file under the Config management, you can run the process with the same tools as in the former case. Now, we have available a new migration that we can run from console: drush migrate:status

Drush Migrate Status

Now you can execute the migration with: drush migrate-import article_csv_import And all the new nodes will be created. The limit? well, tags and image not will be migrated, cause tag is an entity reference and image is not a link, is a file, and both types must use some differents Plugins&mldr;but we’ll talk about this in future posts.

Drush cex / Drush cim

With the migration under the config system, now you can edit, import and export the migration using the basic resources from Drush. For example, testing drush cex:

Drush Cex

As you can see, the Config System has directly put the new migration file under the management of Migrate Plus and It has performed some actions, such as: renamed the file by placing migrate_plus.migration as a prefix in the file name or added a new file for group (only a way to group migration processes).

Remember the name of the file? It’s just the same that we were using in the /config/install directory, the so-called migrate_plus.migration.article_csv_import.yml. We’ve done exactly the same process, but from a different direction. Are you impressed? No? Do you find it interesting?

Remember also that with this config file, you can use drush cim and load the migration in any other Drupal (with access to the CSV file as datasource, indeed).

Thus we have migrated some 102 new nodes using two different approaches and different methodologies. Not bad.

5- Key Concepts

Migration Plugins

Ok, It’s very important so we have to repeat one more time the same song&mldr;You must to know the Plugin Format and the diverse world of the existing Migration Plugins.

Every Plugin points to a specific data type, a specific format or a different source. You should know the main ones very well and also investigate those you may need, since in migrations they are used extensively. Because of this, for example, we have not been able to migrate taxonomy terms or images in the second case from the CSV file as datasource.

Let’s see the Plugins involved in these two migrations, watching its descriptive files:

Basic Embedded Migration

source:
  plugin: embedded_data
  data_rows:
         ...
process:
  title: creative_title
  body: engaging_content
destination:
  plugin: 'entity:node'
  default_bundle: page

We’re using for extract data from the source the Embedded Data Plugin, a PHP class available in /web/core/modules/migrate/src/Plugin/migrate/source/EmbeddedDataSource.php where in its annotations block you can see some configuration keys that you can use in your migrate file:

 *
 * Available configuration keys
 * - data_rows: The source data array.
 * - ids: The unique ID field of the data.
 *

And data_rows and ids are the keys that we’re using in our migration description file. Read more about the EmbeddedDataSource class in Drupal.org API.

Now, watching the process block and looking for&mldr;where’s the Processing Plugin? Well I think this might be interesting&mldr;usually, all the field mappings in a processing block requires a process plugin for each. Then, with some of “sintactic sugar”, the Migrate API offers a way to reduce and simplify this: if no specific treatment is required for each field, then a single Plugin can take care of all the processing. This “default” Plugin may also be implicit, so that in the absence of a declaration, the Drupal Migrate API will always apply the same Processing Plugin by default.

This “implicit” and by-default Plugin is the Get class and is provided as the basic solution in processing fields. You can find the Get class in the path /web/core/modules/migrate/src/Plugin/migrate/process/Get.php. Read more info about the Get.php class in Drupal.org API. So actually, what we are saying in a complementary way is that is the same thing write:

process:
  title: page_title

as this other:

process:
  title:
    plugin: get
    source: page_title

And so life is a little simpler, isn’t it? Remember: in the absence of a processing plugin declaration for a field, Drupal will apply the “Get” plugin by default.

Ok and finally, for destination we’re using the Entity General Plugin with param “node”, in order to create diverse elements with node type and for bundles “page”. This calls to the Destinatio Plugin Entity.php, abstract class in path: web/core/modules/migrate/src/Plugin/migrate/destination/Entity.php and get its own derivative Plugin. Read more about derivative Plugins in Drupal and read about the Entity.php destination Plugin or the derivative migration class.

CSV datasource Migration

I think that the review of the plugins in this case could be easier and more intuitive.

source:
  plugin: csv
 ...
 process:
 ...
   type:
     plugin: default_value
     default_value: article
 destination:
   plugin: 'entity:node'

For the source, the CSV Plugin, from the migrate_source_csv contrib module. For processing, by default is using Get and for type the Default Value Plugin. For destination, the same plugin as the previous migration: new entities.

Migration as code or as configuration

As you could see, we have treated each migration process differently. The first process (Embedded Data) has been treated as part of the “code”, without any further particularities.
But the second process has been treated as a configuration element of the system itself, making it part of the config/install path, which will create a new configuration object from the installation.

In both cases you write the migration definition in a YAML format and then you put the migration file in a place or another. But there are more differences&mldr;Let’s make a little summary of these keys:

  • Migration “as code” is provided out of the box, but the module “Migrate Plus” allows you treating the file as a configuration object.

  • Depending on which approach you use, the location of the files and the workflow will differ:

    • As code, in order to make changes to the migration definition you’ll need access to the file system and manage the migration file as a code file, something developers-oriented.

    • As configuration, you’ll can do changes to the migration definition file using the config sync interface in Drupal, path: /admin/config/development/configuration, in addition to being able to use configuration export/import tools: drush cex, drush cim, cause now you sync the migration (the migration file will be saved in database). This means that you can write, modify, and execute migrations using the user interface. Big surprise.

    • As a configuration object, now your migration file will be create a new configuration registry in your Drupal Config System, and keep it alive also when your migrate module will be disabled. To avoid this and delete the config, put your own custom module as a new dependency of the migration in your migration description file.yml, so the migration will be deleted from Drupal’s Active Config just in this moment:

  dependencies:
    enforced:
      module:
        - my_own_migration_custom_module

  • Another change is that now, in a config-way, your migration file needs a UUID, just a global identifier for the Drupal Config Management System. Add at first line an unique and custom UUID for your file, to facilitate the processing of the configuration. Remember: UUID is a string of 32 hexadecimal digits in blocks of 5 groups using the pattern: 8-4-4-4-12. Make your own!
    uuid: cacafuti-1a23-2b45-3c67-4d567890a1b2.

6- Resources

Download, play and test the different resources using along this post. I uploaded to Github ready to use.

  1. Basic Migration File, basic_migration_one.yml, available in Github as Gist.

  2. CSV Migration File, article_csv import.yml, available in Github as Gist.

  3. CSV Source File with random data, Gist in Github.

  4. Codebase of the two migration modules (basic and csv), Available in Github. This will be a central repository for all the modules of this series of posts about Migrations, so get the direct link to these two examples:

  5. In parallel to this series of articles I’m also publishing a series of snippets in Gitlab under the topic “Migrations”, with a more simplified format, less verbose. Here you can access to the first snippet and get links to the rest of the series. Drupal Migrations Tips (I): Creating a new basic migration structure.

7- :wq!

[embedded content]

Mar 17 2020
Mar 17

As I said in the previous post, during these months I will be playing with migrations, preparing some cases for a future (I hope) book. Well, during these days of confinement, I intend to continue with small articles around here to show experiences related to migrations.

In the former post, I was writing about migrations in Drupal from a point of view based in the look for a tool-box, just a set of basic resources in order to focusing a migration.

There’s a lot of information to process about it and some more concepts, technics and tactics to resolving a migration, you can be sure. So this month I want to write something that allows me play with migrations, maybe more practical than theorical.

This article was originally published in https://davidjguru.github.io
Picture from Unsplash, user Émile Séguin, @emileseguin

Table of Contents

1- Introduction
2- Arrangements
3- Approach
4- Migrations
5- Key Concepts
6- Resources
7- :wq!

This article is part of a series of posts about Drupal Migrations:

1- Drupal Migrations (I): Basic Resources

2- Drupal Migrations (II): Examples

1- Introduction

The Drupal Migration API can be one of the most interesting, but also one of the most complex, since its activities are often related to classes and methods of other Drupal APIs (so it’s especially particular when debugging). In any case, and as the amount of concepts can be overwhelming, I think we could practice migration mechanics through a couple of exercises.

Well, for this article I had proposed to model two different migration processes, under a point of view that could be summarized as “primum vivere, deinde philosophari” (first you experiment, then you theorize). This is why I have decided to organize it in a particular way:

  • The first thing to say is that the two processes are divided into sections that are common to both and instead of finishing one and starting the next one, both go in parallel (you choose your own adventure).

  • Then, Only at the end of this post you will find some key concepts used in this article. First we gonna to play with the structures, then we’ll understand them.

So, in the next steps, we’ll working around two certain experiencies:

  1. Migrating Data from a embedded format (maybe the most simple example of Drupal migrations).

  2. Migration Data from a classical CSV file format (just a little more complex than the previous example).

Both of the cases are perhaps the most basic scenarios for a migration, so I recommend reading this article for those who want to get started on its own mechanics, as a practical complement to get into Drupal migrations.

2- Arrangements

First case: Migrating embedded data

For our first case we will need, on the one hand, to enable the Migrate module of the Drupal core, and on the other hand, to download and install a contributed module to be able to manage migrations.

From the different options we have, we are going to choose migrate_run, which we have already mentioned in the previous post and could be interpreted as a light version of migrate_tools (although it’s actually a fork of the project): both of wich provide drush commands to run migrations, so if you have migrate_tools installed you must uninstall it in order to avoid collide with migrate_run.

As a curious note, the first lesson here is that for running Drupal migrations, neither migrate_plus nor migrate_tools are “hard” dependencies, that is, we can implement migrations without having these modules enabled in our Drupal installation.

By the way I have to say that it’s important to know that migrate_run is optimized for Drush 9 and later. If you use Drush 8 you will have to use an adapted version, like the Alpha 4, which was still prepared for Drush 8.

Using Composer and Drush:

composer require drupal/migrate_run
drush pmu migrate_tools # If you need
drush en migrate migrate_run -y
drush cr

Using Drupal Console:

composer require drupal/migrate_run
drupal mou migrate_tools # If you need
drupal moi migrate migrate_run

And you will see in the path /admin/modules:

Enabling Migrate and Migrate Run modules

Building the resources

Now, we’re going to create a new custom module for our first Migration:

cd project/web/modules/custom
mkdir migration_basic_module

Then, the migration_basic_module.info.yml file with content:

name: 'Migration Basic Module'
type: module
description: 'Just a basic example of basic migration process.'
package: 'Migrations Examples 2000'
core: 8.x
dependencies:
  - drupal:migrate

Create the new migration definition file with path: /migration_basic_module/migrations/basic_migration_one.yml.

In our new declarative file basic_migration_one.yml, which describes the migration as a list of parameters and values in a static YAML-type file, we will include the embedded data of two nodes for the content type “basic page” to be migrated, loading only two values:

  1. A title (a text string).
  2. A body (A text based on the ChiquitoIpsum generator*, http://www.chiquitoipsum.com).

*Chiquito de La Calzada was a national figure in the Spanish state, a legendary comedian.

basic_migration_one.yml

id: basic_migration_one
label: 'Custom Basic Migration 2000'
source:
  plugin: embedded_data
  data_rows:
    -
      unique_id: 1
      page_title: 'Title for migrated node - One'
      page_content: 'Lorem fistrum mamaar se calle ustée tiene musho pelo.'
    -
      unique_id: 2
      page_title: 'Title for migrated node - Two'
      page_content: 'Se calle ustée caballo blanco caballo negroorl.'
  ids:
    unique_id:
      type: integer
process:
  title: article_title
  body: article_content
destination:
  plugin: 'entity:node'
  default_bundle: page

And this will be the structure of the new custom module for basic migration example:

/project/web/modules/custom/  
                     \__migration_basic_module/  
                         \__migration_basic_module.info.yml  
                             \__migrations/  
                                 \__basic_migration_one.yml  

Enabling all the required modules using Drush:

drush pm:enable -y migrate migrate_run migration_basic_module
drush cr

Or using Drupal Console:

drupal moi migrate migrate_run migration_basic_module

Second Case: Migrating from csv files

For this second case we are going to deactivate migrate_run (if applicable) and activate the superset of modules: migrate, migrate_plus and migrate_tools. Besides, for the treatment of CSV files we are going to use a Source Plugin stored in a contrib module called Migrate Source CSV migrate_source_csv. This contrib module in its version 3.x is using league/csv for processing CSV files. Ok, let’s go. So using Composer + Drush:

composer require drupal/migrate_plus drupal/migrate_tools drupal/migrate_source_csv
drush pmu migrate_run # If you need 
drush en migrate migrate_plus migrate_tools migrate_source_csv -y
drush cr

So, now in the path /admin/modules/:

Enabling Migrate and Migrate Plus Migrate Tools

Building the resources

We’re going to create another new custom module for our second Migration:

cd project/web/modules/custom
mkdir migration_csv_module

With a new migration_csv_module.info.yml file:

name: 'Migration CSV Module'
type: module
description: 'Just a basic example of basic migration process with a CSV source.'
package: 'Migrations Examples 2000'
core: 8.x
dependencies:
  - drupal:migrate
  - drupal:migrate_tools
  - drupal:migrate_plus

In this example we’re going to require a declarative file of the migration too (as in the previous case) but with the exception that we’re going to locate it in a different place. This will be placed in the /migration_csv_module/config/install/ path.

The structure will look like this just now:

/project/web/modules/custom/  
                     \__migration_csv_module/  
                         \__migration_csv_module.info.yml
                          \__csv/
                               \_migration_csv_articles.csv
                           \__config/
                                \__install/
                                     \__migrate_plus.migration.article_csv_import.yml

So we need a csv with original data to migrate. It’s easy to solve this using web tools like Mockaroo, a pretty good random data generator. I’ve created a CSV file with some fields like: id, title, body, tags, image. Download it from here. This file will be our datasource for the Migration process. Ok, by now create the directories for the module and put the new custom CSV in the /csv path:

CSV Migrate module structure

And now, our migrate_plus.migration.article_csv_import.yml file (In later sections we will explain its construction and sections):

uuid: 1bcec3e7-0a49-4473-87a2-6dca09b91aba
langcode: en
status: true
dependencies: {  }
id: article_csv_import
label: 'Migrating articles'
source:
  plugin: csv
  path: modules/custom/migration_csv_module/csv/migration_csv_articles.csv
  delimiter: ','
  enclosure: '"'
  header_offset: 0
  ids:
    - id
  fields:
    -
      name: id
      label: 'Unique Id'
    -
      name: title
      label: Title
    -
      name: body
      label: 'Post Body'
    -
      name: tags
      label: 'Taxonomy Tag'
    -
      name: image
      label: 'Image Field'
process:
  title: title
  body: body
  tags: field_tags
  image: field_image
  type:
    plugin: default_value
    default_value: article
destination:
  plugin: 'entity:node'

Okay, we now have all the resources we need to create our new migration. Now let’s see how we approach the process.

3- Approaches

We’re going to describe the different approaches that we will apply to our example cases, in order to understand them better.

First case: Migrating embedded data

In this first case, we considered making the lightest possible case of migration in Drupal: Only two nodes with two basic fields each under an embedded format: the lightest possible.

Also, in this example we are going to use for the three ETL phases of the migration (Extract, Transformation and Loading) processing plugins already provided by Drupal (we will not develop any custom plugin). If you don’t know anything about the concept of Migration Plugins, please stop by for a moment and back here to read a little introduction to the topic.

To make things lighter, we will keep the “lite” version of Migration Tools, Migrate Run. Besides, we will only use the basic commands without any other options or complementary parameters, only with the basic argument of the migration file identifier.

Second Case: Migrating from csv files

For this execution, I would like to play with something pretty interesting&mldr;due to we’ll running this second migration example as configuration, I was thinking that will be funny do the inverse road&mldr;Yes, I propose not to install (activate, drush enable) the new custom module for CSV and leave it&mldr;only as storage for the CSV file.

Let’s move and run the migration from somewhere else. Surprise. Visit the path /admin/config/development/configuration/single/import into your Drupal installation and we’ll see there!.

4- Migrations

First case: Migrating embedded data

Getting info about the available migrations

drush migrate:status
drush ms

Output from console:
----------------- -------- ------- ---------- ------------- --------------------- 
  Migration ID      Status   Total   Imported   Unprocessed   Last Imported        
----------------- -------- ------- ---------- ------------- --------------------- 
basic_migration_one   Idle     2       0          2               
----------------- -------- ------- ---------- ------------- --------------------- 

Running migrations

drush migrate:import basic_migration_one
drush mi basic_migration_one  

Output from console:
----------------- -------- ------- ---------- ------------- ------------------- 
  Migration ID      Status   Total   Imported   Unprocessed  Last Imported        
----------------- -------- ------- ---------- ------------- ------------------- 
basic_migration_one   Idle     2       2 (100%)   0            2020-03-17 23:19:36  
----------------- -------- ------- ---------- ------------- ------------------- 

And so, going to the path /admin/content you’ll see the two new nodes:

Drupal Basic Migration Embedded Data

Rollbacking migrations (undoing)

drush migrate:rollback basic_migration_one
drush mr basic_migration_one  

Output from console: 

[notice] Rolled back 2 items - done with 'basic_migration_one'

Drupal Basic Migration Commands

Second Case: Migrating from csv files

Well, now in the path /admin/config/development/configuration/single/import we have to import our new custom migration definition file, Ok?

Loading the migration config data

Just go to Import -> Single Item, select the configuration type as “Migration” and paste the content of the original migration file:

Drupal Migration load File by Config

Click The “Import” button and the new Config object will be created in the Config System.

And now?

Running the migration

With the Migration file under the Config management, you can run the process with the same tools as in the former case. Now, we have available a new migration that we can run from console: drush migrate:status

Drush Migrate Status

Now you can execute the migration with: drush migrate-import article_csv_import And all the new nodes will be created. The limit? well, tags and image not will be migrated, cause tag is an entity reference and image is not a link, is a file, and both types must use some differents Plugins&mldr;but we’ll talk about this in future posts.

Drush cex / Drush cim

With the migration under the config system, now you can edit, import and export the migration using the basic resources from Drush. For example, testing drush cex:

Drush Cex

As you can see, the Config System has directly put the new migration file under the management of Migrate Plus and It has performed some actions, such as: renamed the file by placing migrate_plus.migration as a prefix in the file name or added a new file for group (only a way to group migration processes).

Remember the name of the file? It’s just the same that we were using in the /config/install directory, the so-called migrate_plus.migration.article_csv_import.yml. We’ve done exactly the same process, but from a different direction. Are you impressed? No? Do you find it interesting?

Remember also that with this config file, you can use drush cim and load the migration in any other Drupal (with access to the CSV file as datasource, indeed).

Thus we have migrated some 102 new nodes using two different approaches and different methodologies. Not bad.

5- Key Concepts

Migration Plugins

Ok, It’s very important so we have to repeat one more time the same song&mldr;You must to know the Plugin Format and the diverse world of the existing Migration Plugins.

Every Plugin points to a specific data type, a specific format or a different source. You should know the main ones very well and also investigate those you may need, since in migrations they are used extensively. Because of this, for example, we have not been able to migrate taxonomy terms or images in the second case from the CSV file as datasource.

Let’s see the Plugins involved in these two migrations, watching its descriptive files:

Basic Embedded Migration

source:
  plugin: embedded_data
  data_rows:
         ...
process:
  title: creative_title
  body: engaging_content
destination:
  plugin: 'entity:node'
  default_bundle: page

We’re using for extract data from the source the Embedded Data Plugin, a PHP class available in /web/core/modules/migrate/src/Plugin/migrate/source/EmbeddedDataSource.php where in its annotations block you can see some configuration keys that you can use in your migrate file:

 *
 * Available configuration keys
 * - data_rows: The source data array.
 * - ids: The unique ID field of the data.
 *

And data_rows and ids are the keys that we’re using in our migration description file. Read more about the EmbeddedDataSource class in Drupal.org API.

Now, watching the process block and looking for&mldr;where’s the Processing Plugin? Well I think this might be interesting&mldr;usually, all the field mappings in a processing block requires a process plugin for each. Then, with some of “sintactic sugar”, the Migrate API offers a way to reduce and simplify this: if no specific treatment is required for each field, then a single Plugin can take care of all the processing. This “default” Plugin may also be implicit, so that in the absence of a declaration, the Drupal Migrate API will always apply the same Processing Plugin by default.

This “implicit” and by-default Plugin is the Get class and is provided as the basic solution in processing fields. You can find the Get class in the path /web/core/modules/migrate/src/Plugin/migrate/process/Get.php. Read more info about the Get.php class in Drupal.org API. So actually, what we are saying in a complementary way is that is the same thing write:

process:
  title: page_title

as this other:

process:
  title:
    plugin: get
    source: page_title

And so life is a little simpler, isn’t it? Remember: in the absence of a processing plugin declaration for a field, Drupal will apply the “Get” plugin by default.

Ok and finally, for destination we’re using the Entity General Plugin with param “node”, in order to create diverse elements with node type and for bundles “page”. This calls to the Destinatio Plugin Entity.php, abstract class in path: web/core/modules/migrate/src/Plugin/migrate/destination/Entity.php and get its own derivative Plugin. Read more about derivative Plugins in Drupal and read about the Entity.php destination Plugin or the derivative migration class.

CSV datasource Migration

I think that the review of the plugins in this case could be easier and more intuitive.

source:
  plugin: csv
 ...
 process:
 ...
   type:
     plugin: default_value
     default_value: article
 destination:
   plugin: 'entity:node'

For the source, the CSV Plugin, from the migrate_source_csv contrib module. For processing, by default is using Get and for type the Default Value Plugin. For destination, the same plugin as the previous migration: new entities.

Migration as code or as configuration

As you could see, we have treated each migration process differently. The first process (Embedded Data) has been treated as part of the “code”, without any further particularities.
But the second process has been treated as a configuration element of the system itself, making it part of the config/install path, which will create a new configuration object from the installation.

In both cases you write the migration definition in a YAML format and then you put the migration file in a place or another. But there are more differences&mldr;Let’s make a little summary of these keys:

  • Migration “as code” is provided out of the box, but the module “Migrate Plus” allows you treating the file as a configuration object.

  • Depending on which approach you use, the location of the files and the workflow will differ:

    • As code, in order to make changes to the migration definition you’ll need access to the file system and manage the migration file as a code file, something developers-oriented.

    • As configuration, you’ll can do changes to the migration definition file using the config sync interface in Drupal, path: /admin/config/development/configuration, in addition to being able to use configuration export/import tools: drush cex, drush cim, cause now you sync the migration (the migration file will be saved in database). This means that you can write, modify, and execute migrations using the user interface. Big surprise.

    • As a configuration object, now your migration file will be create a new configuration registry in your Drupal Config System, and keep it alive also when your migrate module will be disabled. To avoid this and delete the config, put your own custom module as a new dependency of the migration in your migration description file.yml, so the migration will be deleted from Drupal’s Active Config just in this moment:

  dependencies:
    enforced:
      module:
        - my_own_migration_custom_module

  • Another change is that now, in a config-way, your migration file needs a UUID, just a global identifier for the Drupal Config Management System. Add at first line an unique and custom UUID for your file, to facilitate the processing of the configuration. Remember: UUID is a string of 32 hexadecimal digits in blocks of 5 groups using the pattern: 8-4-4-4-12. Make your own!
    uuid: cacafuti-1a23-2b45-3c67-4d567890a1b2.

6- Resources

Download, play and test the different resources using along this post. I uploaded to Github ready to use.

  1. Basic Migration File, basic_migration_one.yml, available in Github as Gist.

  2. CSV Migration File, article_csv import.yml, available in Github as Gist.

  3. CSV Source File with random data, Gist in Github.

  4. Codebase of the two migration modules (basic and csv), Available in Github. This will be a central repository for all the modules of this series of posts about Migrations, so get the direct link to these two examples:

  5. In parallel to this series of articles I’m also publishing a series of snippets in Gitlab under the topic “Migrations”, with a more simplified format, less verbose. Here you can access to the first snippet and get links to the rest of the series. Drupal Migrations Tips (I): Creating a new basic migration structure.

7- :wq!

[embedded content]

Mar 16 2020
Mar 16

Category 1: Web development

Government organizations want to modernize and build web applications that make it easier for constituents to access services and information. Vendors in this category might work on improving the functionality of search.mass.gov, creating benefits calculators using React, adding new React components to the Commonwealth’s design system, making changes to existing static sites, or building interactive data stories.

Category 2: Drupal

Mass.gov, the official website of the Commonwealth of Massachusetts, is a Drupal 8 site that links hundreds of thousands of weekly visitors to key information, services, and other transactional applications. You’ll develop modules to enhance and stabilize the site; build out major new features; and iterate on content types so that content authors can more easily create innovative, constituent-centered services.

Category 3: Data architecture and engineering

State organizations need access to large amounts of data that’s been prepared and cleaned for decision-makers and analysts. You’ll take in data from web APIs and government organizations, move and transform it to meet agency requirements using technology such as Airflow and SQL, and store and manage it in PostgreSQL databases. Your work will be integral in helping agencies access and use data in their decision making.

Category 4: Data analytics

Increasingly, Commonwealth agencies are using data to inform their decisions and processes. You’ll analyze data with languages such as Python and R, visualize it for stakeholders in business intelligence tools like Tableau, and present your findings in reports for both technical and non-technical audiences. You’ll also contribute to the state’s use of web analytics to improve online applications and develop new performance metrics.

Category 5: Design, research, and content strategy

Government services can be complex, but we have a vision for making access to those services as easy as possible. Bidders for this category may work with partner agencies to envision improvements to digital services using journey mapping, user research, and design prototyping; reshape complex information architecture; help transform technical language into clear-public facing content, and translate constituent feedback into new and improved website and service designs.

Category 6: Operations

You’ll monitor the system health for our existing digital tools to maintain uptime and minimize time-to-recovery. Your DevOps work will also create automated tests and alerts so that technical interventions can happen before issues disrupt constituents and agencies. You’ll also provide expert site reliability engineering advice for keeping sites maintainable and building new infrastructure. Examples of applications you’ll work on include Mass.gov, search.mass.gov, our analytics dashboarding platform, and our logging tool.

Reflections on my migration journey from Disqus comments

Mar 15 2020
Mar 15
Mar 13 2020
Mar 13

Drupal 9 release date has been pinned for June 3, 2020, and it's coming up super fast. What does that mean for your site?

First of all, don't panic. Drupal 7 and 8 end of life are scheduled until November 2021, so there is plenty of time to upgrade. However it is always good to plan ahead with time and take advantage of the new features and security releases with the new version.

If you are on D7

Moving to Drupal 9 will be very similar as moving to Drupal 8, and in fact, there is no reason to wait, and the recommendation is to move to D8 as soon as possible, incorporating the tools described in the next section to search for possible incompatibilities.

Mar 13 2020
Mar 13

Read our roadmap to understand how this work falls into priorities set by the Drupal Association with direction and collaboration from the Board and community. You can also review the Drupal project roadmap.

Project News

Drupal 9 beta is closer than ever!

At the time of writing this post, there are fewer than three beta-blockers for Drupal 9. This hopefully means that we'll be seeing a beta release of Drupal 9 very soon. 

What does this mean for you?

Now's the time to get familiar with what's coming in Drupal 9, and to check your contributed or custom modules to see if you're ready to go. The community has put together a number of tools that you can use: the upgrade status module, the Drupal Check command line tool, and Drupal Rector.

We also need your help! We're looking for more individuals and organizations to participate in the Drupal Beta Test program. It's a great way to contribute to Drupal.

Call for Sponsors & Contributors: Automatic Updates

We're really proud of the work we accomplished in the first phase of the automatic updates initiative; in Drupal 7 and Drupal 8, sites that don't depend on Composer workflows now have complete support for securely and automatically updating Drupal Core. In the second phase of this work we want to extend that support to contributed projects, and to support Composer-based site installations. 

We need your help to make the second phase happen. Will you contribute?

Learn more on our call for sponsors & contributors post.

Drupal.org Updates

DrupalCon Minneapolis Program Update

In preparation for releasing the full DrupalCon Minneapolis speaker schedule, we've made some updates to the accepted sessions page. 

The newly redesigned page now highlights our excellent keynote speakers (to include Mitchell Baker from Mozilla!) as well as other featured speakers for this year's event. On top of that you can filter the list of sessions by track, to get a jumpstart on finding your favorite sessions, before the full schedule is released. 

Ready to enable Semantic Versioning for Contributed Projects

We've rearchitected the version management for contributed projects, so that they can begin using Semantic Versioning as we enter the Drupal 9 era. You can see an example of this in practice on this sample project: semver_example. 

We're coordinating with the Drupal core maintainers to select a window for enabling the new semver functionality across all projects. We want to ensure that Drupal end-users will still be able to find and easily understand which projects they can use once projects are able to be compatible with both D8 and D9, and are using semver version numbering. 

Not familiar with semantic versioning

The three digit numbering scheme (MAJOR.MINOR.PATCH) is designed to provide guide rails around API breaking changes. In Drupal core for example, patch releases are incremented whenever there are bug fixes or security releases. Minor releases indicate that new features have been introduced. And the Major version only changes when deprecated APIs are removed and fundamental architectural changes have been introduced.  Contributed project maintainers are encouraged to adopt the same pattern.

Updated display of releases

Speaking of releases - we've recently updated the display of releases to provide a cleaner view of release metadata. This should make it much easier to understand the history of recent releases, and to see at a glance which ones were bug fixes vs. feature releases vs. security releases. 

New Release Meta Data

You can see a detailed example by looking at the release history for Drupal core

Drupal usage stats by branch

Because of the six-month minor release cycle, it's become much more important to have more granular insight into what minor versions of Drupal are in use in the wild. 

Usage stats by branch

As you can see above, we've updated the usage stats for Drupal to display usage by branch. This is mostly useful for Drupal Core, but may be valuable for contrib maintainers as well as they look to understand which versions of their projects are in highest demand. 

Coming soon: An updated UX for project browsing

With the release of Drupal 9, it will be possible for contributed projects to be compatible with both major versions of Drupal. Perhaps more interestingly, because of the release of new features with minor versions, there are some projects that may only be compatible with a certain range of minor versions (e.g: 8.6.x - 9.2.x). 

This is a powerful improvement in ensuring that key modules are ready to use with Drupal 9 on day one, but it also has the potential to be confusing for Drupal site owners and evaluators who are trying to discover what projects they can use. We're looking to update the project browsing on Drupal.org to make sure discoverability doesn't suffer with this change. If you have good ideas about this user experience, please feel free to share them on the issue!

Drupal 9 Readiness

Packaging enhancements

Beginning with Drupal 8.8.0, Drupal needed to be packaged from the output of Composer create project, rather than as the simple contents of a git clone. These changes to packaging have additional ramifications for how we manage tagged releases for Drupal core, and in particular for how we manage security releases. We've been making a variety of updates to the Packaging pipeline since Drupal 8.8 to make the process more transparent, resilient, and performant, and that work continues. 

DrupalCI

DrupalCI: Support for new Postgres environments

Because minimum requirements are changing with Drupal 9, we've added new test environments for both Postgres 10 and Postgres 12

DrupalCI: Updated SQLite version

SQLite has also been updated within the DrupalCI test environment to version 3.26, to support testing on the correctly supported version. 

DrupalCI: Support for MariaDB environments

MariaDB forked from MySQL after the acquisition by Oracle, but at first had remained fairly consistent. However, with recent versions MariaDB has had to diverge, and so we are now providing explicit testing support for MariaDB, with test environments for versions 10.2.7 and 10.3.22. 

———

As always, we’d like to say thanks to all the volunteers who work with us, and to the Drupal Association Supporters, who make it possible for us to work on these projects. In particular, we want to thank: 

If you would like to support our work as an individual or an organization, consider becoming a member of the Drupal Association

Follow us on Twitter for regular updates: @drupal_org, @drupal_infra

9 Things To Take Into Consideration When Starting A Drupal-based Project

Mar 13 2020
Mar 13

Does your Drupal hosting company lack native Composer support?

Mar 12 2020
Mar 12
Mar 12 2020
Mar 12

by David Snopek on March 6, 2019 - 1:56pm

As you may know, Drupal 6 has reached End-of-Life (EOL) which means the Drupal Security Team is no longer doing Security Advisories or working on security patches for Drupal 6 core or contrib modules - but the Drupal 6 LTS vendors are and we're one of them!

Today, there is a Critical security release for the EU Cookie Compliance module to fix an Cross Site Scripting (XSS) vulnerability.

The module provides a banner where you can gather consent from the user when the website stores cookies.

The module doesn't sufficiently sanitize data for some interface labels and strings shown in the cookie policy banner.

This vulnerability is mitigated by the fact that an attacker must have a role with the permission "Administer EU Cookie Compliance banner".

See the security advisory for Drupal 7 for more information.

Here you can download the Drupal 6 patch or the full release.

If you have a Drupal 6 site using the EU Cookie Compliance module, we recommend you update immediately! We have already deployed the patch for all of our Drupal 6 Long-Term Support clients. :-)

If you'd like all your Drupal 6 modules to receive security updates and have the fixes deployed the same day they're released, please check out our D6LTS plans.

Note: if you use the myDropWizard module (totally free!), you'll be alerted to these and any future security updates, and will be able to use drush to install them (even though they won't necessarily have a release on Drupal.org).

Mar 12 2020
Mar 12

This post was created jointly by Michael Hess of the Security Working Group, and Tim Lehnen, Executive Director of the Drupal Association.

Last year, with the security release of SA-CORE-2018-002, the most significant security vulnerability since 2014, we heard the pain of site owners and development teams around the world staying up at all hours waiting for the complex security release process to complete and the patch to drop. We heard the pain of agencies and end-user organizations required to put teams on late shifts and overtime. We heard from some users who simply couldn't respond to patch their sites on the day of release, because of lack of resources or entrenched change management policies.

We've heard calls from the community for rotating the timezones for security advisories from release to release, or for having more on-call support from security team members across the globe, or simply for a longer horizon between the release of PSA and SA.

Yet at the same time, we're cognizant that these solutions would put increased burden on a security team composed of dedicated volunteers and contributors. There are a number of generous organizations who sponsor many of the members of the security team, but relying on their altruism alone is not a sustainable long-term solution—especially if we consider expanding the role of the security team to address the larger pain points above.

Last week, with the release of SA-CORE-2019-003, we heard these concerns for site owners and the sustainability of the security team echoed again.

The Security Team and the Drupal Association have been developing solutions for this issue for well over a year.

The goals are simple:

  • Provide a new service to the Drupal community, from small site owners to enterprise-scale end users, to protect their sites in the gap from security release to the time it takes them to patch.
  • Create a new model for sustainability for the Security Team, generating funding that 1) covers the operating costs of the program 2) can support security team operations and 3) can support additional Drupal Association programs.

Although the execution will take care and careful partnership, we are happy to announce that we've found a solution.

We're tentatively calling this: Drupal Steward. It is a service to be provided by the Drupal Association, the Security team, and carefully vetted hosting partners.

Drupal Steward will offer sites a form of mitigation through the implementation of web application firewall rules to prevent mass exploitation of some highly critical vulnerabilities (not all highly critical vulnerabilities can be protected in this fashion, but a good many can be - this method would have worked for SA-CORE-2018-002 for example).

It will come in three versions:

  • Community version - for small sites, low-budget organizations, and non-profits, we will offer a community tier, sold directly by the DA. This will be effectively at cost.
  • Self hosted version - for sites that are too large for the community tier but not hosted by our vendor partners.
  • Partner version - For sites that are hosted on vetted Drupal platform providers, who have demonstrated a commitment of contribution to the project in general and the security team in particular, protection will be available directly through these partners.

Next Steps

The Drupal Association and Security Team are excited to bring this opportunity to the Drupal Community.

We believe that the program outlined above will make this additional peace of mind accessible to the broadest base of our community possible, given the inherent costs, and are hopeful that success will only continue to strengthen Drupal's reputation both for one of the most robust security teams in open source, and for innovating to find new ways to fund the efforts of open source contributors.

We will announce more details of the program over the coming weeks and months as we get it up and running.

If you are a hosting company and are interested in providing this service to your customers, please reach out to us at [email protected].

Please also join us at DrupalCon for any questions about this program.

If you are a site owner and have questions you can join us in slack #drupalsteward.

For press inquiries, please contact us at: [email protected]

Mar 11 2020
Mar 11

There’s nothing like the threat of a global pandemic to bring the topic of working remotely to the forefront. 

This week, in response to the rapid spread of the coronavirus disease (COVID-19), companies from all over the world are scrambling to get systems and policies in place to ensure that work can continue in the event that quarantines are imposed or decisions are made to exercise caution and curtail the threat of workplace transmission of the disease. 

Remote work options are inherent to the Promet Source culture. We’ve benefitted for years from a culturally diverse staff and the opportunity to source the best talent without bias to location. As other organizations are rapidly moving in this direction, here are five strategies that we've learned for optimizing the remote work opportunity.

1. Communicate Often and Communicate Well

Compensating for the fact that you are not engaging with co-workers in the hallways, over lunch, or during daily stand-ups requires excellent and intentional communication. In fact, don't hesitate to over communicate with both your team and your supervisor. Assume nothing. Set clear expectations. Stay in touch, and be sure not to overlook the importance of casual conversations and humor. A productive work environment is not all work no play and you should get to know your remote colleagues just the same as you would those who work in the office or workspace next to yours.

2. Maintain Face-to-Face Connections

Promet’s president, Andrew Kucharski is insistent on the use of Zoom video conferencing for all meetings -- even ad hoc check-ins. This serves multiple purposes. Of course, it keeps us on our A Game and mitigates against any work-at-home temptation to stay in PJs and slippers all day. More importantly, we are inherently more connected when we see each other’s faces and facial expressions. We’re also more prone to converse and know what’s going on in each other’s lives and to remain accountable to each other.

Promet Source PMO working remotely with her dog, Lumen, by her sideDemonstrating another big advantage of working remotely from home, Pamela Ross, Promet Source PMO Director, takes a break with her dog Lumen at her side.

3. Leverage Technology

This one is huge. The concept of telecommuting has been around for a decade or two, but more so than ever before, we have access to tools that make us wonder why we’d ever waste a potentially productive hour or so every day sitting in traffic or taking public transportation to an office. Teleconferencing, shared calendars, collaborative document authoring tools and Slack are among the multitude of tech resources that enable a global staff to thrive.

4. Acknowledge and Affirm

Remember that all of the same principles of leadership and human dynamics apply when working remotely. Who among us isn’t energized by public acknowledgement, appreciation, and various forms of affirmation? At Promet Source, we encourage shout outs on the companywide Slack channel and through software that is specifically designed to engage employees and emphasize the company culture. New team members are welcomed and introduced during a “two-truths-and-a-lie” activity during company wide video conference calls, and managers are coached to consistently acknowledge team members for their contributions.

5. Recognize Relevant Strengths

Working remotely is a privilege that requires maturity and an excellent work ethic. Not everyone fares well in an environment that lacks the structure of a traditional office. Leadership needs to hire accordingly and cultivate an environment in which the responsibilities and advantages of working remotely are emphasized and built into expectations for every role.

At Promet Source, we do a lot more than develop and design accessible websites that win awards. Our Human-Centered Design workshops help to define the dynamics and the direction for success in any organizational endeavor. Contact us today to learn more. 

I'm supporting open source: Drupal Association membership drive

Mar 11 2020
Mar 11
Mar 11 2020
Mar 11

The business world is competitive by nature. An organization’s intellectual property and the custom software that costs valuable time and money to develop is an incredibly prized possession - one that’s important to protect. That’s why the idea of procuring an open source solution (free software that can be used by anyone) can be such a foreign and challenging concept for many in the business world.

In this article, we’ll walk through some of the most common questions that clients have about procuring open source software, so that you’ll understand how this software is licensed, what you can and can't do with it, and hopefully help you make an informed decision about procuring and extending open source software services.

How does open source licensing work, exactly?

Open source software turns the traditional software licensing model on its head by allowing users to modify and freely redistribute software. Open source is defined by criteria intended to promote and protect software freedom, and support the communities which contribute to the success of open source projects.

Are there any laws that prevent someone from making changes to the software and repackaging the entire thing for sale?

Many open source projects are able to survive and thrive because there are protections in place that prevent someone from turning the project into something proprietary. Drupal and many other open source projects are covered under the GNU Public License or GPL, one of the most common open source licenses. The GPL prevents anyone from distributing the software that it covers without also sharing its source code. This ensures that the project covered by the GPL remains open source.

Does this mean that if I pay a contractor for custom code to be developed that adds to an existing GPL-covered open source project, I’ll be required to release that work to the public for free?

The short answer is simply “no”. If you work on or pay for someone to work on custom code that modifies a GPL-covered open source project, you won’t be required to give that work away to the community at large.

The GPL only requires that you release your source code if you plan to sell or release (“distribute”) your custom code. If you’re just planning to use the code internally, or as part of a hosted solution that you control, there’s no need to share it with the world.

But shouldn’t I share the code? Isn’t that how open source works?

Many users of open source software do decide to share their source code with the world through contributions to the open source project in question, such as a contributed module in Drupal. There is no requirement to do this, but there are some advantages.

Source code is the actual text document that a software developer creates. It’s uncompiled, meaning that it’s written in programming language that can be read and edited by a human. The reason that distinction is important here is that source code is raw and can be inspected and modified. This practice helps improve both the security and the usability of the software.

By sharing your source code, other people may decide to improve on it and fix it for free, because it’s mutually beneficial. It’s much harder to staff a team of internal developers to keep your code tested, maintained, and bug-free while planning an improvement roadmap to add new features and create new software integrations. Having others work on your code means it is made better both for you and for them. Sometimes the advantage of having software that works really well and has updates and features added more quickly outweighs the advantage of keeping your innovation secret from your competitors.

Curious if open source software is right for your business? Read our post on how you can save money using open source technologies.

Mar 11 2020
Mar 11

We've worked on countless websites that have social media sharing functionality. You know, those little links that let you easily post to Facebook, Twitter, or some other social network?

These widgets work by requiring a developer to embed a script tag on their site. Like this:

By embedding JavaScript from a third-party source, you've allowed that provider to modify the content of your HTML page. Putting concerns aside about what could go wrong (XSS attacks, unexpected manipulation of the page, JavaScript execution errors), and just focusing on what is happening is a reason to avoid most approaches to share links.

When you embed a share widget on your site, you've added tracking by that social network. Now social networks can associate each visitor’s profile with the content that is on your page. Social networks, and Facebook, in particular, use that to build an advertising profile based on your content.

A hypothetical example: If your site provided medical or self-help advice, the share widget on the page loads JavaScript from Facebook. Like many, visitors to your site are always logged into Facebook even if they don't have it open. When the JavaScript is loaded, it knows the profile of the user—and how it shows when you've liked or retweeted something. The JavaScript can then check the URL of the page, which Facebook can then index. Facebook can associate the content of the page with the user's profile. And finally, Facebook can now show advertisements targeting the medical condition of visitors to your page. And, that's all just from the site visitor looking at your content. This requires no interaction with the Facebook share widget; the mere act of loading the widget is enough to associate the widget with the content of your site.

Combined Share Widgets Can Be Even Worse

An alternative to using the direct widgets provided by social networks are those created by other providers that wrap around social media links. Examples include AddThis, ShareThis, AddToAny, Shareaholic, and many others. However, this further compounds the problem. Not only are Facebook and Twitter tracking your visit, but so is the provider of the sharing widget.

For example, in the privacy policy of AddThis (which is owned by Oracle) states:

Publishers provide us with AddThis Data so that we can build Segments and Profiles to facilitate personalized interest-based advertising for you by Oracle and our Oracle Marketing & Data Cloud customers and partners. By installing the AddThis Toolbar, Toolbar Users provide consent for us to use their AddThis Data for interest-based advertising.

Using a centralized share provider has only introduced another aggregator and broker of people's interests. Not all services are equally bad, but be sure to carefully read the terms of service when using any of these providers. Note in most cases, using one of these widgets will also load the SDKs for each enabled social network to count engagement such as likes, retweets, etc.

Alternatives and Suggestions

The absolute best thing an organization concerned with privacy can do is not include any share links at all. That would avoid any direct connection between your visitors and data aggregators. However, for many clients, designers, and visitors, having some share capabilities is expected. What can developers do to meet the requirements and be responsible for user data?

The answer is pretty simple. Use links. Each social network has a simple URL that you can use to prepopulate a sharing form with the URL of your content. At its simplest, these links look something like this:

No JavaScript. Just HTML.

Be sure to include the rel attribute to prevent the third-party site from being able to manipulate the browser history. And using target = "_blank" opens a new window, so the user doesn't immediately leave your page.

This provides a happy middle ground where sharing is still available for users, but it makes it impossible for social networks to track users simply visiting the page. Once the user clicks/taps on the share link, then they're consenting to use those social networks (and thus be tracked and profiled).

Copy/paste example services that don’t include any JavaScript can help with generating these links, see the following sites for examples:

As of this writing, you can also check out the share links here on Lullabot.com, which uses a combination of these direct-share URLs with lightweight JavaScript to open in a sized new window.

Although privacy is starting to become a focus for the general public, many users still may not realize that their browser is logged into social networks all the time. Websites big and small, then facilitate the tracking of users by loading JavaScript from these social networks, resulting in extensive profiling based on the viewed content used to create targeted advertising. 

Share links are often privacy trojan horses. As the builders of the web, we should take care to account for the privacy of our site visitors. So the next project you're on, advocate for a non-tracking solution.

From Historian to Drupal Developer - How Debug Academy Launched My New Career

Mar 10 2020
Mar 10

Federated Search - Release Plan Q1 2020

Mar 10 2020
Mar 10

Drupal Landing Page - part 1

Mar 10 2020
Mar 10
Mar 10 2020
Mar 10

Landing pages are great for product presentation and customer engagement.

There are a must for today marketing campaigns, mobile advertising and sales development.

landing pages

There is no easy way to build a simple landing page in Drupal.

You can use custom themes or modules to manage layout like parade but it is not that simple. Layout options are limited. For instance. The module does the job; you can build a simple landing page without custom development, but requires a lot of dependencies for a simple page and you may still have to do some css editing.

In this article we will explain how our landing page has been constructed within Drupal 8 website using separate dedicated theme and a custom module with twig templates.

The original page which is a stand alone 1 page theme is now fully integrated in the website.

It may not be the best method, but it can be easily replicated and give more room for creativity and extra flexibility compared to a layout module or a full land page theming construction.

Part 1: the custom theme

To achieve that, we created a custom theme with only 1 region for content. When building a 1 page theme, you usually do not want side columns or extra headers and footers.

To create your theme, you only need 1 file to be saved under your custom theme folder in the Drupal 8 "themes" folder: myTheme.info.yml.

type: theme
base theme: false name: 'EK'
description: 'Built to use land page'
version: VERSION
core: '8.x' regions:
  content: Content
  footer: Footer

This is what is needed to create the basic theme that will be used in our landing page. We keep a region "footer" to insert hidden blocks or content.

This theme will be called on specific routes names and will replace the default theme of the site.

You can add a screenshot image also in the theme folder if you want to enhance your admin view.

theme admin

In the next step we will explain how our custom module switch theme for dedicated url and build the landing page with twig template. For that step you will need some knowledge on creating a simple module, insert libraries and make a twig template.

Mar 09 2020
Mar 09

Since tickets are no longer required, sign up for our mailing list to get all the latest updates. Links to join sessions will be on the session pages on the day of the event.

Due to the worldwide Coronavirus/COVID-19 response and in consideration of the health of our community, we have reconsidered the implications of moving forward with business as usual for MidCamp. While there are currently no travel or gathering warnings for Chicago, we need to consider that historically, >40% of our attendees travel to the event. Additionally, current advice and historical evidence suggest that proactive steps are essential in containing the network effects of the virus.

Fortunately, we have the tools and ability to share our information in a remote manner, and as such we’re moving MidCamp to a 100% remote format this year. There’re many details to work out, but our aim is to deliver the same quality content on the same schedule as we would have done in person.

What does this mean for attendees?

  • In-person social events will be cancelled and replaced with virtual events where possible

  • All Wednesday trainings will be cancelled.

  • Thursday & Friday sessions will be presented and attended via videoconference (details to come). We will have no in-person events at DePaul.

  • All tickets will be refunded.

    • We are still accepting individual sponsorship donations. Donations will go toward deferring cancellation costs, toward costs of the virtual event, and towards our 2021 event.

  • If you’ve booked accommodations with the Midcamp 2020 group at Hotel Versey, call (773) 525-7010 to cancel. 

Our goal is to maintain the high-quality content you've come to expect from MidCamp without adding any risk to the community we care so deeply about. We appreciate your patience and flexibility with this change of format. If you have any questions at all, feel free to contact us at [email protected], or hop into our Slack https://mid.camp/slack.

Thanks again for your support!

Seeking volunteers for the evolving structure and roles of the Drupal Community Working Group

Mar 06 2020
Mar 06
Mar 06 2020
Mar 06

The mission of the Community Working Group (CWG) is to foster a friendly and welcoming community for the Drupal project and to uphold the Drupal Code of Conduct. 

- https://www.drupal.org/community/cwg 

As the Drupal Community Working Group (CWG) moves into its seventh year, we have been thinking a lot about how we can evolve it to better serve the changing needs of the Drupal Community. 

At the moment, the four members of the CWG split our time between reactive issues (conflict resolution and Code of Conduct enforcement) and proactive issues (community health resources, workshops). While the work is often emotionally taxing, it is also often extremely rewarding. We believe the proactive work has a large impact on the community, but our time is often filled with reacting to issues in the community. 

To this end, we have been working with Tara King (sparklingrobots) on identifying new CWG roles, mainly focused on proactive tasks. These new roles will not play a part in conflict resolution matters and will not receive access to any incident reports or other confidential information that has been shared with the group, with the exception of subject matter experts, who may see some limited information when brought in to consult on specific cases. These new roles are designed for individuals to help provide insight and expertise into how we can better support and grow our community.

It is important to note that all CWG members must abide by the CWG Code of Ethics, regardless of if they consult on conflict resolution or Code of Conduct enforcement cases or not.

The list of new CWG roles follows below. In some cases, we have already reached out to individuals to fill some of the roles. Full details about the roles can be found on the Community Working Group's Community Health Team page.

  • Community health - develops and produces community health initiatives like workshops and tweaks to drupal.org processes.
  • Community event support - assists Drupal community events with Code of Conduct template, playbooks, and other resources. It is our hope that this role be filled by members of the newly established Drupal Event Organizers Group. 
  • Subject matter experts - includes individuals with knowledge of specific geographic, industry, and mental health areas. In some cases, subject matters experts will be provided with limited information about specific conflict resolution issues they have been brought in to assist with.
  • Ambassadors - coordinators between the CWG and other groups, including the Drupal Association, other open source projects, Drupal.org maintainers, and Diversity and Inclusion.

We are strong believers that the more proactive work we do, the stronger and healthier our community will be and the less reactive work we will have. 

If you, or someone you know, are interested in any of the new roles, please drop us a line at drupal-cwg [at] drupal.org. Include your name, drupal.org username and which role (or roles) you are interested in. 
 

Mar 06 2020
Mar 06

Fostering Inclusion in Tech

Working with multifaceted and diverse teams to solve complex issues is a part of everyday life at Lullabot. Therefore, becoming stronger, more empathetic communicators who foster diversity, equity, and inclusion (DEI) across the organization is something we’re striving for continuously. That said, DEI is a tough nut to crack, and we’re a work in progress. Like many organizations, we’re constantly asking ourselves, “How do we better foster a sense of inclusion and allow for different types of people, with varied abilities and skills, to work together to solve problems for the future?”

A group that has the benefit of an inclusive environment will:

  • be more agile and culturally competent, and
  • be able to work with a variety of viewpoints, carefully considered, toward building a more thoughtful, and hopefully better, end product or service.

We're learning how to improve internal communication and hold space for each other as we dive into these types of conversations. This series is a compilation of some tips we’ve collected through our continuing work, and we encourage you to share your own. 

Make a Plan to Foster Inclusion

While we consider ways to build up teammates and their sense of belonging in the job, we also desire each individual’s highest and best level of participation in a shared mission. Our team, as well as other knowledge workers, are becoming increasingly aware of the ability to leverage the extreme potential and power of technology to expand ideas, increase access to opportunities, and level the playing field.

Be Explicit

If an organization does not have a policy publicly stated, proactively, in the marketplace, others may have already crafted a narrative about mission and values without management’s participation. Be explicit about where the company stands. Make clear what’s important to the company, both internally and externally. We share our Mission and Core Values on our website and encourage clients and job applicants to review them, and are continuously finding ways of infusing these into our workflows and culture to ensure our team is living up to these ideas.

Start Now and Build into the Schedule

Who represents, guides, leads, and makes decisions for the company? Take a picture of the board or staff—who’s in the photo? Evaluate who’s who and identify whose voices the organization has chosen to elevate, increase, and honor.

  • Are voices missing? If so, why?
  • What opportunities exist for all staff to increase their skills and advance?
  • What opportunities exist for people traditionally excluded from work?
  • In which ways may new people participate as staff, interns, apprentices, consultants, or vendors? 
  • For any of the above, what’s the tracking mechanism? Free and low-cost tools exist to help you collect, analyze, report on, and share data to help with decision-making.
  • Is there a transparent way to demonstrate how staff may advance, either within their career track or if they'd like to switch to a different path such as management or sales?
  • Whose voice is missing? Who is not in the room? Representation matters.

Incorporate Inclusion as a Guiding Value, and Put It into Practice

While it’s great to have ideas, implementing them is some of the most difficult and demanding work. It’s best to start where you are and incrementally improve.

Consider determining three tasks to implement in the next quarter to increase the types of voices you currently have represented, and evaluate every 8-12 weeks how you’re doing, what’s working, what’s not working, and where you want to invest strategically. The Drupal Diversity and Inclusion group offers weekly accountability, support, and connections on initiatives across the Drupal community. Internally, our newly-formed Inclusion and Equity working group is discussing governance, goals, and how to move forward with our efforts toward greater inclusion.

Respect Autonomy

The best work does not always happen in an assigned cubicle, hot desk, or office. An individual might be better suited to doing focus work in the morning, taking a siesta break, then picking up again after dinner and going to midnight. Some of our teammates need to make day-by-day, non-scheduled arrangements for childcare, eldercare, medical appointments, and other issues, and much unpaid labor falls on stay-at-home workers. When a new team member requests to be on-call for a specific block of time, how does a company make it work? A flexible schedule (learn about my colleague Sean Lange’s routine while working from home) allows our teammates to more equitably participate in the workforce and bring their best ideas to us when they’re ready to do so. 

Respect staff’s autonomy and ability to choose and set their hours, and encourage a culture of high expectations as well as high performance. Clearly define deliverables and the practicalities of team needs (such as deadlines for when certain projects need to launch), but allow the team to determine the best way for them to deliver, rather than forcing people to adhere to an inflexible work arrangement where they clock in and immediately tune out.

Practically speaking, workforce trends for remote work are increasing: 26 million employed persons worked at home on an average day in 2018 (Bureau of Labor Statistics), and there are increasing numbers of positions that offer telework, telecommuting, and work-from-home options. Generation Z, which is predicted to become 36% of the global workforce in 2020, is comfortable with technology to make conferences, meetings, and training sessions seamless regardless of location.  

We’re a 100% distributed company, where all workers are remote, and there is no centralized office. As everyone works across multiple time zones, we’ve continually experimented with practices that foster community and connection. Read more about being a distributed company. Based on client desires and existing systems, we use Zoom, Uberconference, Google Meet, Webex, and Slack, among others, for conferences.

Support Mental Health and Wellness

The bulk of staff time is spent in the workplace or doing work-related activities. Burnout, stress, depression, anxiety, and mental health disorders directly impact teammates, colleagues, and clients. The American Institute of Stress survey shows 40% of workers reporting that their job is “very or extremely stressful” and 80% of workers report feeling stress on the job (with almost half saying they “need help in learning how to manage stress”). Lost productivity resulting from depression and anxiety is estimated to cost the global economy US$ 1 trillion each year, according to the World Health Organization.

Over the last year, we have made mental health a priority: learn more about Supporting Mental Health at Lullabot. Three best practices to support reducing stress and increasing mental health at the workplace include:

1) Avoid overscheduling your team: offer flexible work arrangements. For example, we work a 30-hour billable week and have 10 additional hours for contributing back to the company and community.

2) Create an open and relaxed work environment with access to management and ongoing feedback loops. For example, we use the Know Your Team tool (podcast) to organize constructive one-on-ones. Other activities, including scheduled coffee talks for drop-in support and advice, a weekly team call, small group calls on Fridays, a town hall with leadership every month, multiple Slack channels for conversations ranging from #being-human to #cats to #parenting, and a back-end “Daily Report” for internal news and reports.

3) Increase education. For example, offer access to mental health and general health and wellness topics, and provide training and development opportunities. With continuing education and professional skill-building, teams have documentable ways to increase productivity and overall experience.

Mental health awareness means implementing actions small and large across the organization that include:

  • Scheduling breaks in long meetings.
  • Obtaining psychotherapy, counseling, grief support, and similar add-ons to the health package.
  • Scheduling monthly or quarterly gatherings to discuss or practice mental health wellness.
  • Offering paid time off.

Invest in Personal and Professional Development, Training, and Education

“What happens if we train them, and they leave? What happens if we don’t, and they stay?”

While salary and benefits remain the base of any job, investment in an employee’s unique talents also pays off. Consider investing in professional development, training, certification, and other educational hours through an annual allocation or a pooled budget for staff-directed or individually-planned training. As part of our benefits package, we each receive an education and event budget annually. This may take the form of an education budget used for conferences, seminars, training, and continuing education. Determine a process for staff to propose options and receive feedback or vetting, perhaps as part of their employee review process, as they build up a multi-year plan to improve their abilities. 

Required To-Do List, Start Here

Make software, website, and digital products accessible

In web development, making digital properties as accessible as possible is required and best practice (check the a11y project). Technical Account Manager at Lullabot, Helena McCabe’s presentations give excellent tips on how to enable accessibility. By starting with an emphasis on accessibility for the digital property, additional issues around inclusion, culture, the role of technology, and overall trends in society may begin to surface. Our white paper on How Accessibility Protects your Business and your Bottom Line offers examples on how to make your products accessible and why it matters.

Practice transparency

Think of transparency as a way to build the team’s muscles, and to start working with fortitude, grace, and strength when grappling with heavier and more complex issues. For example, we practice open-book management (OBM), a financial practice that allows all employees to understand the current revenue, expenditures, and KPIs of the company (learn more at CEO Matt Westgate’s 2019 Lullabot Team Retreat post). By creating the flexibility and capacity to have tough discussions, everyone may use a shared language and understanding of the company’s direction.

Promote a sense of psychological safety

The open-source movement continues to build on information freely shared, vetted, and evaluated across multiple use cases. The belief in sharing knowledge is in the DNA of companies like ours. In our case, Drupal is the community and platform on which many of the staff have built their careers.

Psychological safety is the bedrock for knowledge-sharing: it’s "a condition in which you feel (1) included, (2) safe to learn, (3) safe to contribute, and (4) safe to challenge the status quo—all without fear of being embarrassed, marginalized or punished in some way." (Timothy R Clark, 2019). And, it’s something to which Lullabot aspires: here’s a link to Matt Westgate’s lightning talk on psychological safety and DevOps.

With accessibility, transparency, and safety in mind, we'll share more tips to begin or advance discussions around: hiring inclusively, engaging with staff, focusing on culture, and easier fixes. As we continue to work on this internally, we offer these ideas in the spirit of sharing and continuous improvement. Do you have ideas? Please drop a comment. We'd love to hear your thoughts and suggestions.

Mar 05 2020
Mar 05

Matt and Mike talk to two organizers of Drupal4Gov, as well as the project manager for Lullabot's Georgia.gov replatform about all things Drupal in the government.

Top Drupal blog posts from February 2020

Mar 05 2020
Mar 05
Mar 04 2020
Mar 04

The process of auditing a website for ADA accessibility compliance, as described by the W3C(R) Website Accessibility Conformance Evaluation Methodology (WCAG-EM) 1.0  falls into two essential categories: automated and manual. The automated part of the process is relatively straightforward. It’s simply a matter of leveraging the right tools for diagnosing non compliance with WCAG 2.1. The next and indispensable step tends to be open-ended and undefined. 

The WCAG-EM is the definitive to the evaluation process.

[WCAG-EM documentation]... is intended for people who are experienced in evaluating accessibility using WCAG 2.0 and its supporting resources. It provides guidance on good practice in defining the evaluation scope, exploring the target website, selecting representative samples from websites where it is not feasible to evaluate all content, auditing the selected samples, and reporting the evaluation findings. It is primarily designed for evaluating existing websites, for example, to learn about them and to monitor their level of accessibility. It can also be useful during earlier design and development stages of websites. 

The process is very well defined and W3C(r) provides a sample reporting tool for audits. 
 
We, at Promet Source like to tinker and apply small changes, without deviating from the process to improve the report and make it easier to use and read. Our version of the Website Accessibility Evaluation Report Generator is Google Doc based, provides an executive summary, a simple dashboard of results and is FREE! This template from Promet reflects WCAG 2.1 guidelines, and designed  for use by accessibility analysts and developers to detect errors missed by automated testing.

As a part of our commitment to advancing inclusivity and web experiences that are accessible to a full range of differently abled web users, we are making this tool available to all.
 

Small preview of the tool

 

Why Manual Accessibility Testing Matters

Manual accessibility testing goes deeper and wider than automated scans. It includes:

  • Keyboard testing
  • Color contrast testing
  • Screen reader testing
  • Link testing
  • Tables and forms testing
  • Cognitive testing
  • Mobile testing

The types of non-compliance issues, which are detected by manual testing tend to have the greatest likelihood of exposing site owners to ADA accessibility lawsuits.
 
Keep in mind that diagnosing a website for accessibility and fixing any noncompliance issues that are uncovered is not a one-size-fits-all endeavor. 

Every site has a distinct set of strengths and challenges, and in the current environment, the stakes are high for getting it right. That’s why we at Promet Source believe that tapping the expertise of a Certified Professional in Accessibility Core Competencies (CPACC) is the most efficient and effective path for bringing a site into compliance. We follow a distinct WCAG 2.1 auditing and remediation process that consists of a series of value-added steps in a specific order.  

Circle process graphic depicting web accessibility testing

 

Value-Added Elements

There is a high degree of added value that occurs during and following an accessibility audit. The education, consultation, and opportunity to dig deep and deconstruct aspects of a site that no longer serve the organizational mission fuels a better and wiser team of developers and content editors. For a number of reasons, web accessibility also enhances SEO.
 
In the current climate, websites are highly dynamic and serve as the primary point of engagement for customers and constituents. Constantly evolving sites call for an ongoing focus on accessibility, and an acknowledgement that staff turnover can erode the education, expertise, and commitment to accessibility that is in place at the conclusion of an audit. 
 
For this reason, a bi-annual or annual audit is a highly recommended best practice. 

Interested in kicking off a conversation about auditing your site for accessibility? Contact us today.
 

Mar 04 2020
Mar 04

Sending a Drupal Site into Retirement

The previous article in this series explained how to send a Drupal Site into retirement using HTTrack—one of the solutions to maintaining a Drupal site that isn't updated very often. While this solution works pretty well for any version of Drupal, another option is using the Static Generator module to generate a static site instead. However, this module only works for Drupal 7 as it requires the installation of some modules on the site, and it uses Drupal to generate the results. 

The Static Generator module relies on the XML sitemap module to create a manifest. The links in the XML sitemap serve as the list of pages that should be transformed into static pages. After generating the initial static pages, the Cache Expiration module keeps track of changed pages to be regenerated to keep the static site current. This combination of Static Generator, XML sitemap, and Cache Expiration is a good solution when the desire is to regenerate the static site again in the future, after making periodic updates.

There are many module dependencies, so quite a list of modules was downloaded and installed. Once installed, the high-level process is:

  • Create and configure the XML sitemap and confirm it contains the right list of pages.
  • Configure Cache expiration to use the Static Generator and expire the right caches when content changes.
  • Go to  /admin/config/system/static and queue all static items for regeneration.
  • Click a Publish button to generate the static site.

Install Static Generator

The modules are downloaded and enabled using Drush. Enabling additional modules, like xmlsitemap_taxonomy, may be needed depending on the makeup of the site.

drush dl static expire xmlsitemap

drush en static_file, static_views, static_xmlsitemap, static_node, static

drush en expire

drush en xmlsitemap_menu, xmlsitemap_node, xmlsitemap

Configure XMLSitemap

On /admin/config/search/xmlsitemap, make sure the site map is accurately generated and represents all pages that should appear in the static site. Click on the link to the sitemap to see what it contains.

  • Add all content types whose paths should be public.
  • Add menus and navigation needed to allow users to get to the appropriate parts of the site.
  • Make sure Views pages are available in the map.

A lot of custom XML sitemap paths may be required for dynamic views pages. If so, generate XML sitemap links in the code where the database is queried for all values that might exist as a path argument, then create a custom link for each path variation.

Code to add custom XML sitemap links look like this (this is Drupal 7 code):



/**
 * Add a views path to xmlsitemap.
 *
 * @param string $path
 *   The path to add.
 * @param float $priority
 *   The decimal priority of this link, defaults to 0.5.
 */
function MYMODULE_add_xmlsitemap_link($path, $priority = '0.5') {
  drupal_load('module', 'xmlsitemap');

  // Create a unique namespace for these links.
  $namespace = 'MYMODULE';
  $path = drupal_get_normal_path($path, LANGUAGE_NONE);

  // See if link already exists.
  $current = db_query("SELECT id FROM {xmlsitemap} WHERE type = :namespace AND loc = :loc", array(
    ':namespace' => $namespace,
    ':loc' => $path,
  ))->fetchField();
  if ($current) {
    return;
  }

  // Find the highest existing id for this namespace.
  $id = db_query("SELECT max(id) FROM {xmlsitemap} WHERE type = :namespace", array(
    ':namespace' => $namespace,
  ))->fetchField();

  // Create a new xmlsitemap link.
  $link = array(
    'type' => $namespace,
    'id' => (int) $id + 1,
    'loc' => $path,
    'priority' => $priority,
    'changefreq' => '86400', // 1 day = 24 h * 60 m * 60 s
    'language' => LANGUAGE_NONE
  );

  xmlsitemap_link_save($link);
}

Configure Cache Expiration

On admin/config/system/expire, set up cache expiration options. Make sure that all the right caches will expire when content is added, edited, or deleted. For instance, the home page should expire any time nodes are added, changed, or deleted since the changed nodes change the results in the view of the latest content that appears there. 

Generate the Static Site

Once configured, a Publish Site button appears on every page, which is a shortcut. But the first time through, it’s better to visit /admin/config/system/static to configure static site options and generate the static site. Some pages were created automatically, and others not during the initial setup. Once all other modules are configured, and the XML sitemap looks right, clear all the links and regenerate the static site.

The location where the static site is created can be controlled, but the default location is at the path, /static/normal, in the same repository as the original site. That location and other settings are configured on the Settings tab.

Generate the static site and ensure all the pages are accounted for and work correctly. This is an iterative process due to the discovery of missing links from the XML sitemap and elsewhere. Circle through the process of updating the sitemap and then regenerate the static site as many times as necessary.

The process of generating the static site runs in batches. It might also run only on cron depending on what options are chosen in settings. Uncheck the cron option when generating the initial static site and later use cron just to pick up changes. Otherwise, running cron multiple times to generate the initial collection of static pages is required.

For a 3,500 page site, it takes about seven minutes to generate the static pages. Later updates should be faster since only changed pages would have to be regenerated.

When making changes later, they need to be reflected in the XML sitemap before they will be picked up by Static Generator. If XML sitemap updates on cron, run cron first to update the sitemap, then update the static pages.

After generating the static site and sending it to GitHub, it was clear that the Static Generator module transforms a page like /about into the static file /about.html, then depends on an included .htaccess file that uses mod_rewrite to redirect requests to the right place. But, GitHub Pages won’t recognize mod_rewrite. That makes the Static Generator a poor solution for a site to be hosted on Github Pages, although it should work fine on sites where mod_rewrite will work. 

Comparing HTTrack and Static Generator Options

Here’s a comparison of a couple of methods explored when creating a static site: 

  • HTTrack will work on any version of Drupal, Static Generator, only on Drupal 7.
  • HTTrack doesn’t require setup other than the standard preparation of any site, which is required for any static solution. Static Generator took some time to configure, especially since there wasn’t an existing XML sitemap and Cache Expiration installed and configured.
  • HTTrack can take quite a while to run, a half-hour to an hour, possibly longer. Static Generator is much faster—seven minutes for the initial pass over the whole site.
  • The Static Generator solution makes the most sense if there is a need to keep updating the site and regenerating the static pages. That situation justifies the up-front work required to configure it. HTTrack is easier to set up for a one-and-done situation.
  • The file pattern of /about/about.html created by our custom HTTrack arguments works fine for managing internal links on Github Pages. The file pattern of /about.html created by Static Generator will not correctly manage internal links on Github Pages. The second pattern will only work on a host that has mod_rewrite installed and the appropriate rules configured in .htaccess.

Github Pages or the Static Generator module make excellent solutions. To view an example of a site generated with HTTrack, go to https://everbloom.us.

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web