Jan 23 2020
Jan 23

Drupal’s massive and passionate community has long been the envy of the open-source world. The many thousands of active contributors to Drupal are its strength, and one of the key reasons Drupal continues to be a force while others have stumbled.

With the release of Drupal 9 rapidly approaching, the support of the community is more important than ever. Here’s your chance to make the first release of Drupal 9 even better!

The annual Drupal Global Contribution Weekend, where Drupal User Group Meetups all over the world volunteer Drupal development time and make open-source contributions, is this weekend.

Anyone can take part in this initiative either from the comfort of their own home or at a contribution event near them. It is a great way to meet new people, learn about Drupal, and be part of something bigger. 

Check this Drupal map to see if there is a group meeting near you.

If you are planning to attend Southwestern Ontario’s contribution Meetup event in London, Canada on January 25th, (still time to RSVP) read on to get some helpful pre-event recommendations. 

Drupal 8 license plate

Pre-Attendance Checklist

Here are five things you can do ahead of time so that you are prepared for a quick start - especially if you have not contributed to Drupal core before: 

  1. Create a user account on Drupal.org if you do not have one
  2. Install Git on your laptop* (Yes, you will need a laptop)
  3. Make sure your laptop has a local development environment* (more on this later)
  4. Set up Drupal*
  5. Have some knowledge of how to create and apply patches using git*

Do as many of these steps beforehand as you can. If you need help with one or more, someone can help you on the day. 

*Not mandatory if you plan to contribute to documentation or marketing issues on Drupal.org, instead of code.

Resources For Beginners

Some other things we recommend beginners do ahead of time is to get familiar with the following resources:

  • Search the Drupal 8 Issue Queue. A list of issues tagged as ‘novice’ can be found here.  
  • Read guidelines on how to make a good issue.
  • Chat with the Drupal community using Slack and speak to core maintainers directly in the #contribute channel.
  • Join the slack channel for #Global-contribution-weekend
  • Read Echidna’s How to Contribute to Drupal blog series 

Local Development Environment

To have a successful contribution weekend you will need a solid local development environment and to install the latest development branch of Drupal. You can use any local environment you are comfortable with and in my opinion, either Pantheon, XAMPP, or the Acquia Dev desktop, would have what you need.

  • Pantheon sandbox is easy to setup. You can work with it either in SFTP or Git mode if you don't want to set up a server on your computer. 
  • XAMPP is a simple, lightweight Apache distribution that works well to create a local web server for testing and deployment purposes.
  • The Acquia dev desktop local comes with almost all tools you would ever need - Apache, PHP, mySQL and drush - and it installs Drupal in one click so you can get going, faster.

At the Southwestern Ontario event, we will have available a limited number of older machines ready and set up, specifically for guests to use for writing and testing patches, etc. 

Social

Finally, let people know you participated on your social media by using the hashtags #ContributionWeekend #Drupal, and tag the Drupal Users Group that organized the event you attended. For Southwestern Ontario's, use @LondonDUG and hashtag #WRDUG

Digital Echidna is a proud sponsor of the Southwestern Ontario Global Drupal Contribution Event 2020. 

--

 

Sep 10 2019
Sep 10

Just like the poem says, “Little drop makes the mighty Ocean,” all contributions matter in the growth of the Drupal global community. 

A diverse community results in great things. To ensure the longevity of Drupal digital experiences and the adoption of this open-source technology, Drupal itself must be ready for a global audience. 

Contributors are Drupal's most valuable asset, and are the sole force behind improvements to the platform and the community itself. There are so many ways to contribute to Drupal. One way non-native English speakers like me can contribute to Drupal, is simply by volunteering time translating Drupal’s user interface text. 

Why does translation matter?

Drupal, by default, assumes modules and themes are written in English. This assumption of English as a default language creates a common ground and standard for sharing with the Drupal community. 

Modules and themes must be translated from English to other languages. To translate Drupal is to translate from English, the pieces of text (or set of “strings” in programming terminology) that are visible in buttons, menus, field captions, messages, and so on. 

At present in Drupal, there are 100 languages with about 115 translation groups. According to the translation status of Drupal 8, only Ukranian, French, and German are considered 100 per cent (with 9,353 strings) translated.

Malayalam is one of the languages in which I am fluent and it is a language spoken by 36 million people in Kerala, a southern state in India. Malayalam has incomplete versions of the text in core. Parts of the interface will still show up in English, while other parts need corrections and improvements of the language. 

When I started contributing to this particular translation project, it was immediately noticed and embraced by others in the online community. First, I was a Translation Self Moderator and Translation Content Moderator, then made Translation Community Manager. 

Translating Drupal means opening doors for talented developers everywhere to embrace the Drupal open-source platform. 

Whenever I do translation I feel like I’m solving a puzzle. When I get a chance to contribute to my mother tongue and home community, it is always a happy and prideful moment. I feel connected to a place very far away from where I live now, in Canada. 

I often think of Julia Carney's immortal lines (from her poem, Little Things), "Little drops of water,/Little grains of sand,/Make the mighty ocean/And the pleasant land./So the little minutes,/Humble though they be,/Make the mighty ages/Of eternity". Meaning, if things are done well and effectively on a regular basis, even if it’s only for a short while each day, it adds up to something substantial.

Interested? You too can join the Language Team from the Drupal Translation page and help out in the language of your choice by suggesting translations that will be later approved by team members with Content Moderator role. Strings can have multiple suggestions at a time, and even translated strings can receive further suggestions to help fine-tune translations. There is also something in it more than the greater good or feeling of a job well done -- issue contributors get credits on drupal.org. 

I want to thank both Steve Bayer (SteveBayerIN) and colleague M Parker (mparker17), for helping me get started.  

ഈ ലേഖനം വായിച്ചതിന് നന്ദി ! :-)

Read other blogs in this series, How To Contribute to Drupal

--

Did you enjoy this article? Get more just like it by signing up to receive Digital Echidna’s free e-newsletter, delivered to your inbox every month. Subscribe today.

May 24 2019
May 24

Time is always of the essence. From a consumer perspective, you want to know when events take place, when something’s open or closed, how long a meeting or activity will last. And, from a development perspective, you want to be able to create a date field that’s intuitive for the users, but doesn’t involve a lot of custom work.

There’s a default functionality in Drupal 8 that, while functional, is cumbersome. So I recently developed a module called Smart Date that will make things a lot easier for site developers -- and provide that functionality that editors want.

I initially identified the need back when I was working on a client site that required me to enter some sample content. We’ve all used calendar software -- whether it’s a Google or Outlook calendar, or even what you have on your phone.

In this instance, with Drupal, I needed to enter the year, month, and date. I also had to enter a start hour, start minute, and define whether it was AM or PM. And then do it all over again for the end time, meaning 12 fields to fill out for each instance. As a user, we have an expectation and assumption that entries would autopopulate end times -- but you know what they say about assumptions.

I wanted Drupal to have that same ease of use. To achieve that, I made a date widget that adds a concept of duration. Like your calendar application, it can assume a default duration, so as soon as you enter the start, it can populate the end for you.

As a site builder you can choose this default duration, so if you want it to be one hour (like Google and Apple’s calendars) if can do that out of the box. If you’re building a recruiting site that should default to 15-minute intervals, that’s up to you.

You can even restrict what intervals are available. In the default setup, editor has the convenience of choosing from a default duration or making a custom range if they need something that isn’t listed. But suppose you’re organizing a conference where the sessions will all be 45 or 90 minutes in length. As a site builder, Smart Date allows you to enforce these constraints.

Another request we’ve had from clients is the ability to designate events as “all day”. Again, something we’ve all become used to in our calendar applications. And a perfectly valid use case in how we need to register events on the sites we build. But up until now, we’ve had to custom build a solution as a customization, again and again. Smart Date gives your editors this ease-of-use with the click of a button – again unless the needs of your solution dictate that as a site builder, you need to take that option away (which you can).

Another request we get again and again – and have had to build custom – is to make the display of time and date ranges more intelligent. For example, if you were formatting a date range by hand, if the start and the end were on the same date, you wouldn’t write the date on both, you’d only write it once. Smart Date has this kind of intelligence built in. It’s a little more complex to set up, but hopefully, the presets will work for a wide range of use cases, and in time we’d like to see translations available on localize.drupal.org so a wide variety of cultures can get their preferred date and time formats available on install.

One last major aspect of Smart Date is performance. At Digital Echidna, we know that the speed of a site is a critical component of the overall experience, not to mention SEO. We test our sites at various points during development to ensure they’ll meet the appetite by web visitors for a site that not only looks great and is easy to use, but loads quickly so they get done what they need, and go back to surfing for funny cat videos.

In a recent mid-development site performance audit, I realized that the slowest page identified was an events archive, even though it held almost no content. When I looked at the query Drupal had constructed based on the view configuration, I realized the core date fields were storing values as strings, and at query, every row had to convert multiple values into date formats in order to make the required comparisons. I’ve since spoken to a number of other developers within the Drupal community, who have had to build workarounds so that date and time stored in Drupal can be accessed in a way that meets web visitors’ ever-increasing expectations for fast page loads.

MySQL has its own DATETIME field especially to provide fast queries for storing and accessing this type of data, but the Drupal core team chose not to use it but a solution that depends on this wouldn’t be portable to other database engines, which is understandable. For Smart Date, I chose to store the values as timestamps, which have some limitations in their ability to store values in the far future or distant past, but more than meet the need for what we see as a typical use case, storing coming or recent events.

The beauty of this approach is that we could use functionality built into Drupal 8 for date and time functionality (handling of date and time widgets, with validation, etc) and its built-in capabilities for storing and retrieving timestamps (which are still used for node creation and revision dates, for example) and only write the code that is necessary to translate between the two.

It’s a testament to the object-oriented infrastructure of Drupal 8 that we could build this solution using different parts of what’s already in Drupal 8 core, and focus our efforts on adding code where it really adds value.

The module has already started to get feature requests, so I expect we’ll see its capabilities continue to grow, as other developers submit patches to “scratch their own itch”. That pooling of community effort is another key strength for Drupal, as we can all benefit from the work we do individually to make things better.

We hope that Smart Date will make Drupal sites better solutions: for site builders, for the editors that use them, and for the visitors who come to consume the content. All by making it easy for Drupal to work like the calendar application we’ve all become accustomed to using, from technology giants like Google, Apple, and Microsoft.

At Digital Echidna, we’re committed to improving that experience across the board with our products, so that when someone who is not as familiar with Drupal is entering content, it’s going to be a positive, intuitive, and enjoyable experience. When we discover a new, better, or more intuitive way to do things, we add that to our baseline development toolbox to ensure that all of our customers can benefit from that improved experience.

We often take for granted how much back-end work goes into creating what appears to be a simple bit of online functionality. But these things take time -- and with Smart Date, I’m hoping that this module will provide a better experience that lets people spend less time developing and frees up more time to focus on creating innovative, customer-focused solutions.

Mar 25 2019
Mar 25

You can make the most elegant, relevance-based site search appliance possible -- but, still, sometimes you’re going to want to ‘game’ the system.

Manipulating site search results sounds nefarious, but really it’s all about providing the most relevant results to the end users. A well-defined site search doesn’t require manual tweaks to provide relevant results -- especially when it’s searching against site content that’s properly tagged, uses appropriate semantic markup, and is SEO friendly. However, sometimes it is necessary to tweak the results to provide something of particular, timely relevance.

A few weeks ago I developed a new Drupal module called Search Overrides, that can help. 

Why AND WHEN TO Use Search Overrides

Maybe there’s a new page or document that provides the most relevant information that users want that hasn’t risen up the ranks; maybe there are casual alternatives (think: slang) of terminology that they’re not able to use internally, but for which they want to ensure that content can rank. Maybe all the results that show are good and relevant, but there’s one in particular we know should be on top because it’s the page people are calling a company’s helpline to find.

We do a tremendous amount of work on Apache Solr and this is one of the top requests we receive from clients -- the ability to tweak the search results from time to time. Based on our experience with a broad range of technologies, we know that there is similar functionality in search products like Cludo and Swiftype, so we felt that we needed to fill that gap in Solr.

The new Drupal module Search Overrides provides a method for site administrators with the necessary permissions to manually override the results returned by Search API Solr. In short, one can select which nodes will be placed at the top of results for specific search terms, and exclude nodes that they don’t want to have appear in the site search results.

This gives administrators tremendous flexibility to ensure search results align with user needs. Obviously, an algorithm-based search will provide relevant results, but sometimes it’s valuable to tailor specific items to your key audiences.

So what’s next? Refining and improving this module. After all, that’s one of the great things about the Drupal open-source community. We can create something that solves a need, but through constant iteration, testing, and getting the input of an exponentially larger group of Drupal contributors adding to this module, we can continue to make it better, more functional, and easier to use.

We’ve already got some updates in mind that we’re going to work on. We’re looking at how to integrate role-specific overrides. So, for example, if a client of yours wants its staff to see a policy manual when they search “policy” but have the general public return a privacy policy, then the results could be tailored to return results based on roles. And right now the module only works with nodes, but ideally, this could extend to all items that are in the index, including media entities.

DrupalCon2019

If you have any questions about contributions or if you want to contribute a patch to this module, it is greatly appreciated. And, feel free to reach out to me online or meet me in person at DrupalCon - where I will be leading a few of the Builder session tracks (Story City: Case Study and Making Drupal Fast: A Surgical Approach) and spending some time at our company booth, #708. 

Feb 01 2019
Feb 01

Since the beginning, our goal has been to balance technical expertise with creative flair when building and designing websites, applications, and digital platforms to deliver real solutions for clients. We continue to stand as leaders in our industry who understand the importance of open-source technology and the global community that supports it. Our success hasn’t gone unrecognized. Specifically, we’ve been identified as one of Clutch’s top web developers in Canada in their 2019 annual report.

As a B2B ratings and reviews firm, Clutch.co is committed to helping firms solve their business problems by identifying providers who will be able to deliver quality solutions. Clutch gathers direct feedback from our clients. Rankings on their site are based on factors such as the quality of a participating firm’s work and thought leadership, but most prominently, client reviews. Our team is continuously reinforced by the positive feedback we’ve received from clients since using Clutch, and we’re grateful to have an independent, third-party partner to reinforce our values of transparency and trust. Our profile has our reviews in full, but here are some of the comments that made this achievement possible:

"They were dependable, flexible, and met all of our expectations." – Executive Director, Communications & Marketing, SUNY Morrisville College

“They delivered on all of their promises.” – Marketing Manager, Sifton Properties

“They were fantastic leaders in guiding us through the project from beginning to end.” – Information officer, The Agency for Cooperative Housing

On Clutch’s sister-site, The Manifest, we’re featured as a leader among the top web development companies. The Manifest provides industry reports to help businesses narrow their search for a provider that will match their problems with solutions. Additionally, our work is also featured on Clutch’s designed focus site, Visual Objects. We’re displayed prominently as a leading company in the web development category as well and appreciate the opportunity to showcase our visually stunning work.

Our top priority will always be to provide businesses with essential services to optimize their digital footprint. We’re very grateful for our clients’ support, as well as the distinctive honor Clutch has bestowed on our agency. We will continue to strive for this standard of excellence in the new year.

Jan 10 2019
Jan 10

Drupal. It’s been the foundation of our solutions for a few years now and it powers some of the top sites around the world in fields ranging from commerce to government. If you’ve ever been interested in getting your feet wet with the CMS, or expanding on your existing knowledge of Drupal, we have a great opportunity for you.

The weekend of Jan. 25-27 is Drupal Global Contribution Weekend, formerly known as “Global Sprint Weekend.” Echidna has long been a supporter of this event. This year is no different, as we’ll be hosting the contribution event for London and area, on Saturday, Jan. 26, 2019 at Digital Echidna HQ -- and we’d love to see you here.

Whether you’re an experienced developer or just starting out, there are going to be opportunities for you. In fact, you don’t even have to have development experience -- there are many ways you can contribute. 

There will also be snacks and beverages on hand! Register on meetup.

The event will be held as follows:

Saturday, Jan. 26, 10 a.m. to 5 p.m. at 148 York St.

You can come for the whole day or drop by for an hour or two. Come by and say hi, if you want. And for students who may be interested, this weekend is a great opportunity for you to come in and meet some people in the community.

One of the greatest strengths of Drupal is its community. As an open-source technology, Drupal relies on the creativity, dedication, and talent of its contributors. You can make a difference and no matter what your passion, you can find a home for it within Drupal.

The Global Contribution Weekend is a great, low-barrier-of-entry way for you to get involved. And we’re happy to help support a community that has meant so much.

Register today.

I hope to see you there.

Oct 02 2018
Oct 02

Take a look at the bottom right corner of this blog post. See it? That Echidna-red “speech bubble”? Go ahead… click on it. I’ll wait!

That’s right. A direct link to me. And legitimately me, not just a team of “me”s monitoring the account.

Of course, if you come back to this blog in six months, or six years, that may change. The chat feature may be there, or it may not be (which will make this intro extremely confusing -- so let me extend potential apologies to future readers).

A couple of weeks ago we decided to test out a chatbot feature on our Echidna.ca properties. It’s been an interesting process and we’re evaluating its effectiveness. But I wanted to take a moment to talk about the process and the importance of not letting perfect get in the way of better.

Sometimes we overthink things that can add value to our experience for our customers. It’s natural to want the perfect solution that meets everyone’s needs to magically pop out of the box. But that’s not the way life works. We try, we iterate, we refine, and we try again.

Even with the digital solutions we build for our clients, we ensure that they have tremendous control and flexibility. We don’t want them to be locked into a specific solution for years because they can’t afford to change -- we embrace open-source technologies and ethos to ensure they can take ownership of their content. After all, markets change, customer needs change, and the environment around us changes -- so we want to ensure we can change with it.

That’s why we’re testing the chatbot. Is it perfect? Honestly, we’ve had mixed reactions to it internally. Some like it, some don’t, and others think it needs work. 

But there’s a difference between just taking every idea, throwing it against the wall, and seeing what sticks, and actually taking a measured approach to innovation and giving yourself the greatest opportunity to succeed.

Goals

Before you start any sort of project, obviously you want to establish goals. Clearly, you’ve identified a need, so it’s important to ensure that the goals align to fulfilling that need -- just the same as you would with any other project.

In our case, we were getting great readership of our blogs and excellent traffic on our site. We list our email, phone, all social media on the site. But still there is opportunity for people to interact with us directly in a more private forum. And the chatbot started from here.

Expectation Management and Learning

When you use a chatbot, what do you expect? An immediate response, right? It became vitally important for us to quickly manage those expectations.

Realistically, I can’t be at my desk all the time, watching for chatbot interactions. And any unexpected delay can create a negative experience for the user.

Even if I’m at my desk, I can’t always dedicate time to it. After all, if I’m on a client call, I want to give that person my full attention -- it would be disrespectful to that relationship if I’ve divided my attention to a chatbot request that popped up.

So eventually, it might not be me that answers you. In the future we might have a team that’s dedicated to the task, who shares the responsibility. Already we’ve made some changes to the messaging, so that when I’m not available, the message switches to encourage people to share their thoughts. 

But that comes with a commitment to return those messages ASAP. And that takes resourcing.

It’s not enough to have a good idea; you have to make sure you can effectively maintain that resource and provide the superior service you’re promising. Otherwise, it’s not going to work.

Presentation

How should it look? Should the chat be on every page? Should it only appear on key pages? Should it be a giant takeover or should it be more subtle? Should it immediately appear on the site as soon as someone accesses a page or should it wait for a few seconds? Five, 10, 30? What’s enough time? How is it affecting our mobile experience? Is it creating accessibility challenges and how do we overcome those so that everyone has an equitable experience?

Are you surprised to learn that we have already changed the interface a few times? Don’t be. These are all decisions that can be iteratively tested and evaluated -- and then we repeat the process all over again.

How Do We Know When We’ve Got it Right? We Ask!

And, like any good user experience exercise, we’ll know we’ve got it right when our customers are excited about the experience. So how do we know that? Well, we ask them.

Formally or informally, reaching out to our users to solicit feedback is key and that’s how we’re going to make the best solution for the people that matter most -- our end users.

We know perfect is never going to happen. So when you let perfect get in the way of better, you’re effectively paralyzing yourself. But a well-thought-out plan, with clear goals and anticipating resourcing needs and customer expectations can get you 80 per cent of the way to where you go!

While you need to take that first step and adjust as you march along the path, it’s important to do your research, planning, and long-term strategic thinking first to ensure you’re not starting in the wrong direction.

Sep 18 2018
Sep 18

It’s happening. Drupal 8 will be end-of-life by November 2021. So what does that mean? Well, whether you’re talking about a vehicle or a website, ensuring that routine maintenance is regularly performed can contribute to a smooth path. And when it comes to making sure the job is done the best way, it pays to call in the experts.

When my car needs work, I don’t do it myself, I call a mechanic. The same should be the case for your website -- which is why Digital Echidna’s service level agreements are so valuable.

And with a recent announcement of the deprecation of Drupal 7 and 8 in 2021, that type of support can go a long way to ensuring a smooth transition to Drupal 9.

Last week, Drupal’s founder and lead developer, Dries Buytaert, posted a blog discussing the arrival of Drupal 9 and what the future holds for Drupal 7 and Drupal 8. I encourage you to review the blog post when you get time.

“But wait!” you may be saying. “We just built a Drupal 8 site -- what’s with this new Drupal 9???” Don’t worry! The transition from Drupal 8 to Drupal 9 is going to be a natural progression and doesn’t mean a wholesale change. In fact, I think it’s important to highlight this one line from Dries’ post:

“By keeping your Drupal 8 sites up to date, you should be well prepared for Drupal 9,” he said.

And that’s where our EchidnaCare packages can be such a great value to you. Prioritized security updates and core patching is included at all levels of EchidnaCare. Our Onyx level offers even greater value where the work to install those items are included in the base price.

Three years offers an ample amount of time to ensure that all of those ducks are in a row and ready for the transition. Many of our clients are already in the midst of this transitional timeline for their web solutions. And while the whole process of transitioning to Drupal 9 should be fairly smooth, having that regular maintenance as part of the process now can help ensure that you’ve got a head start along the smoothest path possible.

If you’re interested in learning more about our EchidnaCare packages, please contact Andrew. And if you have any questions about our commitment to Customer Support, feel free to post them in the comments section below!

Sep 13 2018
Sep 13

We’re number (two-hundred-and-forty) one! 

Earlier today, Canadian Business and Maclean’s released its 30th annual Growth 500 list, and we’re proud to announce that we’ve made the list for the second, consecutive year. Digital Echidna ranks as no. 241 with a five-year revenue growth of 285 per cent.

First, the details. Then, the more important part -- the acknowledgements.

The Growth 500 list measures five-year revenue growth. For this edition, our growth from 2012-2017 was measured. And this represents an incredible time in Echidna’s history. In 2012 were just settling into the second floor of the Burridge Block, we had just signed our first major hospital account, we were ramping up operations from a team of 15.

By the next year, we had doubled in size, jumped on the space vacated by Cello (remember that restaurant Londoners?), and were continuing to build our team both in terms of depth and breadth -- living our commitment to provide our clients with a comprehensive team from account management to creative and content to project management and development.

Over that five-year period, revenue grew -- 285 per cent -- as did our size (now close to 70 and growing) but, equally as impressive, our scope and influence grew. We received accolades for our commitment to accessibility (including receiving the prestigious provincial David C. Onley award for leadership), we were recognized as industry leaders in Drupal development, and our belief in supporting our community through sharing our time, talents, and resources was acknowledged with corporate social responsibility awards both from TechAlliance and the London Chamber of Commerce.

We’ve grown -- and continued to grow. We expanded onto yet another fourth floor of the Burridge Block (a parade of Echidnas surrounding Laurie Lashbrook and her team!) and we increased our presence across North America, becoming a go-to development firm for enterprise-level solutions for clients across the board, but especially in healthcare, education, and not-for-profit/government industries.

In 2018 we vacated all floors of the Burridge Block building, moving on to a new spot at 148 York Street.

But -- and here come the acknowledgements -- all this growth is thanks to the dedication, support, and commitment of the people who believe in us and have supported our efforts.

Internally, our growth has been built upon a foundation of excellence that’s come about from alignment to our core values: humble confidence; dedication and reliability; being a team player; being engaged and passionate in our efforts; and always learning and improving. As we charted a path to success, we knew we had to have many hands all rowing in the same direction -- and we’re proud of the team we’ve developed (and the support of those who have moved on to other opportunities).

Externally, we have to thank our clients. Our most recent launches, Michael Garron Hospital and the Michael Garron Hospital Foundation, may be a long way from the sole-proprietorship early days working with amazingly supportive people like Larry Kinlin and Steve Glickman, but, at the core, we approach every project trying to do our best for our clients and ensure that their needs and goals are met.

Our tools, tactics, and processes may have changed, but one thing that hasn’t changed -- from the early days to where we’re now working on enterprise-level, multi-user, multidimensional projects -- is our commitment to the end user. And that’s helped us get where we are.

And, of course, I’d be remiss to not thank the amazing community of supporters we’ve been lucky enough to work with -- and continue to work with -- in London itself. We’ve developed wonderful relationships with colleagues, mentors -- and even competitors. Our partnerships with institutions like Fanshawe College and Western University aren’t just about short-term needs, but rather are shared efforts to grow and improve our city as a destination for tech talent. Our CSR efforts focused on children, culture, and the environment are all designed with an eye towards making our community more livable and sustainable for generations to come. And, of course, our commitment to the downtown core is part of our efforts to help revitalize this incredible city -- a city that’s our home.

Thank you again to everyone who has believed in us and supported us over the year. We’re proud of this recognition and again being one of several London area companies on this list. We look forward to continuing to reward the faith that has been placed in us -- by our families, by our community, and by our clients.

Dec 19 2017
Dec 19

Bullying is a major problem in our schools, workplaces, homes, and over the Internet.

Over the next 14 months, Digital Echidna is supporting a series of mental health awareness campaigns, in particular those with an emphasis on inclusion and diversity online and those that aim to combat cyberbullying.

Today, we start by bringing attention to Clarke Road Secondary School in London. Clarke Road is hosting its annual health and wellness day, focusing on mental health. For Me Day is a day dedicated to self care and reflection. Inspirational speaker Andy Thibodeau will kick off the morning, speaking to what it takes to support a mentally healthy community. Next, the students will participate in a full day of activities designed for just this purpose, from sports to art to drumming. It was inspired by, and is a natural extension of, Clarke Road’s “sea of pink” day, where students wear pink to bring awareness to combat bullying.

Of course there are many other system and school-based proactive initiatives that Thames Valley District School Board takes to address bullying and cyberbullying. As an example, just two weeks ago the hashtags #TVDSBMediaMoment and #ThinkingThursday were launched on Twitter, providing teachers with prompts to work with their classes around this very same conversation.

There are also several schools that recognize the Sea of Pink day and International Pink day (some schools do a celebration in February, while others recognize in April).

This abundance of special awareness days led me to think about this issue within our own Drupal community. There is a dedication within Drupal to preserve the things that got us here: namely, keeping Drupal a fun, welcoming, challenging, and fair place. The Drupal Code of Conduct (DCOC) states our shared ideals with respect to conduct. It is an expression of our ideals and is a way to communicate our existing values to the entire community.

This code of conduct is based on the one developed by Ubuntu, with the addition of the Conflict Resolution Policy developed by the Drupal Community. How great to have a code of conduct within work, and the technology we use.

So today, as one local high school acts, we will wear our own pink socks in tribute and will reflect upon the following questions posed by another great local resource, the London Anti-bullying Coalition:

  • Have you supported the young people in your life to learn how to resolve a conflict in a respectful and mutually beneficial way?
  • How can you support the young people in your life to safely use technology?
  • Are you modelling good social skills for the young people in your life?

We’d love to read your thoughts. Please share any answers you may have to the questions above in the comments section. Or if you’d have any other thoughts on the topic of bullying, we’d love to hear them too.

An image of a bunch of Echidnas looking at a collection of pink socks.An image of someone wearing pink Digital Echidna socks

Oct 27 2017
Oct 27

With the arrival of Drupal 8, several contributed modules -- which were previously add-ons for Drupal 7 -- received renewed attention and were integrated into Drupal 8 core. Those useful technologies were no longer simply existing in contrib space, but were seen as essential components of the latest iteration of Drupal. Views and date module were integrated, as was migrate.

And while there migrate is a well-developed framework, there are still some situations that occur during development that requires some customization -- which is what my focus is on for this post.

The Migrate framework is now in core and it supports the ability for hundreds of sites, built in Drupal 6 or 7, to be migrated to Drupal 8. It’s an amazing ecosystem of modules that is growing to become one of the major core components. Migrate is robust and, despite some critical issues, is ready to be used for most of the migration needs. If you haven’t checked it out yet, please do so as is it gives you a solid foundation for migration tasks in Drupal.

In today’s blog post, I’ll dive into the architecture of custom large-scale imports of data, define why one would use it instead of migrate, and discover how to build scalable and robust import solution from scratch.

Factors in choosing custom import over migrate framework

  • Your import needs to be lightweight, recurring, can be processed outside of cron or use CI;
  • Your import may come from combined external sources while only needs to save data once;
  • You want to have flexibility and control over what actions are performed with the content: unpublishing, deleting, updating just one field; 
  • You want it to be scalable and editable with minimum code re-writes or use of new plugins; and
  • You need precise control over what parts of the data need to be updated and when.

Where to start

Gather your requirements by answering these questions:

  • Is it a periodic or one time import?
  • Will it be manual or completely automated?
  • Will it be a cron job or continuous integration?

And don’t forget to think about the Architecture: Content type architecture; PHP class hierarchy

You’ll also want to determine import sources: from what environment and what format you will be importing into Drupal?

What are the options?

You have a variety of options for import triggers depending on your needs. The simplest one is a manual Import - triggered manually by the user through the Drupal UI.

If you need your import to run on background you may use a cron job to trigger and run it. However, keep in mind that some hostings have limitations on the length of the cron job. This may result into partial completion of the import during each cron run and may require multiple cron runs to complete import in full. If you need timely updates of information, this may not be the desired option.

To run a completely automated import without the need to use cron, you may consider using Continuous Integration. There are many good options available. I recommend Jenkins as it’s easy to use and install, and it has a very user-friendly UI.

Analyze the sources

Before you start any import you need to run a discovery on from which sources the data will be coming, where the source files are stored, and how you will be accessing them.

The simplest solution is when the source files live inside a Drupal installation. They are either manually uploaded to the public files folder through FTP or automatically dropped in by the third-party script to the Drupal install.

The more complicated scenario happens when source files are outside of Drupal: external XML or JSON feed. Sometimes connection to the external database is also required.

While analyzing the sources consider the format of files or feeds, credentials, and any drivers required to connect to the external sources.

Make sure you are planning for the parsing resources. Do you need any external libraries for connection to SFTP or parsing complicated XML? Check to see if required modules like CURL are installed on your server and that you have all of the needed access rights and permissions. Establishing good foundation and checking on these factors before you even start coding will lead to less problems and debugging in the process of development.

Architecture

After you have completed  the foundation, you are ready to build on top of it. Everything starts with structure. In terms of import it’s structure of your content types and fields and structure of your source files.

Sitebuilding should be done and all content types should be well defined before you start coding. Think not only about separate content types but also about content types’ “eco system.” Relationships between content types and taxonomies are very important as imported data may need to be migrated into multiple content types or have taxonomy terms attached. The relationships should be dictated by the desired functionality and taken into consideration when building the import.

When it comes to the source files’ architecture, it’s important to keep in mind that structure and encoding of the source files should be set in stone. The structure of the source files defines mapping to the Drupal fields and cannot vary without breaking the import functionality.

Although there are many libraries that allow to detect and change encoding of the data, they are not always 100 per cent perfect. And it’s important to remember that PHP works best with UTF-8 encoding.

Import in a nutshell

Import consists of three simple operations: get data, parse data, save data. These three operations will be handled by the three queues: the storage queue, which stores data dumps; the getter queue, which parses data dumps and saves them into separate records; and the setter queue, which saves records into Drupal entities. This approach allows us to organize the process to run in parallel for multiple records, but each individual record to be always processed consequently.

To begin an import we need to create a trigger and a set of batch operations. We will discuss how to use manual trigger. We start with configuration form. Create custom module and in src/Form directory create a Form class.

<?php
/**
 * @file
 * Contains \Drupal\import_example\Form\ConfigImportForm.
 */

namespace Drupal\import_example\Form;

use Drupal\Core\Form\ConfigFormBase;
use Drupal\Core\Form\FormStateInterface;
use Drupal\Core\Queue\QueueFactory;
use Drupal\Core\Queue\ReliableQueueInterface;
use Drupal\Core\Queue\QueueWorkerInterface;
use Drupal\Core\Queue\QueueWorkerManagerInterface;
use Drupal\Core\Queue\SuspendQueueException;
use Symfony\Component\DependencyInjection\ContainerInterface;


class ConfigImportForm extends ConfigFormBase {

  /**
   * @var QueueFactory
   */
  protected $queueFactory;

  /**
   * @var QueueWorkerManagerInterface
   */
  protected $queueManager;

  public static $queueManagerInstance;

  /**
   * {@inheritdoc}
   */
  public function __construct(QueueFactory $queue, QueueWorkerManagerInterface $queue_manager) {
    $this->queueFactory = $queue;
    $this->queueManager = $queue_manager;
  }

  /**
   * {@inheritdoc}
   */
  public static function create(ContainerInterface $container) {
    return new static(
      $container->get('queue'),
      $container->get('plugin.manager.queue_worker')
    );
  }

  /**
   * {@inheritdoc}
   */
  protected function getEditableConfigNames() {
    return ['import_example.settings',];
  }

  /**
   * {@inheritdoc}.
   */
  public function getFormId() {
    return 'example_import_form';
  }

  /**
   * {@inheritdoc}.
   */
  public function buildForm(array $form, FormStateInterface $form_state) {
    $config = $this->config('import_example.settings');
    $form['help'] = array(
      '#type' => 'markup',
      '#markup' => $this->t('If you want to trigger import manually, please press "Trigger Import" button'),
    );
    $run_overnight = $config->get('run_overnight');
    if (!isset($run_overnight)) {
      $run_overnight = 1;
    }
    $form['run_overnight'] = array(
      '#type' => 'checkbox',
      '#title' => $this->t('Run import over night?'),
      '#default_value' => $run_overnight,
    );
    $form['actions']['#type'] = 'actions';
     $form['actions']['run_import'] = array(
      '#type' => 'submit',
      '#value' => $this->t('Trigger Import'),
      '#button_type' => 'primary',
    );

    return parent::buildForm($form, $form_state);
  }

  /**
   * {@inheritdoc}
   */
  public function submitForm(array &$form, FormStateInterface $form_state) {
    // saving config
    $this->config('import_examples.settings')
      ->set('run_overnight', $form_state->getValue('run_overnight'))
      ->save();
    $operation = $form_state->getValues()['op']->__toString();
    if (isset($operation) && $operation === 'Trigger Import') {
      // trigger manual import
      $this->ImportDataQueuePopulate();
    }
    parent::submitForm($form, $form_state);
  }
}

We can populate needed queues with the following helper function:

 <?php
 /**
   * Helper function to populate Queue with data from CSV file.
   *
   */
  protected function ImportDataQueuePopulate() {
    // get manual queue instance.
    $queue_manual = $this->queueFactory->get('import_get_manual', TRUE);
    $queue_manual->deleteQueue();
    // get manual save queue instance.
    $queue_save_manual = $this->queueFactory->get('import_save_manual', TRUE);
    $queue_save_manual->deleteQueue();
    $operations = array();
    // making a queue of the files.
    $operations[] = array('\Drupal\example_import\Form\ConfigImportForm::getBatchOperation', array('queueCreateItem', array($queue_manual, 'File1.csv', 'data1')));
    // ... all your source files are gotten here.
    // getter queue.
    $operations[] = array('\Drupal\example_import\Form\ConfigImportForm::queueProcessItem', array($queue_manual, 'import_get_manual'));
    // setter queue.
    $operations[] = array('\Drupal\example_import\Form\ConfigImportForm::queueProcessItem', array($queue_save_manual, 'import_save_manual'));
    $batch = array(
      'title' => $this->t('Import'),
      'operations' => $operations,
      'label' => $this->t('Import'),
      'finished' => NULL,
    );
    batch_set($batch);
  }

And the helper method that are used to get batch operation and to create queue items are as follows:

<?php
  /**
   * Helper function to get batch operation.
   *
   * @param $callback_name.
   *    name of the callback function that needs to be called.
   *
   * @param array $arguments.
   *    array of arguments that needs to be passed to the callback function.
   */
  public static function getBatchOperation($callback_name, $arguments, &$context) {
    switch ($callback_name) {
      case 'queueCreateItem':
        self::$callback_name($arguments[0], $arguments[1], $arguments[2]);
      break;
      case 'queueProcessItem':
        self::$callback_name($arguments[0], $arguments[1], $context);
      break;
    }

  }


  /**
   * Helper function to create queue item.
   *
   * @param queue
   *    QueueInterface object.
   *
   * @param str $file_name
   *    Name of the file.
   *
   * @param str $key
   *    Key to identify part of the import.
   */
  protected static function queueCreateItem($queue, $file_name, $key) {
    $item = new \stdClass();
    $data = self::getFileContents($file_name);
    $item->content = array('key' => $key, 'info' => $data);
    $queue->createItem($item);
  }


    /**
   * Helper function to CURL content of the file.
   *
   * @param str $file
   *    file name.
   *
   * @return str $data | FALSE
   *    data blob of CURL request.
   */
 protected static function getFileContents($file) {
    $username = "username";
    $password = "password";
    $uri = 'public://import/' . $file;
    $url = file_create_url($uri);
    $ch = curl_init();
    curl_setopt($ch, CURLOPT_HEADER, 0);
    curl_setopt($ch, CURLOPT_USERPWD, "$username:$password");
    curl_setopt($ch, CURLOPT_HTTPAUTH, CURLAUTH_BASIC);
    curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); //Set curl to return the data instead of printing it to the browser.
    curl_setopt($ch, CURLOPT_URL, $url);
    $data = curl_exec($ch);
    if(!curl_errno($ch)) {
      $info = curl_getinfo($ch,  CURLINFO_HTTP_CODE);
      if($info!= 200) {
        $data = FALSE;
      }
    }
    else {
      $data = FALSE;
    }
    curl_close($ch);
    return $data;
  }

}

Define queues inside the src/Plugin/QueueWorker folders

<?php
/**
 * @file
 * Contains Drupal\example_import\Plugin\QueueWorker\ImportGetManual.php
 */

namespace Drupal\example_import\Plugin\QueueWorker;

use Drupal\example_import\Plugin\QueueWorker\ImportGetBase;
use Drupal\Core\Queue\QueueWorkerBase;
use Drupal\Core\Queue\QueueFactory;
use Drupal\Core\Queue\ReliableQueueInterface;
use Drupal\Core\Queue\QueueWorkerInterface;
use Drupal\Core\Queue\QueueWorkerManagerInterface;
use Drupal\Core\Plugin\ContainerFactoryPluginInterface;
use Symfony\Component\DependencyInjection\ContainerInterface;


/**
 * Provides Base CURL functionality for the CSV files of fund details values.
 *
 * @QueueWorker(
 *   id = "import_get_manual",
 *   title = @Translation("Import: get CSV data"),
 * )
 */
class ImportGetManual extends ImportGetBase {}

And process items. The end result of the processing operations should be populating of another queue that follows this one in order. For example dump queue populates getter queue and getter queue in turn populates setter queue. The processing function will vary based on your source and on your fields structure.

Finally setter queue saves information into Drupal.

Following these three easy steps and discovery process you can build your own large-scale imports that are customizable to your needs. Happy importing!

Sep 12 2017
Sep 12
Digital Echidna is a certified Thunder Integrator -- one of only a few in the world -- offering tailored CMS solutions to the publishing community.

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web