Mar 15 2019
Mar 15

With the announcement that the Google Search Appliance was End of Life, many universities started looking around for replacement options. At Palantir, we wanted to provide an open source option that could solve the following needs:

  • A simple way to store, retrieve, and parse content.
  • A cross-platform search application.
  • A speedy, usable, responsive front-end.
  • A flexible, extensible, reusable model.
  • A drop-in replacement for deprecated Google Products

Working with the University of Michigan, we architected and developed a solution. Join Ken Rickard to learn more about Federated Search and to see a live demo.

  • Date: Saturday, February 16
  • Time: 11:00am to 11:45am
  • Location: Room 179 

Update: Video of this session is now available on Drupal.tv

Mar 15 2019
Mar 15

Migrations are fraught with unexpected discoveries and issues. Fighting memory issues with particularly long or processing heavy migrations should not be another obstacle to overcome, however, that is often not the case.

For instance, I recently ran into this no matter how high I raised the PHP memory_limit or lowered the --limit flag value.

$ drush migrate:import example_migration --limit=1000

Fatal error: Allowed memory size of 4244635648 bytes exhausted (tried to allocate 45056 bytes) in web/core/lib/Drupal/Core/Database/Statement.php on line 59

Fatal error: Allowed memory size of 4244635648 bytes exhausted (tried to allocate 65536 bytes) in vendor/symfony/http-kernel/Controller/ArgumentResolver/DefaultValueResolver.php on line 23

It should be noted that the --limit flag, while extremely useful, does not reduce the number of rows loaded into memory. It simply limits the number of destination records created. The source data query has no LIMIT statement, and the processRow(Row $row) method in the source plugin class is still called for every row.

Batch Size

This is where query batch size functionality comes in. This functionality is located in \Drupal\migrate\Plugin\migrate\source\SqlBase and allows for the source data query to be performed in batches, effectively using SQL LIMIT statements.

This can be controlled in the source plugin class with via the batchSize property.

/**
 * {@inheritdoc}
 */
protected $batchSize = 1000;

Alternatively, it can be set in the migration yml file with the batch_size property under the source definition.

source:
  batch_size: 10
  plugin: example_migration_source
  key: example_source_db

There are very few references that I could find in existing online documentation. I eventually discovered it via a passing reference in a Drupal.org issue queue discussion.

Once I knew what I was looking for, I went searching for how this worked and discovered several other valuable options in the migration SqlBase class.

\Drupal\migrate\Plugin\migrate\source\SqlBase

/**
 * Sources whose data may be fetched via a database connection.
 *
 * Available configuration keys:
 * - database_state_key: (optional) Name of the state key which contains an
 *   array with database connection information.
 * - key: (optional) The database key name. Defaults to 'migrate'.
 * - target: (optional) The database target name. Defaults to 'default'.
 * - batch_size: (optional) Number of records to fetch from the database during
 *   each batch. If omitted, all records are fetched in a single query.
 * - ignore_map: (optional) Source data is joined to the map table by default to
 *   improve migration performance. If set to TRUE, the map table will not be
 *   joined. Using expressions in the query may result in column aliases in the
 *   JOIN clause which would be invalid SQL. If you run into this, set
 *   ignore_map to TRUE.
 *
 * For other optional configuration keys inherited from the parent class, refer
 * to \Drupal\migrate\Plugin\migrate\source\SourcePluginBase.
 * …
 */
abstract class SqlBase extends SourcePluginBase implements ContainerFactoryPluginInterface, RequirementsInterface {

Migration Limit

Despite the “flaws” of the --limit flag, it still offers us a valuable tool in our effort to mitigate migration memory issues and increase migration speed. My anecdotal evidence from timing responses from the --feedback flag shows a much high migration throughput for the initial items, and a gradually tapering speed as a migration progresses.

I also encountered an issue where the migration memory reclamation process eventually failed and the migration ground to a halt. I was not alone in this issue, MediaCurrent found and documented this issue in their post Memory Management with Migrations in Drupal 8.

Memory usage is 2.57 GB (85% of limit 3.02 GB), reclaiming memory. [warning] Memory usage is now 2.57 GB (85% of limit 3.02 GB), not enough reclaimed, starting new batch [warning] Processed 1007 items (1007 created, 0 updated, 0 failed, 0 ignored) - done with 'nodes_articles' The migration would then cease to continue importing items as if it had finished, while there were still several hundred thousand nodes left to import. Running the import again would produce the same result.

I adapted the approach MediaCurrent showed in their post to work with Drush 9. It solved the memory issue, improved migration throughput, and provided a standardized way to trigger migrations upon deployment or during testing.

The crux of the solution is to repeatedly call drush migrate:import in a loop with a low --limit value to keep the average item processing time lower.

Our updated version of the script is available in a gist.

So the next time you are tasked with an overwhelming migration challenge, you no longer need to worry about memory issues. Now you can stick to focusing on tracking down the source data, processing and mapping it, and all of the other challenges migrations tend to surface.

Mar 15 2019
Mar 15

A lot of effort goes into engaging your visitors to ‘Sign-up’ or ‘Contact’ you. You send them a warm and fuzzy invitation to complete the form, tell them all the great reasons why they should complete the form… but who likes to complete a form?  Guarantee a smooth sign-up process and increase the completion rate of your webforms with these six tips. 

#1 Make it Flow

Before you begin designing that web form, it is always good to create a User Flowchart. Working to establish the form completion process from start to finish, a flowchart will help you: 

  • Determine what information is needed (and when)
  • Decide what actions and interactions are appropriate
  • Determine the order of actions
  • Make considerations for new patterns to aid the completion process
     

A User Flowchart can begin with a simple Flow Outline, which can then be placed in a flowchart diagram and later illustrated using low fidelity paper prototypes to find the most natural set of actions. When creating the outline consider the following:

The Business Objective

  • What is the main objective of the website for achieving successful completion of the form? (ie, we want to gather as many email addresses as possible.)
  • What is the required information needed from the person completing the form? (ie, we need their name and email, and since our site is only for adults we also need their birth date.)

The User Persona

  • Take advantage of the information gained from the User Personas to focus on the user’s various needs and considerations. What problem do they want to solve and how can this form help them?

  • What devices do they access most frequently to complete webforms? It’s good to know in advance if most of the users complete forms on their mobile phones and/or if they have inferior internet connectivity.

The Entry Point

When designing a User Flowchart, keep in mind the point of entry. Perhaps they arrive after clicking the Call to Action on the homepage. Often webforms are a part of an email or social media campaign, or the user arrives at the form after an organic search. The users should be treated differently based on where they come from, and may need extra context to reiterate the goal of the form to help them get orientated when they arrive. Consider all possibilities.

#2 Keep it Short and Sweet

Don’t ask for information that’s not needed. Your subscription or contact form — or any form that gathers basic information — should only ask for the bare necessities needed in order to accomplish that goal. People will usually stick around long enough to enter their name, email address and password. Anything more than that should be the absolute minimum amount of information needed, with further data obtained in follow-up communications or by implementing a multi-step form (see tip #3). No one enjoys completing a form, so keep it as simple as possible! Neil Patel found that three form fields was the optimal number.  Pegasystems, a Mediacurrent client, leveraged third-party integrations on their Drupal 8 site to pre-fill form fields with data and improved the user experience for returning visitors. 

Reducing the number of form fields can increase conversion rates by 26 percent. 

Email Address

  • Forward thinking email form fields provide suggestions for fixes when the email address has been entered incorrectly. … Did you mean @gmail.com?
  • If you include an auto fix for mistyped email addresses you won’t need to require the user to enter it twice. Otherwise, it’s is a good approach to provide the extra guarantee that they’ve got it right.
  • When the form is for creating an account or signing up for a newsletter, a current practice is to use the email address for the account instead of providing a separate username. This will cause less frustration with creating one that is not already in use, as well as remembering it every time they login.

Name

The person’s name is really only needed in instances where account personalization or custom communication is used. A frequent approach is to provide one field for their full name. This is a bonus since some users may have more than two words that make up their full name, and one field takes less time to complete (especially on mobile devices). Check first to see if the system requires the first name to be isolated for marketing purposes.

Password

  • Enough with the 'confirm password' already! They will lower your conversion rates. Give the user the option to actually SEE the password they’re entering with a ‘show password’ toggle, and they won’t have to enter it twice.
  • Include a Password Strength Indicator. You can get creative with messaging to encourage users to try harder when creating a more secure password like Geeklist does: “Crack Time: 5 minutes”
  • Depending on the level of site security, another time-saving feature is the ability to never have to enter their password again with the ‘Remember Me’ feature.

#3 Multi-step Forms

Single step forms are the most common type of form found on a website. Sometimes, however, using the minimum amount of fields will not accomplish the goal of capturing essential user data. Instead of simply adding more fields to your one-page form you should be aware of an important point: 

Multi-step forms have been found to increase conversions by as much as 300% (without increasing website traffic). 

Multi-step forms convert well for several reasons:

Simplicity

Through the use of progressive disclosure design techniques, we can improve usability when only the minimum data required for a task is presented to the user. Multi-step forms provide the ability to break up a longer form into manageable steps so the user is not visually overwhelmed with requirements at the beginning or during the process. By including only one or two questions per step with a manageable number of steps overall will improve the user experience and significantly increase the chance they will complete it. 

Reduced Psychological Friction

Multi-steps with a simplified interface allow opportunity to use low-friction language in order to reduce psychological friction. In order to encourage the user to become immersed with an energized focus on the activity, we must always seek to minimize disruptions and use language that puts them in a positive state of mind. 

Orientation

Progress bars encourage form completion. The most common use of visual progress trackers are when conducting an online purchase since those are often broken into a multiple step process. It answers the questions the user may have during completion:

  •  How long will the form take?
  • What comes next?
  • Is anything happening?
     

Displaying the steps required to complete the form along with where the user currently is at in the process will help manage their expectations and keep them oriented throughout.

Investment

By using the approach of requesting general information at the beginning of the form and moving towards more sensitive information requests towards the end of the form, the user feels more invested and is therefore more likely to complete.

Conditional Logic

Longer forms will sometimes benefit by using conditional logic in order to personalize the experience. The user is provided with specific questions based on certain responses therefore eliminating irrelevant information retrieval while simultaneously obtaining more targeted data. Save them valuable time and customize their experience, and they will likely reward you by clicking the submit button.

#4 Make it Easy to Read

Including the labels and inputs, consider the context being used for all text on the page and work to ensure your font sizes are large enough to be legible on all devices. The amount of content on the page should be considered while also using best practices for accessibility. 

  • Recent trends are a 14px font size at minimum.
  • When specifying a 16px font size for mobile devices, iOS will not zoom in when the user taps the field, because it’s not needed. This approach can be less distracting especially when there are multiple form fields on the page.
  • Consider the maximum amount of characters that will be needed in all cases to ensure enough room is provided to complete each field. For example, some zip codes in other countries use a varying number of digits.

#5 Inform Everything

Label All Things

The label of the form field you want the user to complete should ALWAYS remain visible. The labels can be placed outside of the field near the top, right, or left — or even better — use the Infield Top Aligned Label. This popular approach has been found to be the quickest to scan, has the best flow, and takes up less real estate. The labels are placed inside of the field, jumping to the top left corner as the user begins typing. Either way, at no point should the user lose sight of the information that’s needed inside of the field.

Inline Form Validation

  • Inform the user as they progress if anything has been entered incorrectly or if a field is missing information. Don’t make them click the ‘Submit’ button at the end of the form only to receive a bunch of red text telling them what they have to re-do.
  • Micro interactions such as a simple green check or a red ‘X’ along with a brief message as the user completes the form will improve the workflow.
  • Tell them if their CAPS LOCK IS ON.

Required or Optional?

Inform the user which fields are required and which are optional for the form to be accepted. An asterisk is often used to designate required information, but they are ignored by screen readers so make sure the required fields include the HTML5 ‘required’ attribute or the aria-required set to true.

Field Details

Explaining the information needed for each field is another great approach. If your Registration Sign-up Form will require a password with at least 6 unique characters with 2 of them numbers, tell them! Does the phone number field require a +, or a country code, or an area code? Tell them or show them.

Progress Bar

  • A form that’s broken into logical steps is easier to complete. If there are multiple steps that require multiple screens to complete, add a progress bar so the user knows where they are in the process.
  • If possible, add a link to the completed steps in the progress bar so the user can go back if needed.

Safety

  • Make your users feel safe during sign-up by informing them about your terms, policies, or rules.
  • Ensure them you will not share their information or spam their email.
  • Provide an easy way to cancel or opt out at any time, without much effort.

#6 Must be Mobile

While optimizing your site for mobile devices, any forms on your site should also be carefully considered. Not only are the screens smaller, but often the connections are slower, and entering text can be a bit tricky, so reducing the number of required fields is especially important. Luckily, recent innovation for mobile forms have provided modern solutions and compression techniques that could actually encourage sign-up on a mobile device:

Predefined DropDowns

  • Whenever possible, avoid open input fields and provide a dropdown list instead for easier completion.
  • Dropdown selections should be written as they would normally display (ie, Credit Card expiry date: 01/20).

Collapsible Menus

This really helps when multiple offerings are available with details for each.

Predictive Search Fields

As the user begins typing the keyword, a list of possible results is provided. 

Calendar Tools

Choose a calendar that is easy to use, with big targets that help to avoid user input errors

Combine Inputs When Possible

Providing only one field for a ‘Full Name’ instead of one for ‘First Name’ and one for ‘Last Name’ will speed up the form completion process and reduce user frustration. 

Automatic Advance

The system should recognize when a date or email has been entered and take the user to the next field automatically, whenever possible.

Buttons That Engage

  • The ‘submit’ button should use a strong command verb that provokes emotion or enthusiasm, such as ‘Sign-Up Now!’
  • Use bright, engaging (and accessible) color combinations. Color changes on tap add even more visual information about the progress.
  • Ensure the tap target is large enough for users with big fingers or who have difficulty being accurate. Apple's iPhone Human Interface Guidelines recommends a minimum target size of 44 pixels wide 44 pixels tall.

Final Word

Achieving a smart form design isn't always easy, but it's well worth the effort. 

What are some great examples of forms you've seen? 

Editor’s note: This post was originally published on November 14, 2016, and has been updated for accuracy and comprehensiveness.

Mar 15 2019
Mar 15

In this post we’ll see how to save temporarily values from a form and how to retrieve or process them later in a controller. To do that, we’ll use the Form API and the private tempstore (the temporary store storage system of Drupal 8).

The use case is the following: we need to build a simple RSS reader (a form) where the user could introduce the URL of an RSS file and the number of items to retrieve from that file.  Next, in a new page (a controller), the application should display the list of items with a link to each syndicated page .

The easiest way to achieve it would be to retrieve the values in our buildForm() method, process them and display the result thought a specific field of the form. But that’s not our use case.

To process the form’s values and to display the results in another page, we’ll first need to store the form's values and retrieve them later in a controller. But how and where to store and retrieve those values?

Long story short: Store and retrieve data with the Private Tempstore System of Drupal 8

Drupal 8 has a powerful key/value system to temporarily store user-specific data and to keep them available between multiple requests even if the user is not logged in. This is the Private Tempstore system.

// 1. Get the private tempstore factory, inject this in your form, controller or service.
$tempstore = \Drupal::service('tempstore.private');
// Get the store collection. 
$store = $tempstore->get('my_module_collection');
// Set the key/value pair.
$store->set('key_name', $value);

// 2. Get the value somewhere else in the app.
$tempstore = \Drupal::service('tempstore.private');
// Get the store collection. 
$store = $tempstore->get('my_module_collection');
// Get the key/value pair.
$value = $store->get('key_name');

// Delete the entry. Not mandatory since the data will be removed after a week.
$store->delete('key_name');

Fairly simple, isn't it? As you can see, the Private Tempstore is a simple key/value pair storage organized with collections (we usually give the name of our module to the collection) to maintain data available temporarily across multiple page requests for a specific user.

Now, that you have the recipe, we can go back to our use case where we are now going to figure out, how and where we could store and retrieve the values of our form.

Long story: Illustrate the Private Tempstore system in Drupal 8

For the long story part, we are going to create a module with a form and a controller to first get the data form the user, store the form’s data in a private tempstore and redirect the form to a controller where we’ll retrieve the data from the tempstore and process them.

This is the form.

drupal8-privatetempstore-form

And this is the controller.

drupal8-privatetempstore-controller

You can find and download the code of this companion module here.

Types of data storage in Drupal 8

In Drupal 8 we have various APIs for data storage:

Database API  - To interact directly with the database.
State API - A key/value storage to store data related to the state or individual environment (dev., staging, production) of a Drupal installation, like external API keys or the last time cron was run.
UserData API - A storage to store data related to an individual environment but specific to a given user, like a flag or some user preferences.
TempStore API - A key/value storage to keep temporary data (private o shared) across several page requests.
Entity API - Used to store content (node, user, comment …) or configuration (views, roles, …) data.
TypedData API - A low level API to create and describe data in a consistent way.    

Our use case refers to user-specific data we need for a short period of time, data that is also not specific to one environment, so the best fit is certainly the TempStore API but in its private flavour because the data and the results are different for each user.

The only difference between private and shared tempstore is that the private tempstore entries strictly belong to a single user whereas with shared tempstore, the entries can be shared between several users.

Storing the form's values with the Private TempStore storage

Our goal is to build a form where user can introduce an RSS file URL and a number of items to retrieve from that file, next we need to store temporarily those values to retrieve them later in a controller.

Let’s now take a closer look at how we could store (or save) our form’s values. As we get the data (url and items to retrieve) from a form, it’s clear that we are going to store the data in the submitForm() method since this method is called when the form is validated and submitted.

Here is the code of our form.

<?php
namespace Drupal\ex_form_values\Form;
use Drupal\Core\Form\FormBase;
use Drupal\Core\Form\FormStateInterface;
// DI.
use Symfony\Component\DependencyInjection\ContainerInterface;
use Drupal\Core\Messenger\MessengerInterface;
use Drupal\Core\Logger\LoggerChannelFactoryInterface;
use Drupal\Core\TempStore\PrivateTempStoreFactory;
/**
 * Class WithControllerForm.
 *
 * Get the url of a RSS file and the number of items to retrieve
 * from this file.
 * Store those two fields (url and items) to a PrivateTempStore object
 * to use them in a controller for processing and displaying
 * the information of the RSS file.
 */
class WithStoreForm extends FormBase {
  /**
   * Drupal\Core\Messenger\MessengerInterface definition.
   *
   * @var \Drupal\Core\Messenger\MessengerInterface
   */
  protected $messenger;
  /**
   * Drupal\Core\Logger\LoggerChannelFactoryInterface definition.
   *
   * @var \Drupal\Core\Logger\LoggerChannelFactoryInterface
   */
  protected $loggerFactory;
  /**
   * Drupal\Core\TempStore\PrivateTempStoreFactory definition.
   *
   * @var \Drupal\Core\TempStore\PrivateTempStoreFactory
   */
  private $tempStoreFactory;
  /**
   * Constructs a new WithControllerForm object.
   */
  public function __construct(
    MessengerInterface $messenger,
    LoggerChannelFactoryInterface $logger_factory,
    PrivateTempStoreFactory $tempStoreFactory
  ) {
    $this->messenger = $messenger;
    $this->loggerFactory = $logger_factory;
    $this->tempStoreFactory = $tempStoreFactory;
  }
  /**
   * {@inheritdoc}
   */
  public static function create(ContainerInterface $container) {
    return new static(
      $container->get('messenger'),
      $container->get('logger.factory'),
      $container->get('tempstore.private')
    );
  }
  /**
   * {@inheritdoc}
   */
  public function getFormId() {
    return 'with_state_form';
  }
  /**
   * {@inheritdoc}
   */
  public function buildForm(array $form, FormStateInterface $form_state) {
    $form['url'] = [
      '#type' => 'url',
      '#title' => $this->t('Url'),
      '#description' => $this->t('Enter the url of the RSS file'),
      '#default_value' => 'https://www.drupal.org/planet/rss.xml',
      '#weight' => '0',
    ];
    $form['items'] = [
      '#type' => 'select',
      '#title' => $this->t('# of items'),
      '#description' => $this->t('Enter the number of items to retrieve'),
      '#options' => [
        '5' => $this->t('5'),
        '10' => $this->t('10'),
        '15' => $this->t('15'),
      ],
      '#default_value' => 5,
      '#weight' => '0',
    ];
    $form['actions'] = [
      '#type' => 'actions',
      '#weight' => '0',
    ];
    // Add a submit button that handles the submission of the form.
    $form['actions']['submit'] = [
      '#type' => 'submit',
      '#value' => $this->t('Submit'),
    ];
    return $form;
  }
  /**
   * Submit the form and redirect to a controller.
   *
   * 1. Save the values of the form into the $params array
   * 2. Create a PrivateTempStore object
   * 3. Store the $params array in the PrivateTempStore object
   * 4. Redirect to the controller for processing.
   *
   * {@inheritdoc}
   */
  public function submitForm(array &$form, FormStateInterface $form_state) {
    // 1. Set the $params array with the values of the form
    // to save those values in the store.
    $params['url'] = $form_state->getValue('url');
    $params['items'] = $form_state->getValue('items');
    // 2. Create a PrivateTempStore object with the collection 'ex_form_values'.
    $tempstore = $this->tempStoreFactory->get('ex_form_values');
    // 3. Store the $params array with the key 'params'.
    try {
      $tempstore->set('params', $params);
      // 4. Redirect to the simple controller.
      $form_state->setRedirect('ex_form_values.simple_controller_show_item');
    }
    catch (\Exception $error) {
      // Store this error in the log.
      $this->loggerFactory->get('ex_form_values')->alert(t('@err', ['@err' => $error]));
      // Show the user a message.
      $this->messenger->addWarning(t('Unable to proceed, please try again.'));
    }
  }
}

In the submitForm() method we can see two lines where we deal with the private tempstore in step 2 and 3.

$tempstore = $this->tempStoreFactory->get('ex_form_values');
//...
$tempstore->set('params', $params);

In the first line we call the PrivateTempStoreFactory to instantiate a new PrivateTempStore object trough its get() method. As you can see, we get the factory by DI. We also define the name of our collection (ex_form_values) with the same name as our module (by convention) and pass it to the get() method. So, at this time we created a new PrivateTempStore object for a collection named "ex_form_values".

In the second line we use the set() method of the PrivateTempStore object which will store the key/value pair we need to save, in our case, the key is 'params' and the value is the $params array that contains our form’s values.

The PrivateTempStoreFactory uses a storage based on the KeyValueStoreExpirableInterface, this storage is a key/value storage with an expiration date that allows automatic removal of old entities and this storage uses the default DatabaseStorageExpirable storage to save the entries in the key_value_expire table.

Here is the structure of this table:

key_value_expire-strucuture

The PrivateTempStore object has the following protected properties, all passed to the form by DI:
$storage - The key/value storage object used for this data.
$lockBackend - The lock object used for this data.
$currentUser - The current user who owned the data. If he's anonymous, the session ID will be used.
$requestStack - Service to start a session for anonymous user
$expire - The time to live for the entry in seconds. By default, entries are stored for one week (604800 seconds) before expiring as defined in the KeyValueStoreExpirableInterface.

We can’t set any of those values, they are all set by the KeyValueStoreExpirableInterface when we instantiate the new PrivateTempStore object.

The PrivateTempStore object method we are interested at this time is the set() method which will store the key/value we need to save, in our case, the key is 'params' and the value is the $params array that contains our form’s values. You can find the code of this method here and below, the beginning of this method:

public function set($key, $value) {

  // Ensure that an anonymous user has a session created for them, as
  // otherwise subsequent page loads will not be able to retrieve their
  // tempstore data.
  if ($this->currentUser
    ->isAnonymous()) {

    // @todo when https://www.drupal.org/node/2865991 is resolved, use force
    //   start session API rather than setting an arbitrary value directly.
    $this
      ->startSession();
    $this->requestStack
      ->getCurrentRequest()
      ->getSession()
      ->set('core.tempstore.private', TRUE);
  }
  // ....
}

As you can see at the beginning of this method, we ensure that an anonymous user has a session created for him, thanks to this, we can also retrieve a session ID that we will use to identify him. If it’s an authenticated user, we will use his UID. You can confirm that in the getOwner() method of the PrivateTempStore object.

When we submit the form, thanks to these two lines, we’ll save the form's values in the database. If you look at the key_value_expire table, you can see that there are other records, most of them with update information of our app., but also new records with the collection value of tempstore.private.ex_form_values. Here is a query on this collection when an anonymous and the admin user use the form.

key_value_expire-drupal8

We didn’t output the column 'value' because it’s too large. But you can see now how our form's values are stored in the database and particularly for anonymous users with their session ID.

At this time, we know how to store temporarily the values of a form in the privatetempstore of Drupal. This was not so hard, wasn’t it? Yes, Drupal is a great framework and it saves us a lot of time when developing a web application.

Now let’s see how to retrieve and process the values in a controller.

Retrieve our form's values from the privatetempstore

To retrieve the data, process them and display the results, we’ll need to redirect our form to a controller. Here is the code of this controller.

<?php
namespace Drupal\ex_form_values\Controller;
use Drupal\Core\Controller\ControllerBase;
// DI.
use Drupal\Core\Messenger\MessengerInterface;
use Drupal\ex_form_values\MyServices;
use Symfony\Component\DependencyInjection\ContainerInterface;
use Drupal\Core\TempStore\PrivateTempStoreFactory;
use GuzzleHttp\ClientInterface;
// Other.
use Drupal\Core\Url;
/**
 * Target controller of the WithStoreForm.php .
 */
class SimpleController extends ControllerBase {
  /**
   * Tempstore service.
   *
   * @var \Drupal\Core\TempStore\PrivateTempStoreFactory
   */
  protected $tempStoreFactory;
  /**
   * GuzzleHttp\ClientInterface definition.
   *
   * @var \GuzzleHttp\ClientInterface
   */
  protected $clientRequest;
  /**
   * Messenger service.
   *
   * @var \\Drupal\Core\Messenger\MessengerInterface
   */
  protected $messenger;
  /**
   * Custom service.
   *
   * @var \Drupal\ex_form_values\MyServices
   */
  private $myServices;
  /**
   * Inject services.
   */
  public function __construct(PrivateTempStoreFactory $tempStoreFactory,
                              ClientInterface $clientRequest,
                              MessengerInterface $messenger,
                              MyServices $myServices) {
    $this->tempStoreFactory = $tempStoreFactory;
    $this->clientRequest = $clientRequest;
    $this->messenger = $messenger;
    $this->myServices = $myServices;
  }
  /**
   * {@inheritdoc}
   */
  public static function create(ContainerInterface $container) {
    return new static(
      $container->get('tempstore.private'),
      $container->get('http_client'),
      $container->get('messenger'),
      $container->get('ex_form_values.myservices')
    );
  }
  /**
   * Target method of the the WithStoreForm.php.
   *
   * 1. Get the parameters from the tempstore for this user
   * 2. Delete the PrivateTempStore data from the database (not mandatory)
   * 3. Display a simple message with the data retrieved from the tempstore
   * 4. Get the items from the rss file in a renderable array
   * 5. Create a link back to the form
   * 6. Render the array.
   *
   * @return array
   *   An render array.
   */
  public function showRssItems() {
    // 1. Get the parameters from the tempstore.
    $tempstore = $this->tempStoreFactory->get('ex_form_values');
    $params = $tempstore->get('params');
    $url = $params['url'];
    $items = $params['items'];
    // 2. We can now delete the data in the temp storage
    // Its not mandatory since the record in the key_value_expire table
    // will expire normally in a week.
    // We comment this task for the moment, so we can see the values
    // stored in the key_value_expire table.
    /*
    try {
      $tempstore->delete('params');
    }
    catch (\Exception $error) {
      $this->loggerFactory->get('ex_form_values')->alert(t('@err',['@err' => $error]));
    }
    */
    // 3. Display a simple message with the data retrieved from the tempstore
    // Set a cache tag, so when an anonymous user enter in the form
    // we can invalidate this cache.
    $build[]['message'] = [
      '#type' => 'markup',
      '#markup' => t("Url: @url - Items: @items", ['@url' => $url, '@items' => $items]),
    ];
    // 4. Get the items from the rss file in a renderable array.
    if ($articles = $this->myServices->getItemFromRss($url, $items)) {
      // Create a render array with the results.
      $build[]['data_table'] = $this->myServices->buildTheRender($articles);
    }
    // 5. Create a link back to the form.
    $build[]['back'] = [
      '#type' => 'link',
      '#title' => 'Back to the form',
      '#url' => URL::fromRoute('ex_form_values.with_store_form'),
    ];
    // Prevent the rendered array from being cached.
    $build['#cache']['max-age'] = 0;
    // 6. Render the array.
    return $build;
  }
}

First we inject all the services we need and also a custom service MyServices.php used to retrieve the RSS file and the required number of items.

This controller has a simple method called showRssItems(). This method performs six tasks:

1. Get the data we need from the PrivateTempstore
2. Delete the PrivateTempStore data from the database (not mandatory)
3. Display a simple message with the data retrieved from the tempstore
4. Get the items from the rss file in a render array
5. Create a link back to the form
6. Render the array.

The tasks that interest us are the first and the second one, where we deal with the tempstore.

$tempstore = $this->tempStoreFactory->get('ex_form_values');

This line is now straightforward for us, since we know that the get() method of the PrivateTempStoreFactory instantiate a new PrivateTempStore object for the collection 'ex_form_values'. Nothing new here.

$params = $tempstore->get('params');

In the line above, we just retrieve the value of the key 'params' from the storage collection into the variable $params. Remember that we are using the privatetempstorage, as we retrieve the value, we’ll also check if it's the same user with the getOwner() method of the PrivateTempStore object.

The second task is deleting the data from the tempstore, this is not mandatory since we are working with a temporary store where the data will be deleted automatically after a week by default. But if we expect a heavy use of our form, it could be a good idea since we won’t normally need the data anymore.

try {
  $tempstore->delete('params');
}
catch (\Exception $error) {
  $this->loggerFactory->get('ex_form_values')->alert(t('@err',['@err' => $error]));
}

As the delete() method of our privatetempstore object can trow an error, we’ll catch it in a 'try catch' block. For learning purpose I’ll suggest to comment those lines to see how the data are stored in the key_value_expire table.

The rest of the controller is pretty straightforward, we are going to pass the data to small methods of our custom service to retrieve the RSS file and the number of items we need, build a render array with the results as you can see in task number four. Next, we’ll create a simple link back to our form and render the entire array. Nothing strange here.

Recap

We wanted, trough a form, to get a RSS file URL, the number of items to retrieve from, and display this information in a controller. To do so, we needed to store the information of the form, retrieve it and process it later in a controller.

We decided to use the Private TempSore system of Drupal because:
- The information doesn’t belong to a specific environment or a particular state of our application (see State API for that case)
- The information is not specific to a user and doesn’t need to be stored with its profile (see UserData API for that case). On the other hand, anonymous users should also have access and use the form
- We need the information temporarily, for a short period of time
- It has to be private because the information belongs to each user (anonymous or authenticated)

To store the data, we used the submit() method of our form and the following lines:

$tempstore = $this->tempStoreFactory->get('ex_form_values') - to instantiate a new PrivateTempStore object with a collection name 'ex_form_values' that refers to our module in this case.

$tempstore->set('params', $params) - to save the value of the variable $params in the database with the key 'params'. Remember that as we use a private storage, Drupal will retrieve the user ID or its session ID if anonymous, and concatenate it with the key value in the 'name' row of the 'key_value_expire' table.

To retrieve the information in our controller’s showRssItems() method, we used the following code:

$tempstore = $this->tempStoreFactory->get('ex_form_values') - to instantiate a new PrivateTempStore object with a collection name 'ex_form_values' that refers to our module in this case.

$params = $tempstore->get('params') - to retrieve the value stored in the keyed 'params' record of this collection and for this particular user.

Yes! Once again, through this simple example, we can see how powerful Drupal is, and how its framework is easy to use.

And you? In which situation do you think we could use the privatetempstore? Please, share your ideas with the community.

Mar 15 2019
Mar 15

Adam stood in the middle of the garden, enveloped in exquisite beauty. The world was there to delight him, succulent fruit, dignified trees, green meadows, sprinkling pool and species of all kinds. Yet he stood contemplating the nature, he felt certain loneliness and thus the Lord said 

It is not good that man is alone. I shall make him a compatible helper.

With the creation of other species, both male and female sprang up the same time. If the beginning of the entire universe was chosen to be this way, how can business be any good without clients and a strong relationship with them, Right? 

 Image of two hands where the upper one is offering an apple to the lower one


The productivity and enduring relationship not only provides value to clients that are consistent but also constructs a healthy connection in every business venture. 

Though there are times when you get stuck in a rut with clients and the relationship starts to rot. 

So, how do you change it? 

Maybe with some strategies or maybe with the help of some plan. Well, whatever it may be here are some of the approaches which you can adapt to sweet up that sour relationship and add more productivity to a particular project.  

But how can perfect client relationship get ruined?

Under perfect circumstances, organizations and big enterprises treat their clients right. However, there might be times when they are under pressure to sell more or retain those paying customers, chances are that they might deviate from their standards. Resulting in sorrowful client satisfaction. 

With this context here are some of the actions which can kill a perfect build client relationship:

Saying yes to a client when you should not 

There is no shame in accepting the fact that your organization can meet only a level of expectations and not beyond it. Taking up those clients who are not a good match is foolishness.  If a particular organization knows that they going to hate dealing with a client or they might fail to meet their quality standards, the money is not worth the inevitable breakdown.  

Overpromising 

When there is a wide gap between expectations and reality - it results in disappointment. If you are selling software or a product for that matter, don’t promise the integration which will take a week or so and won’t work perfectly. Give those commitments that you know are humanly and technically feasible. Overpromising results in fears. 

Not addressing the key details 

When you are serving a client, it is necessary to include each and every detail about the project. You leaving details out by omission is one thing. If you leave out details intentionally, you will screw up relationships. Thus, address to each and every key detail. 

Being unauthentic 

If you are focusing only on yourself, what works for you and whatever you do then spending your time considering what's best for your team, company, or business partnership is a waste. Adopt an all-or-nothing attitude, acting however is needed to win favor, seal a deal, or make a sale, even if it means lying or misrepresenting your position is a call for a sour relationship. 

Image showing a handset in red color with text as “The Customer”. There are arrows that are connected to it and many doodles depicting important factors

Taking These Few Important Steps for an Enduring Relationship 

We all know that a huge amount of time and effort is employed on acquiring clients, yet very few businesses spend the same energy nurturing the relationship. Here are some of the tips that would help you endure your client relationship.  

Communication is the key 

Clients depend on you to keep them informed. Having constant communication with them should be the top priorities. This includes updating them on various projects, as well as making them understand about any kind of bumps that you may encounter in the product delivering journey.

Information distribution

Don’t delay to share knowledge that might be useful to the clients, whether or not it benefits your organization in a way. The more value you present, the more a client attains to depend on you. There should not be a hesitation to share important and crucial data. 

Integrity 

If you are not honest to your client and vice-versa, no long term relationship survives. In addition to producing a product or service, your client requires you to show a chief responsibility towards all the dealings. Nowadays clients are really intelligent, they understand when they are being deceived or misled. Speaking a “ white lie” about why you failed can ruin your reputation. And without a reputation in terms of integrity, you can fail to cultivate the kind of long-term relationships that your business stands on.

Encourage multi-player team involvement 

The success of any project depends on the contribution of every member of the team.  Encouraging multi-player team with the involvement of the dev team can bring laurels to your project. This way the team members have a sense of ownership in a group project and they believe that their contributions are valued. They feel motivated to share their best work.

Goals 

There might be times when you would feel that you and your client are not on the same page. You have your own objective and your client has there's.  The solution to this common issue is to set mutual goals. 

And as soon as you start your new project and get engaged and committed to the deadlines, you help the client with vital product or services that might not be available in time to meet his or her needs. Set mutual goals from the very beginning to avoid any kind of friction later in the future.

Work for a strong partnership 

If you are building a relationship in all the appropriate ways and of course providing the products and services to your client needs, you can operate on developing a partnership with the client, something that is ahead of the project development. 

A client who determines that the organization that is serving them is in it for the long haul and that it motivates to help them succeed soon starts to view them as more than just a vendor or supplier. You become a partner in their enterprise and someone they grow to value today, tomorrow and in the years to come

Looking into the performance 

Re-examine the cost 

If you have been working with a particular customer for a long time, re-examine what it really costs you to do so. It would not be feasible to cut your price if it becomes cheaper to serve them.

Perceiving the Product 

Instead of thinking about what it is or what it does, you should infuse how it makes them feel. Even if you sell software, your software may relieve the stressful feeling of trying to get work done in a limited amount of time. It may make them feel confident in doing the job right.

Modify the strategy of budget 

Modify what you sell from a capital cost into an expense if your customer’s CEO won’t approve your product. Often, capital spending is prohibited but monthly expenses continue to be budgeted.

Finding an efficient distributor 

Sell your wares through a distributor if customers start to need smaller quantities or more service. Perhaps your service has declined as you pursued larger customers. If so, get a third party to sell and service your customer properly. You sure don’t need to make as much if you are doing less.

Selling your Service 

If they won’t buy your service by the unit, sell it by the hour or the result. So many times buyers are told to cut costs by cutting inventory.

Grant with a warranty if your product is at fault 

If your product or service was deficient, offer some kind of insurance to assure your customer it won’t be a problem next time.

Managing the departments 

The reasons customers buy your from you can change over time. A purchasing department can make decisions until its company has legal or customer problems, at which time their finance or marketing departments may now have the final say.

Managing projects with the help of various methods 

  • Waterfall: One of the more traditional project management methodologies, Waterfall is a linear, sequential design approach where progress flows downwards in one direction like a waterfall.
     
  • Agile development:  Agile is best suited for projects that are iterative and incremental. It’s a type of process where demands and solutions evolve through the collaborative effort of self-organizing and cross-functional teams and their customers
     
  • Scrum: Scrum is comprised of five values: commitment, courage, focus, openness, and respect. Its goal is to develop, deliver, and sustain complex products through collaboration, accountability, and iterative progress. 
     
  • Kanban: Kanban is another popular Agile framework that, similar to Scrum, focuses on early releases with collaborative and self-managing teams.
     
  • Six Sigma: It aims to improve quality by reducing the number of errors in a process by identifying what is not working and then removing it from the process.

Case Studies 

Ivey Business Journal 

A three-year cross-industry study by Ivey business journal explained how poor business strategy, inappropriate communication or damaged working relationships between partners account for 94 percent of all broken and failed alliances. On their own, poor or damaged working relationships account for 52 percent of all broken alliances. 

There are several reasons due to which an alliance is broken. Issues like impersonal problems, failure of team members communicating, high attrition rates, and most importantly the failure to reach a milestone.

When an alliance is recognized as broken, there are many critical tasks to perform and many separate decisions to be made. Partners require to diagnose why the alliance has broken down, examine and interpret the existing obstacles, disputes or tensions, and create a specific procedure to master these problems. They must furnish themselves to uphold a long-term relationship.

To relaunch your relationship with your client a three-step process can be followed:

  • Audit the relationship diagnosing the root causes
  • The partnership can succeed only if both organizations are fully persuaded that the alliance is the most effective means to meet their goals.
  • Conduct relationship planning build a joint contract and deal understanding
Image of a pie chart where 52% is red in color, 37% is blue in color and 11% is green in color. The pie chart shows the causes of partnership failure


OEM Profitability and Supplier Relations 

OEM Profitability and Supplier Relations - which is based in part on data gathered over the past 13 years from the annual Working Relations Index Study published by consultancy Planning Perspectives - found the better the relationship an automotive manufacturer has with its suppliers, the greater its profits are.

It explained the relationship “quantifies the economic value of suppliers’. This includes a supplier sharing new technology, providing the best team to support to the manufacturer, and providing support that goes beyond the supplier's contractual obligation.
 
The report added the research “establishes the fact that the economic value of the suppliers’ non-price benefits can greatly exceed the economic benefit realized from suppliers’ price concessions”. On average, this can be up to four to five times greater, according to the research.

Conclusion 

To get customer loyalty in today’s rapidly changing competitive world, companies need to rethink.

  • How do they engage customers?
  • Do they have the appetite required to build loyal relationships?
  • Is it even the right strategy for them in the first place?

Determine what your business and shareholders need first. If it’s short-term financial gains, then customer loyalty should not be a stated goal. Client seeks relationships, with their vendor. They want a place to be heard, a place to be appreciated and a place to connect. 

At Opensense Labs, we use social technologies and services that allow us to take relationships with customers to higher levels. Connecting with customers’ personal values helps in placing ahead of the competition in winning the hearts and minds of your customers.

Ping us now at [email protected] now. 

Mar 15 2019
Mar 15

As Community Liaison, I find it important to liaise face-to-face whenever I can, and an opportunity presented itself to visit a community I have not been able to spend time with until now; that in India.

This was going to not only be the first time I’ve worked with the community in India but also my first time in India. I couldn’t help but wonder, “Will I see any elephants?”

Think Indian!

I found myself sat on a motorbike at the side of a road in Goa, India and about to plunge into the traffic for the first time. At home, I’m an experienced motorcyclist but here, everything is different. I have to learn fast…

Waiting for a gap in the stream of vehicles that will never come, I shout at myself, “Come on Rachel, think Indian!” I just need to adjust how I think and accept that the traffic conditions here are not better or worse, just different, and to “go with the flow”. I take the plunge and catch up with Surabhi Gokte and Manjit Singh (the extremely generous community member who loaned me his beautiful Royal Enfield, pictured here) on their scooter and we disappear into the night.

Manjit's beautiful Royal Enfield Classic 350
Manjit's beautiful Royal Enfield Classic 350 - quite a change from my own BMW...

I learned a lot about India riding Manjit’s motorbike over the next couple of days (yes, sorry Manjit - I may have added another 250Km to the clock!) and at the marvellous Drupal Camp Goa that I had flown out to join.

Drupal Camp Goa was a great place to meet the Drupal community in India and I was determined to find out as much as I could about what was going on there. What I learned was that the community is hugely vibrant and doing amazing things —at a scale I simply never grasped before.

I was invited to do an opening talk at the beginning of the weekend and I chose to speak about community, as is my want, and how we can ensure that the whole World is aware of the community here. It is well documented in Dries’ blog posts that the contributions to the project from India are significant; they are, after all, the second most prolific country in code contributions after the USA, and then only by a small margin.

Graph showing the most active countiries contributing code to Drupal
Code contributions to Drupal, by country. Taken from https://dri.es/who-sponsors-drupal-development-2018 

What struck me, though, was that I didn’t know the amazing individuals here and I don’t see them featured enough in the global Drupal conversation. I talked about how we recognise the contributions made but we all have a responsibility to ensure that we facilitate people outside of places like the USA and Europe moving into “positions of influence” in our global community; places like the track chairs of global Drupal conferences, our many working groups and so on. I would very much like every lead of every Drupal working group to be asking themselves, “If 13% of our code contributions come from India, do we have at least 13% of our group’s leadership represented by Indians and, if not, why not?”

I was particularly struck at the quality of the sessions that I attended, and the scope of the things they were discussing.

Two sessions stuck out in my mind: The first was one on machine learning and its applications in Drupal, by Akanksha Singh. She not only described much of the history behind machine learning and explained how it may be used for many applications, she described a Drupal module she is finishing developing (on drupal.org/projects very soon!) that will allow sentiment analysis of content via tools like IBM Watson. I can think of a thousand uses for this!!

I very much enjoyed a session on “college leaver” education in Drupal by Sharmila Kumaran. It seems that she has developed a system by which they spend time exposing a lot of students to Drupal and then identify those who they think have potential to move into Drupal careers. Pretty standard stuff but then they mentioned the scale of the operation and I sat up: In 2019, they are expecting to have exposed over 6,000 (yes, six thousand) college-aged people to Drupal and how it works. Is there anyone else, anywhere in the World, educating people in Drupal at this scale??

The whole camp was full of people doing amazing things. The organisers were doing a fantastic job, the food was awesome, and I left with an overwhelming feeling that region including India will power the growth of Drupal for a long time.

Shout out to all the Volunteers and Speakers of @DrupalcampGoa 2019 - We are happy to give you credits on https://t.co/Ip1TSC0VnA as you have contributed to #DCG19. Here is the link - https://t.co/hhXnXEB5Xx, please add comments to justify your contribution :) pic.twitter.com/tlCApvXhQJ

— DrupalCamp Goa (@DrupalcampGoa) March 12, 2019

In the Mountains

We all highly appreciate the contributions Drupal community members make by becoming individual members of the Drupal Association, especially as it directly finances my ability to see people face to face at camps and meetups. I make every possible effort to spend that contribution wisely. So, on the return from India, I called off at Drupal Mountain Camp in Switzerland, where I had been invited to moderate a keynote panel on “The Future of Communities”, also involving Nick Veenhof, Imre Gmelig Meijling, Yauhen Zenko and Vincent Maucorps.

A recurring theme of the panel discussion was one of collaboration between the local associations, their members and each other. At the Drupal Association, we are working to aid that collaboration as much as possible and I was hugely impressed by the panel conversations. I learned more about how the local associations in Europe work with their members and each other to promote Drupal and facilitate growth. Truly a model to emulate across our global community.

One other thing - I very much appreciate that the event, thinking about its location, supplied a wonderful woolly hat rather than the usual t-shirt as a freebie. I know I'm not the only one who is aggrieved every time she is told "Oh, we only have straight fit t-shirts". Thanks Drupal Mountain Camp!

Drupal Mountain Camp woolly hat

But what about the elephants?

I never did manage to find any elephants whilst exploring India, or even in Switzerland. I did learn about a two “elephants in the room”, though:

  1. The Drupal community in India is extraordinary, doing great things, and I wasn’t aware enough of this. That is my fault and I intend to change that. I think we should all be looking to how we can learn more, too. It is very obvious that Drupal is in a state of growth in India and we should be cultivating that - I wouldn’t be surprised at all if the next webchick or xjm is already in India. We all gain by helping her grow.

  2. The Drupal communities in Europe are organising themselves in some really professional ways; using their established local associations to promote the project and members through tools like press releases, hosting local Splash Awards etc. These local associations in Europe are shining examples to other countries around the World.

Mar 15 2019
Mar 15

As always at this time of the year, we set our course on DrupalCamp London 2019. DrupalCamp London is the event most awaited by us. This year our team was somewhat bigger. Grzesiek and Maciek were joined by Jaro. As always, we had great expectations and thirst for knowledge – and as always, we were fully satisfied.

Wykład Macka na DrupalCamp London 2019

High level of DrupalCamp London

The conference began with the inaugural presentation of a valued Google Chrome programmer, Rowan Merewood. Due to this, the participants gained a solid amount of knowledge regarding user experience and website optimisation at the start of the conference already. The level of successive lectures was rising higher and higher – a trend maintained to the very end.

Session Jaro DCL 2019

What's more, we are incredibly proud and satisfied with our own presentations. There were as many as six lectures delivered by Droptica this year! But more about it below:)

6 Droptica lectures

After the last DrupalCamp London 2018, we became convinced that one should never give up. Where did this come from? Last year we have failed to get to deliver even one lecture during the conference. We drew conclusions and moved forward. The objective? Gaining a chance to present our – quite considerable – knowledge during DrupalCamp 2019.

Grzesiek session DrupalCamp London 2019

We did it! In the end, DrupalCamp London has accepted as many as six of the presentations proposed by Droptica. We are bursting with pride! The conclusion? Obstinacy,company development, constant enhancement of knowledge and better preparation yielded better results then we have ever expected. As always, every small or big success gives us even more energy and motivation.

Jaro DrupalCamp London 2019

Below are the topics that we presented during the conference.

1. Maciej Łukiański:

  • Droopler distribution - How can you save even 100 man-days during development of a new website with Drupal.
  • Continuous integration in a Drupal project (docker, ansible, jenkins, …).

2. Grzegorz Bartman:

  • Workflow in distributed Drupal agency.
  • Scrum everywhere - how we implemented Scrum not only in software development projects.

3. Jarosław Bartman:

  • Drupal 8 SEO.
  • How to make Drupal editor-friendly.

Maciek session DrupalCamp London 2019

We are particularly proud of the distinction given to Jaro's presentation about Drupal 8 SEO during the conference's summary carried out by Richard Dewick from Drupal Centric. You can read more about it here.

DrupalCamp London 2019 with Droptica until the next year

The conference is over. The emotions slowly subside. Droptica would like to thank DrupalCamp London for a great event. The sponsors – for bringing the event to life, and
the organisers – for the effort put into its preparation. We are glad that during the conference we had a chance to meet many Drupal-buddies and make many new relationships.

Gadgets DrupalCamp London 2019

DrupalCamp London 2019 is getting better year by year. It makes us happy, because – as the proverb goes – the appetite comes with eating. We start the countdown to the next conference while keeping the practical and impressive conference gadgets with us for the time being. See you at the next DrupalCamp London!
 

Mar 14 2019
Mar 14

Day one

The weather was snowy and cold and caused some transportation delays. Though I arrived later than planned, I was able to attend afternoon workshops and work on some Drupal Contribution projects. The day finished with an Apéro, where everyone gathered for good drinks and great conversations. To finish the night, the Amazee team left to have dinner together.

Apéro

Day two

The 2nd day of Drupal Mountain Camp began with a discussion by a panel made up of Nick Veenhof, Imre Gmelig Meijling, Yauhen Zenko and Vincent Maucorps, to discuss the future of Drupal communities. The panel was moderated by Rachel Lawson. They took questions from the audience members, which included:

  • How can we attract young talent?
    By targeting students, as they have the most time. Having a DrupalCamp in a university shows them what the community has and can to offer. This can also be achieved by getting Universities to add Drupal to the course curriculum. Another way is offering a training initiative or talking to agencies.
  • What can we do about International collaborations?
    Related to the previous question, maybe offer a base camp or training day. This allows those who wouldn’t be able to attend a larger event to learn. Live streaming is a good option for those not able to attend in person.
  • What are the benefits of sponsoring events, such as Drupal Mountain Camp?
    Sponsoring is a great way to find talent and increase brand recognition, particularly to companies that are new.

GraphQL 101: What, Why, How

This session was presented by fellow colleague Maria Comas, as a beginner’s guide to GraphQL. Throughout the presentation, it became clear why GraphQL is so powerful. I really liked the abbreviation WYGISWAF (What You Get Is What You Asked For), because that is essentially what GraphQL does. It will return only the data that you need. Maria showed us how this is achieved through demo code, before letting people know that this is already available in Drupal through the GraphQL module. As it was International Women’s Day, it was fitting that Maria ended the session with the following quote by computer scientist, Grace Hopper.

The most damaging phrase in the language is "We’ve always done it this way!"
- Grace Hopper
 

Maria Comas

Mob Programming: An interactive session

Next was Daniel Lemon, whose session was all about mob programming. Having already introduced a scrum team to mob programming, Daniel wanted to share the experience. This presentation gave a broad overview of mob programming. What impressed me most about this session was that Daniel didn't just want to explain to the audience what mob programming is, but got members of the audience to participate in a live mob session. This meant that those involved and those watching could see how mob programming works.

Participants were tasked with creating a gallery page and menu to the Drupal Mountain Camp site, within 15 minutes, taking turns of 2 minutes each, being the driver or navigator. After introducing the task, the 5 participants were able to create a basic implementation of the gallery page. The session ended with a quick retrospective, in which participants were truly motivated to try this within their own company. Many felt it was a nice switch from the ordinary single-developer experience, but some observed it could be difficult to keep up especially in the role of the driver.

On stage

Splash awards, fondue, sledding, and drinks!

The Splash Awards is about awarding the best Drupal projects of the year. Amazee Labs won an award for Zuerich.com in the category of Design/UX.

During the awards, Jeffrey McGuire treated us to sounds from the Alphorn, which I, personally, had never heard before. The sound produced was truly beautiful. After the awards, everyone made their way to the funicular station to collect their sleds and made their way up to the Belle Epoque restaurant. I was unable to go sledding as I didn’t have the right footwear, so I went to eat fondue with fellow colleagues Victor, Bastian, and Michael. There really is nothing better than ending the day with fondue.

Splash Awards

Day three

Day three started with a keynote, presented by Matthew Grill about the Drupal Admin UI & JavaScript Modernisation initiative, in which he informed us about the current progress of the administration. After the initial showing at DrupalEurope, it was clear that existing modules wouldn’t be compatible. This led to the team creating extension points, which would allow current modules to bundle and transpile the JavaScript to be used with the AdminUI, without having an extra build step.

It was clear that this was still a work in progress but nonetheless, it was nice to hear the latest update about the initiative. After the session, everyone was invited to the group photo. Say “Drupal”!

Group Photo

Current state of the Drupal Admin UI Redesign

The next session was again about the Drupal Admin UI, however, this time about the design. This was given by Sascha Eggenberger and Cristina Chumillas, they both explained and showcased the new design system, wireframes, and the current state of designs the initiative is proposing. It was clear that the design process was long and opinionated after they explained that designing a button wasn’t as straightforward as expected, due to many states and types. The team are hoping for a release in Drupal 8.7. but it was clear, after someone asked, that it seems to be a slow process, that this might not happen in time. It was noted that they also need help from contributors.

If you want to help or just know more about the above, head to the Admin UI & JavaScript Modernisation initiative.

Optimise your JavaScript

Saša Nikolić gave his session on optimising JavaScript. After a short history of the internet, in which I learned that Drupal came before Facebook. Saša also covered data loading. Loading lots of data, with lots of data manipulation is not a good idea for the user as this will slow down page loads.

The session also explained how to address various scenarios and the general rules that every JavaScript developer should be familiar with in order to boost your site’s performance. This includes using tools like Google Chrome dev tools, and Lighthouse. Tree shaking was another suggestion, by including only the functions that are needed. I also came to learn about prepack, a JavaScript bundle optimiser. Another useful piece of advice was to utilise CSS. Why use JavaScript for animations when CSS can take care of this? If unsupported browsers are the reason, leave it out, and make it look graceful as possible. I also enjoyed the joke about “eval() = bad”.

Network was the bottleneck, now it’s JavaScript.
- Saša Nikolić

Open source contribution

This was my favourite session of the day in which I learned about the opinions of Christina Chumillas, Miro Dietiker, Kevin Wenger, Michael Schmid, and Lukas Smith about everything to do with open source. This was an open forum, moderated by Josef Dabernig, in which an audience member was encouraged to ask a question they had about open source.

  • What motivates you to contribute to open source?
    It is concrete, you can see what you have done. People will code review, this will not only help make it better but will make oneself better. On a side note, people should just work together, join forces, this is the mindset of Drupal.
  • What is the advantage of open source software over proprietary software?
    Not only does it help with the maintenance of the code, but having different backgrounds, helps with the innovation of the code. Proprietary software means being on your own, which sometimes is not productive.
  • What is a good way to avoid maintainer burnout?
    Having a coach is a good way to let them, and other people, know of any problems and get help from them. Avoid those that don't have your best interest at heart. Share the knowledge, don't let one person do everything, and don’t let yourself be only one to complete someone just for the credit.

It was really nice to hear those answers and I couldn’t agree more. As someone who loves to contribute to open source, I think the biggest benefit is that your code will only become stronger if you share your code with others. After all, two heads are better than one.

Closing

Lukas Smith gave a very thought-provoking and inspiring closing session titled "Diversity & Inclusion: Why and How?". Lukas shared personal insights into becoming active in improving diversity and inclusiveness. He challenged the audience with some shocking statistics on the low amount of female to male programmers across Switzerland and the United States and then revealed that in open-source this percentage is even lower.

What can we do to better ourselves and improve Diversity? He also finished off the session with several tips to improve Diversity, some of which I find important to highlight:

  • Challenge your cognitive biases.
  • Consider following specifically people from marginalized communities in your chosen field.
  • Believe when members of marginalized communities point out issues with bias even if you have never encountered them.
  • Work on using inclusive language.

While talking about inclusion, I, along with everyone who attended, was happy to see that there were three sign language interpreters at the event. This meant that those who are deaf or with hearing difficulties were not excluded from the camp. This was another reason why this camp was exceptional.

If someone points out an offensive statement, make an effort to not become defensive. Listen, learn, move on.
- Lukas Kahwe Smith

After the closing everyone was invited for the ice hockey match between HC Davos and Rapperswil. This was my first time watching an ice hockey game, so it was wonderful to attend. It was a great match, with both a great atmosphere and great people. With that ended the great weekend that was Drupal Mountain Camp. I can honestly say that I had such a great time, especially spending time with my team and the Drupal community.

Finally, you hear it all the time, “thank you to all the sponsors”, but honestly, it cannot be expressed enough. Without them, great camps like Drupal Mountain Camp wouldn’t be possible.

The Game

Mar 14 2019
Mar 14

Using Drupal as your default CMS undoubtedly has advantages, however it also comes with its negative sides. The price you have to pay for its customizability, is the complexity and steep learning curve. Here at Sooperthemes, we have thought of you and developed an easy-to-use solution for you: Glazed Builder. With this visual Drupal page builder, you and your team of content creators and marketeers will be able to create rich content and beautiful web pages for your business, without having to touch a line of code.

In this article, I present to you 8 ways through which a visual page builder like Glazed Builder can further create value for your business.

 

1. Cut in half your landing page costs and time-to-market

Having a good landing page is paramount to the success of your business. However, it takes plenty of time and money to find the right people and tools to do it. With Glazed Builder as your Drupal 8 page builder however, creating a landing page has never been easier, cheaper and faster. Content creators and marketeers will be able to to create a visually stunning landing page in a matter of minutes, without having to rely on the IT department.

2. Stress less: Reduce employee turnover in your content team with true WYSIWYG

Are your employees stressed that the webpage they are building is going to look completely different than they imagined? Well, with Glazed Builder, your content creators will experience true WYSIWYG (what you see is what you get). That means that whatever they have imagined for your webpage is going to be their final result. No more senseless stress for your content creating team.

3. Get twice as much Drupal site-building work done by your most expensive staff: Developers

Developers, they are the most expensive members of your staff. However, they do not get work done as fast as you would like. The way to increase productivity is to have developers use Glazed Builder as your default Drupal page builder to build dynamic pages and dashboards that leverage drupal's block and views systems. This way, you will make their job easier while also increasing their productivity.

4. Same-day web design and publishing by using the pre-built templates

You need to launch a webpage in a matter of hours and you don’t have the inspiration necessary to design a layout? Fret not, Glazed Builder, the Drupal page builder, has you covered. With a plethora of templates available, you just have to select the right template for your business, insert your content, and post it. It has never been easier.

5. Content creators will produce better, more effective content than your competitors.

Do you want to stand-out from your competition in terms of content creation? Glazed Builder can help you and your content creators unleash their creativity. With an endless amount of customizability, Glazed Builder is sure to provide the right tools and power for your content creators to achieve their wildest dreams. When it comes to customizability, with Glazed Builder, the sky's the limit.

6. Reduce onboarding time and training costs: Reduce Drupal’s steep learning curve for content creators and marketeers

Every time there is a new tool introduced to your business, you have to pay a large amount of money for training your employees. The same is applicable for Drupal, since it is a highly complex CMS, it has a steep learning curve and requires highly skilled developers to be able to make it truly shine. However, Glazed Builder was engineered to be able to be used by even the most non-tech savvy of its users. This way, your staff will be able to quickly understand how to operate the visual builder and you will be able to reduce the time and money spent on training your personnel.

7. Save thousands on cloud hosting costs with a frontend tool that runs in your browser, not in your cloud

If you're thinking that a Drupal 8 website with the additional features of Glazed Builder requires a beefy server, you're wrong! 90% of Glazed Builder's magic is happening in the browser. Even our Drupal 8 demo sites with hundreds of demo content items run perfectly fine on affordable shared hosting solutions like our favorite Drupal hosting A2Hosting.

8. Better performance attracts a bigger number of visitors on your webpage

Even if you have top-notch content on your website, it’s irrelevant when it takes a long time to load. Most site visitors don’t have patience when it comes to loading a webpage, they would simply exit and visit the next one if it takes too much time. However, Drupal is the fastest out of the bunch when it comes to speed. It takes the least amount of time to load a page, which means that the likelihood of visitors leaving significantly drops.

 

Conclusion on Drupal Page Builder

Now that you know all of this, what are you waiting for?

Start improving your business today by using our visual page like Glazed Builder.

Mar 14 2019
Mar 14

TEN7 is a full-service digital firm, and our tagline is “We create and care for Drupal-powered websites.” Creating and building a website is the sexy visible part, but caring for a website over time is the just-as-important maintenance work. Keeping your site code updated, backed up, secure and performing well is the job of a support team.

Most of the clients we design and build websites for also ask us to support their site after it’s built, which we happily do. But clients with sites built by other companies often approach us to support their site as well. However, we don’t take on support clients lightly.

Why You Can’t Just Pay Us to Be a Support Client

If you want to pay us to support your site, why wouldn’t we just do it? Well, when we were just starting out, there was a time when we’d take on whoever came along and offered us cash.

We took on support clients without knowing anything about what was “under the hood” of their site. We had to scramble to figure out how a site worked as support issues came in, which led to increased problem-solving times. This resulted in subpar support of the client, and subpar support led to real unhappiness for our team members.

As you might have guessed, the impetus for a change in the way we did business was a particularly difficult client and their site. It took two years to effectively end the relationship. The experience was a turning point in our company’s existence: it led me to think about what we’d done wrong, and reverse-engineer a way to prevent it from happening again. You can hear more about this experience in my Drupal Camp talk, “Know Yourself First.”

As we’ve grown and matured as a company, we’ve learned the value of recognizing the right clients and building the right process to support their sites. This ensures that we do work that meets our high standards, that our team is respected and happy, and that there’s a good vibe between all parties. Our multi-step onboarding process for new support clients lets us accomplish all these goals.

The Audit → Improve → Care Process

Each step in this process is a separate engagement, and has its own pricing and contract. When a step is completed, either party can quit with no hard feelings. The three steps are:

  1. TEN7Audit: We perform a comprehensive site audit and present a report of findings and recommendations.
  2. TEN7Improve: We implement selected recommendations from the audit report (at minimum the most critical ones) based on budget and need.
  3. TEN7Care: This is the final part of the process, an annual support agreement.

Let’s learn more about each step.

Step 1: TEN7Audit

Before we’ll agree to support your site, we need to know what we’re dealing with.

To figure this out, we perform a complete audit for a flat fee of $2,500. We settled on this cost because while it’s not exorbitant, it’s high enough that it will disqualify anyone who isn’t serious. The entire process can take up to six weeks, from when a client first expresses interest until the day we present our findings report.

Site Audit and Analysis

The site audit starts with us making an exact replica of the site so that it can be examined without affecting its live operation. We run Healthcheck, a Drupal module sponsored and maintained by TEN7, to generate a baseline report for review.

Healthcheck is like your Drupal site's own personal physician and can run continuously after installation to keep tabs on the health of the site over time (hence its name!) It has a user-friendly, action-oriented dashboard that shows you issues, ranked in terms of urgency. We install Healthcheck on sites we support.

In the TEN7Audit process, Healthcheck provides the initial lay of the land and identifies things like whether module updates are required, or whether the site has been hacked or modified. Next, we get humans involved to take a deeper look at where the site is hosted, whether there is a version control workflow, whether continuous integration and automation exist, how the infrastructure is configured and more. For example, a Healthcheck-identified performance issue (slow page load speed) could have numerous causes, from giant images to caching being disabled. Humans have the best chance of ascertaining and reporting the causes.

Audit Results

When the TEN7Audit process is complete, we present a report of findings with a prioritized list of issues and recommendations. We define critical issues that need the most attention, as well as recommendations for repair and optimization:

  • Tier 1 - Critical: Issues that need to be fixed as soon as possible for site security
  • Tier 2 - Best Practices: Improvements that will reduce technical debt, optimize the site, and won’t negatively affect the site if not immediately done
  • Tier 3 - Nice to haves: Long-term improvement goals you should consider for the sustainability and success of your site

Here’s an example of a TEN7Audit Findings Report.

What’s Next?

After a TEN7Audit has been completed, the next step involves determining which recommendations from the report should be implemented, and then doing so to improve your site. Of course, this also gives us an opportunity to evaluate our relationship with you and decide whether it makes sense for us to continue (and if not, no hard feelings!)

Step 2: TEN7Improve

You’ve decided you’d like to have TEN7 work on the improvements identified in the TEN7Audit, and we’ve also decided that we’d like to work with you to help improve your site.

The recommendations in the TEN7Audit report will be accompanied by an estimated number of hours to fix each issue. The total cost for TEN7Improve varies for every site, and is calculated by multiplying the number of hours estimated to fix chosen issues and our hourly rate at the time of the site audit. A budget for TEN7Improve starts at around $3,000 for a site with minor issues and can go up to $10,000 or more for a site in bad shape.

If we’re going to put ourselves in a position to care for your site in the long run, you'll expect consistent, high quality work. We require Tier 1 Critical Issues to be fixed to proceed with the TEN7Improve step, but we also highly recommend fixing Tier 2 Best Practice issues as well (and most clients do).

If you have budget constraints, items in the recommendations list can be cherry picked, and you can determine the order in which they should be addressed. Occasionally there will be issues found in the audit that can’t be fixed at any time—they may be so big, or so entrenched in the site infrastructure that it would take an enormous effort to fix. In such rare cases, a complete site rebuild is warranted and recommended.

What’s Next?

By this point, the hard work has been done—your site has had performance and infrastructure issues fixed, Drupal core and modules are up to date, and the site is as secure as we can make it. We’re now more comfortable with your code; we have a better idea about how the site is built and how it works. Hopefully we’ve been able to make a real difference in the infrastructure, security, performance and usability of your site.

After this longer engagement, we also have more information about what it’s like to work with the you: how you communicate, how pleasant or demanding you are, and how quickly you pay your bills. You also know what it’s like working with us: how responsive we are, how important you feel, how much value you’re receiving.

We should now be in a position to offer a TEN7Care Support Agreement, in which we support a site we’ve audited, improved and are familiar with. It gives us the opportunity to take care of a site that we didn’t build, but that we feel comfortable being wholly responsible for. In some cases, we may not offer a support agreement, and of course, you may not wish to pursue one either.

Step 3: TEN7Care

Since we are now comfortable with the inner workings of your site, we can estimate how many hours per year will be required for site monitoring and maintenance. This includes time for periodic site updates, security patches, uptime monitoring and regular backups and archiving.

The TEN7Care Support Agreement starts at five hours per month, billed on a monthly basis. The agreement typically covers:

  • Drupal site maintenance: maintain and update core and contributed Drupal modules
  • 24x7 site uptime monitoring and response
  • Regular backups and archiving: automated nightly backups, weekly and monthly snapshots backed up to two offsite servers
  • Monthly traffic analytics insights
  • Guaranteed availability of your business’ critical paths after updates
  • Installation and use of a versioning system to manage multiple site environments
  • Regular releases of code with automated release notes
  • Support availability during business hours via email, video conference or Slack

Here’s an example of a TEN7Care Support Agreement.

What’s Next?

Yes there is a next, even here! Six weeks before the end of a TEN7Care Support Agreement, we meet internally to review the last year of work: how many hours we billed and how much work your site required. We’ll discuss whether to adjust the hours for the coming year, and whether a rate change is due, amongst other things.

That’s How You Become a Support Client!

As a company, we strive to do our best for our clients, and we set a high bar for quality work. I believe our success is directly proportional to our clients’ satisfaction with our work. Moreover, as a team, we have to be happy doing that work. If either of these two things don’t happen, we aren’t going to do our best, and there won’t be satisfaction in what we’ve created.

I think our TEN7Audit → TEN7Improve → TEN7Care process helps us accomplish all these goals.

Would You Like to Work With Us?

Do you have a Drupal site that needs support? We can help! Fill out this form and we'll get back to you quickly!

Mar 14 2019
Mar 14

When loading or interacting with entities in Drupal 8, we often use the EntityTypeManagerInterface interface, which is the brains behind the entity_type.manager service that is provided in many of the Drupal core base classes.

This often appears in one of the following ways:

\Drupal::service('entity_type.manager')->getStorage('node');

$this->entityTypeManager->getStorage('node');

Either approach returns an instance of EntityStorageInterface. Each entity type can define a class that extends EntityStorageBase and adds additional custom methods that are applicable to a given entity type.

The node entity type uses this pattern in \Drupal\node\NodeStorage to provide many of its commonly used methods such as revisionIds() and userRevisionIds().

The benefits of adding custom storage methods becomes more apparent when you begin to work with custom entities. For example, if you have a recipe entity type, you could have a loadAllChocolateRecipes() method that abstracts the query and conditions needed to load a subset of Recipe entities.

The resulting call would look like this:

/* @var $recipes \Drupal\recipe_module\Entity\Recipe[] */
$recipes = $this->entityTypeManager
  ->getStorage('recipe')
  ->loadAllChocolateRecipes();

A custom storage handler class is integrated with an entity via the annotated comments in the entity class.

\Drupal\recipe_module\Entity\RecipeEntity

/**
 * Define the Recipe entity.
 *
 * @ContentEntityType(
 *   id = "recipe",
 *   label = @Translation("Recipe"),
 *   handlers = {
 *     "storage" = "Drupal\recipe_module\RecipeStorage",
…

Then in the storage handler class, custom methods can be added and existing methods can be overridden as needed.

/**
 * Defines the storage handler class for Recipe entities.
 */
class RecipeStorage extends SqlContentEntityStorage {

  /**
   * Load all recipes that include chocolate.
   *
   * @return \Drupal\example\Entity\Recipe[]
   * .  An array of recipe entities.
   */
  public function loadAllChocolateRecipes() {
    return $this->loadByProperties([
      'field_main_ingredient' => 'chocolate',
    ]);
  }

Manual SQL queries can also be performed using the already provided database connection in $this->database. Explore the Drupal\Core\Entity\Sql\SqlContentEntityStorage class to see the many properties and methods that you can override or leverage in your own methods.

Again, the NodeStorage and TermStorage offer many great examples and will demystify how many of the “magic” methods on these entities work behind the scenes.

For example, if you ever wondered how the Term::nodeCount() method works, this is where the magic happens.

\Drupal\taxonomy\TermStorage

/**
 * {@inheritdoc}
 */
public function nodeCount($vid) {
  $query = $this->database->select('taxonomy_index', 'ti');
  $query->addExpression('COUNT(DISTINCT ti.nid)');
  $query->leftJoin($this->getBaseTable(), 'td', 'ti.tid = td.tid');
  $query->condition('td.vid', $vid);
  $query->addTag('vocabulary_node_count');
  return $query->execute()->fetchField();
}

The next time you need to write a method that returns data specific to an entity type, explore the use of a storage handler. It beats stuffing query logic into a custom Symfony service where you are likely violating single responsibility principles with an overly broad class.

This potentially removes your dependency on a custom service, removing the need for extra dependency injection and circular service dependencies. It also adheres to a Drupal core design pattern, so it is a win, win, win, or something like that.

Mar 14 2019
Mar 14

Drupal comes with its own built in cron. This means that you can add your own job to the list of jobs that are executed when the Drupal cron runs.

Drupal runs all of these jobs at the same time. What happens if you want to run one particular cron job really frequently? You have to change your cron settings to run ALL of them that frequently. It’s an all or nothing affair and you end up running all the cron jobs as often as your most frequent job. That can put a lot of pressure on your web server.

A second problem is if you have a job that requires a lot of heavy lifting. With these jobs, you might want to run them overnight when traffic to your site is lighter and there is less pressure on the web server. But with Drupal’s built in cron, you can’t do that either because of the all or nothing nature of it. You run it once or not at all.

To solve this problem, you need a way to separate cron jobs so that you can run each job, or groups of jobs, at different times. There are two main options to achieve that - Elysia Cron and Ultimate Cron. In this article, we are going to look at Ultimate Cron because that is available for Drupal 7 and 8, where as Elysia Cron is Drupal 7 only.

Download and install the module

In Drupal 8, the recommended way to add a new module to your project is to use composer. The following composer command will download Ultimate Cron and add it to the list of dependencies.

composer require drupal/ultimate_cron

You can then enable the module manually by going to the Extend admin menu item, or by using Drush.

Create a new job

You can create a new job by implementing hook_cron() in a custom module. The following example implements hook_cron() in a module called custom_utility. When this job is run, it will log a message to Drupal’s logs.

function custom_utility_cron() {
  \Drupal::logger('custom_utility')->notice('Cron ran');
}

You can change the code in the function to execute what you need. The call to log a message is purely to demonstrate that this cron job is actually running.

Discover the job

Head over to configuration -> cron in the admin menu (/admin/config/system/cron/jobs). You should see a list of jobs from various core modules.

Hit the Discover jobs button and the job you created above will appear on the list.

Ultimate cron jobs

To test that this works, click Run. Then head over to recent log messages (admin/reports/dblog) and you should see the message.

Message in log from ultimate cron

To edit the frequency of the job, click on the arrow to the right of the Run button and click edit. In the scheduler tab you can change the frequency in the Run cron every X minutes drop down to.

Export the configuration

At this point, you can export the configuration using Drupal’s standard configuration export. You can then import the configuration if you use a staging and production site. If you are unsure how to export and import configuration, check out the configuration management documentation on Drupal.org.

Adding more than one job for a module

If your custom module only needs to support one cron job, then you have enough to do that. You can add additional code to your implementation of hook_cron() (see custom_utility_cron() above) and that will run when your cron job runs at the scheduled time.

But what if you want to have more than one cron job running at different times all from this one custom module?

You can do this by defining each multiple cron job with their own call back function. The call back function is used instead of hook_cron().

The settings for each Ultimate Cron job is stored in its own configuration file. Drupal uses YAML for configuration files, and you’ll need to create these files. The following steps will outline how to do this.

Steps to add a cron job

  1. In the admin menu, go to Configuration -> Configuration synchronisation (/admin/config/development/configuration)
  2. Click on the Export tab and then the Single item sub-tab (/admin/config/development/configuration/single/export)
  3. Change Configuration type to Cron job
  4. Change Configuration name to Default cron handler with the name of your custom module. In my case, this is  Default cron handler (custom_utility_cron).
  5. Copy the configuration code to your clipboard

In your custom module:

  1. Create a config directory in the root of your custom module’s directory
  2. Create a new file: ultimate_cron.job.jobname.yml (change jobname to the name of your job)
  3. Paste the configuration code you copied earlier

Here is an example of what this code will look like:

uuid: fa40bf2b-f544-4e22-b4b9-dda9cc0efbfe
langcode: en
status: true
dependencies:
  module:
    - custom_utility
title: 'Default cron handler'
id: custom_utility_cron
weight: 0
module: custom_utility
callback: custom_utility_cron
scheduler:
  id: simple
  configuration:
    rules:
      - '*/[email protected] * * * *'
launcher:
  id: serial
  configuration:
    timeouts:
      lock_timeout: 3600
    launcher:
      thread: 0
logger:
  id: database
  configuration:
    method: '3'
    expire: 1209600
    retain: 1000

There are a couple of adjustments to make to this file:

  • Remove the uuid line (the first line)
  • Change the id to something unique e.g. id: custom_utility_cron_job1
  • Change the title to something meaningful so that you can distinguish this job in the cron admin page

This file is creating a new job for Drupal to run. Let’s break it down:

  • callback: custom_utilty_cron - this is the call back function that will be run when this job runs.
  • module: - this is your custom module. This is also the module where you should add your call back function
  • scheduler: - this is the schedule to which this job will run, which for this one is every 15 minutes

The next thing to do is to add the callback to the .module file in your module.

Here is an example:

/**
 * The callback for the cron job.
 */
function custom_utility_callback() {
  \Drupal::logger('custom_utility')->notice('Cron ran');
}

This call back function will be run every time this job is executed.

And then go back to the configuration file from above and change the callback function to point to this function. In my example, this means changing this line:

callback: custom_utility_cron

To:

callback: custom_utility_callback

Repeat

You can then repeat these steps for any additional cron jobs that you need, with a new configuration YAML file and call back function for each.

Import the new config

In order for this to work, you need to import the configuration that you set up above. This can be done with the following command (change custom_utility to the name of your module):

drush config-import --source=modules/custom/custom_utility/config --partial -y

Wrapping up

Separating cron jobs out is often necessary to give you the control you need to run jobs at different times. Ultimate Cron is a wonderful module that allows you to do just that.

Mar 14 2019
Mar 14

There have been a lot of people that are very much interested in the “DevOps” concept and when I sat down with some of these, the direction of the conversation went down to many interesting paths. 

They started talking about deployment best practices, rollbacks, hot deployment etc. 

Two blue screws placed vertically where the middle text has dev in one and ops in other


But, when there were some mentions about “Blue-Green Deployment” - complete silence. 

Therefore, this gave me an idea to tell the rest of the world that with all the microservices, native cloud and what not technology, blue-green deployment is not a silver bullet, but it is an element to usefulness.

How?

Well, you got to read ahead. 

What do we understand by blue-green deployment?

A blue-green deployment is a management approach for releasing software code. 

Two identical hardware environments are configured in the exact same way in Blue-green deployments, which is also known as A/B deployments 

Only one of the environments is live at a single time, where the live environment serves all the production traffic. For example, if blue is currently live then green would be idle and vice-versa.

Blue-green deployments are usually utilized for consumer-facing applications and the applications which have critical uptime requirements. The new code is delivered to the inactive environment, where it is completely tested. 

How it reduces the risk?

Achieving automation and continuous delivery at any level of production is a holy grail, and avoiding downtimes and risks are high up on the list of priorities. Blue-green deployment provides you with simple ways of achieving these goals by eliminating risks that are witnessed in the deployment. 

  • You will never encounter surprise errors

When you fill a particular form online, what all credentials do you fill? Your name, phone number, address, street and probably your bank details if you are making an online purchase. Right?

You press the “pay now” button and check on the “receive spam emails” but unfortunately, your order wasn’t able to get processed as you desired. If you are lucky enough you get an error message equivalent to “application is offline for maintenance” all your efforts and time goes in vain. But with blue-green deployment, you never have to worry about this maintenance screen. 

There is a list of item’s upon one click and upon next click, you are eligible to see the new menu that you add. This would keep furious emails about error screen from flooding your inbox. 

  • Testing the production environment 

Ensuring that your pre-production environment is as close to your production environment as possible is not only important but essential too. With the help of blue-green deployment, this task is easily achievable. The user can test any application while it is disconnected from the main traffic. The team has the eligibility to even load the test if they desire too. 

  • Makes sure that the traffic is seamless 

Customer needs and desires are more global than ever and there is no longer an essential good time to do deployment, especially if you work in an enterprise where the business needs to be running around the clock. If you have a customer facing application then there are chances that they might switch their platform to some other website, if they don’t find what they desire. This means a decrease in sale and business. 

Blue-green deployment assures that your traffic never stops. That customer can place their order just fine without disruption. Which means that the employees overseas continue to do their job without any interruption, saving companies money. 

  • Easy Recovery 

You might witness times where you would get introduced to bugs and viruses. We can either spend a lot of money on its fix or we can inevitably find them and recover them. With the help of blue-green deployment, we have our older and more stable version of our applications to come back online at a moment’s notice by evading the pain to roll back a deployment.

Image of an object that is connected to a yellow object that says router which is connected to a chart that is divided into three halves.Source: Martin Fowler

How does this process work?

As we know that blue-green deployment technique involves running two identical production environments where they are configured in the same way, therefore, let us assume that the current deployment is in the green environment in 2.3 release. The next deployment which would take place would be in a blue environment that would be in 2.4 release.  

The environment would then be tested and evaluated until it is confirmed to be stable and responding. Once it is in production the server would be redirected, thus becoming the new production environment that the users are routed to.

The entire design is used to provide fast rollbacks in a case a deployment fails or does not pass a QA. When deployment fails or critical bugs are identified, a rollback to the green environment will be initiated. Once the bugs are fixed the version is re-deployed to the blue environment and the traffic is rerouted back the moment it is stable. 

While deploying the preceding version i.e version 2.5, the deployment would switch to the green environment and would be extensively be tested and evaluated. Traffic would be rerouted to the green zone once it passes the quality assessment.

This way both green and blue environment are regularly cycled between live versions and staging to the next version. 

Image of five blue-green boxes that are placed horizontally which are pointing to the blue-green boxes on the other sidesSource: Medium 

Blue-Green Deployment helping your Drupal websites

Let us imagine that you constructed a website with the help of Drupal, now you are getting high traffic in it. Normally for developing, updating and testing a website (without risking the live integrity), you follow these steps:

Development: The development process starts with developers working on new features, bug fixes, theming and configuration in the local environment. It makes it possible to easily roll back to the previous stage of development.
 
Testing: Typically this environment is not available for client viewing and it is intended for testing developmental work against a lateral host. 

Staging: This stage is used for presenting the changes to the client for approval. QA (quality assurance) and UAT (user acceptance testing) are most often carried out on the staging stage. 

Production: This is the live site on the web available visitors. It contains new features that have been proven safe to go live. 

As you can see that this process can be long and time-consuming, maintaining and constructing site can be irritating therefore blue-green deployment rescues you at times like these. 

It would provide near to zero downtime and would present easy rollbacks capabilities. The fundamental idea behind blue/green deployment is to shift traffic between two identical environments that running differently in different applications. 

 Image of a blue and green square in two different images. The first one shows request in the blue box and the second pictures show the sameSource: NewGenapps

Some of the implementations for Your Drupal Website 

Blue-Green Deployment for Drupal websites with Docker 

Drupal Deployments are hard. The user has to make sure that that the code is deployed, composer dependencies are pulled, schema updates are pulled, scheme updates are performed and all the caches are cleared. 

All with keeping the website up and responsive to the users. But if anything goes wrong and you wish to rollback? Do you stop the deployment? Well, no blue-green deployment is the answer to it. 

Docker makes it easy to build, shift and run applications. On the EC2 instance, there are always two raised docker containers of “blue” and “green”, and ngnix works as a reverse proxy on the same instance. The user can build a Drupal site that is running parallelly in the “blue” and “green” environment and serve both from MySQL database. we install Apache, PHP, and Drupal in baseimage-docker.

 Image of a square box that says nginxconnected to blue-green boxes. These boxes are connected to MySQL boxSource: Nulab

Drupal with Blue-Green Deployment in AWS Beanstalk 

Within the help of ECS, the user can create task definitions, which are very similar to a docker-compose.yml file. 

A task definition is a collection of the container, each of which has a name, the Docker image runs, and have the option to override the image’s entry point and command. The container definition is also where the user can define environment variables, port mappings, volumes to mount, memory and CPU allocation, and whether or not the specific container should be considered essential, which is how ECS knows whether the task is healthy or needs to be restarted.

The Amazon web service solution allows the user to quickly and easily manage the deployment and scalability of web platforms. The deployment helps in configuring a high-availability environment that seamlessly runs a Drupal website. Running a DB instance that is external to Elastic beanstalk decouples the database from the lifecycle of the environment, and lets the user connect to the same database from multiple environments, swap out one database from another and perform a blue-green deployment without affecting the database.

The below image shows how green-blue deployment work in AWS environment. 

An image divided into two halves where both have a cloud at the top connected to a security group which in turn is connected to the EC2 security group. Source: CloudNative

Some of the best practices for smooth release 

Now that we understand how blue-green deployment works, let’s cover some of the best practices that are related to it:

Load Balancing

Load balancing helps you to automatically set a new server without depending on any other mechanism, without depending on the DNS mechanism. The DNS record will always point to the Load Balancer and the user would only modify the servers behind it. This way they can be absolutely sure that all traffic comes to the new production environment instead of the old one.

Rolling Update

To avoid downtime the user can execute rolling update which means instead of switching from all blue server to all green server in a single cut-off you are eligible to work with an integrated environment. This indicates that rather than switching from all blue servers to all green servers in a single cut-off, the user can control with an integrated environment

Monitoring the environment 

Monitoring the productive as well as the non-productive environment is important. Since the same environment can play both as production and as non-production, all you would need is to toggle the alerting between the two states. 

Automate

The user can script as many actions as possible in the witch process, instead of doing a manual set of actions. This brings huge benefits. The process becomes quicker, easier, safer and enables self-service.

Deployment in cloud

If your servers run in the cloud, there is an interesting variation of the Blue-Green method in which instead of going back and forth between two static environments, you can just create the next environment from scratch.

This process is also valuable for avoiding the danger of servers becoming snowflakes, which are servers that have a unique configuration set that isn’t documented anywhere.  Once these snowflakes get erased for some reason, you have no easy way to properly recreate them. Whatever may be the choice it is important to keep the newest test and release technology to ensure that the release is smooth.

Conclusion 

Deployments are one of the most important parts of the software development lifecycle, therefore all the activities involved should thoroughly be researched and tested to ensure that they are a perfect fit for your system architecture and business. 

At OpenSense Labs, we have a pool of Drupal developers and experts that work on technologies that use these tools and services. Contact us now at [email protected], our experts would guide you with the queries and questions that are related to this topic. 

Mar 14 2019
Mar 14

About a year ago, I only just learned about the principles of IndieWeb, which in a way is a bit of a shame. Fast forward to now, and I'm proud to announce the first stable release for Drupal 8. Together with this milestone, I also pushed a new version of Indigenous so that both are aligned feature wise.

It's been a great journey so far, and while there's still a lot to do for both projects, the stability and feature set warrants a stable tag. It has changed the way I interact with (social) media day to day now since the last half year, both in reading and posting, being in full control of every aspect. It's great, everyone should try it!

What's next?

I've been thinking the last few weeks to raise funding, but after much consideration, I'm not going forward on that path. Even though my public GitHub profile lists over 1300 contributions the last year (about 3.5 per day), which somehow is simply crazy, I still have more than enough spirit and motivation to keep on going. Just a little slower from now on, since many features for both projects are not mission critical - even though they are awesome. Of course, I won't mind if someone would suddenly feel the urge to sponsor me.

Slowing down now you think, that can't be true ? Right. As already announced a few weeks ago, the next focus will be writing an Activitypub module for Drupal so you can communicate with your site on the Fediverse. I'm currently using Bridgy Fed for this, but, in the IndieWeb spirit, it's time to bring this home!

But first, time to make sure I don't mess up my tryouts of the Moonlight sonata. No commits until after March 31st - I promise :)

Say hello to the first stable release of the #IndieWeb module for #Drupal 8! https://realize.be/blog/first-stable-release-indieweb-module-drupal-8

Mar 14 2019
Mar 14

We’re back with an overview of the top Drupal blog posts from last month. Have a read and get yourself up to speed on the most recent goings-on within the Drupal community!

The 15 Things Your AEM Team Says Drupal Can't Do, But Can

The first post that caught our attention was Third & Grove’s list of 15 misconceptions about Drupal when compared with Adobe Experience Manager (AEM). With this blog post, the team at Third & Grove want to shed some light on the real differences between the two content management solutions and help people make a more informed decision.

A lot of the assumptions about Drupal’s shortcomings with regards to AEM are outdated and hence more up-to-date information was needed for an honest comparison. With the recent developments in Drupal, such as the Layout API and the new admin UI that’s on the horizon, Drupal now offers a much better experience for developers and content editors alike.

Read more

A Security Checklist for Drupal 8 Sites with Private Data

Even though Drupal has the reputation of an extremely secure CMS out-of-the-box (hence also its widespread adoption in government sites), some websites built in Drupal need some additional security precautions. This is especially true for sites that contain sensitive private information. 

With this security checklist provided by Lullabot’s Matthew Thift, you’ll always have a point of reference to check if each security measure has been adequately followed in every step of the project. 

Read more

Announcing the New Lullabot.com

Next on our list, we have another blog post by Lullabot, this one being an announcement of the new look of their website, Lullabot.com, written by Mike Herchel. The previous version of the site was one of the first decoupled Drupal sites built with ReactJS, but a decoupled architecture was deemed too complex a solution for Lullabot's current needs, and so they decided to replatform the site. 

The new and improved Lullabot.com is thus a return to a more traditional Drupal architecture. This makes it easier for developers to join the project while offering a better experience for content editors through the Layout Builder module.

Read more

Find out more in a podcast by Lullabot

Testing your Drupal code base for deprecated code usage with PHPStan

The starting point for the next post on our list was this blog post by the same author, Matt Glaman, about writing cleaner code with phpstan-drupal, a Drupal extension for PHPStan. The discussion in the comments section was what spurred this second blog post by Matt, which details how to test your Drupal code base for deprecations with the help of this extension. 

The goal of discovering deprecations is not just optimizing code, but also ensuring the compatibility with Drupal’s dependencies, namely Symfony and PHPUnit. This is one of the key responsibilities of the Drupal 9 group led by Gábor Hojtsy and Lee Rowlands; a tool that automates the tracking of deprecated code is thus exactly what they’ve needed.

Read more 

Drupal Pitch Deck at 60+ case studies

The following post is a sort of continuation, or rather, an update to Paul Johnson’s call for case studies. In this first post, he provided more information on the Promote Drupal initiative, the Pitch Deck project in particular, complete with examples of case study slides, and called on the community to contribute to the project with our own case studies.

This follow-up post details the progress of the Pitch Deck project: how many case study slides were submitted up until that point and what the next steps are. Even though this is an ongoing project, it’s never too late to get involved - anyone wishing to do so can and should contact Paul Johnson.

Read more

Optimizing site performance by "lazy loading" images

Next up, we have a post by Dries on how to greatly optimize site performance with the use of “lazy loading” images. Since all the images on a page are usually loaded simultaneously, this can be very detrimental to the site’s performance. A small tweak such as opting for lazy loading images can greatly reduce the time needed for the page to render.

How this works is by generating lightweight placeholder images which are as small as possible and devoid of any unnecessary headers and/or comments. These placeholder images are then embedded directly into the HTML and replaced with real images when they become visible to the user scrolling on the page.

Read more

Related blog post by Dries

Headless CMS: REST vs JSON:API vs GraphQL

Another post that we wanted to highlight was again written by Dries; this one is a comparison of different headless architectures. It is a very comprehensive post in which he compares three web services implementations - REST, JSON:API and GraphQL. The first part is a more general, CMS-agnostic comparison, while the second focuses on Drupal-specific implementation details.

The three different headless options are compared by the qualities that are most relevant for developers. These are: request efficiency, API exploration and schema documentation, operational simplicity, and writing data. According to the analysis in this blog post, the most viable headless solution for Drupal 8 core is JSON:API. As such, JSON:API is planned on being included in Drupal 8.7.

Read more

My 2019 Aaron Winborn Award Nomination

Finally, we have a post taken from Adam Bergstein’s aka n3rdstein’s blog. In this post, Adam reveals his 2 nominations for the Aaron Winborn Award - Nikhil Despande and Kendra Skeene, 2 instrumental members of the Digital Services in State of Georgia

Nikhil and Kendra are both avid advocates for open source and Drupal in particular. They were the driving force behind Ask GeorgiaGov that has a major two-fold benefit: the needs of Georgia’s citizens are better served, while Drupal profits from innovation in the form of an integration of the conversational interface Alexa. As such, they are truly outstanding members of the community and more than deserve the nomination for such an award.

Read more

These were the Drupal-related blog posts from last month that intrigued our team the most. If you’ve read any you found particularly interesting that we’ve missed, let us know and we’ll be happy to check it out. We’ll be back next month with another overview of the most interesting Drupal content - stay tuned!
 

Mar 14 2019
Mar 14

by Elliot Christenson on March 13, 2019 - 11:56pm

We're continuing our popular "Support Wizard Help Desk" at Drupalcon 2019! Book some time with myDropWizard for some FREE help with your Drupal site!

You're a first time Drupalcon attendee? You're a veteran Drupaler? Either way, you made part of your Drupalcon mission to fix a lingering issue - or at least to be pointed in the right direction!

We're here to help!

We spend our days helping Drupalers just like you every day with their support needs, so we thought "Let's bring that myDropWizard Support Face-to-Face with Drupalers: FOR FREE!"

So, drop by Booth #811 or (better yet!) schedule with us below!

Where we'll be and when

  • Our booth: We again sponsored DrupalCon this year and will have a booth in the exhibit hall! We're in booth #811!
  • Everywhere! Just like you we want to get around the convention to see everyone and everything. Stop us and say "hello!"

Schedule a one-on-one meeting

Again, we'll be happy to discuss your current challenges (or successes!) anywhere at Drupalcon, but if you want to be double extra sure that you'll be able to chat with us, schedule a one-on-one meeting with us!

We like to keep things simple, so just drop us an email to schedule a meeting with either Elliot or David:

[email protected].

Can't wait to see you there!

We're super friendly, non-imposing people who love Drupal and the Drupal community. :-) We all look forward to hearing how your organization is using Drupal and how we can help! Have a great week in Seattle!

Mar 14 2019
Mar 14

Are you concerned about web accessibility issues that might be hidden within your pages?

We recently gathered input from the Promet accessibility team concerning digital accessibility issues that are most often in need of remediation, and we came up with a Top 12 List of web accessibility mistakes and oversights. They pertain to:

1.  Alt text
2.  Color contrast
3.  Forms
4.  Headings
5.  iFrames
6.  Keyboard accessibility
7.  Landmark roles
8.  Links
9.  Lists
10. Semantic markup
11. Tables
12. File attachments

1. Alt Text

An image might be worth a thousand words, but if someone can’t see the image, then what? So, what’s up with the images on your site? Make sure that they are not missing: 

  • The alt attribute and descriptive text,
  • Enough description in the alt attribute, or 
  • A null alt attribute (alt="") indicating that the image is decorative and thus has no meaning.

If the image is a chart, alternative text that briefly describes it might not be enough. Complicated charts and graphs will require extra effort. The use of the longdesc attribute can be used if the narrative of your content doesn’t already include the information communicated by the image. 

2. Color Contrast

This one can be problematic if you’re heavily invested in your branding. If a color contrast checker reveals that your branding colors don’t create sufficient contrast, there are minor fixes you can employ to achieve accessibility. The two issues we see most often are insufficient color contrast:

  • Between text and background colors and 
  • Between text and UI components (i.e. button) or background image(s).

This type of problem can be headed off at the pass with a well-planned design and teaching your content authors how to incorporate accessibility into the creation of their colored charts and graphs.
 

3. Forms

Forms perform different duties. You have the search box on every page. That’s a form. That “Sign up for our Newsletter” on your homepage is a form. And, let’s not forget “Contact Us.” That’s three forms and we haven’t gotten to the comment forms, e-commerce forms, or event sign up forms. So, you can see, accessible forms require considerable attention.  

Now, add in the long list of issues we see to understand why your forms might be your most vulnerable objects on the page.

  • Form fields found with missing labels.
  • Form labels found with "for" attribute not matching another element’s ID.
  • Select element’s option doesn't have value available.
  • Select element doesn't have initially-selected option.
  • If this selection list contains groups of related options, they should be grouped with optgroup.
  • Checkboxes/Radio Buttons should be contained within a fieldset.
  • Fieldsets must contain a legend element.
  • Form is missing a required submit button.
  • Button elements missing value and/or content.

These are fixable offenses. For example, if your button elements are missing value and/or content, this is what that means.

Inaccessible
<input type=”submit” />

Accessible
<input type=”submit” value=”Submit Form” />

As you can see, the fix is simple. However, the tricky part can be gaining access to the site in such a way that the solution can be applied. If you’re using a third-party plug-in or a content management system, you might not have access to the code that generates your forms. 

4. Headings

Headings in HTML are defined via the <H1>, <H2>, <H3>, etc. elements. 
So often content is chunked in an article by using sub-headers styled with <strong> versus <H2>, for instance. This is not considered accessible. 

When Promet Source audits website pages, we often find:

Heading tags without content, and 
Heading structure that is not logically nested.

What does logically nested mean? Let’s take a look.

Inaccessible 
<h1>About Us</h1>
<h4>Our History</h4>
<h6>Our Future</h6>

Accessible
<h1>About Us</h1>
<h2>Our History</h2>
<h2>Our Future</h2>

This example makes it look like the content is the problem and it might be. However, this kind of issue can creep up in the presentation of page objects that reside outside the main content. 

5. iFrames

iFrames are used to display content on your web page from an external source. In this scenario, you need to think about two things: the iFrame on your page and the external content source, or third-party.

Start with ensuring your iFrame has a title, as shown in the sample code below. Then, ensure that the external content is accessible, which might not be easy if you don’t have an agreement with the source to provide accessible content. Remember, just because it came from someone else, that doesn't mean you are responsible.

Inaccessible
<iframe src=”/images/maine-beach-home.png” width=”...” height=”...” />

Accessible
<iframe title=”Maine beach home” src=”/images/maine-beach-home.png” width=”...” height=”...” />

6. Keyboard Accessibility

You scroll the browser window to see what’s hidden below. You place your cursor on a link or in a form field using your touchpad or mouse. Swiping and screen taps have become actions we take for granted on mobile devices. What would you do if you couldn't click or tap?

The most common keyboard accessibility issues we see are:

Elements that are not keyboard accessible;
Elements that are not visible when it gets focus (e.g., show a dotted border); and 
Forms that cannot be navigated using the keyboard and other accessibility tools. (i.e. accordions).

Let’s consider the first issue. Helping a user navigate to non-link and non-form elements is often overlooked. If your page is divided into sections, allowing the user to tab through the sections can be helpful.

Inaccessible
<div>Complementary content region</div>

Accessible
<div tabindex=”0”>Complementary content region</div>

7. Landmark Roles

So far we’ve talked about aspects of web pages that you are likely aware. This next topic has to do with the W3C’s Accessible Rich Internet Applications Suite (ARIA), and the roles and attributes that you can assign to your HTML elements. 

The W3C says, “With WAI-ARIA, developers can make advanced Web applications accessible and usable to people with disabilities.” However, it can also create issues if you apply ARIA’s roles and attributes incorrectly. For instance:

  • Elements missing required landmark roles, and 
  • Landmark role "presentation" is applied improperly on an element because its child elements contain semantic meaning.

Let’s consider the application of roles and semantic meaning.

Inaccessible
<div class=”promet-logo” role=”presentation”>
    <img src=”/images/promet-logo.png” />
</div>

Accessible
<div class=”promet-logo”>
    <img src=”/images/promet-logo.png” />
</div>

Sometimes a role assignment makes things worse, not better.

8. Links

When it comes to accessible links, you have two perspectives to consider: code and content. Before we look at the coding issues for accessible hyperlinks, let’s consider content. 

Links such as “Click here” and “Read more” appear often on the internet, but they’re not accessible because they lack purpose. 

Imagine listening to assistive technology reading out the links on a page, “Read more. Read more. Read more.” Read more what? In order for the link to be purposeful it needs to indicate what you would be reading more about. For example: “Read More about Weather Patterns.”.

Regarding coding issues, we see five coding issues on a regular basis.

  • Anchor elements found with valid href attribute but no link content.
  • Anchor elements found with missing href attributes.
  • Broken links to 404 (page not found) pages (e.g., a link to google.co versus google.com).
  • Back to top anchor link doesn't exist.
  • “Skip to main content” link is missing.

The ‘skip to main content’ link accommodates success criterion 2.4.1 Bypass blocks, enabling a user of the page to bypass listening to the menu and any other blocks of content that stand in the way of them hearing the main content. Below is some code that illustrates the problem.

Inaccessible
<body>
     <header></header>
     <main id=”main-content”></main>
</body>
     <a id=”skip-link”>Main Content</a>

      Accessible
<body>
    <a href=”main-content” id=”skip-link”>Skip to Main Content</a>
<header></header>
<main id=”main-content”></main>
</body>
 

9. Lists

f you are familiar with HTML, you might find it hard to believe that bulleted lists are, at times, not created using the <ul> and <li> elements. Just because a list visually appears as such, that doesn’t mean assistive technology will read as that way.

So, remember that:

  • List elements should be marked up as a navigation list, and 
  • Ordered/unordered/definition lists should include list items.

What do we mean by “should include list items?” See the example below.

Inaccessible
<ol>
     <div>Link 1</div>
     <div>Link 2</div>
     <div>Link 3</div>
</ol>

Accessible
<ol>
     <li>Link 1</li>
     <li>Link 2</li>
     <li>Link 3</li>
</ol>
 

10. Semantic Markup

Semantic markup has to do with meaning. If something is a paragraph, use <p>, not <span> or <div>, for instance. Other examples include <form>, <table>, and <article>. These HTML elements are descriptive and carry meaning. Semantic html matters.

We often see:

Duplicate ID attribute values found on pages, and
Semantic markup used for emphasis or special test (i.e. don’t use <font>).

Let’s take a look at the ID attribute that can be assigned to an HTML element. In this example, a web page has three unique forms and they each have the same ID. 

Accessible
<form id=”search”>
<form id=”search”>
<form id=”search”>

Inaccessible
<form id=”search-header”>
<form id=”search-content”>
<form id=”search-footer”>

The ease with which such fixes can be applied rely on how your website is created and if you have access to the code.
 

11. Tables

Tables are not as responsive as you will need them to be. So, if you don’t need to use a table, don’t. If you need to use a table, note that the following accessibility issues are often found.

  • Table is missing caption elements.
  • Table headers are missing the <th> element.
  • The relationship between <td> elements and their associated <th> elements is not defined.

The last example is easy for content authors to overlook as HTML editor buttons do not insert ID and header attributes. Notice how the data cell references the table header cell that is applicable.

Inaccessible
<table>
    <thead>
        <th>One</th>
<th>Two</th>
<th>Three</th>
</thead>
<tbody>
    <td>Column 1 content</td>
<td>Column 2 content</td>
<td>Column 3 content</td>
</tbody>
</table>

Accessible
<table>
    <thead>
        <th id=”col1” headers=”blank” scope=”col”>One</th>
<th id=”col2” headers=”blank” scope=”col”>Two</th>
<th id=”col3” headers=”blank” scope=”col”>Three</th>
</thead>
<tbody>
    <td headers=”col1” scope=”row”>Column 1 content</td>
<td headers=”col2”>Column 2 content</td>
<td headers=”col3”>Column 3 content</td>
</tbody>
</table>

12. File Attachments

Accessibility is not just about the HTML page. PDFs are commonly uploaded to a page for purposes of download. They, and other files such as MS Word and Powerpoint files, are required to be accessible.

PDF accessibility requirements that we see most often pertain to the following issues:

  • Untagged content,
  • Missing alternative text for images, and
  • Incorrect reading order.

The process to fix non-web files varies, depending on the source document. 

Conclusion

As the above examples demonstrate, accessibility compliance calls for close attention to a wide range of details.

One of the most challenging aspects of making a site accessible tends to be of fixing the technology used to create it. If your site was created from individual HTML pages, you can edit the markup and fix many, if not all of your issues. If not, a site rebuild might be in order.

Before you build a new site, make sure that the content management system is designed to produce accessible pages and forms. It’s also essential that you audit your current pages to identify accessibility issues before they are migrated into the new site. 

Promet Source offers a path to achieving accessibility compliance for your websites, web applications and technical products. With flexible testing and remediation options, we can partner with you to ensure that you are adhering to WCAG 2.1 guidelines.  

Contact us today for a conversation on the level of training or support that best fits your needs.

Mar 14 2019
Mar 14

March 14, 2019

Often, during local Drupal development (or if we’re really unlucky, in production), we get the dreaded message, “Unable to send e-mail. Contact the site administrator if the problem persists.”

This can make it hard to debug anything email-related during local development.

Enter Mailhog

Mailhog is a dummy SMTP server with a browser GUI, which means you view all outgoing messages with a Gmail-type interface.

It is a major pain to install, but we can automate the entire process with the magic of Docker.

Let’s see how it works, and discuss after. Follow along by installing Docker Desktop – no other dependencies are required – and installing a Drupal 8 starterkit:

git clone https://github.com/dcycle/starterkit-drupal8site.git
cd starterkit-drupal8site
./scripts/deploy.sh

This will install the following Docker containers: a MySQL server with a starter database, a configured Drupal site, and Mailhog. You wil see something like this at the end of the output:

If all went well you can now access your site at:

=> Drupal: http://0.0.0.0:32791/user/reset/...
=> Dummy email client: http://0.0.0.0:32790

You might be seeing different port numbers instead of 32791 and 32790, so use your own instead of the example ports.

Now, the magic

(In my example, DRUPAL_PORT is 32791 and MAILHOG_PORT is 32790. In your case it will probably be different.)

As you can see, all emails produced by Drupal are now visible on a cool GUI!

So how does it work?

A dedicated “Mailhog” docker container, using on the Mailhog Docker image is defined in our docker-compose.yml file. It exposes port 8025 for public GUI access, which is mapped to a random unused port on the host computer (in the above example, 32790). Port 1025 is the SMTP mailhog port as you can see in the Mailhog Dockerfile. We are not mapping port 1025 to a random port on the host computer because it’s only needed in the Drupal container, not the host machine.

In the same docker-compose.yml, the “drupal” container (service) defines a link to the “mail” service; this means that when you are inside the Drupal container, you can access Mailhog SMPT server “mail” at port 1025.

In the Starterkit’s Dockerfile, we download the SMTP modules, and in our configuration, we install SMTP (0, in this case, is the module’s weight, it doesn’t mean “disabled”!).

Next, configuration: because this is for local development, we are leaving SMTP off in the exported configuration; in production we don’t want SMTP to link to Mailhog. Then, in our overridden settings, we enable SMTP and set the server to “mail” and the port to 1025.

Now, you can debug sent emails in a very realistic way!

You can remove the starterkit environment by running:

docker-compose down -v

Please enable JavaScript to view the comments powered by Disqus.

Mar 13 2019
Mar 13

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

Three stars will align and the Open Web will win.

Today, the world wide web celebrates its 30th birthday. In 1989, Sir Tim Berners-Lee invented the world wide web and changed the lives of millions of people around the globe, including mine.

Tim Berners-Lee sitting in front of a computer showing the first website

Tim Berners-Lee, inventor of the World Wide Web, in front of the early web.

Milestones like this get me thinking about the positive impact a free and Open Web has had on society. Without the web, billions of people would not have been able to connect with one another, be entertained, start businesses, exchange ideas, or even save lives. Open source communities like Drupal would not exist.

As optimistic as I am about the web's impact on society, there have been many recent events that have caused me to question the Open Web's future. Too much power has fallen into the hands of relatively few platform companies, resulting in widespread misinformation, privacy beaches, bullying, and more.

However, I'm optimistic that the Open Web has a chance to win in the future. I believe we'll see three important events happen in the next five years.

First, the day will come when regulators will implement a set of laws that govern the ownership and exchange of data online. It's already starting to happen with GDPR in the EU and various state data privacy laws taking shape in the US. These regulations will require platforms like Facebook to give users more control over their data, and when that finally happens, it will be a lot easier for users to move their data between services and for the Open Web to innovate on top of these data platforms.

Second, at some point, governments globally will disempower large platform companies. We can't leave it up to a handful of companies to judge what is false and true, or have them act as our censors. While I'm not recommending governments split up these companies, my hope is that they will institute some level of algorithmic oversight. This will offer an advantage to the Open Web and Open Source.

Third, I think we're on the verge of having a new set of building blocks that enable us to build a better, next-generation web. Thirty years into the web, our data architectures still use a client-server model; data is stored centrally on one computer, so to speak. The blockchain is turning that into a more decentralized web that operates on top of a distributed data layer and offers users control of their own data. Similar to building a traditional website, distributed applications (dApps) require file storage, payment systems, user data stores, etc. All of these components are being rebuilt on top of the blockchain. While we have a long way to go, it is only a matter of time before a tipping point is reached.

In the past, I've publicly asked the question: Can we save the Open Web? I believe we can. We can't win today, but we can keep innovating and get ready for these three events to unfold. The day will come!

With that motivation in mind, I want to wish a special happy birthday to the world wide web!

Mar 13 2019
Mar 13

Our team is always excited to catch up with fellow Drupal community members (and each other) in person during DrupalCon. Here’s what we have on deck for this year’s event:

Visit us at booth #709

Drop by and say hi in the exhibit hall! We’ll be at booth number 709, giving away some new swag that is very special to us. Have a lot to talk about? Schedule a meeting with us

Palantiri Sessions

Keeping That New Car Smell: Tips for Publishing Accessible Content by Alex Brandt and Nelson Harris

Content editors play a huge role in maintaining web accessibility standards as they publish new content over time. Alex and Nelson will go over a handful of tips to make sure your content is accessible for your audience.


Fostering Community Health and Demystifying the CWG by George DeMet and friends

The Drupal Community Working Group is tasked with fostering community health. This Q&A format session hopes to bring to light our charter, our processes, our impact and how we can improve.


The Challenge of Emotional Labor in Open Source Communities by Ken Rickard

Emotional labor is, in one sense, the invisible thread that ties all our work together. Emotional labor supports and enables the creation and maintenance of our products. It is a critical community resource, yet undervalued and often dismissed. In this session, we'll take a look at a few reasons why that may be the case and discuss some ways in which open source communities are starting to recognize the value of emotional labor.

  • Date: Thursday, April 11
  • Time: 2:30pm
  • Location: Exhibit Stage | Level 4


The Remote Work Toolkit: Tricks for Keeping Healthy and Happy by Kristen Mayer and Luke Wertz

Moving from working in a physical office to a remote office can be a big change, yet have a lot of benefits. Kristen and Luke will talk about transitioning from working in an office environment to working remotely - how to embrace the good things about remote work, but also ways in which you might need to change your behavior to mitigate the challenges and stay mentally healthy.

Join us for Trivia Night 

Thursday night we will be sponsoring one of our favorite parts of DrupalCon, Trivia Night. Brush up on your Drupal facts, grab some friends, and don't forget to bring your badge! Flying solo to DrupalCon? We would love to have you on our team!

  • Date: Thursday, April 11
  • Time: 8pm - 11:45pm
  • Location: Armory at Seattle Center | 305 Harrison Street

We'll see you all next week!

Mar 13 2019
Mar 13

by David Snopek on March 13, 2019 - 1:36pm

As you may know, Drupal 6 has reached End-of-Life (EOL) which means the Drupal Security Team is no longer doing Security Advisories or working on security patches for Drupal 6 core or contrib modules - but the Drupal 6 LTS vendors are and we're one of them!

Today, there is a Less Critical security release for the Views 6.x-3.x module to fix an Cross Site Scripting (XSS) vulnerability.

This module enables you to create customized lists of data.

The module doesn't sufficiently sanitize certain field types, leading to a Cross Site Scripting (XSS) vulnerability.

This vulnerability is mitigated by the fact that a view must display a field with the format "Full data (serialized)" and an attacker must have the ability to store malicious markup in that field.

See the security advisory for Drupal 7 for more information.

Note: There are two other security advisories that were published today for Views on Drupal 7, but they don't affect Drupal 6.

Here you can download the Drupal 6 patch or the full release.

Note: This only affects Views 6.x-3.x -- not 6.x-2.x.

If you have a Drupal 6 site using the Views 6.x-3.x module, we recommend you update immediately! We have already deployed the patch for all of our Drupal 6 Long-Term Support clients. :-)

If you'd like all your Drupal 6 modules to receive security updates and have the fixes deployed the same day they're released, please check out our D6LTS plans.

Note: if you use the myDropWizard module (totally free!), you'll be alerted to these and any future security updates, and will be able to use drush to install them (even though they won't necessarily have a release on Drupal.org).

Mar 13 2019
Mar 13

By Natasha ChantoMarketing | March 13, 2019

By Natasha ChantoMarketing | March 13, 2019

DrupalCon 2019

We’re Going to Seattle!

We are a month away from flying out to Seattle for the one thing we have all been waiting for here at weKnow… DrupalCon 2019! Our team is beyond excited to be a part of this event once more as attendees and special conference guests.

Why are we going to DrupalCon?

Most of our WeGive efforts go towards coding for several projects weKnow maintain where we have several developers contributing a significant portion of their time to improving coding tools. Our top contributors, Jesus Manuel Olivas and Omar Aguirre, have dedicated much of their time to projects like the Drupal Console, which has now been downloaded more than 3 million times, helping hundreds of thousands developers code more efficiently. We also take the time to contribute to numerous modules and the Drupal core. This is why we are flying out to Seattle to share with other people and companies our work within the Drupal community.

What are some of the benefits of attending DrupalCon?

In this five day event, people from all around come for the training, conferences, social events and for the wealth of networking opportunities. It opens up your mind to a new world and business views and it allows you to learn about and from other people’s experiences with Drupal. This is the event where we go and soak up the inspiration. As developers, we learn from the strong and inspiring leaders that encourage us to think big and envision our future. It is a boundless source of motivation where they ignite your spark to continue improving your work and technology for the greater good. From the moment you step in, the positive vibe will enthrall you and, if you are a new attendee, you will for sure find support to take your first steps in Drupal. Find a space to bond and strengthen your relationship with colleagues, business partners and dive into the experience.

Our team at DrupalCon

This year, we are going to have our Head of Products, Jesus Manuel Olivas, as a speaker for two sessions. The first one will be in participation with Mario Hernandez and Mark Casias from Mediacurrent, and it is, Introduction to Decoupled Drupal with Gatsby and React  where people will understand and learn how to create a React-based Front End with Gatsby in combination with React. The second session is Building a Slack ChatBot where people will understand how to make interactions between computers and humans feel just like an interaction solely between humans. Andres and Omar are looking forward to attending DevOps and Front End sessions such as Drupal Blue/Green deployments with AWS ECS, Serverless, Well Actually…, Gatsby and Drupal, amongst others.

Through our time in Seattle, we will be uploading content and information on our social media sites to keep you updated. If you want to come with us through our journey, follow Jesus Manuel Olivas in twitter as @jmolivas and Andres Avila as @andresavila97. And if you are going to DrupalCon, see you in Seattle!

 

Drupalcon Seattle 2019

If you are looking to augment your team, execute your vision, automate your process or learn more about Drupal and our contributions, we invite you to get to know us better.

Mar 13 2019
Mar 13

By default, Drupal 8 has two methods for building the breadcrumb trail. For content, this method is based on the URL of the page, and for taxonomy terms this method is based on the vocabulary hierarchy.

The default construction of a breadcrumb

Let's explore in more detail the construction of the breadcrumb for content.

Let's take an example of a content page with the following URL:

/services/freelance/drupal/webfactory-drupal

The last part of the URL (webfactory-drupal) corresponds to the title of the page. Drupal will then inspect the rest of the URL and for each part look for if a content matches that URL.

So, Drupal will inspect this URL, to see if it matches existing content.

/services/freelance/drupal

If so (let's imagine that a content whose title is Drupal Specialist has this URL), the title of the page is added to the breadcrumb trail.

Then, he inspects this URL, to see if it matches existing content.

/services/freelance

If so (the content title is Freelance Drupal for example), the page title is added to the breadcrumb trail.

And finally Drupal inspects the last part of the URL, to see if it still matches existing content.

/services

The page title (Services) is then added to the breadcrumb trail.

So for this example, if each part of the path corresponds to an existing content page, the breadcrumb trail generated for this URL will be the following.

Home > Services > Freelance Drupal > Drupal specialist > Web factory Drupal

It is thus possible to build a custom-made, relevant breadcrumb trail using this detection by parent path, either by using a manual alias for listing pages, pivot pages or landing pages, or by using the Pathauto module to automatically build a relevant alias for content to be automatically placed in a section of a site (typical example, news, events, services, etc.). 

Note that the generation of the last part of the breadcrumb trail, namely the title of the current page, Web factory Drupal in our example, is the responsibility of the theme. As a general rule, you will find an option in any correct theme that allows you to display or not the title of the current page in the breadcrumb trail. Or it can be done with a simple hook.

/**
 * Implements hook_preprocess_HOOK().
 */
function MY_THEME_preprocess_breadcrumb(&$variables) {
  $request = \Drupal::request();
  $route_match = \Drupal::routeMatch();
  $page_title = \Drupal::service('title_resolver')->getTitle($request, $route_match->getRouteObject());

  $variables['#cache']['contexts'][] = 'url';
  if (count($variables['breadcrumb']) <= 1) {
    $variables['breadcrumb'] = [];
  }
  else {
    $breadcrumb_title = theme_get_setting('breadcrumb_title');
    if ($breadcrumb_title) {
      $variables['breadcrumb'][] = array(
        'text' => $page_title
      );
    }
  }
}

The breadcrumb trail for the pages of taxonomy terms is built according to a different logic: according to the hierarchy of terms and this regardless of the alias used for the pages of terms.

These two methods of breadcrumb generation are the default methods included in Drupal Core. It is of course possible to modify this default behavior by means of contributed modules, such as Breadcrumb Menu for example which generates the breadcrumb depending on the position of the page in the main menu and which in the absence of the page in the menu switches to the default Drupal Core generation, or by means of a custom module.

Customize the breadcrumb trail with a module

Altering the construction of the breadcrumb is done by means of a service tagged with the breadcrumb_builder tag. For example, for example

my_module.term_breadcrumb:
  class: Drupal\my_module\MyModuleTermBreadcrumbBuilder
  arguments: ['@entity_type.manager', '@entity.repository', '@config.factory', '@path.validator', '@path.alias_manager']
  tags:
  - { name: breadcrumb_builder, priority: 1010 }

The priority given to a service of this type makes it possible to order which rules to apply first, the highest priorities being those applied first.

The MyModuleTermBreadcrumbBuilder Class must implement two methods

  • The applies() method that will allow us to indicate when to apply this rule of construction of the breadcrumb
  • The build() method that will build the breadcrumb itself.

So if we want to add a parent to the breadcrumb trail of taxonomy terms pages, for example, our class will look like this.

/**
 * Provides a custom taxonomy breadcrumb builder that uses the term hierarchy.
 */
class MyModuleTermBreadcrumbBuilder implements BreadcrumbBuilderInterface {
  use StringTranslationTrait;

  /**
   * The entity type manager.
   *
   * @var \Drupal\Core\Entity\EntityTypeManager
   */
  protected $entityTypeManager;

  /**
   * The entity repository.
   *
   * @var \Drupal\Core\Entity\EntityRepositoryInterface
   */
  protected $entityRepository;

  /**
   * Drupal\Core\Config\ConfigFactoryInterface definition.
   *
   * @var \Drupal\Core\Config\ConfigFactoryInterface
   */
  protected  $configFactory;

  /**
   * The taxonomy storage.
   *
   * @var \Drupal\Taxonomy\TermStorageInterface
   */
  protected $termStorage;

  /**
   * The settings of my module taxonomy configuration.
   *
   * @var \Drupal\Core\Config\Config
   */
  protected $taxonomySettings;

  /**
   * The path validator service.
   *
   * @var \\Drupal\Core\Path\PathValidatorInterface
   */
  protected $pathValidator;

  /**
   * The alias manager.
   *
   * @var \Drupal\Core\Path\AliasManagerInterface
   */
  protected $aliasManager;

  /**
   * MyModuleTermBreadcrumbBuilder constructor.
   *
   * @param \Drupal\Core\Entity\EntityTypeManagerInterface $entity_type_manager
   * @param \Drupal\Core\Entity\EntityRepositoryInterface $entity_repository
   * @param \Drupal\Core\Config\ConfigFactoryInterface $config_factory
   * @param \Drupal\Core\Path\PathValidatorInterface $path_validator
   * @param \Drupal\Core\Path\AliasManagerInterface $alias_manager
   * @throws \Drupal\Component\Plugin\Exception\InvalidPluginDefinitionException
   * @throws \Drupal\Component\Plugin\Exception\PluginNotFoundException
   */
  public function __construct(EntityTypeManagerInterface $entity_type_manager, EntityRepositoryInterface $entity_repository, ConfigFactoryInterface $config_factory, PathValidatorInterface $path_validator, AliasManagerInterface $alias_manager) {
    $this->entityTypeManager = $entity_type_manager;
    $this->entityRepository = $entity_repository;
    $this->configFactory = $config_factory;
    $this->pathValidator = $path_validator;
    $this->aliasManager = $alias_manager;
    $this->termStorage = $this->entityTypeManager->getStorage('taxonomy_term');
    $this->taxonomySettings = $this->configFactory->get('my_module.taxonomy_settings');
  }

  /**
   * {@inheritdoc}
   */
  public function applies(RouteMatchInterface $route_match) {
    return $route_match->getRouteName() == 'entity.taxonomy_term.canonical'
      && $route_match->getParameter('taxonomy_term') instanceof TermInterface;
  }

  /**
   * {@inheritdoc}
   */
  public function build(RouteMatchInterface $route_match) {
    $breadcrumb = new Breadcrumb();
    $breadcrumb->addLink(Link::createFromRoute($this->t('Home'), '<front>'));
    /** @var \Drupal\taxonomy\TermInterface $term */
    $term = $route_match->getParameter('taxonomy_term');
    $breadcrumb_parent = $this->taxonomySettings->get('vocabularies.' . $term->bundle() . '.breadcrumb_parent');
    if ($breadcrumb_parent) {
      $url = $this->pathValidator->getUrlIfValid($breadcrumb_parent);
      if ($this->pathValidator->isValid($breadcrumb_parent)) {
        $path = $this->aliasManager->getPathByAlias($breadcrumb_parent);
        if(preg_match('/node\/(\d+)/', $path, $matches)) {
          $node = Node::load($matches[1]);
          if ($node instanceof NodeInterface) {
            $node = $this->entityRepository->getTranslationFromContext($node);
            $breadcrumb->addCacheableDependency($node);
            $breadcrumb->addLink(Link::createFromRoute($node->label(), 'entity.node.canonical', ['node' => $node->id()]));
          }
        }
      }
    }

    // Breadcrumb needs to have terms cacheable metadata as a cacheable
    // dependency even though it is not shown in the breadcrumb because e.g. its
    // parent might have changed.
    $breadcrumb->addCacheableDependency($term);
    // @todo This overrides any other possible breadcrumb and is a pure
    //   hard-coded presumption. Make this behavior configurable per
    //   vocabulary or term.
    $parents = $this->termStorage->loadAllParents($term->id());
    // Remove current term being accessed.
    array_shift($parents);
    foreach (array_reverse($parents) as $term) {
      $term = $this->entityRepository->getTranslationFromContext($term);
      $breadcrumb->addCacheableDependency($term);
      $breadcrumb->addLink(Link::createFromRoute($term->getName(), 'entity.taxonomy_term.canonical', ['taxonomy_term' => $term->id()]));
    }

    // This breadcrumb builder is based on a route parameter, and hence it
    // depends on the 'route' cache context.
    $breadcrumb->addCacheContexts(['route']);

    return $breadcrumb;
  }

}

This class largely follows the breadcrumb construction logic provided by Drupal Core, and only adds a parent to the breadcrumb built according to a configuration parameter. This same logic can also be applied to the breadcrumb trail of content pages, in case you want a view for example in the breadcrumb trail, or any other page that is not a content page.

In the end, customizing a breadcrumb trail can be done in many ways, as is often the case with Drupal, but I must admit that finally the default pattern responds to many use cases and is very often sufficient with a configuration zest at the level of generating aliases of the pages of a Drupal 8 project. Finally, we can also note the Custom Menu Breadcrumbs module which allows us to configure a main parent from a menu item for content of a certain type.

Mar 13 2019
Mar 13

The TWG coding standards committee is announcing an issue for final discussion. Feedback will be reviewed on March 20, 2019.

To help the initiative to update all deprecated code for Drupal 9 we need a standardized format for deprecation messages.

New issue for discussion:

  • Issue #3024461: Adopt consistent format for deprecation messages.
    Having a machine readable format for deprecation messages will allow us to develop tools on api.drupal.org to keep track of the current status of deprecated code in Drupal core and contributed modules. This will help drive the initiative to update all deprecated code before the release of Drupal 9.

Interested in helping out?

You can get started quickly by helping us to update an issue summary or two or dive in and check out the full list of open proposals and see if there's anything you'd like to champion!

Mar 13 2019
Mar 13

The TWG coding standards committee is announcing an issue for final discussion. Feedback will be reviewed on March 20, 2019.

To help the initiative to update all deprecated code for Drupal 9 we need a standardized format for deprecation messages.

New issue for discussion:

  • Issue #3024461: Adopt consistent format for deprecation messages.
    Having a machine readable format for deprecation messages will allow us to develop tools on api.drupal.org to keep track of the current status of deprecated code in Drupal core and contributed modules. This will help drive the initiative to update all deprecated code before the release of Drupal 9.

Interested in helping out?

You can get started quickly by helping us to update an issue summary or two or dive in and check out the full list of open proposals and see if there's anything you'd like to champion!

Mar 12 2019
Mar 12

Every year community members from across the globe meet in Orlando for Florida Drupal Camp. This year Adam, Ryan, and Jonathan from Hook 42 attended. It was a fantastic time to connect with people, to learn, and enjoy some warmer weather. Plus, alligators!

Ryan and Adam led a training on connecting Drupal 8 and Gatsby.JS. The training utilized a set of Docker images that help people build everything from end-to-end on their own system. Attendees used the Umami demo, installed JSON API, and configured Gatsby.JS to pull recipes from Drupal. It was well attended and there was a lot of collaboration in the room. Our team appreciated all of those that attended - especially as we worked through technical and wifi issues. 

Ryan and Adam also gave a talk on emerging technology related to Drupal. Some of the topics included Cypress.io, Hubspot, ElasticSearch, GraphQL, Gatsby.JS, and Pattern Lab. It’s important for community members to find and use the right tool for the right job. As Drupal community members, we must be mindful of how these complementary technologies can serve us. You can watch the recording of the session on Drupal TV.

Team Member Reflections

Adam

I look forward to this camp each year, and once again this year did not disappoint. Thank you to “the Mikes”, Kyle, and Jordana for volunteering again. The time spent with the community was uplifting and encouraging - even if it was a bit tiring giving both a training and a session. The time spent with my colleagues Jonathan and Ryan, having barbeque, and drinking craft beer with the community all brought a lot of energy. I was able to reconnect with many friends and I always treasure those opportunities. Also, Jonathan and I unintentionally packed the same hoodies and Drupal Camp Asheville shirts. Twinsies!

The session quality was outstanding! Most notably, there were some incredibly great non-technical, people-focused sessions provided by Jordana Fung and Qymana Botts. I strongly recommend seeing both. Thanks, as always, to the amazing Kevin Thull, all sessions are recorded and posted to Drupal TV.
 
At the end of camp, on the contribution day, I was able to work on the proof-of-concept integrating SimplyTest.me and Tugboat QA. The result of which is a new Tugboat QA contributed module based on a Backdrop CMS module port.

Ryan

This wasn’t just my first time to Florida Camp, it was my first time in the state of Florida in general! What a great experience. I had a great time at the camp and really enjoyed speaking with a wide array of folks about decoupled Drupal and Gatsby.js, both through the training Adam and I conducted as well as hallway conversations. Brian Perry’s talk comparing various static site generators and their compatibility with Drupal proved really insightful. I was really excited about his demo of the Tome static site generator, which runs within Drupal itself.

Thanks a ton to the organizers - the camp was a lot of fun and I look forward to attending again next year!

Jonathan

My second time at Florida Drupal Camp was as good as, if not better, than my first. As a Drupal Camp organizer myself, I find Florida Drupal Camp to be inspiring. They manage to pull off being extremely well organized while maintaining a very casual air to the whole weekend. 

As an attendee, I had a unique (for me) experience this year. I went to Drupal Camp Florida without a computer and it was great! Without a laptop always in front of me I found myself more attentive to the session and more engaged in the camp in general. I enjoyed it so much that I will likely avoid taking my computer to future camps.

The sessions I attended were all wonderful and there are a few I’m still thinking about today. Here is a quick list, and my takeaways:

  • How to keep Drupal relevant in the Git-based and API-driven CMS era
    The Drupal island appears to be shrinking as the overall landscape for approaches to web development grows. It doesn’t hurt keeping an eye on the new ways to solve old problems.
  • How to Hire and Fire Your Employer
    Always a fan of these “being human” talks. In this one, April encouraged me to continuously evaluate my values as a person and how they match any given employer. “Life is too short to be miserable” is a powerful perspective to maintain.
  • An Introduction to Gulp
    I’ve been using Gulp and similar technologies for a number of years, but have almost always used someone else’s configuration. Tim makes it look easy to get started writing my own tasks, and I’m really looking forward to doing so on my next project.
  • The problem with Paragraphs, a return to rich fields!
    Maybe the only actual Drupal session I attended, it’s always great to see Hawkeye and get his perspective. In this session he opened my eyes to some serious problems with the Drupal community’s go to approach to solving complicated fields, the paragraphs module. By the end of the session, I was ready to start making custom rich/composite/compound fields for future projects.

Another top-notch Drupal Camp Florida in the bag. I’ll definitely be back next year, and if you’ve never been I highly recommend it.

Mar 12 2019
Mar 12

Simple XML sitemap 3.1 has been released

The third major version of simple_sitemap has been long in the making bringing a more reliable generation process, a significantly more versatile API and many new functionalities. The first minor ugrade of the 3.x branch comes with views support and human readable sitemaps.

Major new features in 3.1

Simple XML Sitemap views supportViews and views arguments support

Including view display URLs in the sitemap has been possible through adding these URLs as custom links in the UI (or via the API).

View variations created by view arguments however are tedious to include as one would have to include every version of the URL.

The integration of the simple_sitemap_views inside simple_sitemap 3.x makes it easily doable via the UI.

Thanks to @WalkingDexter for his tremendous work on this submodule!

Human-readable sitemaps with XSL stylesheets

Before:

Sitemap without XSL

Now:

XML sitemap with XSL stylesheet

This will not change how bots interact with the sitemaps, it will however make the sitemaps readable and sortable for humans. This can be helpful when debugging the sitemap content or using the sitemap to visually present content to users.

Other improvements

You can see the list of bug fixess and improvements on the module's release page.

Upgrade path

The module upgrades fine from any of the 2.x and 3.x versions.

To upgrade the module via composer, $ composer require 'drupal/simple_sitemap:^3.1' can be used. Afterwards, just visit /update.php, or run $ drush updb to update the module's storage.

For more information about the 3.x branch of the module, see this post. I invite you to watch this space for a more in-depth technical tutorial on how to programmatically create sitemap types. Also feel free to leave a comment below!

Mar 11 2019
Mar 11

It’s hard to imagine life without mobile devices. While a developer controls the display of the site and its structure, editors are the ones adding content to a site on a regular basis. One tool that developers can use with editors in mind is Responsive Preview.

Difficulties during content creation can range from complex interfaces to performance problems. Each of these problems can be multiplied when you add in responsive design.  

Responsive Preview is a module that provides you with a quick way to preview how the website's pages will appear with various screen dimensions.

Benefits

Response Preview helps editors when they need to try out new layouts or add new content to preview the page. This means they can make sure everything works on the most commonly used devices before publishing on the live site. The module provides a quick approximation of how the page and the layout will look on any device.

How it works

From the development perspective, the module creates an iFrame with provided configs. This means it doesn't have to go through the trouble of getting all the frontend code and all the styles compiled and rendered in the backend to show a preview. Instead, we just load the iFrame which displays the frontend view.

The module provides a config entity “Preset” with the fields that describe “device” including  “width”, “height”, “rotation”, and more. On a node view page, there is a select with our options. Selecting an option opens an overlay with a preview of the current page.

iframe

.css({

width: width,

          height: height

     });

This way we get a preview of everything we need. Simply create a controller and put a link to the source attribute of the iFrame.

Responsive Preview Module Screenshot

Conclusion

When creating a site, developers should think not only about end users but about editors who will need to work on the site daily. Drupal is a great solution for content management, but sometimes it lacks ease of use.

It’s not enough to only have a mobile, tablet, and desktop design anymore, as many devices fall in between those dimensions. Phones can be the same size as small tablets and the new iPad Pros have a larger screen than some laptops. Responsive Preview provides the flexibility to configure the module as an admin. This makes it easy to add new presets when new devices are released.

Got a project that needs editor-friendly responsive design? Get in touch with us today!

Mar 11 2019
Mar 11
Photo by Bureau of Reclamation https://www.flickr.com/photos/usbr/12442269434

You’ve decided to use Acquia DAM for managing your digital assets, and now you need to get those assets into Drupal where they can be put to use. Acquia has you covered for most use cases with the Media: Acquia DAM module. This module provides a suite of tools to allow you to browse the DAM for assets and associate them to Media entities. It goes a step farther by then ensuring that those assets and their metadata stay in synch when updates are made in the DAM.

This handles the key use case of being able to reference assets to an existing entity in Drupal, but what if your digital assets are meant to live stand-alone in the Drupal instance? This was the outlying use case we ran into on a recent project.

The Challenge

The customer site had the requirement of building several filterable views of PDF resources. It didn’t make sense to associate each PDF to a node or other entity, as all of the metadata required to build the experience could be contained within the Media entity itself. The challenge now was to get all of those assets out of the DAM and into media entities on the Drupal site without manually referencing them from some other Drupal entity.

The Solution

By leveraging the API underlying the Media: Acquia DAM module we were able to create our own module to manage mass importing entire folders of assets from Acquia DAM into a specified Media bundle in Drupal. This takes advantage of the same configuration and access credentials used by Media: Acquia DAM and also leverages that module for maintaining updates to metadata for the assets post-import.

The Acquia DAM Asset Importer module allows the site administrator to specify one or more folders from Acquia DAM to import assets from. Once configured, the module runs as a scheduled task through Drupal’s cron. On each cron run, the module will first check to see if there are any remaining import tasks to complete. If not, it will use the Acquia DAM API to retrieve a list of asset IDs for the specified folders. It compares that to the list of already imported assets. If new assets exist in the folders in Acquia DAM, they’re then added to the module’s Queue implementation to be imported in the background.

The QueueWorker implementation that exists as part of the Acquia DAM Asset Importer will then process it’s queue on subsequent cron runs, generating a new Media entity of the specified bundle, adding the asset_id from Acquia DAM and executing save() on the entity. At this point the code in Media: Acquia DAM takes over, pulling in metadata about the asset and synching it and the associated file to Drupal. Once the asset has been imported into Drupal as a Media entity, the Media: Acquia DAM module keeps the metadata for that Media Entity in synch with Acquia DAM using its own QueueWorker and Cron implementations to periodically pull data from DAM and update the Media entity.

Try it out

Are you housing assets in Acquia DAM and need to import them into your Drupal site? We’ve contributed the Acquia DAM Asset Importer module on Drupal.org. Download it here and try it out.

Mar 11 2019
Mar 11

Thank you for backing the Webform module's Open Collective

First off, I want to thank the backers of the Webform module's Open Collective. After my last blog post, Open email asking organizations to back the Webform module and Drupal-related Open Collectives, we have 14 backers and a current balance of $908.44 that needs to be spent.

I also received a comment and an email about the need for some process for paid support. It’s worth noting that the Drupal Association is exploring a paid support model for assisting with security releases. We should recognize that Drupal 8 was a major software change and it is one that is still changing the Drupal community. And while I am thinking about how the Drupal community is changing and how we can develop better support around the Webform module, one of my more immediate concerns is improving the Webform module's Open Collective, and brand is the first thing I want to address.

Improving the Webform module's Open Collective

There are some useful tips and guides for building and maintaining an Open Collective. I appreciate Pia Mancini’s statement that "A collective is not a sprint," which makes me feel comfortable taking time to build out the Webform module's Open Collective.

Defining and strengthening the Webform module's mission will help clarify to backers what they are supporting and getting from the Webform module. The product summary for the Webform module is…

The Webform module provides all the features expected from an enterprise proprietary form builder combined with the flexibility and openness of Drupal

Drupal directly completes with proprietary Content Management Systems and the Webform module is competing with 100's of proprietary form and survey builders. The Webform module is not included on any top form builder lists, even though in a feature comparison we are on par with most of these form builders, with the added benefit that Drupal and Webform is completely free for everyone to use. This is why with the launch of the Webform's Open Collective, I proposed using collected funds to improve the Webform module's marketing and brand.

Improving the Webform module's brand

The Drupal Association is actively working on promoting Drupal. At the same time, we as a community can also promote Drupal's contribute module ecosystem. There are some amazing contributed projects with equally amazing developers contributing and maintaining these projects. Besides writing quality open source software, it’s also essential to consider the big picture of an open source project. For example, the Webform module's project page's information architecture works to address every aspect of the Webform module from encouraging people to watch a video, try the Webform module, and explore documentation. The summation of a piece of software or project is visually represented by a logo.

A dedicated logo is missing from the Webform module's project and Open Collective landing pages.

I did not start this blog post with this direct ask for Webform logo because I feel there is much greater need and opportunity for the Drupal community. That said...

Better branding is missing from most Drupal contributed projects.

Improving all Drupal-related projects brand

Some Drupal projects have dedicated logos which help establish the project's brand.

  • Drupal.tv website has the word TV inside the "Drupal" drop.
  • Acquia's Lightning distribution has a lightning bolt inside the "Drupal" drop.
  • Token module has very cool looking [T].
  • Rules module has an R flow diagram.
  • Paragraphs module has a right-pointing pilcrow.

All of the above project logos demonstrate how having a visual representation helps establish a project's brand.

In the spirit of the giving back to Drupal community…

Is it possible for us to collaborate and create a logo for the Webform module that also provides a universal logo kit which makes it possible for any Drupal contributed project to have a dedicated logo?

Providing a logo kit for Drupal contribute projects

When exploring the existing project logos in the Drupal community, I feel there are two key requirements for a reusable logo kit.

First, the logo should include a visual reference to Drupal. I like the concept of using the Drupal "drop" with some icon or text placed inside the drop.

Second, the logo kit is going to need to leverage a creative common and freely available icon library, which could include something like Seven theme's Libricons or Material Design's icon.

We can use Drupal media kit's logo page as a starting point for the final deliverables.

The Webform module's logo requirements and process

Right now, the simplest icon which could be used to represent a form builder is a checkbox, but I am open to other suggestions. The logo should also optionally include the 'Webform' project name. The Webform logo may also be used with the Webform UI as a callout.

Our entire process will be transparent with all materials freely available. As this project proceeds, we will publish blog posts encouraging community feedback.

Who should respond to this request for proposal

First and foremost, you need to want to give something back to Drupal and Open Source with the understanding that your contribution will be recognized, although you will most likely not to be fully financially compensated for work.

We have only $900 USD available for this project. It is not a lot and may be too little. It might make sense for us to do a two-phase approach with the Webform logo designed in phase 1 and the Drupal contributed project logo kit part of phase 2.

I know some people or organizations might do willing to do this work for free, but it is important that you are compensated in some way because…

It is not reasonable to assume that someone is going to doing something for free because they are contributing to open source.

Proposals could be very simple with a general statement of work and timeline. Our entire process is going to be transparently done using the Drupal.org issue queue. Proposals can be posted as comments to Issue #3026111: Create a logo and header for the Webform module. Also, feel free to ping on the Drupal Slack #webform channel if you have questions.

Submit a Webform logo proposal

Thanks again

I opened this blog post thanking the existing backers of the Webform because they are making it possible for the Webform module to have a logo. Building a reusable logo kit for all contributed projects most likely requires more funds. Having a Drupal contributed project logo kit will help us strengthen our community's visual brand and presentation.

Strengthen and supporting the Drupal community is a shared goal over every Drupal related Open Collective, whether it is providing testing environments, video recordings, cloud hosting, camps, or a powerful, flexible, and open source form builder.

Please consider backing the Webform module or any Drupal related Open Collective.

Support the Webform module

Almost done…

We just sent you an email. Please click the link in the email to confirm your subscription!

OKSubscriptions powered by Strikingly

Mar 11 2019
Mar 11

DrupalCon Seattle is about a month away, and we're putting the finishing touches on this year's plans. Drupal's biggest annual conference affords us the opportunity to support the project, share our expertise, and connect with our colleagues from far and wide. We love DrupalCon. Here's what we've got in store this year.

Our Booth & Swag

Come by the Chromatic booth, #516 in the exhibit hall, to pick up some free Chromatic swag. No kidding, our t-shirts are the softest/coolest and we've got an awesome new vintage design this year:

And we made some kick-ass stickers for this year's conference too:

Our Sessions

Introduction to Drupal 8 Migrations - Clare Ming

Migrations can be intimidating but with Migrate modules now in core, it’s easier than ever to upgrade or migrate legacy applications to Drupal 8. Let's demystify the process by taking a closer look at how to get started from the ground level.

In this session we’ll cover:

  • A brief overview of the Migrate APIs for importing content to Drupal 8, including Migrate Drupal's capabilities to move content from a Drupal source.
  • Understanding your migration pathway.
  • Getting your site ready for migrated content.
  • Sample migration scripts and configuration files for migrating nodes and field mapping.
  • Consideration of media entities, file attachments, and other dependencies.
  • Using Migrate Drupal UI and Migrate Tools for managing migrations.

For those new to Drupal, this session will introduce the basic concepts and framework for setting up your Drupal 8 migration path successfully.

Time: 04/10/2019 / 11:45 - 12:15
Room: 609 | Level 6

Configuration Management: A True Life Story - Nathan Dentzau

Long gone are the days of copying databases, creating a custom module, or creating features to push new functionality to your Drupal Website. Those days and arcane methods are a thing of the past with Drupal 8. Why or how you ask? Read on my friend, read on!

Managing configuration in Drupal 8 has become much easier with the introduction of configuration management. In this talk, we will review “good practices” Oomph and Chromatic have established for configuration management, what tools we use on all of our Drupal 8 projects and how to use them. We will also discuss how configuration management ties into a Continuous Integration pipeline using Github and Travis-CI.

We will discuss the following list of modules:

  • Configuration Manager
  • Configuration Split
  • Configuration Read-only mode
  • Configuration Installer

What you can expect to learn from this talk:

  • Automating configuration management in a Continuous Integration pipeline
  • How to export and import configuration in Drupal 8
  • How to manage configuration in version control
  • How to manage configuration for multiple environments
  • How to install a new instance of Drupal with a set of existing configuration

This talk is for all skill levels and aims to make everyone’s life easier through the magic of Drupal Configuration Management. We look forward to sharing our experience with you and answering any questions you may have.

Time: 04/10/2019 / 12:30 - 13:00
Room: 612 | Level 6

Saving the world from bad websites. - Dave Look

"Saving the world from bad websites." This is our compelling saga at Chromatic. We've spent years growing and evolving as a team and it has taken time for us to land on this phrase as our compelling saga. In this talk, we'll explore the idea of a compelling saga for an agency, what it is, and why it's important. This concept comes from a book titled "High Altitude Leadership" where the author explores leadership principles learned from mountaineering expeditions. We'll discover how these same leadership principles can be applied to our industry.

We will cover:

  • What a compelling saga is and why you should have one for your Agency.
  • How these change and evolve as your agency grows.
  • The cultural impact of all team members being able to articulate the compelling saga.
  • How buy-in or lack of can mean life or death of your agency.

Time: 04/10/2019 / 13:00 - 13:30
Room: 6C | Level 6

Preprocessing Paragraphs: A Beginner's Guide - Larry Walangitan

Paragraphs is a powerful and popular contributed module for creating dynamic pages in Drupal 8. Preprocessing allows us easily to create and alter render arrays to generate tailored markup for any paragraphs type.

In this session you will learn about:

  • Getting started with preprocessing paragraphs and structuring your preprocessing methods.
  • Creating custom render arrays and overriding twig templates.
  • Referencing nested entities and pulling data into paragraph twig templates.
  • How to debug your preprocessing and twig files using contributed modules or composer packages without running out of memory.

You'll leave this session ready to preprocess paragraphs and have a plan of action to reference when debugging any issues. This session is perfect for site-builders or back/front-end devs that are new to preprocessing in Drupal 8 and Twig.

Time: 04/10/2019 / 16:00 - 16:30
Room: 608 | Level 6

If you're coming to the conference, we'd love to meet you. Come say "hello", grab some swag, and tell us how you use Drupal....oh, and if you're looking for a job, our booth is a great place to make an impression on us!

Mar 11 2019
Mar 11

Last month, our team was busy acquiring new members, preparing for DrupalCamp London which took place the first week of March, and getting everything set for our Ljubljana team’s move into new offices. Still, this didn’t prevent us from writing some really fun blog posts. Here’s a quick recap of our posts from February in case you missed any.

Druplicon.org: In Search of the Lost Druplicon

The first post we wrote in February presented druplicon.org, a site for exploring the different variations of the famous Druplicon, and the story behind the site’s creation. The idea originated with one of our developers and the site itself was also built by our developers as part of their onboarding project

Visitors to the site get to explore the various Druplicons in a fun and educational way, and they also get the chance to submit any icons that they can’t find in the inventory. But the true highlight of this blog post is the origin story behind druplicon.org - by now, you’re probably eager to know about it, so, give it a read!

Read more

Interview with Taco Potze: Why Drupal was the CMS of choice and what potential open source has

We continue with one of our Community Interviews. We managed to get some very interesting insights on Drupal and the Dutch Drupal community from Taco Potze and his team. 

With Taco being a co-founder of several notable projects in the Drupalverse (GoalGorilla, Open Social and the blockchain-based THX), our talk with him was a really great and thought-provoking one. We really enjoyed getting to know more about his projects and his views on the potential of open source. Thanks for sharing your thoughts on Drupal with us, Taco!

Read more

Interview with Amber Matz: How will Drupal's greatest challenge shape its future?

Next up, we had another post from the Community Interviews series. We talked with Amber Matz, who is Production Manager and Trainer at Drupalize.me; but her involvement with Drupal does by no means end there. Among other notable roles, she is also involved with organizing the Builder Track for the upcoming ‘Con.

According to Amber, the greatest challenge that Drupal will face is intrinsically connected to one of its greatest advantages - its scalability. The main obstacle going forward will thus be gaining more insight into our user base and consequently having more articulated and differentiated tools for the easy acquisition of Drupal.

Read more

Top 6 SEO Modules for Drupal 8

We finished February’s blog posts with a list of useful SEO modules for Drupal 8. Drupal is a CMS that is very SEO-friendly, and its prolific community has provided a range of modules that can vastly improve the SEO of any Drupal site.

By using the modules from this list, you'll no longer have to worry about things such as manually creating proper URLs or taking care of dead links. If you’re dissatisfied with the SEO ranking of your Drupal 8 site, then these are the perfect modules to get you started on stepping up your SEO game.

Read more

We hope you enjoyed revisiting our blog posts from February. Stay tuned for more!
 

Mar 11 2019
Mar 11

Few things are as good for a business as a website that looks great and runs well. When you’ve established a strong digital presence and rank well on Google, then you’re set to chase and convert leads to your heart’s delight.

Before any of those gains materialize, however, there’s the tricky task of building that website.

If you’ve chosen to use the Drupal CMS, then congratulations: you’ve made the right choice. Drupal is agile, powerful, and home to a wide community of developers and entrepreneurs.

Moreover, Drupal makes it possible to choose themes: templates that do most of the legwork of designing a site so you need only worry about the parts that matter (ex. your lead magnets, SEO, and copy).

Since not all themes are created equal... we’ve compiled a rundown of the 7 best Drupal themes to use in 2019:

1. Progressive by NikaDevs

Best Drupal Theme: Progressive By NikaDevs

Progressive is a theme that offers great value for money.

Its creators have packaged it with over 200 interactive elements, meaning you’ll be sure to find a function that lets your site move and behave just the way you envisioned it. It comes with video hosting, unique slider effects, and visual features that are guaranteed to capture an audience’s attention.

Our favorite thing about the theme is the set of four homepages offered at entry. While other themes force you into a single aesthetic mode, Progressive offers its solutions without placing implicit restrictions on your design options.

Price: $59

Compatibility: Drupal 7 & 8

Best used for: Businesses of all shapes and sizes.

2. TheMAG by PinkDexo

Best Drupal Theme: TheMAG by PinkDexo

Today’s marketing is all about content that informs, entertains, or intrigues. It’s for this reason that TheMAG by PinkDexo ranks on our list of the best Drupal themes for 2019: it’s built to house content and house it well.

TheMAG is ideal for sites that want to rake in ad revenue, profit from content, or showcase products. It comes with a wide range of layout options and interactive elements, but it shines the most when used to present content in the style of a --you guessed it-- magazine.

We recommend taking a look at the theme before getting too excited. It’s a perfect fit for certain niches (blogging, journalism, entertainment) but other types of businesses would do best to continue reading through our list.

Price: $54

Compatibility: Drupal 7, Drupal 8, Drupal Thunder, & Drupal Commerce

Best used for: Media and Entertainment.  Businesses that lean on blogging activity.

 

 

3. Winnex by gavias

Best Drupal Theme: Winnex by gavias

Winnex perfects soft and professional design through its clean look and implicit geometry. It took the best lessons out of the adage, “less is more,” and used them to create an intuitive theme for professionals looking to market their services.

The theme gives users an easy time of site development thanks to its block-based, drag-and-drop interface, its support staff, and the numerous video tutorials they’ve produced.

Winnex gets all the basics right, but our favorite feature is how effortless it is to upload and display videos. After all, when it comes to pitching a service, it pays to set your site up with content that builds user trust.

Price: $48

Compatibility: Drupal 8

Best used for: Consultancy and other B2B services.

 

 

4. Porto by refaktor

Best Drupal Theme: Porto by refaktor

Part of a CMS’ job is to simplify the job of web design. In this regard, Porto is a theme that surpasses expectations. We’re hard-pressed to think of any other theme that can get so much done in so simple a way.

Porto features an unparalleled level of customization, with nearly endless options for page layout, header design, color mixing, and media hosting. This means that any business -- from consultants to home repair services, to e-commerce businesses -- can build their dream site using the theme.

Price: $59

Compatibility: Drupal 7, Drupal 8, Drupal Commerce, & Bootstrap 3.x

Best used for: E-Commerce. Businesses of all shapes and sizes.

 

 

5. Jango by NikaDevs

Best Drupal Theme: Jango by NikaDevs

NikaDevs is back on the list with another highly versatile theme. Like Progressive, Jango offers endless potential for web design. Unlike Progressive, however, Jango presents itself as a theme for more timeless web design.

You’ll see fewer colors and gimmicks in their marketing, which can work well to favor businesses that need a site that keeps things simple. As with all themes on this list, Jango is structured, responsive, and offers support for its users.

Jango is a straightforward Drupal theme in the sense that it performs as needed, with minimal flash and maximum impact.

Price: $59

Compatibility: Drupal 7, Drupal 8, Drupal Commerce, & Bootstrap 3.x

Best used for: Any business that wants a simple and effective website.

 

 

6. OWL by gavias

Best Drupal Theme: OWL by gavias

Cafe and restaurant owners can breathe easy knowing there’s a Drupal theme specially designed for them. OWL is a quick and easy to use theme that knows the needs and demands of the food industry.

Our favorite feature of OWL is its capacity to host high resolution, Retina-ready images that food photographers are sure to appreciate. Coupled this with a simple interface and support for all kinds of modern programming tools (ex. Bootstrap, Custom CSS, SASS, etc.) and you have the perfect theme for restaurateurs.

Price: $48

Compatibility: Drupal 8, Bootstrap 3.x

Best used for: Hospitality. Restaurants and Cafes

 

 

7. Edmix by gavias

Best Drupal Theme: Edmix by gavias

Our final entry, and our third from developer gavias, is a theme that caters to educational sites and businesses that offer online courses or tutorials. Edmix appears to take the best design features from of popular sites like Coursera and Udemy, giving its users the ability to shine a spotlight on their courses and videos.

As is the standard for gavias themes, the interface is easy to use and comes with assistance in the form of their support team, and library of tutorial videos.

Price: $56

Compatibility: Drupal 8, Drupal Commerce, & Bootstrap 3.x

Best used for: Online universities, masterclass services, etc.

 

This list covers what we think to be the best Drupal themes for a business operating in 2019. As you’ll notice, the best themes are versatile, and they allow for a high degree of customization to let businesses present the best sides of themselves without sacrificing style.

If you’ve had luck with any of the themes above, or if you know of a theme that you feel deserves a place on the list, feel free to write to us or leave a comment below.

Likewise, if you need help using Drupal or deploying any of the themes we’ve listed, Varbase is a powerful website builder platform that empowers businesses to build fast, reliable, and search optimized websites in record time.

Mar 10 2019
Mar 10

Search is a key feature in web experience, and for a lot of people, it's the go-to method to find content. We use search countless times a day on our smartphones in various contexts. And yet, when we're building out websites, search is often an afterthought that we don't spend much time on. Search gets added to the laundry list of site features, like meta tags and social media links.

Drupal Core Search

Drupal is fantastic at managing content. It gives you loads of flexibility when it comes to building out your information architecture and categorizing content. But we often don't set aside a lot of time to build out a customized search UI to find that content. At the end of a project, you might just turn on the core search module and call it done. And then we find out that people use Google to search our website.

Drupal's core search functionality hasn't changed much in the last 10 years, and is lacking features that users expect. It can be slow, and it doesn't correct for misspellings or allow you to prioritize results. Search should make your content easy to find, and make your site more engaging for users. Over the years, we've worked on lots of websites that integrate with Solr, to provide an enterprise-level search engine on top of Drupal. But setting up Solr takes time, and can be tricky if you don't have a lot of time to set it up, or the know-how to configure your server.

Why Cludo?

We recently decided to add search to evolvingweb.ca, and decided to try out Cludo. It's a "search as a service" tool that allows you to add on a search interface to an existing site. Kind of like you'd add Disqus, or Google Analytics. It was pretty easy for our developers to set up Cludo. Besides some challenges setting up search of the French language side of our site, particularly searches with UTF8 characters, the setup was straight-forward and only took a few hours to add.

The immediate advantage is that you don't have a lot of setup time for a search that just works how users expect. But after it was all set up, I realized that there are a lot of extra features that you get that we wouldn't normally take the time to configure for a basic search:

  • Autocomplete - start typing the title of a node and it'll autocomplete
  • Customize the index - you can pick and choose what's searchable and what's not
  • Analytics - you can see who searches for what, giving non-technical users visibility on how users search for content
  • Boosting - you get nice defaults for results ordering, but you can also customize the criteria to prioritize certain types of pages or certain criteria
  • Machine learning - an add-on feature that does the boosting and changes the autocomplete ordering for you based on user behaviour
  • Easy-to-use interface - non-technical users can update all the settings through Cludo's UI

Cludo analytics interface

Before you ask, yes there's a module for that! The Cludo module was released a couple weeks ago. It's still in development, but you can try it out. You just have to add your Cludo account number and key, and it provides a search form block that you can place on the page.

Here are some examples of websites using Drupal:

Open Source vs. Paid Third-Party Service

So what's the catch? Cludo isn't a free service, it comes with a $200/month price tag for most websites. And it will cost more than that if your site has more than 20,000 pages or you want bells and whistles like document search, machine learning or searching private content. There are discounts for non-profits and educational organizations.

There's a trend towards using third-party services for everything from marketing automation tools to comments and now search. I know a lot of Drupal developers prefer to use open source tools as much as possible. I think the great thing about third-party tools is that it gives us another option. We can offer our clients a way to get a search interface up-and-running quickly, without a lot of up-front development time. It gives the end-user something that's easier to configure.

On the other hand, for a large website with a lot of content, we might want more control over the functionality and costs. And for an intranet, we might want more control over where the data is stored. If we have a lot of site installs, Cludo could start to become very pricey. In these cases, using Search API would be a better option. But for lots of use cases, when that "instant" quality is the priority, Cludo is a great option, to make sure your content is discoverable and that your users can find it.

Mar 10 2019
Mar 10

I often work on migrations that involved dates that need to be migrated into Drupal 8. While many times the dates are for entity created and updated dates, and therefore in Unix timestamp format, sometimes (when migrating events, for example), I'll need to migrate a date in some other format into Drupal's date field.

In the past, I've ended up writing a custom process plugin or callback to convert the date into the proper format for Drupal core's date field, but I recently discovered the format_date process plugin (I'm pretty sure I'm late to the party on this) and realized I was doing more work than I had to. 

The short version is this - the format_date plugin takes a date in any format (as long as it can be described using the standard PHP Date patterns) and converts it to the format Drupal requires. Oh, and it also has timezone support!

Here's an example of taking a datetime in the format of "11/04/18 08:30:00" (EST) and converting it to the format that Drupal's core date field requires, "2019-11-04T08:30:00" (UTC).

field_eventdate/value:
  plugin: format_date
  from_format: 'm/d/y H:i:s'
  to_format: 'Y-m-d\TH:i:s'
  from_timezone: 'America/New_York'
  to_timezone: 'UTC'

It's pretty simple! Less custom code usually means less opportunities for mistakes and less code to maintain in the future!

Mar 10 2019
Mar 10

One of the pillars of the consulting side of the work we do here at DrupalEasy is data migration into Drupal sites. Over the past few years, we've been focused on migrating data into Drupal 8 using the most excellent core migrate modules along with contrib modules like Migrate Tools, Migrate Plus, and Migrate Source CSV

If there's one thing I've learned over the years is that data to be migrated is never, ever, ever, ever, never as clean as it is claimed to be. There's always something that has to be massaged during the migration process in order to get things working.

One of the most common things I see when migrating data from a .csv is strings with trailing spaces. If you take a cursory look at the data in a spreadsheet, you might see something like "Bread", but if you look at the same data in a text .csv file, you'll see that the string is actually "Bread " (trailing space). If you're migrating this field to a vocabulary using the entity_lookup process plugin, that trailing space will cause the term to not be found, and therefore not migrated.

The solution? Well, you could clean up the data, but there's actually a much easier solution that I use by default on almost all string data being migrated - I use the "callback" plugin to in-turn call the PHP trim() function on incoming strings in the "process" section of the migration configuration. Here's an example:

field_topics:
  -
    plugin: callback
    callable: trim
    source: Topic
  -
    plugin: entity_lookup
    entity_type: taxonomy_term
    bundle: topics
    bundle_key: vid
    value_key: name
    ignore_case: true

Using this method allows for the incoming data to be a little dirty without affecting the migration.

Mar 10 2019
Mar 10

Code flows up, data flows down.

I repeat this phrase in just about every workshop I teach - it is one of the basic principles of being a professional web developer. The idea is that we should be working locally, then pushing our changes (using Git) up to the project's dev, then QA, the live environments. As for the project's data (database and files directory for Drupal sites), the direction is opposite, we should only be moving data "down" - from live to QA, or live to dev, or live to local.

There are, of course, exceptions to every rule, and certainly in this case as well.

One exception is when the project is just getting started. Consider the example where you've started a new project on your local, you've reached the first milestone of development and are ready to move everything to a shared development environment where the client can catch their first glimpse of the project. In this case, you'll likely be moving everything "up" - code, database, and files. 

I had this exact scenario recently, I was migrating a rather large site to Drupal and had the initial migration looking good, and was in the process of getting it up-and-running on Pantheon. I successfully pushed the code as well as SFTPd the 1.6GB files directory to the Pantheon dev environment. The database was a bit larger than the 100MB maximum Pantheon allows to be uploaded through the browser, so I was using their "URL" method.

Pantheon database import interface

My plan was to put the database dump in a public Dropbox folder, then copy/paste the URL of the file in the Pantheon Dashboard interface. Unfortunately, it didn't work. I tried both .sql and .sql.gz formats, I tried doing the database import using Terminus (Pantheon's command-line interface) - each time I was provided with either no error message, or one that wasn't very helpful.

The solution? Turns out it is a bit of a DropBox issue, albeit one that is pretty easy to fix.

When copying/pasting the URL for a public file on DropBox, the URL ends in dl=0 - turns out that this prevents Pantheon from being able to import the file. Simply change it to dl=1 and the problem is solved (this works in both the Dashboard and Terminus)!

Mar 09 2019
Mar 09

When I say that a decade ago, the web wasn’t the same as it is today, would you agree?

Yes, you might, or you might not.

But when we examine the statistics a conclusion can be made: Web is changing all the time and testing has accompanied the change with it. 

Image of five stick figures working on the letter that says com which is hanging via crane wire

Testing is one of the critical processes in application development. The success or the failure of the application entirely depends on it. 

Cypress is one such framework which helps you conquer any type of web testing and is the best option for your website. 

Yeah, you must be wondering that why out of all the testing software in the market I enlighted Cypress. 

Well, let’s find out why. 

Why Cypress?

Cypress is a javascript based end to end testing framework that does not use selenium at all. 

Now, What is selenium?

Well, Selenium automates browsers. What the user does with that power is entirely up to them. Primarily, it is used for automating web applications for testing purposes. It is the core technology in countless other browser automation tools, APIs and frameworks.

So coming back to Cypress, the testing tool is a modular, integrated document assembly and delivery system that improves the management, accessibility, and distribution of content throughout an enterprise. This system can swiftly deploy and it requires little or no user training.

Cypress comes with many handy advantages which would make you choose the software at one go. 

  • Automatic waiting: Cypress has the ability to automatically wait for the DOM (document object model) to load, make the elements visible, allow the animation to be completed, and much more. 
     
  • Real-time Reloads: Cypress is smart enough to understand that after saving a test file the user is going to run it again, so it automatically triggers the run next to the browser as soon as the user presses to save the file. 
     
  • Debuggability: The testing framework provides the user with the ability to directly debug a particular app under test from chrome Dev-tools. It presents a straightforward error message and recommends the user on how they should approach it.
     
  • Architecture: There are many testing tools which work by running outside of the browser and it executes remote commands across the network, but Cypress is the exact opposite. This testing tool is executed in the same run loop as the application.
     
  • Works on the network layer: Cypress runs at the network layer by reading and changing web traffic. This allows the testing tool to not only change everything that is coming in and out of the browser but also allows to change the code that may interfere with its ability to automate the browser. 
     
  • It is a new kind of testing: Cypress has ultimate control over the applications, the network traffic, and native access to each host object that unlocks a new way of testing ( this has never been possible before)
     
Image of cypress logo where the letter cy is in a black circle and press is outside the circle


How is Cypress different from Selenium?

  Cypress Selenium Installation No configuration is needed. All the dependencies and drivers are automatically installed with .exe Installation of  the language binding and configuring of the drivers is required Running Against Browser Cypress only supports chrome 
You can run your text against any browser  Architecture Runs inside the browser and executes in the same loop Runs outside the browser and executes remote commands  Speed Test code runs alongside application code. Therefore generates an extremely fast test. Automation scripts are slow in selenium Wait for the Elements Cypress runs in the browser and knows what is happening. Thus you don’t have to wait when you are using Cypress In order to do effective automation waiting for an element, it is an important task Documentation The team of Cypress has invested a lot of time in documentation hence it is seamless and complete.   The documentation is not complete and difficult to understand.

Limitations and challenges faced in Cypress 

While Cypress does a really great job of giving developers and QA engineers the thing they want in an automation tool, it does have some limitations.

  • Since the structure is very different from selenium end to end tool, the user first needs to understand the structure and then find the best way to create the scripts.
     
  • As the testing framework is comparatively new, the community is small. It becomes really challenging to find the answers to the problems. 
     
  • No file upload is supported by this software and Cypress does not support cross-browser testing also. Nobody knows that when these things would be covered, and for what big projects, these features are really important. 
     
  • Cypress follows the approach that is related to the Page Object Model and this has been proven with time. 
     
  • The entire framework is only available for one client and i.e javascript. Thus, to work with it, it is important for the user to know the framework.

Can end to end testing deliver quality benefits?

Yes, end-to-end testing is really important it helps ensure accurate functioning of the application by testing it at every layer, right from the front end. Several other benefits of choosing and performing end-to-end testing can be because:

  • It ensures complete correctness as well as the health of an application: In end-to-end testing, the application is tested and validated at all the layers. The layers include-data layer, business layer, integration layer and presentation layer which guarantees the well-being of an application.  
     
  • It increases the reliance of an application: End-to-end testing increases the reliance and the performance of an application before its releases as the application is tested across different endpoints from different devices.
     
  • Decreases the future risks that might accompany the process: End-to-end testing presents the user with rigorous iteration and sprint. Thus, there are lesser chances of risks and failures that may come in the near future. 
     
  • It decreases the repetitive effort: The application is tested thoroughly, thus there is no looking back. The testing reduces the chances of frequent breakdowns and repetitive testing efforts 

End to end testing with Drupal

Cypress makes it easy to add new tests to the website as the user iterates the codes. Here are some of the few concepts that can help you with your Drupal Website. Let’s start the concept with: 

Setting up 

With the help of the standard installation profile and Drupal 8 distribution, the installation can take place along with JSON API. Drupal 8 comes with RESTful Web services which can serve many purposes and facilitates things such as querying nodes by field. 

There are few options for installing Cypress, out of which one of the preferred option is through NPM pacakage.json. The first step is to create your own file in the root of the project. Once the file has been placed, install it by running npm i from the project route. 

The first Test 

After installing cypress with the help of NPM pacakage.json installed, it is the time to test if it is working properly or not.

The test does two things:

  • It visits any website’s root address (that are configured by NPM script)
     
  • It verifies that the page has an element with “Cypress Testing” in it.

Creating the account 

The next step is to create user accounts. Depending on the environment, some option is more feasible than any other. Therefore, in order to do things, it is important to create Drupal entities. It is also important to access to an administrator account. You can do it manually create them in the database and pass the account credentials to Cypress through an environment variable, or you can let cypress create the account every time it runs the tests. This would reduce the chances of risks and issues that might occur during the procedure. 

The command that is there in cypress i.e cy.exec() provides the user with the access that is there in the system commands (Especially in Drush). The credentials are then decided for the test user. An object is added with the key values that are passed to the test as environmental variables.  Now add username and password to create the user admin account. Now that the credentials are available, it is possible to use them to create the user. 

Logging in 

To test any restricted or authentic users, it is important to log in first. The most obvious way to do this is the same way a user would log in, through the UI. In fact, the user should ensure that logging in through UI is possible. 

After each and every test, Cypress leaves the browser in a state it was in when it finished running the test. It is useful because it leaves a great position to discover the next steps. For this particular case, Cypress will come back to the browser with admin user logged in.
 
To keep tests independent from each other, Cypress removes the browser cookies before east of the test is operated. This prevents the side effects between tests, but it also means that the user needs to log in each time a test operates that needs authentication.
 
Now that the login codes have been placed, we need to write it. The user can reuse logs via UI test code, but if the same codes have to be operated before every test, there wouldn’t be much point in having the test, to begin with. Most important, logging in through the UI is slow. If the user has to log in before every test they run, a lot of time will be wasted on logging in. Drupal logs in simply by posting form data to the login URL. 

Seed the data 

It is important to look at how JSON API is used to seed the data which has to be tested and understand that API authenticates the requests. By default (for unsafe and non-read requests) JSON and the standard REST module requires a token request header to be presented. The tokens can then be used to create and delete data by posting the endpoints that are exposed by JSON API module. 

Note that Cypress presents an after hook. It is fascinating to delete the test nodes in the after hook since, at that point, the user has to access to the test node’s id and could delete the test content without having to query by the title. 

However, the approach can be troublesome in the event that needs a test runner to quit or refresh before running the after block. In this case, the test content would never get cleaned up since the user wouldn’t have access to the node’s id in future test runs. Once the test articles are seeded, the “displays published articles” test will visit the node’s page and confirm that the fields
 
Debugging using DevTools

As we can see that Cypress has grown out to be an excellent Test Runner that helps the user to understand what is happening in an application and in the tests, there’s simply no substituting all the amazing work that the browser has done on their built-in development tools.

Your Cypress test code runs in the same run loop as your application. This means you have access to the code running on the page, as well as the things the browser makes available to you, like document, window, and, of course, debugger

Running Cypress in continuous integration

If you want that automated testing and continuous integration should work together then it is important to have some sort of CI/CD server. These are the hosted servers, and for implementing it with Drupal 8 these tools must work together.

It is important to note that developers must ensure that all tests are passed on the local workstation. The Drupal configuration is exported where the system spins up a fresh installation

Conclusion

End-to-end testing shouldn’t be hard. Cypress makes integration testing pleasing and enjoyable. You can write the end to end tests without worrying about browsers, Selenium, and other scary stuff.

You would agree on the fact that the framework is so nice that planning to use nothing but Cypress for integration testing would be fruitful. Plus, the documentation is pure gold: Cypress Docs are filled up with best practices and examples.

At OpenSense Labs, we have quality Drupal experts who try to enable digital transformation to the enterprise with the services and assistance.  Contact us now at [email protected] 

Mar 08 2019
Mar 08

Look out for Hook 42 at DrupalCon 2019 in Seattle!

It’s that time again, another DrupalCon is fast approaching and our team couldn’t be more excited for this year’s Seattle event. We’ve got a lot in store for you this year, from presentations, BOFs, sponsorships, partnership collaborations, and using our listening ears. You’ll find our team distributed all about.

We’re bringing a stacked line-up of knowledge and experiences to drop on 'ya this year. Not only that, we’re looking forward to hearing all the ups and downs you’ve had this past year, and how we’re all growing together within the Drupal community. 

Let’s get to sharing!

Summits

Join us at the Performance and Scalability Summit on Monday. Hook 42 and Tandem have worked with the Drupal Association to line up leading speakers to cover scaling performant Drupal websites to scaling efficient Drupal development teams. The Performance and Scalability Summit is the place for developers, DevOps professionals, decision makers, and other technical stakeholders to plan for growth.

Talks

To hear from our experts, find us at one of our talks where we’ll review insights our team has experienced first-hand and how we’re adapting to the new and old needs of evolving technology.

Database Query Optimization in Drupal

Kristen talks all things database, and walks through how to put the sluggish operations behind you to optimize your Drupal environment. Covering the basic optimization steps isn’t always enough, exploring options past the basics will help streamline your Drupal environment for both front-end and back-end users.

Accessibility Deep Dive Workshop

Aimee is teaming up with Caroline Boyden, from UC Berkeley, to take a closer look at accessibility. Together, they’ll use real-world examples of how accessibility is best implemented, and how every member of your team can be part of accessibility, from designers to developers to content authors and everything in between.

Which Accessibility Tools are Right for You?

Aimee breaks down the best tools for implementing and testing accessibility, and the benefits and areas of improvement for these tools. Taking a look at what is available today, and how your team can take advantage of these tools to increase your website’s accessibility.

Drupal 8 Case Study – Stanford Cantor Arts Center Redesign

Kristen and Ryan join forces to walk through one of Hook 42’s latest client projects. Focusing on the implementation of good design and development, and how Drupal was the perfect place to house the dynamic technical needs of the team at Stanford.

A Survey of Emerging Technologies That Complement Drupal

Adam and Ryan are teaming together to talk about the latest tech to take advantage of Drupal’s flexibility and ability to interact well with modern advancements on the web.

Considerations of Federated Search and Drupal

Adam shines the light into federated search, its importance, and how you can implement this applications within Drupal. Taking a close look at how to pair the application with Drupal to minimize risk and increase cross-platform communications.

Sponsors

Women in Drupal Luncheon

We’re proud to sponsor the Women in Drupal Luncheon, fostering inclusivity and empowerment within the Drupal community. We’ll gather to enjoy delicious food, talk shop, and relax amongst professionals.

BOF

Simplytest.me

VP of Engineering, Adam Bergstein maintains the SimplyTest.me service leverage by the Drupal community for testing and prototyping community contributions. Join him for a BOF on Wednesday! We want to chat with you all things about our go-to browser testing tool for all our Drupal projects.

Adam is also sorting out the Drupal Coffee Exchange meetup. 

Partner

Lingotek Session: Avoiding Trouble Spots When Creating a Multilingual Site

Our partner, Lingotek, has organized a session revolving around the challenges with multilingual websites. Through the session you’ll learn about common challenges with multilingual websites, and how you can get ahead of those issues. We’ll be there to support Lingotek and help answer questions during the session.

We Hope To See You There

If you haven’t already registered for DrupalCon Seattle, get on it! We’re ready for yet another amazing DrupalCon adventure, and we’re looking forward to seeing old and new faces in the crowd.

via GIPHY

 

Mar 07 2019
Mar 07

This year is the sixth annual Midwest Drupal Camp (aka MidCamp). Palantir is excited to sponsor this year’s event and also have multiple Palantiri presenting sessions!

Palantir Sessions and Events

Community Working Group Update and Q&A by George DeMet

The mission of the Drupal Community Working Group (CWG) is to uphold the Drupal Code of Conduct and maintain a friendly and welcoming community for the Drupal project. In this session, CWG members George DeMet (gdemet) and Michael Anello (ultimike) will provide an update on some of the CWG's recent activities and what the group is working on in 2019, as well as answer audience questions.

  • Thursday @ 2:50pm
  • Room 314A


Federated Search with Drupal, SOLR, and React (AKA the Decoupled Ouroboros) by Matt Carmichael and Dan Montgomery

Our session will begin with a tour through a recent project developed by Palantir.net for the University of Michigan — bringing content from disparate sites (D7, D8, Wordpress) into a single index and then serve results out in a consistent manner, allowing users to search across all included properties. We’ll discuss how we got started with React, our process for hooking up to SOLR, and how we used Drupal to tie the whole thing together.

  • Friday @ 9am
  • Room 324


Overcoming Imposter Syndrome: How Weightlifting Helped Me Accept My Place in Tech by Kristen Mayer

Weightlifting and tech. On the surface, these two things may not seem to have much in common, but as a woman trying to navigate both of these male-dominated spheres, I’ve often been intimidated and doubted whether I really belonged. In this session, I’ll look at the strategies that helped me overcome imposter syndrome in the gym, and my journey of applying them to my professional life. I hope that anyone attending this session will walk away feeling empowered about their position and skills within the tech community!

  • Thursday @ 3:40pm
  • Room 312


Understanding Migration Development in Drupal 8: Strategies and Tools to See What's Happening by Dan Montgomery

Migrations in Drupal can be challenging for developers because the tools and strategies to get started and peer behind the curtain are different than those used in most backend development. This is an intermediate topic intended for developers who have a basic understanding of Drupal 8 concepts including plugins and the way entities and fields are used in Drupal to manage content.

  • Thursday @ 11:40am
  • Room 314B


Game Night!

Head to the second floor for a fun night of board games, camaraderie and conversation. Camp registration is required to attend this event.

  • Thursday from 6-9pm
  • 2nd Level


We'll see you there!

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web