Jan 15 2016
Jan 15

This is the first time that I am working with automated tests. I have written tests before and and of course I believe that tests improve projects dramatically, but I have never had (/made) the time in a project to actually do it. So I was quite excited to get started with travis.

First things first, what do you test!?

I was terribly excited to start working with tests, but yesterday we didn't actually have any code to test against yet, so instead we decided, we actually want to run the drupal core tests. We want our module to be enabled, and to test know if any of our code causes a Drupal core regression, so I started with that.

Trial and Error (a lot)...

Travis build history logs showing eventually passing tests ​Travis build history logs showing eventually passing tests.

The documentation is pretty good and there are LOADS of example .travis.yml files floating around, so I started to just get an active environment build working and focused on getting php unit tests working first. I only wanted a couple of tests to run and the main thing I cared about was that drupal installed properly and that the environment was up and running. Eventually I got a passing build with the following .travis.yml

While I was really quite happy with this, I wanted more. Simpletests to say the least, and maybe even behat tests. I found a lot of inspiration when looking at BartFeenstra's (Xano on d.o) Currency module: https://github.com/bartfeenstra/drupal-currency/blob/8.x-3.x/.travis.yml
Most importantly, I found Drupal TI - https://github.com/LionsAd/drupal_ti

Drupal TI

This project makes it almost trivial to integrate Drupal Projects with Travis. It handles a lot of the setup in terms of MYSQL, Drupal download and how to start running tests. It managed to reduce my .travis.yml to the following:

This is quite a bit of code, but it is really clean and well commented so we can clearly see what is going on.

When to test?

So now there are tests... Great! But when do we run them? Running all of Drupal's tests can take quite some time, but we do want to have a check on development to catch things. So in short we need testing on pull requests between individual development repos (eg https://github.com/yanniboi/decoupled_auth) and the main project repo (https://github.com/FreelyGive/decoupled_auth) but when we are doing our development we only really care about project specific tests.

Seeing tests on GitHub

When I create a pull request now, I automatically have the tests run and the results on the pull request:

GitHub pull request GitHub pull request with Travis test results

Travis summary GitHub pull request test results in Travis CI

Eventually what we really want to do, is have a way of checking inside our .travis.yml, if the test has been started by a pull request, or by a code push/merge and run different parameters depending on that. But more about that next time. There will be a blog post on that soon...

In the meantime, use Travis and write tests :)

References:

Jan 13 2016
Jan 13

Developer experience is our primary concern with this Drupal 8 version of doing CRM.

We thought we could improve the experience of helping developers contribute to the project. We noticed that for Drupal 8 all the cool kids were moving to hosting their development to github, such as with Drupal Commerce, but even core bits of Drupal.

So we did some investigating and decided to join them. We thought it would be helpful to share a couple of our thoughts and reasons, we are by no means authorities on this!

Getting Started

Being able to work with Github is really nice. Someone can come along to github and easily fork the main repository which is possible on Drupal.org but much easier on github.

No Special Users

We have a principle where “no individuals is special”. On drupal.org the module maintainers get access to more tools then everyone else. On github everyone is basically the same. In theory someone’s fork may become a bigger deal than the original. This means everyone has the same tools and so the things we do to make lives for our developers easier, everyone else gets to share.

We found that when some developers were maintainers and had access to drupal.org’s git they had a much nicer experience than the people who had to just download the source code or set up their own git experiences.

Pull Requests

Pull Requests are really nice. We think pull requests are pretty much a nicer way of doing patches as you can just click a few buttons and copy and paste it into the issue queue. With Dreditor it is not a big deal but github keeps track of minor changes to a patch much more effectively especially if multiple people are working on it.

  • Although it does require giving others access to my fork of a project and so we have found that sometimes patches are easier
  • Although if multiple people are working on a pull request, they can do it by forking the pull request owner’s repository and do a pull request with that first!

Drupal.org

We definitely still use Drupal.org as the issue queue and turn off all of github’s issue tracking features. We then reference issue numbers in as many commits as possible and certainly all pull requests (We post pull requests in their issue).

One of the committers can, every so often push the “main repository” or any repository to the git repo on drupal.org

TravisCI

We also use travis-ci to handle tests and will follow up with a more detailed post about how we handle testing.

Jan 13 2016
Jan 13

Short history of Tests in Drupal

In Drupal 7 it was fairly straight forward what kind of a test you should write, especially for core. It would basically be a simpletest. If you wrote in contrib, you might write a few Behat or PHPUnit tests for your module, but you would either run these tests locally of on some remote testing server such as Jenkins or Travis.

While this was quite good for doing webtests along the lines of ‘create user’, ‘log in’, ‘go to page x’, ‘click on button y’, it was also pretty slow. Every patch on the issue queue could be delayed as much as 3 hours while all the tests ran and during sprint weeks we end up with huge queues of patches waiting to be tested.

Drupal 8 has really stepped up its game. In 2013 it separated our testing into two categories. one for web tests that require a complete or partial Drupal environment - Simpletests, and another for everything else - PHPUnit :) PHPUnit Change Notice

It didn’t take long for there to be several different subclasses for PHPUnit tests in Drupal:
Unit Tests
Kernel Tests
Functional Tests

What now?

So now the question is, “Which test should I be writing”. Here is a quick explanation of the different Test Base classes.

Drupal Test Structure

- TestCase (PHPUnit_Framework_TestCase)
  - \Drupal\KernelTests\KernelTestBase
  - \Drupal\simpletest\BrowserTestBase
  - \Drupal\Tests\UnitTestCase
- \Drupal\simpletest\TestBase
  - \Drupal\simpletest\KernelTestBase
  - \Drupal\simpletest\WebTestBase
    - \Drupal\simpletest\InstallerTestBase

TestCase and TestBase should never be extended directly. They are just structure that the remaining tests extend.

UnitTestCase is what you want if you don’t care about bootstrapping Drupal. All you want to do is test Class methods and make sure that they behave expectedly. Eg. If your method required the first argument to be a string and you pass it an array you get an exception, etc.

Next up we have KernelTestBase, WebTestBase and BrowserTestBase. These all do something towards setting up a Drupal Environment and allowing you to test your code in the wild. KernelTestBase is the simplest as it bootstraps Drupal and sets up a Database, etc. but it doesn’t explicitly install any modules, and so is similar to the environment in the early installer. Any modules that you require are loaded in setUp() and you can perform schema installation as required.

Notice: there are two KernelTestBase classes. \Drupal\simpletest\KernelTestBase has been deprecated for \Drupal\KernelTests\KernelTestBase as part of Drupals continuing battle to modernise its testing and has moved from simpletest to PHPUnit. KernelTestBase Change Notice

Then we get to WebTestBase and BrowserTestBase. They do basically the same thing, as they allow a full installation of Drupal and modules and are webtests so you test the UI through a browser. The general rule of thumb is use WebTestBase. BrowserTestBase is the newest move to modernize testing as it is an attempt to move Browser Testing from simpletest to PHPUnit and Mink BrowserTestBase Change Notice. So if you fancy it, you can give the new framework a go, but it is still work in progress and not going to be used extensively in core until 8.1.x.

And finally there is InstallerTestBase which is useful testing the installation of Drupal based on changes in configuration.

Apr 28 2014
Apr 28

Sometimes you want to license files without people needing to purchase them. Even using coupon codes to make products free still requires them to be purchased through the Commerce Checkout system.

This is fine for physical products where you still want email and address details of potential future clients.

However when it comes to files, users require an account to access their files, so chances are you have all the details for them already. And there is no shipping required so why make them go through the checkout process just to get a license for a free file? (Seriously if you have reasons comment!)

Here is a snippet of how to generate a file license for a user:

Unrelated

Grammar Lesson:

Today I learnt the difference between 'license' and 'licence'. Unless you are American (in which case just ignore the existence of 'licence') read this.

Apr 15 2014
Apr 15

Super Site Deployment with ctools exportable revert snippets

Sometimes when you are deploying new code to a production site you want to update views, panels, etc. with new code exports, but for one reason or another the defaults are overriden by the database.

Well with the following scripts you can stop worrying about that and just have an update hook take care of reverting (or deleting) the overriding database entries.

Improvements appreciated and feel free to comment!

Apr 14 2014
Apr 14

Alias Directory

You can place the aliases.drushrc file either in the 'sites/all/drush' directory or your global environment drush folder (eg. /home/username/.drush) and the naming convention is 'group.aliases.drushrc.php' so I normally use 'project.aliases.drushrc.php' or 'client.aliases.drushrc.php' to group related sites.

Dev (/local)

Create an alias array defining your local development site:

$aliases['dev'] = array(
  'uri' => 'sitename.dev',      // The uri as configured in you apache hosts
  'root' => '/path/to/web/root',
  'path-aliases' => array(
    '%files' => 'sites/default/files',
   ),
);

You can now (if you placed the alias file in your global drush directory) use drush from any directory, using:

drush @project.dev status

or

drush @project.dev cc all

Did you say any directory?!

Yep! Since you have defined you webroot in the global drush aliases file, you don't have to be in your webroot when running drush, and really, you don't even have to be on the same server...

Production (/remote)

To get the alias details for a remote machine, the easiest place to start would be to just ssh into it and run:

drush sa @self --with-db --show-passwords --with-optional

The result looks like this:

$aliases['self'] = array (
  'root' => '/path/to/drupal/root',
  'uri' => 'http://default',
  'path-aliases' => array(
    '%drush' => '/path/to/drush',
    '%site' => 'sites/default/', 
  ),
  'databases' => array(
    'default' => array(
      'default' => array(
        'database' => 'site_db',
        'username' => 'site_user',
        'password' => 'site_pass',
        'host' => 'localhost',
        'port' => '',
        'driver' => 'mysql',
        'prefix' => '',
      ),
    ),
  ),
);

You can just copy this directly into your local drush alias file and add remote details liek this:

$aliases['live'] = array (
...
  'uri' => 'http://mysite.com',
  'remote-host' => 'ip.or.domain',
  'remote-user' => 'ssh_user',
...
  'path-aliases' => array(
    '%files' => 'sites/default/files',
...

The result allows you to run drush commands locally and have them acting on a remote site.

Jiminy Cricket!

  'remote-port' => 3201,

If you have a seperate port for mysql

  'ssh-options' => '-o PasswordAuthentication=yes',

If you can't use an ssh key

Syncing files

You can sync the files directory between sites:

drush rsync -y @project.live:%files @project.dev:sites/default

or

drush -r /path/to/web/root rsync -y @project.live:%files @self:sites/default

This post is mainly snippets and tips for me to remember drushrc tools in my day to day work.
Other (/better) blog posts are as follows:

Nov 01 2013
Nov 01

Disclaimer: I have much more experience USING Solr with Drupal than setting up a Solr service so please use the comments to correct me.

Having Solr index your content has loads of benefits, but best of all in my humble opinion is the beauty of have facetted filtered search. Of course you could do it with database indexes, but the performance win of using Solr is very noticeable.

Prerequisites:

  • A working LAMP stack.
    • I am using Ubuntu 12.04.
  • Java version 1.6 or higher.
    • Use 'java -version' to check.

Download Solr

Download a copy of the Solr tarball to your computer and extract it to a directory of your choice. Make sure you have root privileges, otherwise you will need to prepend 'sudo' to the following:

cd /usr/share
wget http://mirrors.ukfast.co.uk/sites/ftp.apache.org/lucene/solr/4.5.1/solr-4.5.1.tgz
tar zxvf solr-4.5.1.tgz
rm solr-4.5.1.tgz
mv solr-4.5.1 solr

And now you have Solr downloaded. You can test it by going to the example directory and executing the start.jar

cd /usr/share/solr/example
sudo java -jar start.jar

Now if you navigate to the example.com:8983/solr you should get this lovely screen:

Now it is a bit of hassle always having to run a command from terminal when you want to start solr so here is a little trick to automate it.

cd /etc/init
vim start-solr.conf

Enter the following into the file and save it.

# start-solr

start on startup

script
    cd /usr/share/solr/example
    java -jar start.jar
end script

Now solr will start whenever the machine you are running starts up.

Configure Solr for Drupal

Most of what you need to know for this can be found on d.o here but I will take you through it anyway.

Download the newest release of the Drupal search_api_solr module. There are some files that we need to to our solr settings.

cd /path/to/search_api_solr/solr-conf
cp 4.x/*  /usr/share/solr/example/solr/collection1/conf/

And that is pretty much it. Simples!

Configure Drupal for Solr

Now install Drupal and Search API Solr (including dependencies) as you would normally. Go to the Search API configuration page (example.com/admin/config/search/search_api) add a new server (choosing Solr as the service class) and if you get this wonderful message:

you are done!

You can now index Drupal to your heart's delight.

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web