Sep 10 2019
Sep 10

Dropsolid is a Diamond sponsor at DrupalCon Amsterdam, 28-31 October. In this post, I’d like to share a bit about our vision for delivering the best customer experiences, our open integrated Digital Experience Platform, our partner program, and a special opportunity for DrupalCon attendees.

Are you working in a digital agency and coming to DrupalCon? We’d love to meet you at DrupalCon and talk about how our tools, infrastructure, and expertise could help you as a digital agency partner. We’ll be at Stand 13, by the catering area, so if you fancy a coffee, stop by for a chat. We’re running a very special giveaway, too. Complete a quick survey and we’ll donate 15 minutes of core contribution time as a thank you.

Sign up for Dropsolid News


A vision for Drupal to improve customer experience

In my previous post, I wrote about why we’re sponsoring DrupalCon. Simply put, without it, we wouldn’t exist. I also wrote about what we’re working on for the future, inspired by the market changes around digital experience management. I think we have something unique to offer our partner digital agencies right now.

I’ve gone from being a developer to a CEO, and I know the attraction of solving problems by building your own solutions. Yet, like many agencies, we discovered a few years ago that doing everything in-house was hindering us from growth. To solve this, we ended up pivoting our entire company, defining and offering solutions in a completely different way.

We found that many of our clients’ and partners’ teams were working in silos, with different focuses—one on marketing, another on hosting, and so on. We believe we have to take an integrated approach to solving today’s problems and a big part of that is offering stellar customer experience. We discovered that investing in customer experience meant your customers stick around more and longer. This translates to increased customer lifetime value, lower customer acquisition costs, and lower running costs. But what does it take to get there?

We have to recognize how problems are connected, so we can build connected solutions. You can see this in problems like search engine optimization. SEO is as much about great user experience as it is about your content. Today, for example, the speed and performance of your website affects your search engine rankings. Incidentally, my colleagues Wouter De Bruycker (SEO Specialist) and Brent Gees (Drupal Architect) will be talking about avoiding Drupal SEO pitfalls at DrupalCon Amsterdam.

Similarly, it seemed that various solutions out there were narrowly focused on a single area. We saw the potential and power of integrating these as parts of a unified Digital Experience Platform. Stand-alone, any one of these tools offers benefits, but integrated together, the whole is greater than the sum of its parts.

We are taking this approach with our clients already. With each successful engagement, we add what we learn to our toolbox of integrated solutions. We are building these solutions out for customers with consultation and training to make the most out of their investments. These include our hosting platform; our local dev tool, Launchpad; our Drupal install profile, Dropsolid Rocketship; Dropsolid Personalization; and Dropsolid Search optimized with Machine Learning. 

But our vision is bigger. We are working towards an open, integrated, Digital Experience Platform that our partner agencies can leverage to greater creative freedom and increased capacity without getting in their own way.

Stop by at DrupalCon or get in touch and see what we’re building for you. 

Read more: Open Digital Experience Platform


A Partner for European Digital Agencies

Dropsolid is the only European company sponsoring DrupalCon Amsterdam at the top-tier, Diamond sponsor level. With all due respect for our American colleagues, we believe a robust European company should exist to support all of us here. We want to help other European companies build successful digital experiences with Drupal at the core for organizations, governments, and others.

Like many Drupal agencies, we’ve gotten to where we are now providing services to our local market. Being based in Belgium, we design, strategize, build, maintain, and run websites and applications for clients, mainly in the Benelux region.

Dropsolid is the only European company sponsoring DrupalCon Amsterdam at the top-tier, Diamond sponsor level.

Now, we are looking for partners outside of Belgium to benefit from using our Drupal Open Digital Experience Platform for themselves and their customers. Dropsolid has the tools, infrastructure, and expertise to support creating sustainable digital experiences for anyone. Furthermore, we have the advantage of knowing and understanding the differing needs of our colleagues and clients across Europe.


Come join us!

We are looking for more partners to join us on this journey. By leaning on our tools and expertise, those who have already joined us now have more capacity for creative growth and opportunity.

What you might see as tedious problems and cost-centers holding your agency back, we see as our playground for invention and innovation. Our partners can extend and improve their core capabilities by off-loading some work onto us. And you gain shared revenue from selling services that your customers need.

You might be our ideal partner if you prefer

  • benefitting from recurring revenue, and 
  • not taking on additional complexity that distracts you from your core creative business.

Partners who sign up with us at DrupalCon will get significant benefits including preferred status and better terms and conditions compared to our standard offerings. Talk to us about it at our booth at Stand 13 or contact us to arrange a time to talk.


Facets of the Digital Experience - Dropsolid Sessions at DrupalCon

Check out my other post to see where to meet the Dropsolid Team at DrupalCon. You’re welcome to come say hello at our booth at Stand 13, and we can show you the facets of digital experience management as we see them, and also share our vision for the future.

Each one of our talks focuses on different facets of improving the digital experience for customers:

Sep 04 2019
Sep 04

04 Sep

Dominique De Cooman

Dropsolid was conceived at DrupalCon, and now we’re a Diamond Sponsor! We’ll be in Amsterdam to show off the Dropsolid platform and our vision for Drupal. We’d love your feedback. And we are donating 15 minutes of core contributor time for everyone who completes our survey at our booth.

We hope to see you there! Contact us, sign up for our newsletter, or stop by our booth!

Stop by Our Booth, Help Make Drupal Shine

We will donate 15 minutes of core contribution time for each person who fills out a short survey at our booth (Stand 13, by the catering area) at DrupalCon—one per person.

We didn’t want to be Diamond sponsors just for the sake of it. Drupal and DrupalCon got us here, made us what we are today. We want to make a difference. We asked ourselves what kind of a booth-giveaway would make a lasting impact on Drupal? A t-shirt of just the right shade of blue? ;-) We decided to invest in Drupal, paying a core contributor for their work.

Sponsors are a DrupalCon’s Best Friend

DrupalCon sparked the formation of Dropsolid, and we are very proud to be able to be Diamond Sponsors in Amsterdam this year. I wanted to take a moment to reflect on what DrupalCon has meant for us.

In 2012, after five years as a developer, I attended my very first DrupalCon in Munich. I saw Dries speak, attended so many sessions, met so many community members. There was so much incredible, positive energy; I was overwhelmed.

At that DrupalCon, I met some extraordinary people who helped persuade me that founding a Drupal company was a great idea. The experience convinced me to invest everything I ever owned into a company with Drupal at its core. I felt it was now or never. And so Steven Pepermans and I founded Dropsolid.

Now seven years later, we are one of the Diamond sponsors at DrupalCon Amsterdam. It’s hard to believe Dropsolid can do this. Sponsoring DrupalCon is a dream come true for us. For us, this is already a huge achievement. The very experience of it co-created our company, and now we get to contribute to it ourselves.

We are grateful to be here and want to make a difference.

The Dropsolid Vision for Better Customer Experience with Drupal

At the conference, we want to share a vision for possibilities with Drupal. We see Drupal pinning together an integrated digital experience platform that enables teams to deliver great digital customer experiences more effectively and at a lower cost. Our vision starts with the best practices of working with Drupal, hosting, deployment & development tools, and digital marketing capabilities. It’s what we offer customers today.

Out in the market, these “digital experience platforms” make connecting all the parts together easier. It means you can avoid getting nickeled-and-dimed on individual services and dealing with quirks in integrations. This is all possible right now with Drupal, when you have the skills and knowledge to put everything together. It’s what we do for our clients every day. We build flexible integrated platforms, and we provide training and consultation along the way.

In building these solutions with Drupal, we discovered some best practices, many things that can be recycled and reused, and that there are real advantages and economies of scale. We’ll be talking about that in our talks such as Nick’s talk on improving on-site search with Drupal Machine Learning, and Wouter and Brent’s talk about avoiding Drupal SEO Pitfalls, and Mathias will share the insights we’ve gained in working on Launchpad our Local Development Tool. These are very practical and direct ways to get more out of your investment with Drupal.

But we have a bigger vision. Next we’re working on our integrated service so you can get these capabilities with one offering in Drupal. If you want to know more about this vision, and how to get there today, come along to my talk about Open Digital Experiences to Increase Customer Lifetime Value
You can also stop by our booth to see demos of our Dropsolid hosting Platform, see how to use Dropsolid Personalization, and see Rocketship in action.

Facets of the Digital Experience - Dropsolid Sessions at DrupalCon

Where to meet the Dropsolid Team. In addition to visiting our booth (and making us pay a core contributor!) at Stand 13, we’ll be showing many facets of what goes into digital experiences —investing in Digital Customer Experiences, Search Engine Optimization tips for Drupal—and we’ll be on a panel about local development tools, too.

Demo: A future vision of Drupal as a Digital Experience Platform

  • What: In our live demo, we’ll show you the power of our Platform, Launchpad, Rocketship, Search & Machine Learning, and Personalization tools working together to break down silos and create engaging customer experiences with Drupal. 
  • When: Wed, 30 Oct, 12:40 -13:10
  • Where: Sponsor Stage

Stop Buying Drupal Websites, Buy Open Digital Experiences to Increase Customer Lifetime Value

  • Who: Dominique De Cooman, Founder and CEO Dropsolid 
  • What: My talk is a distillation of what we learned about the difference between Drupal being “just a CMS” for “just building a website,” and how Drupal can be a truly comprehensive Digital Experience Manager.
  • When: Tue, 29 Oct, 11:55 to 12:15
  • Where: Room G 107

The Battle of the Local Development Tools [Panel Discussion]

  • Who: Mattias Michaux, Drupal Developer, and DevOps Engineer joins a Panel chaired by Michael Schmid from Amazee
  • What: DrupalCon website: “In this session, creators and users of different local development tools will provide their story of why they made the choices they made.”
  • When: Wed, 30 Oct, 11:30 - 12:10
  • Where: Room G 102

Machine Learning: Creating More Relevant Search Results With “Learn To Rank”

  • Who: Nick Veenhof, CTO Dropsolid, and Mattias Michaux
  • What: A summary of what machine learning is, but more importantly, how can you use it for a pervasive problem, namely the relevance of your internal site search.
  • When: Wed, 30 Oct, 16:15 to 16:55
  • Where: Auditorium

Drupal SEO Pitfalls and How To Avoid Them

  • Who: Wouter De Bruycker, SEO Specialist and Brent Gees, Drupal Architect
  • What: Drupal can be your perfect technical SEO platform, but to get the most out of it, you have to make sure it’s set­ up as it should be for the search engines. We will go into the details of how to detect SEO issues common and rare (on real Drupal sites!) and explain their impact on SEO.
  • When: Wed, 30 Oct, 16:40 to 17:00
  • Where: Room G 102

See you there! Be in touch!

We hope to see you there! Sign up for our newsletter or stop by our booth at Stand 13 and help us contribute!

And remember, we will donate 15 minutes of core contribution time for each person who fills out a short survey at our booth (Stand 13, by the catering area) at DrupalCon—one per person.

DrupalJam 2019

May 11 2013
May 11

This article will give you the tools to setup fully automated an apache solr multicore localy for drupal.

The advantages: When you have new projects you will only have to create a new apache solr core based on previous configuration. This will save you valuable time.

How does it work? Whatever we have done manualy we can script using bash. This is exactly what we have done to install the apache solr multicore localy.

The script depends on drush, xmlstarlet and curl. The script will detect these dependencies and will try to download them.

Here is the script to install apachesolr and setup the multicore. We will use a second script to setup our cores later.

Install apache solr as a multicore

#!/bin/bash
 
#Validate if the executing user is root
if [ "$(id -u)" != "0" ]; then
   echo "This script must be run as root" 1>&2
   exit 1
fi
 
# ******** #
# Funtions #
# ******** #
 
#Test the connection to the solr installation
function solr_install_test_connection {
  response=$(curl --write-out %{http_code} --silent --output /dev/null <a href="http://127.0.0.1:8983/solr/">http://127.0.0.1:8983/solr/</a>)
  if [ $response != 200 ]; then
    echo -e "\e[00;31m ERROR:Solr installation failed manual intervention needed. Solr reponds at <a href="http://127.0.0.1:8983/solr/">http://127.0.0.1:8983/solr/</a> with http code: $reponse  \e[00m"
  else
    echo -e "\e[00;32m NOTICE:Solr reponds at <a href="http://127.0.0.1:8983/solr/">http://127.0.0.1:8983/solr/</a> with http code:$response  \e[00m"
  fi
}
 
# ****** #
# Script #
# ****** #
 
#Install the needed sources if needed: drush, curl and java
DRUSH=`command -v drush`
if [ -z "$DRUSH" ]; then
  apt-get install drush
  echo -e "\e[00;33m NOTICE:Drush installed \e[00m"
else
  echo -e "\e[00;33m WARNING:Drush already installed at '$DRUSH' \e[00m"
fi
 
CURL=`command -v curl`
if [ -z "$CURL" ]; then
  apt-get install curl
  echo -e "\e[00;33m NOTICE:Curl installed \e[00m"
else
  echo -e "\e[00;33m WARNING:Curl already installed at '$CURL' \e[00m"
fi
 
JAVA=`command -v java`
if [ -z "$JAVA" ]; then
  add-apt-repository ppa:webupd8team/java
  apt-get update
  apt-get install oracle-java7-installer
  echo -e "\e[00;32m NOTICE:Java installed \e[00m"
else
  echo -e "\e[00;33m WARNING:Java already installed at '$JAVA' \e[00m"
fi
 
#Download solr sources
SOURCE=apache-solr-3.6.2.zip
FOLDER=apache-solr-3.6.2
FILE=/opt/$SOURCE
if [ ! -e $FILE ]; then
  mkdir -p /opt
  cd /opt
  wget <a href="http://apache-mirror.telesys.org.ua/lucene/solr/3.6.2/">http://apache-mirror.telesys.org.ua/lucene/solr/3.6.2/</a>$SOURCE
 
  echo -e "\e[00;32m NOTICE:Sources downloaded  \e[00m"
else
  echo -e "\e[00;33m WARNING:Source already downloaded skipping \e[00m"
fi
 
#Extraction of the zip file containing the solr sources
DIR=/opt/apache-solr-3.6.2
if [ ! -d $DIR ]; then
  echo -e "\e[00;32m NOTICE:Extracting sources  \e[00m"
  cd /opt
  unzip $SOURCE > /dev/null
  echo -e "\e[00;32m NOTICE:Apachesolr dir created  \e[00m"
else
  echo -e "\e[00;33m WARNING:Apachesolr dir already exists skipping \e[00m"
fi
 
#Create the multicore folder - name it like you want
DIR=/opt/$FOLDER/dropsolid
if [ ! -d $DIR ]; then
  cp -rf /opt/$FOLDER/example /opt/$FOLDER/dropsolid
  echo -e "\e[00;32m NOTICE:Dropsolid dir created  \e[00m"
else
  echo -e "\e[00;33m WARNING:Dropsolid dir already exists skipping \e[00m"
fi
 
#Install and configure multicore solr.xml to the multicore folder
NEW=`grep core0 /opt/$FOLDER/dropsolid/solr/solr.xml`
if [ -z "$NEW" ]; then
  rm /opt/$FOLDER/dropsolid/solr/solr.xml
fi
FILE=/opt/$FOLDER/dropsolid/solr/solr.xml
if [ ! -e $FILE ]; then
  cp -f /opt/$FOLDER/dropsolid/multicore/solr.xml /opt/$FOLDER/dropsolid/solr/solr.xml
  sed -i '[email protected]<core name="core1" instanceDir="core1" />@@g' /opt/$FOLDER/dropsolid/solr/solr.xml
 
  echo -e "\e[00;32m NOTICE:Solr.xml copied multicore setup complete  \e[00m"
else
  echo -e "\e[00;33m WARNING:Solr.xml already installed, skipping \e[00m"
fi
 
#Create the start|stop|restart script /etc/init.d/solr
FILE=/etc/init.d/solr
if [ ! -e $FILE ]; then
echo '#!/bin/sh -e
 
# Starts, stops, and restarts solr
 
SOLR_DIR="/opt/'$FOLDER'/dropsolid/"
JAVA_OPTIONS="-Xmx1024m -DSTOP.PORT=8079 -DSTOP.KEY=stopkey -jar start.jar"
LOG_FILE="/var/log/solr.log"
JAVA="/usr/bin/java"
 
case $1 in
    start)
        echo "Starting Solr"
        cd $SOLR_DIR
        $JAVA $JAVA_OPTIONS 2> $LOG_FILE &
        ;;
    stop)
        echo "Stopping Solr"
        cd $SOLR_DIR
        $JAVA $JAVA_OPTIONS --stop
        ;;
    restart)
        $0 stop
        sleep 1
        $0 start
        ;;
    *)
        echo "Usage: $0 {start|stop|restart}" >&2
        exit 1
        ;;
esac' > /etc/init.d/solr
 
  #Set permissions
  chmod a+rx /etc/init.d/solr
  /etc/init.d/solr start
  sleep 5
 
  #verify the connection
  solr_install_test_connection
 
  echo -e "\e[00;32m NOTICE:Solr installation completed  \e[00m"
else
  echo -e "\e[00;33m WARNING:Solr already installed, skipping \e[00m"
fi
 
#Configure solr for drupal with solr files from search api solr module
#Also download apachesolr modules
cd /opt/$FOLDER/dropsolid
DIR=/opt/$FOLDER/dropsolid/search_api_solr
if [ ! -d $DIR ]; then
  drush dl search_api_solr
  cp -rf multicore/core0/ solr/
  cp -rf solr/conf/* solr/core0/conf/
  cp -rf search_api_solr/solr-conf/3.x/* solr/core0/conf/
 
  #Reduce the time needed strat processing sent documents - we are local so we want to test fast (tip from Nick Veenhof via Nicolas Leroy)
  sed -i '[email protected]<maxTime>120000</maxTime>@<maxTime>2000</maxTime>@g' solr/core0/conf/solrconfig.xml
 
  #Add this for future use of apachesolr instead of search api solr (we can prepare cores using the files from these modules too)
  sudo drush dl apachesolr-6.x --destination=apachesolr-6.x -y
  sudo drush dl apachesolr-7.x --destination=apachesolr-7.x -y
 
  echo -e "\e[00;32m NOTICE:Search api setting installed  \e[00m"
else
  echo -e "\e[00;33m WARNING:Search api settings already there, skipping \e[00m"
fi
 
#Exit output
echo -e '\e[00;32m Use the solr-instance script to install cores for your project. Execute sudo "./create-solr-instance.sh". \e[00m'

How to use it?

Create a file called solr-install.sh and paste the above code in it. Then execute as root:

  sudo ./solr-install.sh

If everything goes as planned you will see this:

Install independent apache solr multicore

Create solr multicore instances with this script

#!/bin/bash
 
#Validate if executing user is root
if [ "$(id -u)" != "0" ]; then
   echo "This script must be run as root" 1>&2
   exit 1
fi
 
PROJECT=$1
 
if [[ -z $PROJECT ]]; then
  echo -e "\e[00;31m please provide a projectname as 1st variable \e[00m"
  echo -e "\e[00;31m please provide an optional var to indicate to use the apachesolr module (type:apachesolrd6 or apachesolrd7) as 2nd variable \e[00m"
  echo ' '
  exit
else
  echo ' '
  echo '*************Solr env creation started**************'
  echo ' '
fi
 
# ******** #
# Funtions #
# ******** #
 
#Test connection to apache solr
function solr_install_test_connection {
  response=$(curl --write-out %{http_code} --silent --output /dev/null <a href="http://127.0.0.1:8983/solr/">http://127.0.0.1:8983/solr/</a>)
  if [ $response != 200 ]; then
    rm /etc/init.d/solr
    echo -e "\e[00;31m ERROR:Solr installation failed manual intervention needed. Solr reponds at <a href="http://127.0.0.1:8983/solr/">http://127.0.0.1:8983/solr/</a> with http code: $reponse  \e[00m"
  else
    echo -e "\e[00;32m NOTICE:Solr reponds at <a href="http://127.0.0.1:8983/solr/">http://127.0.0.1:8983/solr/</a> with http code:$response  \e[00m"
  fi
}
 
# ****** #
# Script #
# ****** #
 
#Install xml starlet if needed
XMLSTARLET=`command -v xmlstarlet`
if [ -z "$XMLSTARLET" ]; then
  apt-get install xmlstarlet
  echo -e "\e[00;32m NOTICE:xmlstarlet installed \e[00m"
else
  echo -e "\e[00;33m WARNING:Xmlstarlet already installed at '$XMLSTARLET' \e[00m"
fi
 
#Variables
SOLR=/opt/apache-solr-3.6.2/dropsolid/solr
 
#Copy a preconfigured core to use as a basis for our (see solr-install.sh)
ENV=local
DIR=$SOLR/${PROJECT}_$ENV
if [ -d $DIR ]; then
  echo -e "\e[00;33m $DIR exists, skipping SOLR env $ENV creation\e[00m"
else
  cp -r $SOLR/core0 $SOLR/${PROJECT}_$ENV
  echo -e "\e[00;32m Solr env $ENV created in $DIR\e[00m"
fi
 
#Add entry to solr.xml for our new core
ENTRIE=`grep -o "${PROJECT}_$ENV" $SOLR/solr.xml | head -n1`
if [ "$ENTRIE" == "${PROJECT}_$ENV" ]; then
  echo -e "\e[00;33m Entry ${PROJECT}_$ENV exists in $SOLR/solr.xml, skipping SOLR env $ENV creation in solr.xml\e[00m"
else
  #add entry
  cd $SOLR
  xmlstarlet ed --subnode "/solr/cores" --type elem -n core -v "" solr.xml > output-solr-${PROJECT}_$ENV.xml
  sed -i '[email protected]<core></core>@<core name="'${PROJECT}_$ENV'" instanceDir="'${PROJECT}_$ENV'" />@g' output-solr-${PROJECT}_$ENV.xml
  cp -f output-solr-${PROJECT}_$ENV.xml solr.xml
 
  echo -e "\e[00;32m Solr env $ENV added to solr.xml $DIR\e[00m"
fi
 
#if apachesolr d6 - this will override solrconfig.xml and schema.xml with version from apachesolr module for D6
if [ "$2" == 'apachesolrd6'  ]; then
  cd $SOLR
  cp -rf ../apachesolr-6.x/apachesolr/solr-conf/solr-3.x/* $SOLR/${PROJECT}_$ENV/conf/
  echo -e "\e[00;32m Core modified to work with apachesolr d6 module \e[00m"
fi
 
#if apachesolr d7 - this will override solrconfig.xml and schema.xml with version from apachesolr module for d7
if [ "$2" == 'apachesolrd7'  ]; then
  cd $SOLR
  cp -rf ../apachesolr-7.x/apachesolr/solr-conf/solr-3.x/* $SOLR/${PROJECT}_$ENV/conf/
  echo -e "\e[00;32m Core modified to work with apachesolr d7 module \e[00m"
fi
 
 
#Restart solr
service solr restart
sleep 5
solr_install_test_connection
 
#Exit output
echo -e "\e[00;32m Now you can use <a href="http://127.0.0.1:8983/solr/"">http://127.0.0.1:8983/solr/"</a>$PROJECT"_local/ as a connection in your drupal installations. Admin: <a href="http://127.0.0.1:8983/solr/"">http://127.0.0.1:8983/solr/"</a>$PROJECT"_local/admin/. Check out all your cores at <a href="http://127.0.0.1:8983/solr/">http://127.0.0.1:8983/solr/</a> \e[00m"

How to use it?

  ./create-solr-instance.sh [projectname] [optinal: apachesolrd6 or apachesolrd7]

Use the second optional parameter if you want to install apache solr multicore for the http://drupal.org/project/apachesolr module (suffix with d6 or d7). By default the settings from the http://drupal.org/project/search_api_solr module will be installed. So dont use the second param if you are using those modules.

  ./create-solr-instance.sh dropsolidsite  

You should get a result like this:

Review

Go through this script piece by piece so you can learn some bash and learn to use this to your advantage. As drupal developers we are more or less php developers but learning some bash can be a great time saver. For example this script saved me countless hours. Everytime I had to install a multicore on a vm or on somebodies installation I m coaching, we are winning valueable time. You could even use this to control your production installation too.

Read the comments carefully and understand what every step does. We can be lazy by letting the automated scripts do the work for us but we can't allow ourselves to become stupid. You still need to understand what happens. You will need to know how it works because automated stuff always breaks sooner or later.

Other advantages

As already told this script just saves time by automating a repetitive task. But there are some other advantages.

I also use this as an instrument to teach junior drupal developers about apache solr without having them loose to much time on the details. They can set up a solr instance quickly and continue to be productive. In their training time and/or spare time they can pick apart the script like we will do now and learn about the details.

Another advantage is: You know by using the script to install an apache solr multicore localy that everyone has the same configuration. There are so much tutorials on how to set up solr on the web it becomes very possible that everyone in the team has a differten version/setup.

Conclusion

Using scripts to automate these tasks like installing apache solr is highly recommended. But you should always know what you are executing. Always be carefully to review things so you know what to do if the automations break down .

May 01 2013
May 01

Warming the cache gives your users extremely fast pages all the time. Using varnish, cron and bash you can give your clients very fast pages.

Extremely fast pages I mean pages with the main request that loads under 0,5sec. Something like this.

The main http request is served in under 150ms which is almost not even noticable. Loading all the other images, css and js takes around 1.28s. Further optimising could be done with sprites and  CDN but for my blog I m happy with the results for now.

Varnish

Varnish is widespread technology that is easy to setup. You can find countless tutorials on how to set this up on the net. Here is a really good article: http://andrewdunkle.com/how-install-varnish-drupal-7

Cache warming

Warming your caches is pretty easy. With the xml sitemap module, a bash script and well configured cache you can do this.

Cache settings

Go to admin/config/development/performance

Configuring the cache on a blog site to last for a day is a good idea since the content doesnt changes that much. If you have a frequently commented blog then you could lower this.

Xml sitemap

Install the http://drupal.org/project/xmlsitemap By setting this up you should be able to get map of your site urls. We will explore this to crawl the pages with our script.

Script warm-cache.sh

#!/bin/bash
ALIAS='@ddc.production'
URL='dominiquedecooman.com'
 
wget --quiet http://$URL/sitemap.xml --no-cache --output-document - | egrep -o "http://$URL[^<]+" | while read line; do
    time curl -A 'Cache Warmer' -s -L $line > /dev/null 2>&1
    echo $line
done
 
wget --quiet http://$URL/fr/sitemap.xml --no-cache --output-document - | egrep -o "http://$URL[^<]+" | while read line; do
    time curl -A 'Cache Warmer' -s -L $line > /dev/null 2>&1
    echo $line
done
 
wget --quiet http://$URL/nl/sitemap.xml --no-cache --output-document - | egrep -o "http://$URL[^<]+" | while read line; do
    time curl -A 'Cache Warmer' -s -L $line > /dev/null 2>&1
    echo $line
done

The script crawls all the sitemaps for each language.

Cron

Configure the cron with crontab -e

0 2 * * * /var/checkouts/hostingplatform/cache-warming.sh

This will crawl the site everyday ensuring that the cache is warm. Since it lasts for a day we know the cache is always warm. This will ensure users a fast page all the time.

Here is an image of our cache being warmed (from New Relic). This slows the server for a brief time span during a time when the least users are on the site, only to make the site perform for the rest of day at lightning speed.

Want this too?

Like this: Do you want your website to perform like this? Please contact me http://dominiquedecooman.com/forms/contact-get-advise

Apr 28 2013
Apr 28

If you want to create a custom views access plugin the reason for this will be to allow you to have an entry point to add custom logic to protect the access to your view.

Let say you need a finer access control to a view than roles and permissions. For example you are storing properties on a user object who determine if certain user can see certain views independent of their roles.

What we need to do, is to configure our views correctly with a views access plugin.

The hacky way:

You could load the user in the template file of your view and check for the property but this is not what we want to do. While this would work you would have to redo it in every template and you will not be able to configure it. It is also difficult to control. Its hacky and hacky solutions tend to cause a lot of maintenance. Making these quick solutions costing more in the long run. This is worth a different blog post entirely.

Here is the drupal way:

Views has this beautiful architecture in place that allows you to extend its base classes. If you start writing a custom views plugin the best thing to do is to examine an other plugin of its kind. For example check out the permissions access plugin. (tip: a good way to write plugins is to look at examples from views or other module implementing the same system)

The file:

/views/plugins/views_plugin_access_perm.inc
<?php
 
/**
 * @file
 * Definition of views_plugin_access_perm.
 */
 
/**
 * Access plugin that provides permission-based access control.
 *
 * @ingroup views_access_plugins
 */
class views_plugin_access_perm extends views_plugin_access {
  function access($account) {
    return views_check_perm($this->options['perm'], $account);
  }
 
  function get_access_callback() {
    return array('views_check_perm', array($this->options['perm']));
  }
 
  function summary_title() {
    $permissions = module_invoke_all('permission');
    if (isset($permissions[$this->options['perm']])) {
      return $permissions[$this->options['perm']]['title'];
    }
 
    return t($this->options['perm']);
  }
 
 
  function option_definition() {
    $options = parent::option_definition();
    $options['perm'] = array('default' => 'access content');
 
    return $options;
  }
 
  function options_form(&$form, &$form_state) {
    parent::options_form($form, $form_state);
    $perms = array();
    $module_info = system_get_info('module');
 
    // Get list of permissions
    foreach (module_implements('permission') as $module) {
      $permissions = module_invoke($module, 'permission');
      foreach ($permissions as $name => $perm) {
        $perms[$module_info[$module]['name']][$name] = strip_tags($perm['title']);
      }
    }
 
    ksort($perms);
 
    $form['perm'] = array(
      '#type' => 'select',
      '#options' => $perms,
      '#title' => t('Permission'),
      '#default_value' => $this->options['perm'],
      '#description' => t('Only users with the selected permission flag will be able to access this display. Note that users with "access all views" can see any view, regardless of other permissions.'),
    );
  }
}

This is a clear example that we can use to start with. You can see there are some essential functions to create this plugin. Lets check our example to see which ones.

Create a file called custommodule_access_plugin.inc in the root of your custom module.

<?php
 
  /**
   * Access plugin that provides property based access control.
   */
  class custommodules_access_plugin extends views_plugin_access {
 
    function summary_title() {
      return t('Custom acces plugin');
    } // summary_title()
 
  /**
   * Determine if the current user has access or not.
   */
    function access($account) {    
      return custommodules_access($account);
    }
 
    function get_access_callback() {
      return array('custommodules_access', array());
    }
 
  }

For the custom views plugin you need three functions. The summary_title to return the title in the admin interface. This is what you will see when you go to the views interface and you select your plugin. It will appear next to role based and permission based access in the views interface.
Then we have our access method where you the custom callback. Finaly you must declare to views your custom access callback. You can pass arguments if need using the array see in the example above.

We have our file with the class now we need to tell views that we have created a new access plugin.

Two things need to be done. First in the info file you must tell there is a file containing the class.

name = custommodule
description = Custom code
core = 7.x
package = custom
dependencies[] = views
files[] = custommodule_access_plugin.inc

The you need to implement a hook_views_plugins to tell views that there is a new custom access plugin available

  /**
   * Implements hook_views_plugins()
   */
  function custommodule_views_plugins() {
    $plugins = array(
      'access' => array(
        'test' => array(
          'title' => t('Custom Access check'),
          'help' => t('this is a custom access plugin'),
          'handler' => 'custommodule_access_plugin',
          'path' => drupal_get_path('module', 'custommodule'),
        ),
      ),
    );
    return $plugins;
  }

Our access callbacks doing the custom checks.

 function custommodules_access($account = NULL) {
    global $user;
    $access = false; 
    $account = user_load($user->uid);
    $optionfield = field_get_items('user', $account, 'field_option');
 
    //In the future more values are possible so this is extendible
    //For now only +eur exists
    $allowed_values = array('eur');
    $options = explode('+', $optionfield[0]['value']);
    foreach ($allowed_values as $allowed_value) {
      if (in_array($allowed_value, $options)) {
        $access = true;
      }
    }  
    return $access;
  }

Here we are checking on certain values that are stored in a field of a user of certain role. When a user has these properties we want the callback to return true. This will grant access.

Now that you have this plugin you can start using it in all your views by going to the interface access section and select the plugin.

Nov 16 2012
Nov 16

The goal of automating deployment is to make introducing new features easier. In this post we will learn how to set up a workflow that will move code and configuration of your drupal site from your local development station, to development, to staging and to production all by a using your version control and a push of a button.

Why Automatic deployment

  • Its faster. Deploying automatically or semi automatically is a lot faster. You dont need to transfer the database settings manualy. You dont need to transfer the code manualy. You dont have to set the settings correctly on production or staging.
  • Its less error prone. A process that is automated and executed every time is less likely to contain errors.
  • Its well documented. An automated process almost always involves some kind of script being executed. This script is in version control. You can see how it evolves and why is became like it is.
  • Reproducable. The process can be reproduced. You know exactly what has happend to the site.
  • History. You can have a complete history of all the builds.
  • Continous integration. Automating deployment brings us also closer to CI. Being able to introduce changes automatically and test them automatically reduces regression and makes you sleep better at night.

Our basic components

  • A virtual private server with a lamp stack.
  • Version control Git.
  • Drush
  • Drupal of course. With a correct repository layout.
  • Dev Staging Production setup.
  • Continous integration server
  • Deployment scripts

How to set it all up?
TIP:
If you want to test it all out on a free cloud server. Learn how to create a free cloud server to play with and try it all out. Super fun and great for proof of concepts.

Lamp stack
In this post you can read how to configure a vps http://www.dominiquedecooman.com/blog/automate-installing-drupal-ubuntu-...

Drupal project
We will use this repository layout to structure our project:
- bash
-- updates - update scripts (see further)
-- installs - install scripts
-- scripts - random scripts needed in the project
- docroot - contains the drupal site
- documentation - contains documentation
- etc
-- drupal - contains settings files, robot.txt, ...
-- ssh - contains the config settings (see next)
-- aliasses - contains the alias files (see further)
-- vhost - contains a vhost template (see next)

Vhost file

<VirtualHost *:80>
  ServerName ddcdemo.local
  ServerAlias *.ddcdemo.local
  DocumentRoot /home/quickstart/websites/ddcdemo.local/docroot
  <Directory /home/quickstart/websites/ddcdemo.local/docroot>
    Options Indexes FollowSymLinks MultiViews
    AllowOverride All
    Order allow,deny
    allow from all
  </Directory>
</VirtualHost>

ssh config file

Host hera
HostName 92.243.15.236
User admin

Dev - staging - production
On your server create three folders where jenkins will able to build into. For example:

  • /var/www/ddcdemo.dev
  • /var/www/ddcdemo.staging
  • /var/www/ddcdemo.production

Create vhosts on the server connecting to these folder so you sites are accessible. We use dummy urls but mostly it will be real ones. (We put and entry in our /etc/hosts locally to be able to see the sites on ddcdemo.dev, ddcdemo.staging, ddcdemo.prod)

Version control and Git flow
Read all about using git flow here: http://dominiquedecooman.com/blog/git-flow-minimizing-overhead
We will use the braching model as a basis of our workflow.

Drush
Drush is the drupal shell. If you are not yet using it definitly check it out. http://drupal.org/project/drush

We will use it to execute our commands to build our site. We will also use aliases. http://drupal.org/node/670460

Alias template: To be placed in the "~/.drush" folder

// environment dev
$aliases['ddcdemo.dev'] = array(
  'root' => '/var/www/ddcdemo.dev/docroot',
  'remote-host' => '92.243.15.236',
  'remote-user' => 'admin',
);
 
// environment test
$aliases['ddcdemo.test'] = array(
  'root' => '/var/www/ddcdemo.test/docroot',
  'remote-host' => '92.243.15.236',
  'remote-user' => 'admin',
);
 
// environment prod
$aliases['ddcdemo.prod'] = array(
  'root' => '/var/www/ddcdemo.prod/docroot',
  'remote-host' => '92.243.15.236',
  'remote-user' => 'admin',
);
 
// Local instance
$aliases['ddcdemo.local'] = array(
  'root' => '/home/quickstart/websites/ddcdemo.local/docroot',
  'uri' => 'ddc.local',
);

CI server
Install -> https://wiki.jenkins-ci.org/display/JENKINS/Installing+Jenkins+on+Ubuntu
Why?
We use jenkins to automate our workflow. Jenkins allows you to configure a jobs that execute commands to build the site and to introduce our new code and configuration. It will then execute tests and return feedback. It also provides a history of all builds.

Automate the workflow
Exporting database changes
To use continous integration with drupal and to work in team in general you cannot make database changes on the environments. This will not only be very error prone, you might forget something when deploying your feature, it is uncontrolable.

The features module allows us to export drupal configuration (code) into files. for example it will make a representation of a view and export that into a file. This file can be committed into the repository and deployed in a controled way. (http://drupal.org/project/features)

Features allows you to export most components. Here is an overview on how to structure them and some ideas on using install profiles in the deployment process: http://www.slideshare.net/nuvoleweb/code-driven-development-using-featur...

What you cant deploy with features should be done with hook_update(). You can code the database change in this hook and the drush updb command in the deploy script will execute all hook updates and you change will be deployed.

//Hook update example

Configuring jenkins jobs
First we will create a job per environment. Then we will create one or more testbot jobs. This way we will be able to deploy our new code to each environment. Jenkins will always connect to a repository and do a checkout of the code in a workspace. We will then sync these workspace directories with our folders

  • Basic setup
    • General settings
    • Repository configuration
    • Build triggers
    • Post build actions
  • Build section
    • Restore the latest production database
    • This site is building
    • Copy files
    • Update the code
    • Copy build script
    • Overrides
    • Execute deploy script on test database
    • Executed automated tests and store results
    • Execute deploy script on the real database
    • Unset the build message
//Testbot job - development job
#!/bin/sh
 
cd ../../docroot
 
#D7
 
#Update site
drush updb -y
 
#Enable ui modules
drush en field_ui update schema devel feeds_ui views_ui switcheroo dblog glue -y
 
#Feature reverts
drush fra -y
 
# Clear caches
drush cc all
 
#Reindex content Optional
#drush search-api-clear apachesolr_index
#drush sapi-aq
#drush queue-run-concurrent search_api_indexing_queue 8
 
# Run cron
drush cron
 
#Set dev settings
drush vset --yes --always-set preprocess_css 0
drush vset --yes --always-set preprocess_js 0
drush vset --yes --always-set cache 0
drush vset --yes --always-set block_cache 0
drush vset --yes --always-set page_compression 0
drush vset --yes --always-set error_level 2
 
#Run tests
drush en simpletest -y
drush test-run DDCDEMO
//Staging job
#!/bin/sh
 
cd ../../docroot
 
#D7
 
#Update site
drush updb -y
 
#Enable ui modules
drush dis field_ui update schema devel feeds_ui views_ui switcheroo dblog -y
 
#Feature reverts
drush fra -y
 
# Clear caches
drush cc all
 
#Reindex content Optional
#drush search-api-clear apachesolr_index
#drush sapi-aq
#drush queue-run-concurrent search_api_indexing_queue 8
 
# Run cron
drush cron
 
#Set dev settings
drush vset --yes --always-set preprocess_css 1
drush vset --yes --always-set preprocess_js 1
drush vset --yes --always-set cache 1
drush vset --yes --always-set block_cache 1
drush vset --yes --always-set page_compression 1
drush vset --yes --always-set error_level 0
 
#Run tests
drush en simpletest -y
drush test-run DDCDEMO
//Production job
#!/bin/sh
 
cd ../../docroot
 
#D7
 
#Update site
drush updb -y
 
#Enable ui modules
drush dis field_ui update schema devel feeds_ui views_ui switcheroo dblog -y
 
#Feature reverts
drush fra -y
 
# Clear caches
drush cc all
 
#Reindex content Optional
#drush search-api-clear apachesolr_index
#drush sapi-aq
#drush queue-run-concurrent search_api_indexing_queue 8
 
# Run cron
drush cron
 
#Set dev settings
drush vset --yes --always-set preprocess_css 1
drush vset --yes --always-set preprocess_js 1
drush vset --yes --always-set cache 1
drush vset --yes --always-set block_cache 1
drush vset --yes --always-set page_compression 1
drush vset --yes --always-set error_level 0

Writing deployment script

  • Enable/Disable ui modules
  • Set environment specific variables
    • Error_level
    • Set Cache
  • Revert features
  • Update the database
  • Optional: reindex
  • Run cron
  • Flush cache
  • Running tests

The is a dedicated blog post on what to use in the script http://www.dominiquedecooman.com/blog/drupal-7-tip-how-automate-and-cont... The post overlaps a bit with this one but it focusses more on what to automate than how to automate.

Differences per environment
Each environment as you can see in the script a differences. This way you can control what you are doing on each environment.

Reroll
To reroll something simply use git to rollback any changes and recommit a hotfix and deploy. Due to the automation you will be able to do it fast and easy.

You should be able to do this now
Develop on feature branch locally

git flow feature start news
drush dl views
drush en views -y
drush dl features
drush dl ftools
drush en features ftools -y
//Create a feature called ddcdemo_news add the article content type
git add *
git commit -m "initial setup"

Now we have to share the feature with our team mates and have it tested in our CI. We have to push the code so we can check it out in our testbot job to test our feature.

git flow feature publish news

You can now use the jenkins job to deploy the code of the feature branch. Type in the branch field "origin/feature/news" Fix tests if needed.

To stop developing on the feature

git flow feature finish news
git push origin develop

Now git flow has merged all our work into the develop branch. After we push it to the origin. Our CI system should now be able to detect the change and automatically execute our develop build. The code that was merged into develop will now be deployed on our development environment and tested.

Note: You can also run tests locally but simpletest is pretty slow so practically it is better to let the server handle it. There are some optimisation that can be done but they are beyond the scope of this post.

If any tests are failing we should fix them.

Release
Now we need to create a release to deploy on our staging environment so our client is able to test it.

git flow release start v0.1.2

When we are done adding to our release branch we can push it remote and use it as a branch to deploy on staging.

git flow release publish v0.1.2

Now in Jenkins we can fill in our branch name in our staging job and get that tested. If all goes well we can finish our release branch and it will be merged in both the master and the develop branch

git flow release finish v0.1.2
git push origin master

We can keep track of the version by storing this in a text file in the root of the project.

Now we should be ready to deploy the master branch on our production environment. Lets execute our production job. Note that no tests are executed. We did all the testing on our other environments. We are using the contributed simpletest module and the remotetestcase which allows us execute tests directly onto the database so we dont need to do complex setups. But a disadvantages is that it pollutes the database with test content.

Hotfix
Create hotfix branch - v0.1.2.1

git flow hotfix start v0.1.2.1

Finish the hotfix branch. This will merge the change into develop and the master. Push the master to the central repository. Push the production job and the fix is on production.

git flow hotfix finish v0.1.2.1
git push origin master

For more on gitflow read this: http://yakiloo.com/getting-started-git-flow/

Conclusion
What we have now is a powerful way of deploying code in a controlled way. When developing proper tests for each feature we will be able to minimize regression and we are able to release automatically in a controlled way.

Sep 26 2012
Sep 26

We were using search api to build our search pages. While this module is great to create search pages it has some problems with indexing a lot of nodes. There is a patch available http://drupal.org/node/1137734 to avoid the memory limit issue. But indexing with search api using it is still very slow.

The solution?
We want to index our nodes concurrent of course. This will make full use of our machine and make indexing go fast.

Two years earlier this very problem of concurrent indexing was solved with a trick described in this article http://dominiquedecooman.com/blog/doing-big-imports-and-apache-solr-inde.... In the article the concurrency issue was solved by using a cron run that spawned processes depending on how high the load of the machine got. This worked great but now there is a very easy way to do the same thing. Two years ago drush was not that advanced and drupal 7 didnt even existed so no queues in core.

How did we index our nodes concurrent?

The search api implements a cron queue (http://dominiquedecooman.com/blog/drupal-7-tip-cron-queues) This we can exploit to have our nodes indexed concurrently. The only thing we need to do is fill a queue with items to be indexed and then write a special command that uses the "drush_backend_invoke_concurrent()" function to process that queue concurrently. The function will grab a number of items in the queue depending on how high you set the concurrency level and process them. Drush will take care of everything it will kill the processes when they are done and will return output to console if any.

In the code we have two commands. One to fill the queue and one to process it.

<?php
/**
 * Implements hook_drush_command()
 */
function dfc_drush_command() {
  
$items = array();$items['search-api-add-to-queue'] = array(
    
'description' => 'Fetch all items to be indexed, and add them to the queue.',
    
'bootstrap' => 'DRUSH_BOOTSTRAP_DRUPAL_FULL',
    
'callback' => 'search_api_cron',
    
'aliases' => array('sapi-aq'),
  );
$items['queue-run-concurrent'] = array(
    
'description' => 'Run a specific queue by name',
    
'arguments' => array(
      
'queue_name' => 'The name of the queue to run, as defined in either hook_queue_info or hook_cron_queue_info.',
      
'concurrency level' => 'The amount of background processes to spin-off'
    
),
    
'required-arguments' => TRUE,
  );
  return 
$items;
}
/**
 * Command callback for drush queue-run-concurrent.
 *
 * Queue runner that is compatible with queues declared using both
 * hook_queue_info() and hook_cron_queue_info().
 *
 * @param $queue_name
 *   Arbitrary string. The name of the queue to work with.
 * @param $concurrency_level
 *   Amount of background
 */
function drush_dfc_queue_run_concurrent($queue_name$concurrency_level) {
  
// Get all queues.
  
$queues drush_queue_get_queues();
  if (isset(
$queues[$queue_name])) {
    
$queue DrupalQueue::get($queue_name);
    for (
$i 0$queue->numberOfItems() > $i$i++) {
      
$invocations[] = array('command' => 'queue-run ' $queue_name'site' => '@self');
    }
    
$common_options = array(
      
'concurrency' => $concurrency_level,
    );
    
drush_backend_invoke_concurrent($invocations, array(), $common_options);
  }
}
?>

To make it easy we have created a command that adds all items to queue by just calling the search_api_cron function which will just add all items that need indexing to the queue.

#In terminal
drush sapi-aq

Then we call our other command that will "drush_dfc_queue_run_concurrent()" function. Like this:

#In terminal
drush queue-run-concurrent search_api_indexing_queue 8

Now watch the queue spawn 8 other processes beside itself by monitoring the shell you are in by typing in another shell:

watch ps -s[id]. 

To find out the shell you are on type:

ps -ef|grep $$|grep -v grep 

(its the third number on the second line) For more (http://stackoverflow.com/questions/3327013/how-to-determine-the-current-...)

Indexing goes now 8 times faster. If you have a big machine you can raise the concurrency level and it will even go faster. Do monitor your search backend. In our case apachesolr so it can cope with the load.

Conclusion
This function/trick is reusable to call any queue in the system. The process is always the same. Fill up a queue with items and let drush process the queue concurrently. You can do it for any task, from importing content, feeds, indexing

Mention
This little function was developed together with to Sander Van Dooren. Check out his blog, he posts nice tips regularly. http://www.sandervandooren.be/

May 29 2012
May 29

This post will give you a list what to check when you go live with a drupal 7 website and how you can control it and automate it. You need to check this go live checklist each time you keep integrating new stuff during deployment to test if everything is still working. To save time this task can be automated.

Intro
When in development a lot of settings are turned off or are configured differently. Why? Because different environments might use different services and for some settings it is more convienent to develop with certain config. When going live you dont want your dev settings to be active. This post will explain how you can keep things under control and automate some parts of it so you dont have to check everything manualy every time.

Environment specific settings
Lets see which settings should be checked when deploying to test and live. Normaly when using a local - dev - test - live workflow, most of the local settings and dev settings should be the same and also most of test and live settings should be same. Since the client first checks the test version before going live the two environments should be the same except for the environment specific settings of course.

The easiest way to control settings is to keep a settings file per environment. In this settings file it is possible to set your environment specific settings. For example we can set following settings:

In settings.php (local/dev environment)

<?php
  $conf
['error_level'] = '2';
  
$conf['preprocess_css'] = '0';
  
$conf['preprocess_js'] = '0';
  
$conf['cache'] = '0';
  
$conf['page_compression'] = '0';
  
$conf['block_cache'] = '0';
?>

In settings.php (test/live)

<?php 
  $conf
['error_level'] = '0';
  
$conf['preprocess_css'] = '1'
  
$conf['preprocess_js'] = '1';
  
$conf['cache'] = '1';
  
$conf['page_compression'] = '1';
  
$conf['block_cache'] = '1';
?>

So what we did was turn off error reporting, turn on preprocessing of css and js, enable page cache and enable block cache. These are some of the typical go live settings.

A typical site has some other environment specific settings depending on the functionality enabled. So one way to be sure your live settings stay the same is to keep them in the settings file. Note: this will make it impossible to change them in your live site.

<?php
$conf
['so_env'] = 'local';
$conf['so_host_solr'] = '46.137.--.---';
$conf['so_port_solr'] = '8986';
$conf['so_path_solr'] = '/solr';$conf['googlemap_api_key'] = 'ABQIAAAAfYxov8LBTzY0GIIX9zA54hS8rtsh0fmHD----------';
$conf['windows_live_login_client_id'] = '000000---------';
$conf['windows_live_login_client_secret'] = 'JMuzw----------------------------';
?>

We have a site with a solr and have a different instance for all of our environments so we can fill in the settings here, we can do the same for our other env dependent settings like the maps api key etc...

Now all this is working just fine but what if you want to enable error reporting on the test instance? Some error is manifesting its self and you are not able to check it because you cannot change the setting. What this means is that this setting should not be configured the hard way in settings.php but you dont want to check this setting on every deploy you make.

Shell script
Create a shell script and execute your shell script each time you deploy. This script, like your settings file will live in your version control system so you actualy keep track of your go live list.

An example of such a script can be:

<?php
#!/bin/sh

#D7

#vars

drush vset --yes preprocess_css 1
drush vset 
--yes preprocess_js 1
drush vset 
--yes cache 1
drush vset 
--yes block_cache 1
drush vset 
--yes page_compression 1
drush vset 
--yes error_level 0#dis contrib
drush dis devel views_ui -y#dis core
drush dis update syslog dblog field_ui -y#system
drush updb 
drush cc all
#execute go live tests
drush test-run --uri=<a href="http://example.com/">http://example.com/</a> --xml=~/var/drupal/tests GoLiveTest
?>

To use it create a file called update_live.sh. Make it executable chmod u+x update_live.sh and execute it ./update_live.sh

A you can see you will need drush (http://drupal.org/project/drush) to be installed on your webserver. Now you can still change these settings but each deploy they will be reverted to the settings for this environment.

You can use a CI server like Jenkins (http://jenkins-ci.org/) to automate this task. Each environment has its own script so you can actualy control how your site is deployed to each environment. What you can do is configure a job that excute a shell script each time you do a deploy. If you dont know how to setup drupal and jenkins stay tuned for later blog posts I will explain this.

Automate testing
The next thing you can do to automate the testing if everything is setup, is to run tests. Using the simple test module during deployment can verify a lot of things. Generaly it is not recommend to run tests against a live instance. But if you know what you are doing it can help you verify all your settings during deployment. Know what you are doing means:

  • Make sure your tests dont cause any performance issues.
  • Make sure you cleanup everything your tests are doing.
  • Only test environment dependent functinality. All the other functionality you can test when deploying to dev - test
  • Make sure you disable the simpletest module when done.

Note some settings are indeed hard to check using simpletest so this requires you to use some additional tools to monitor other parts of your infrastructure. You can decide for your self how far you want to go. Not everything should be automated, manual checking for some stuff is ok if it doesnt require you to do it often. If automating saves you time do it otherwise dont.

Checklist

Here is the complete checklist of stuff we check when going live.

Performance

  • Set Preprocessing js and css
  • Set Page cache
  • Set Block cache
  • Set Page compression
  • Set Error level
  • Install PHP accelerator like APC etc
  • Disable some core and contrib modules like for example update, syslog, dblog, field_ui, devel, views_ui, ... Logs and stats should be disabled you can use other tools on the server to monitor errors in a more performant way.

Security

  • Upgrade Drupal and the Contirb module to the Latest Version
  • Schedule back up of database
  • Protect Admin password
  • Enable Google Analytics
  • Use captcha or mollom for all forms
  • Double check user registration settings.
  • Double check all permissions.
  • Check the watchdog for errors and warnings, and fix these
  • Check input format

Seo
Use the drupal seo checklist to make sure everything is enabled as it should be. You can also check all settings using a script. (http://drupal.org/project/seo_checklist)

Email

  • Site SMTP settings
  • Contact module settings, webform and/or others that require email.

Test content
Keep a list of test nodes. Typically when you build your instance you know what content is test content, so you can delete this or build your site without it.

Misc
Schedule cron

Optional

Conclusion
Set everything environment specific that doesnt require an occasional change in settings.php.
Set the other environment specific settings by a shell script each deploy.
Keeping both the script and different settings.php files ensure you can keep track of your go live requirements.
Use a golive simple test test to check if everything is working as it should be.

If you think of something else to check post it in the comments below and I will add it to the list.

Dec 30 2011
Dec 30

We want to use ctools' modal frames and field collection forms to create a better user experience.

As we know, ctools comes with a lot of useful apis and tools to use in our own modules. One of them is the modal frames.

The field collection module which is a lesser known module but non the less very useful allows you to have multiple collections of fields to be added to an entity. For example in our example we have a curriculum vitae and we want to add multiple experiences to it. An experience consist out of several fields: company, from to date, function and description. So the field collection groups them.

To add new expierences we can edit the node or we can use the field collection modules links which go to a seperate page to fill in a seperate form.

What we want to do is make this form appear in a ctools modal frame so we can have easy in place editing which provides a superior user experience.

The example can be used to display any form in a ctools modal frame.

How to?
Install ctools and field collection
Create a node type curriculum and add all the fields. Use the field collection module to add new collections here admin/structure/field-collections. You ll notice it is also an entity so add fields like you add fields to other entities like nodes. Now you can add the collections you made as a field onto your node type cv. Now you should be able to see the links when creating a node.

Next we are going to take over these links. We ll create a hook menu in our custom module to create our proper links. We just prefix and suffix the existing paths because we still need the parameters

<?php
/**
 * Implements hook_menu().
 */
function glue_menu() {
  
$items = array();
   
  
$items['modal/field-collection/%/%/%/%/%ctools_js/go'] = array(
    
'page callback' => 'glue_modal_operator_callback',
    
'page arguments' => array(2,3,4,5,6),
    
'access arguments' => array('access content'),
  );
  
  return 
$items;
}
?>

Now lets create the main function which will take the arguments and create the proper add, edit or delete form to administer our field collections items in a ctools modal frame. (read the comments in code for more explaination)

<?php
/**
 *  Modal callback
 */
function glue_modal_operator_callback($field_name$nid$operator$id 0$js FALSE) {
  
//We need a function to load an argument to use in our form state for a file in the field collection which is not loaded in this context
  
module_load_include('pages.inc''field_collection');  
  
  
//Access checks to make sure the user has access to the field collections
  
switch ($opertor) {
    case 
'add':
      
$result field_collection_item_add($field_name'node'$nid);
      if (
$result == MENU_NOT_FOUND || $result == MENU_ACCESS_DENIED) {
        exit();
      }     
      break;
    case 
'edit':           
    case 
'delete':
      if (!
field_collection_item_access($opertor$id)) {
        exit();
      }
      break;
  }  
  
  
//Check if js is enabled, this parameter will be loaded by ctools
  
if ($js) {
    
//Include ctools ajax and modal, dont forget to set ajax TRUE
    
ctools_include('ajax');
    
ctools_include('modal');
    
$form_state = array(
      
'ajax' => TRUE,
      
'title' => t('Experiences'),
    );
    
    if (
$operator == 'add') {
      
//Arguments need to be loaded directly onto the build_info args array because ctools_modal_form_wrapper will call drupal_build_form() directly see from API for more
      
$arg glue_field_collection_item_add(str_replace('-''_'$field_name), 'node'$nid);
      if (
$arg == MENU_NOT_FOUND || $arg == MENU_ACCESS_DENIED) {
        exit();
      }
      
$form_state['build_info']['args'][] = $arg
      
//The modal form wrapper is needed to make sure the form will allow validating, you cannot use drupal_get_form directly it wont work.
      
$output ctools_modal_form_wrapper('field_collection_item_form'$form_state);
    }
    else {
      
//The id is the collection entity id
      
$form_state['build_info']['args'][] = field_collection_item_load($id);
      if (
$operator == 'edit') {
        
$output ctools_modal_form_wrapper('field_collection_item_form'$form_state);
      }
      elseif (
$operator == 'delete') {
        
$output ctools_modal_form_wrapper('field_collection_item_delete_confirm'$form_state);
      }
      else {
        exit();
      }
    }
        
    
//If the form is executed will need to dismiss the form and reload the page
    
if ($form_state['executed']) {      
      
$commands = array();
      
      
//Load the new output
      
$node node_load($nidNULLfalse);       
      
//Render the newly saved field collection set       
      //Here is how to render a single field:<a href="http://dominiquedecooman.com/blog/drupal-7-tip-theming-render-only-single-field-your-entities
">http://dominiquedecooman.com/blog/drupal-7-tip-theming-render-only-singl...</a>      $field_to_render = field_view_field('node', $node, str_replace('-', '_', $field_name), 'full');     

      // Remove the prefix and suffix, which contain unneeded div's and actions links.
      unset(

$field_to_render['#prefix']);
      unset(
$field_to_render['#suffix']);$output = render($field_to_render);
      
      //We will replace the fieldcollection with the new output
      
$commands[] = ajax_command_html('.field-name-field-cv-experience', $output);
      //close the frame
      
$commands[] = ctools_modal_command_dismiss();
      
      
$output = $commands;
    }
    //Render the output
    print ajax_render(
$output);
    exit();        
  }
  else {
    //No js found lets go to the default page
    return drupal_get_form('field_collection_item_form', field_collection_item_load(
$id));
  }
}

/**
 * Add a new field-collection item.
 * 
 * We copied this function from the field collection module but instead of returning a form we return the object
 */
function glue_field_collection_item_add(

$field_name$entity_type$entity_id$revision_id = NULL, $langcode = NULL) {
  
$info = entity_get_info();
  if (!isset(
$info[$entity_type])) {
    return MENU_NOT_FOUND;
  }
  
$result = entity_load($entity_type, array($entity_id));$entity = reset($result);  

  if (!

$entity) {
    return MENU_NOT_FOUND;
  }
  // Ensure the given entity is of a bundle that has an instance of the field.
  list(
$id$rev_id$bundle) = entity_extract_ids($entity_type$entity);
  
$instance = field_info_instance($entity_type$field_name$bundle);

  if (!

$instance) {
    return MENU_NOT_FOUND;
  }

  // Check field cardinality.
  

$field = field_info_field($field_name);
  
$langcode = LANGUAGE_NONE;
  if (!(
$field['cardinality'] == FIELD_CARDINALITY_UNLIMITED || !isset($entity->{$field_name}[$langcode]) || count($entity->{$field_name}[$langcode]) < $field['cardinality'])) {
    drupal_set_message(t('Too many items.'), 'error');
    return '';
  }
$title = ($field['cardinality'] == 1) ? $instance['label'] : t('Add new !instance_label', array('!instance_label' => drupal_strtolower($instance['label'])));

  drupal_set_title(

$title);$field_collection_item = entity_create('field_collection_item', array('field_name' => $field_name));
  // Do not link the field-collection item with the host entity at this point,
  // as during the form-workflow we have multiple field-collection item entity
  // instances, which we don't want link all with the host.
  // That way the link is going to be created when the item is saved.
  
$field_collection_item->setHostEntity($entity_type$entity, LANGUAGE_NONE, FALSE);

  // Make sure the current user has access to create a field collection item.
  if (!field_collection_item_access('create', 

$field_collection_item)) {
    return MENU_ACCESS_DENIED;
  }
  return 
$field_collection_item;
}
?>

The only thing we need to do now is take over the links on the page. For this to happen we will alter the render array. The hook to do this is hook_field_attach_view_alter().

It will be by adding the ctools-use-modal class that our modal functionality will be triggered.

<?php
/**
 * Implements hook_field_attach_view_alter
 */
function synergie_field_attach_view_alter(&$output$context) {
  if (
$output['field_cv_experience']) {
    
ctools_include('modal');
    
ctools_modal_add_js();
    
    
$field 'field_cv_experience';
    
//Add
     
$output[$field]['#suffix'] = 
    
'<div class="description field-collection-description"></div>
       <ul class="action-links action-links-field-collection-add">
         <li>'
           
l(t('Add'), 'modal/field-collection/field-cv-experience/' $context['entity']->nid '/add/0/nojs/go'
               array(
'attributes' => array('class' => 'ctools-use-modal'))) .
         
'</li>
        </ul>
     </div>'
;
    
    
//Edit & delete        
    
foreach ($output[$field] as $key => $value) {      
      if (
is_numeric($key)) {          $output[$field][$key]['links']['#links']['edit']['href'] = 'modal/field-collection/' str_replace('_''-'$field) . '/' $context['entity']->nid   '/edit/' $output[$field]['#items'][$key]['value'] . '/nojs/go';
        
$output[$field][$key]['links']['#links']['edit']['attributes'] = array('class' => 'ctools-use-modal');$output[$field][$key]['links']['#links']['delete']['href'] = 'modal/field-collection/' str_replace('_''-'$field) . '/' $context['entity']->nid '/delete/' $output[$field]['#items'][$key]['value'] . '/nojs/go';
        
$output[$field][$key]['links']['#links']['delete']['attributes'] = array('class' => 'ctools-use-modal');
      }      
    }
  }
}
?>

Here is how to do it and this is what the result should look like:

You should be able to allow validating and after submission thanks to the ctools_modal_form_wrapper function. After submission the modal frame should close and the page should show the changes that where inserted by the ajax_command_html().

For more on how to theme the modal frame and to do much with them more see ctools_ajax_sample.module which comes with ctools.

Dec 27 2011
Dec 27

In this blog we ll explain how to set up a geospatial search by radius using apache solr geospatial extension in a drupal 7.

You ll need the following modules to make the geospatial search by radius work:

location, search_api, search_api_solr, search_api_location, gmap, views
Optionally you can also use the facetapi to allow for facets.

Now we need to set up a solr instance. Thanks to http://ygerasimov.com who already compiled a version of apachesolr with the extension in it. He also copied the search_api_solr schema.xml and solrconfig.xml into it. We can download it here: http://ygerasimov.com/geo/sites/default/files/apachesolr.tar.gz

You can download it to /etc/apachesolr.tar.gz and extract it with

cd /etc
wget <a href="http://ygerasimov.com/geo/sites/default/files/apachesolr.tar.gz
tar">http://ygerasimov.com/geo/sites/default/files/apachesolr.tar.gz
tar</a> -xvf apachesolr.tar.gz 
rm apachesolr.tar.gz 
cd apachesolr

You can start it up, to test it:

java -jar start.jar

Go to port 8983 http://127.0.0.1:8983/solr/ and check out the interface.

Now a start up script would be nice. So you can start it and stop it easly and it will run in the background.
Create a file /etc/init.d/solr and paste this

#!/bin/sh -e
# Starts, stops, and restarts solr
SOLR_DIR="/etc/apachesolr" 
JAVA_OPTIONS="-Xmx1024m -DSTOP.PORT=8079 -DSTOP.KEY=stopkey -jar start.jar" 
LOG_FILE="/var/log/solr.log" 
JAVA="/usr/bin/java" 
 
      case $1 in
          start)
              echo "Starting Solr" 
              cd $SOLR_DIR
              $JAVA $JAVA_OPTIONS 2> $LOG_FILE &
              ;;
          stop)
              echo "Stopping Solr" 
              cd $SOLR_DIR
              $JAVA $JAVA_OPTIONS --stop
              ;;
          restart)
              $0 stop
              sleep 1
              $0 start
              ;;
          *)
              echo "Usage: $0 {start|stop|restart}" >&2
              exit 1
              ;;
      esac

Next thing you ll need is an alias so you can use the commands from anywhere on the server.

nano .bashrc #It is normaly located in the user folder
#solr alias
alias solr="/etc/init.d/solr" 

Exit and test with "solr start" or "solr stop" or "solr restart" (kill and restart your shell to make it take effect)

If you want solr to start when the server boots you ll need this to add it to the boot sequence.

update-rc.d solr defaults  

Now we are done on the server level. Lets get to our drupal instance. Check out admin/config/search/search_api, you ll see that we ll need and index and a server.
First we create a server by selecting our solr
/admin/config/search/search_api/add_server. Fill in the location of our solr which is http://localhost:8983/solr. Feel free to set up authentication but this will not be treated in this post.
Now create an index by selecting our solr server we created. Select in the fields tab all the fields you want in the index. These field will be used to perform the search onto. Make sure you have the longitude latitude field enabled (its provided by location module) because these coordinates will be used to do the radius search on.

Create a content type and on the admin/structure/types/manage/[content-type]/edit page enable the location functionality. Do not use this together with location cck field. It wont work.

Adminster the location settings over here:/admin/config/content/location. Enable "Use a Google Map to set latitude and longitude ". Set up a goolge map api key to allow this to work (link is on the page).

Administer the map settings here:admin/config/services/gmap and here admin/config/services/gmap_location. Yo have a macro generator which comes with the gmap module. So turn that module on to generate your own macro to use for your maps.

Now create a couple of nodes. Set their location not to far from each other so we can test easly.

We are almost there. Now to create a search page we ll need views. So create a view with the search index as base table. Add a filter called "Search: Radius" and expose it. Fill in the exposed settings being the coordinates of the center of the circle and default radius. Do NOT select gmap for a view display it should be HTML List. Change type of Exposed form style to Search API Location. See image:

In Search API location exposed form style options set following string to GMap macro: [gmap align=Center |zoom=9 |center=52.0904,5.1004 |width=600px |height=500px |control=Small |type=Map]

Now you should have something like this:

To see it working on a live demo:

http://ygerasimov.com/geo/search_location

Thanks to http://ygerasimov.com to contribute the module and the precompiled package.

Aug 08 2011
Aug 08

In drupal 7 we have something called contextual links. It is that little wheel you see when you hover over blocks so you can edit them in place. It is a great usability improvement but it is not always that clear how to implement them.

The contextual links functionality used to be a contrib module in d6 now it is a core module.

Reusing already defined local tasks on a custom element

For example on a recent project we have a listing of links by a theme. The links are grouped by a theme node, we use a link field to hold the links to be printed. Now we would like contextual links on our little block it so we can edit the node so we can change our links. Here is how to do it:

<?php
  
//We have a render array defined in a custom module
  
$block['#theme_wrappers'] = array('thema_block');
  
$block['title'] = l($node->title$url);
  
$block['class'] = "themablock themablock-" $block_count " num-" $block_in_row;
  
$block['more'] =  theme('more_link', array('title' => 'Read more''url' => $url));         //The contextual links we place on our element
  
$block['#contextual_links']['thema_blocks'] = array('node', array($node->nid));  
  
  
render $block;
?>

Let me explain what happens. All the links with a type MENU_LOCAL_TASK (as defined in their respective hook_menu) and a context MENU_CONTEXT_INLINE underlying the path node are fetched. The simple entry in the #contextual_links array will fetch all the node local taks.
Look to how the node module defines node/%node/edit and node/%node/delete and it will be clear.

<?php
  $items
['node/%node/edit'] = array(
    
'title' => 'Edit',
    
'page callback' => 'node_page_edit',
    
'page arguments' => array(1),
    
'access callback' => 'node_access',
    
'access arguments' => array('update'1),
    
'weight' => 0,
    
'type' => MENU_LOCAL_TASK,
    
'context' => MENU_CONTEXT_PAGE MENU_CONTEXT_INLINE,
    
'file' => 'node.pages.inc',
  );
  
$items['node/%node/delete'] = array(
    
'title' => 'Delete',
    
'page callback' => 'drupal_get_form',
    
'page arguments' => array('node_delete_confirm'1),
    
'access callback' => 'node_access',
    
'access arguments' => array('delete'1),
    
'weight' => 1,
    
'type' => MENU_LOCAL_TASK,
    
'context' => MENU_CONTEXT_INLINE,
    
'file' => 'node.pages.inc',
  );
?>

This means you cannot fetch just any link defined in hook_menu, only the local task with a context inline will work for contextual links.

Next we need a hook theme to implement our elements theme wrapper.

<?php
/**
 * Implements hook_theme().
 */
function glue_theme() {
  
$items = array(
    
'thema_block' => array(
      
'render element' => 'element',
      
'template' => 'tpl/thema_block',
    ),
  );  
  
  return 
$items;
}
?>

In a preprocess file we can assign our variables.

<?php
/**
 * Implements preprocess_thema_block()
 */
function glue_preprocess_thema_block(&$variables) {
  
$variables['classes_array'][] = $variables['element']['class'];
  
$variables['title'] = $variables['element']['title'];
  
$variables['content'] = $variables['element']['#children'];
  
$variables['more'] = $variables['element']['more'];
}
?>

Now the only thing we need a little template file to print it all.

<?php
<div class="<?php print $classes; ?>" <?php print $attributes?>>
  <?php print render($title_prefix); ?>
  <h2 <?php print $title_attributes?>><?php print $title;?></h2>
  <div class="content"<?php print $content_attributes?>>
    <?php print $content ?>
  </div>
  <?php print $more ?>
  <?php print render($title_suffix); ?>
</div>
?>

Here we can see we printed the $classes which will contain the contextual link classes. The contextual links are filled in by the render function. The actual links you ll find in $title_prefix, also put in place by the render function. By rendering that array the html for the links will be printed. The jquery added by the contextual_links module will transform all arrays with the correct classes to the little wheel you can click.

Your own contextual links

Now if we want our own contextual links we need to create a hook_menu and define our own items as local tasks with an inline context. Here is an example found on the adding new contextual links page on http://drupal.org/node/1089922

In the example we will add our links in the hook_menu and with a hook_block_view_alter we will change the render array of the blocks and add our links to it.

<?php
  
// An example contextual link menu item.
  
$items['contextual/%/information'] = array(
    
'title' => 'Block information',
    
'type' => MENU_LOCAL_ACTION,
    
'context' => MENU_CONTEXT_INLINE,
    
'page callback' => 'contextual_example_page',
    
'page arguments' => array(1),
    
'access callback' => TRUE,
  );
  
// To use local task menu items, there must be a parent page.
  
$items['contextual'] = array(
    
'title' => 'The contextual example page',
    
'page callback' => 'contextual_example_page',
    
'page arguments' => array(1),
    
'access callback' => TRUE,
  );
?>

<?php
  
/**
  * Implements hook_block_view_alter().
  */
  
function contextual_example_block_view_alter(&$data$block) {
    
// Contextual links can be added as a renderable element to the content of
    // a render array. We check if the block has content, and if so add a
    // contextual link to it.
    
if (isset($data['content']) && is_array($data['content'])) {
      
$contextual_links = array(
        
'contextual',
        array(
$block->module),
      );
$data['content']['#contextual_links']['contextual_example'] = $contextual_links;
  }
}
?>

As you can see it works perfectly. The path in

<?php
$contextual_links 
= array('contextual', array($block->module));
?>


points to what we defined in the hook_menu fetches everything underneath the path 'contextual' and the $block->module is the argument passed.

If we would want to add these links to our custom element in previous example the only thing needed would be to adding them to the array.

<?php
  
//We have a render array defined in a custom module
  
$block['#theme_wrappers'] = array('thema_block');
  
$block['title'] = l($node->title$url);
  
$block['class'] = "themablock themablock-" $block_count " num-" $block_in_row;
  
$block['more'] =  theme('more_link', array('title' => 'Read more''url' => $url));         //The contextual links we place on our element
  
$block['#contextual_links']['thema_blocks'] = array('node', array($node->nid));
  
  
$block['#contextual_links']['whatever'] = array('contextual', array($something_usefull));
  
//On $something_usefull you ll need to put something so your function contextual/%/information  knows what to do in the given context.
  
  
render $block;
?>

Altering

Yet another method of adding contextual links is the alter method. This is taken from the api page

function hook_menu_contextual_links_alter(&$links, $router_item, $root_path) {
// Add a link to all contextual links for nodes.

<?php
  
if ($root_path == 'node/%') {
    
$links['foo'] = array(
      
'title' => t('Do fu'), 
      
'href' => 'foo/do'
      
'localized_options' => array(
        
'query' => array(
          
'foo' => 'bar',
        ),
      ),
    );
  }
}
?>

Of course this will only work for existing paths on existing elements.

On views rows

We can also add contextual links to views rows. In our first example we want to add a contextual link to our slides of our slideshow view. Here is what we did:

<?php
/**
 *  Contextual links maker
 */
function glue_make_contextual_links($output$nid) {
  
$render_array =
      array(
        
'children' => $output,
        
'#theme_wrappers' => array('contextual_container'),
        
'#contextual_links' => array(
          
'glue' => array('node', array($nid)),
        ),
  );
  return 
render($render_array);
}
/**
 * Adds contextual links to views templates
 */
function glue_preprocess_views_view_field(&$vars) {
  if (isset(
$vars['field']->field_info['field_name']) && $vars['field']->field_info['field_name'] == 'field_slide_image') {
    
$vars['output'] = glue_make_contextual_links($vars['output'], $vars['row']->nid);
  }
}
?>

We implemented the views_view_field preprocess hook and we wrapped the contextual links around our field we are displaying in the interface. Since the slide in slideshow is a node we can use the nid as argument to call for the correct contextual links allowing us to edit/delete the slide shown.

To be complete here is the theme_hook and the preprocess

<?php
/**
 * Implements hook_theme().
 */
function glue_theme() {
  
$items = array(
    
'contextual_container' => array(
      
'render element' => 'element',
      
'template' => 'tpl/contextual_container',
    ),
  );  
  
  return 
$items;
}
/**
 * Implements hook_preprocess_contextual_container()
 */
function glue_preprocess_contextual_container(&$variables) {
  
$variables['content'] = $variables['element']['children'];
}
?>

And the template file:

<?php
<div class="<?php print $classes; ?>" <?php print $attributes?>>
  <div <?php print $content_attributes?>>
    <?php print $content ?>
  </div>
  <div class="custom-contextual-links">
    <?php print render($title_suffix); ?>
  </div>
</div>
?>

In our next example we did it different. We have table and we want to add a contextual link field. We added a node edit link field to our view table display. In the template of that field named : we ve put this:

<?php
<div class="contextual-links-region">
  <
div class="contextual-links-wrapper">
    <
ul class="contextual-links">
      <
li>
        <?
php print $output?>
      </li>
    </ul>  
  </div>
</div>
?>

This one is less clean because you can only add one item like this. In theory it would be possible to add it to the table row but this mean adapting the jquery from contextual links to also target tr elements, in the core it only targets div elements, putting them around tr elements is not valid html.

On models

On our site we also use the model module which provides "a container entity". It is an entity with just a title and its fieldable. Instead of implementing lots of hooks yourself you could take this module and have this entity working right away. (http://drupal.org/project/model) We use the model entity store information that doesnt require the extra a node offers, like workflow, authoring, etc.. We just want to store stuff in fields. Perfect to store some header images and the path they need to be displayed on. Using this header_image bundle we can show a different header on the specified paths. Now that done wouldnt it be great to have contextual links on it? This is how we did it:

<?php
function glue_menu() {
  
$items['admin/content/models/model/%/add'] = array(
    
'title' => 'Add',
    
'type' => MENU_LOCAL_ACTION,
    
'context' => MENU_CONTEXT_INLINE,
    
'page callback' => 'glue_model_add_type',
    
'page arguments' => array(4),
    
'access callback' => TRUE,
  );

  return 

$items;
}
/**
 * goto the adding page for the model
 */
function glue_model_add_type($mid) {
  
$destination drupal_get_destination();
  
$query db_select('model''m');
  
$result $query
          
->condition('m.model_id'$mid'=')
          ->
fields('m', array('type'))
          ->
execute()
          ->
fetch();$params = array(
    
'query' => array(
      
'destination' => $destination['destination'],
    )
  );
  
//where to go next
  
$_GET['destination'] = url('admin/content/models/add/' $result->type$params);
  
drupal_goto('admin/content/models/add/' $result->type$params);
}

function 

create_logo() {
  
$model model_load($item->model_id);
  
$logo field_view_field('model'$model'field_logo_logo''full');
  
$logo[0]['#image_style'] = 'logo_style';
  
$logo_and_link = array(
    
'#type' => 'link',
    
'#title' => render($logo[0]),
    
'#href' => '',
    
'#options' => array('html' => TRUE'attributes' => array('class' => $css_class)),
    
'#contextual_links' => array(
      
'logo' => array('admin/content/models/model', array($model->model_id)),
    )
  );
  if (isset(
$model->field_logo_link['und'][0]['url'])) {
    
$header_and_link['#href'] = $model->field_logo_link['und'][0]['url'];
  }
  return 
$logo_and_link;
}
?>

We added a menu callback to register local actions to add each model. In the callback we query the type and we make it go to the add page. This way when we create a render array the contextual links to not only edit and delete but also the add link will be present.

Finaly how to put the node/add/%type in a contextual link trick

The same trick we did with the models we can do with the nodes:

<?php
function glue_menu() {
  
$items['node/%/add'] = array(
    
'title' => 'Add',
    
'type' => MENU_LOCAL_ACTION,
    
'context' => MENU_CONTEXT_INLINE,
    
'page callback' => 'glue_node_add_type',
    
'page arguments' => array(1),
    
'access callback' => TRUE,
  );

  return 

$items;
}
/**
 * goto the adding page for the node
 */
function glue_node_add_type($nid) {
  
$destination drupal_get_destination();
  
$query db_select('node''n');
  
$result $query
          
->condition('n.nid'$nid'=')
          ->
fields('n', array('type'))
          ->
execute()
          ->
fetch();$params = array(
    
'query' => array(
      
'destination' => $destination['destination'],
    )
  );
  
//where to go next
  
$_GET['destination'] = url('node/add/' $result->type$params);
  
drupal_goto('node/add/' $result->type$params);
}
?>

The slides are nodes so in the screenshot above you can "Toevoegen" Which means "Add" in dutch. This was caused by the last piece of code.

I m sure they are lots of other methods to add contextual links. So I you have any post them in the comments.

Apr 06 2011
Apr 06

This post will explain how to automate installing drupal on a development environment using a stack installer script.

We used the script to speed up a drupal training. With the script and the install profile we could set up a development workstation very fast. Leaving us more time to look at the important stuff.

It also gave us a chance to explain the power of drupal distributions. The fact that drupal is capable of being installed as a different product using an install profile gives it an edge over other competing opensource cms. This could be interesting for bigger (enterprise) clients who want to develop products on top of drupal using install profiles. More about distributions and install profiles can be found here (http://drupal.org/node/326175 and to follow http://www.drupaldistrowatch.com)

Installing

We assumed an ubuntu was already installed. If not, installing the operating system is trivial. Go to the ubuntu website (http://www.ubuntu.com/desktop/get-ubuntu/download), download and install it.

The install script configures the LAMP stack for drupal, installs APC, drush, drush make and sets up a database for our drupal. After this we launch the make file which downloads all needed code for our drupal. Now the script will configure the drupal file system, set up the permissions and copy the drupal install profile and resources.

execute the script :

./mbs_drupal_installer.sh drupal_instance_name mysql_existing_power_user mysql_power_user_password
#!/bin/bash
 
#!/bin/bash
 
color_echo()
{
        text_reverse_bold="$(tput rev) $(tput bold)"
        text_normal="$(tput sgr0)"
 
        echo "${text_reverse_bold}$*${text_normal}"
}
 
if [ $UID != 0 ]
then
        echo "This script must be run by root".
        exit 1
fi
 
if [ $# != 3 ]
then
        echo "Usage  : $0 drupal_instance_name mysql_existing_power_user mysql_power_user_password"
        echo "Example: $0 mywebsite root \"\""
        echo "Example: $0 mywebsite mysqlpoweruser secretpassword"
        exit 2
fi
 
home_dir=$(dirname $0)
bin_dir=$home_dir/bin
 
drupal_instance_name=$1
mysql_existing_power_user=$2
mysql_power_user_password=$3
 
 
color_echo "Starting Drupal install..."
 
color_echo "Installing Debian/Ubuntu packages..."
apt-get --yes install apache2 php5 php-pear php5-dev php5-gd mysql-server-5.0 php5-mysql mysql-client wget curl
 
color_echo "Setting up the Apache mod_rewrite for Drupal clean urls..."
a2enmod rewrite
 
color_echo "Setting up the Apache mod_expires for Apache Cache-Control directive..."
a2enmod expires
 
color_echo "Setting up the Apache mod_deflate to save bandwidth..."
a2enmod deflate
sed -i 's|DEFLATE text/html text/plain text/xml|DEFLATE text/html text/plain text/xml text/css text/javascript application/x-javascript|' /etc/apache2/mods-available/deflate.conf
 
 
color_echo "Adding PEAR package: progress bars on upload..."
pecl install uploadprogress
sed -i '/; extension_dir directive above/ a\
extension=uploadprogress.so' /etc/php5/apache2/php.ini
 
color_echo "Installing APC php opcode cache..."
pecl install apc
sed -i '/; extension_dir directive above/ a\
extension=apc.so' /etc/php5/apache2/php.ini
 
 
#sed -i 's/query_cache_limit       = 1M/query_cache_limit       = 1M\
#query_cache_type        = 1/' /etc/mysql/my.cnf
#echo "Reloading mysql..."
#/etc/init.d/mysql force-reload
 
color_echo "Reloading Apache..."
/etc/init.d/apache2 force-reload
 
drush_extract_dir=/opt
drush_install_dir=$drush_extract_dir/drush
color_echo "Installing drush in $drush_install_dir ..."
resources_dir=$home_dir/resources
tar xvf $resources_dir/drush-6.x-3.3.tar.gz -C $drush_extract_dir
cp $resources_dir/Console_Table-1.1.3/Table.php $drush_install_dir/includes/table.inc
 
#Installing drush make
drush_make_extract_dir=~/.drush
mkdir $drush_make_extract_dir
tar xvf $resources_dir/drush_make-6.x-2.2.tar.gz -C $drush_make_extract_dir
 
color_echo "Creating the MySQL database for drupal on localhost ..."
$bin_dir/create_database.sh $drupal_instance_name $mysql_existing_power_user $mysql_power_user_password
 
drupal_path="/var/www/$drupal_instance_name"
color_echo "Installing drupal in $drupal_path ..."
 
color_echo "Executing make file..."
/opt/drush/drush make ./multimediabs.make $drupal_path
 
color_echo "Creating additional Drupal directories and files..."
mkdir $drupal_path/profiles/multimediabs
mkdir $drupal_path/sites/all/themes
mkdir $drupal_path/sites/all/modules/custom
mkdir $drupal_path/sites/all/modules/contrib_patched
touch $drupal_path/sites/all/modules/contrib_patched/patches.txt
mkdir $drupal_path/sites/default/files
mkdir $drupal_path/sites/default/tmp
 
color_echo "Copying Drupal profile and installer translation files..."
cp ./multimediabs.profile $drupal_path/profiles/multimediabs/
cp -R $resources_dir/translations $drupal_path/profiles/multimediabs/
 
color_echo "Copying and completing the Drupal settings file..."
cp $drupal_path/sites/default/default.settings.php $drupal_path/sites/default/settings.php
cat $resources_dir/settings_snippet.php >> $drupal_path/sites/default/settings.php
 
color_echo "Copying jquery.ui to module folder..." 
cp -R $resources_dir/jquery.ui $drupal_path/sites/all/modules/contrib/jquery_ui/jquery.ui
 
color_echo "Setting the work files and directories as writable..." 
chmod 777 $drupal_path/sites/default/files
chmod 777 $drupal_path/sites/default/tmp
chmod 777 $drupal_path/sites/default/settings.php
 
color_echo "Copying the drush config file..."
cp  $resources_dir/drushrc.php $drupal_path/
 
color_echo "Restarting apache..."
apachectl restart
 
#color_echo "Installing xhprof"
 
#pecl download xhprof-0.9.2
#tar -xvf xhprof-0.9.2.tgz -C /var/tmp
#cd /var/tmp/xhprof-0.9.2/extension
#phpize
#./configure
#make
#make install
#make test
 
#cp -R /build/buildd/php5-5.3.3/pear-build-download/xhprof-0.9.2/xhprof_html /var/www/xhprof
#ln -s /build/buildd/php5-5.3.3/pear-build-download/xhprof-0.9.2/xhprof_html /var/www/xhprof
#mkdir /var/tmp/xhprof
#chmod 777 /var/tmp/xhprof
 
color_echo "*) creating an Apache virtual host for $drupal_instance_name with path $drupal_path"
cp $resources_dir/vhost /etc/apache2/sites-available/
sed -i "s/multimediabs/$drupal_instance_name/g" /etc/apache2/sites-available/$drupal_instance_name
 
color_echo
color_echo "To complete the installation you must:"
color_echo
color_echo '*) add the drush command to the PATH:'
color_echo "  export PATH=$drush_install_dir:\$PATH"
color_echo
color_echo '*) Change your error settings in php.ini to : error_reporting = E_ALL & ~E_DEPRECATED & ~E_NOTICE'
color_echo
color_echo "*) Create an entry in /etc/hosts : 127.0.0.1      $drupal_instance_name"
color_echo
color_echo "*) Update the virtual host file /etc/apache2/sites-available/$drupal_instance_name"
color_echo "create a symlink ln -s /etc/apache2/sites-available/$drupal_instance_name /etc/apache2/sites-enabled/$drupal_instance_name"
color_echo "restart apache with : sudo apachectl restart"
color_echo
color_echo "Open your browser, go to http://$drupal_instance_name and start the 'multimediabs' pressflow install profile"
color_echo
color_echo "You can then add code and modules in the Drupal instance directory in $drupal_path ."
#color_echo
#color_echo "Add this to php.ini to make xhprof run extension=xhprof.so and xhprof.output_dir=\"/var/tmp/xhprof\" and restart server"

The make file.

; $Id$
;
; ----------------
; Multimediabs Make file
; ----------------
 
 
; Core version
; ------------
; Each makefile should begin by declaring the core version of Drupal that all
; projects should be compatible with.
 
core = 6.x
 
; API version
; ------------
; Every makefile needs to declare its Drush Make API version. This version of
; drush make uses API version `2`.
 
api = 2
 
; Core project
; ------------
; In order for your makefile to generate a full Drupal site, you must include
; a core project. This is usually Drupal core, but you can also specify
; alternative core projects like Pressflow. Note that makefiles included with
; install profiles *should not* include a core project.
 
; Use Pressflow instead of Drupal core:
projects[pressflow][type] = "core"
projects[pressflow][download][type] = "get"
projects[pressflow][download][url] = "<a href="http://launchpad.net/pressflow/6.x/6.20.97/+download/pressflow-6.20.97.tar.gz"
">http://launchpad.net/pressflow/6.x/6.20.97/+download/pressflow-6.20.97.t...</a>  
 
; Modules
; --------
projects[admin_menu][subdir] = contrib
projects[vertical_tabs][subdir] = contrib
 
projects[cck][subdir] = contrib
projects[filefield][subdir] = contrib
projects[imagefield][subdir] = contrib
projects[date][subdir] = contrib
projects[jquery_ui][subdir] = contrib
projects[imageapi][subdir] = contrib
projects[imagecache][subdir] = contrib
 
projects[views][subdir] = contrib
 
projects[features][subdir] = contrib
projects[diff][subdir] = contrib
 
projects[pathauto][subdir] = contrib
projects[token][subdir] = contrib
 
projects[i18n][subdir] = contrib
projects[l10n_update][subdir] = contrib
projects[l10n_client][subdir] = contrib
 
 
;Development modules
projects[devel][subdir] = contrib
projects[coder][subdir] = contrib
projects[devel_themer][subdir] = contrib
projects[schema][subdir] = contrib
projects[install_profile_api][subdir] = contrib
projects[update_api][subdir] = contrib
projects[module_builder][subdir] = contrib
 
; Themes
; --------
 
 
; Libraries
; ---------

Now you still need to set your drush PATH, error reporting settings in php.ini, an entry in the hosts file and a vhost. This is not done automatic because the dev station may be used for other projects and we do not want to interfere with these settings. Note also that drush and drush make versions are hard coded in the script. This is to make sure that everyone had the same version of everything so we would waste time in the training on tracing version specific problems.

After you have setup all necessary you ll open the browser and go to your drupal site. There you can select this simple install profile which allows you to select the language.

<?php
function multimediabs_profile_modules() {
  return array(
    
// core modules
    
'menu''search''taxonomy''path''update''syslog''comment''locale''dblog',// cck
    
'content''filefield''text''imagefield''date_api''date''date_popup''jquery_ui'// imagecache
    
'imageapi''imageapi_gd''imagecache''imagecache_ui',// pathauto,
    
'pathauto''token',// views
    
'views''views_ui',// admin improvements
    
'admin_menu''vertical_tabs',
    
    
//IPP
    
'features''diff',
    
    
//Languages
    
'i18n''l10n_update''l10n_client'
    
    
// devel tools
    
'coder''schema''install_profile_api''update_api''module_builder',
  );
}

function 

multimediabs_profile_details() {
  return array(
    
'name' => 'Multimediabs pressflow',
    
'description' => 'Installation drupal multimediabs Tours.'
  
);
}

function 

multimediabs_profile_task_list() {
  
$tasks = array();
  
  if (
_l10n_install_language_selected()) {
    
$tasks['l10n-install-batch'] = st('Download and import translations');
  }
   
  return 
$tasks;
}

function 

multimediabs_profile_tasks(&$task$url) {
  global 
$install_locale;
  
  
install_include(multimediabs_profile_modules());
  
  if (
$task == 'profile') {
    
// Perform the default profile install tasks.
    
include_once('profiles/default/default.profile');
    
default_profile_tasks($task$url);
    
    
// administration theme
    
variable_set('admin_theme''garland');
    
variable_set('node_admin_theme'TRUE);// user registration
    
variable_set('user_register'FALSE);// hide all Garland blocks
    
db_query("UPDATE {blocks} SET status = 0 WHERE theme = 'garland'");// image quality
    
variable_set('image_jpeg_quality'100);
    
variable_set('imageapi_jpeg_quality'100);// files
    
variable_set('file_directory_temp''sites/default/tmp');
    
variable_set('file_directory_path''sites/default/files');// date & time
    
variable_set('configurable_timezones'0);variable_set('date_format_short''d/m/Y - H:i');
    
variable_set('date_format_short_custom''d/m/Y - H:i');variable_set('date_format_media''D, d/m/Y - H:i');
    
variable_set('date_format_media_custom''D, d/m/Y - H:i');variable_set('date_format_long''l, j F, Y - H:i');
    
variable_set('date_format_long_custom''l, j F, Y - H:i');// error reporting
    
variable_set('error_level'0);// roles
    
db_query("INSERT INTO {role} (name) VALUES ('%s')"'site administrator');
    
db_query("INSERT INTO {role} (name) VALUES ('%s')"'editor');// pathauto
    
variable_set('pathauto_node_pattern''');
    
variable_set('pathauto_taxonomy_pattern''');
    
variable_set('pathauto_user_pattern''');
    
variable_set('pathauto_ignore_words''');// permissions
    
$admin_permissions = array('access administration menu''administer blocks''use PHP for block visibility''administer menu''access content''administer nodes''revert revisions''view revisions''administer url aliases''create url aliases''search content''use advanced search''access administration pages''access site reports''administer taxonomy''access user profiles''administer permissions''administer users');
    
$editor_permissions = array('access administration menu''administer menu''access content''administer nodes''revert revisions''view revisions''search content''use advanced search''access administration pages');
    
_multimediabs_add_permissions(3$admin_permissions);
    
_multimediabs_add_permissions(4$editor_permissions);// input format permissions
    
db_query("UPDATE {filter_formats} SET roles = ',4,3,' WHERE format IN (2, 3)");// hide module descriptions for admin
    
db_query("UPDATE {users} SET data = '%s' WHERE uid = 1"serialize(array('admin_compact_mode' => TRUE)));// Update the menu router information.
    
menu_rebuild();//Activate devel and set xhprof
    
drupal_install_modules(array('devel'));
    
    
/*
    variable_set('devel_xhprof_enabled', 1);
    variable_set('devel_xhprof_directory', '/build/buildd/php5-5.3.3/pear-build-download/xhprof-0.9.2');
    variable_set('devel_xhprof_url', '<a href="http://localhost/xhprof'">http://localhost/xhprof'</a>);
    */
    
    // Move forward to our install batch.
    
$task 'l10n-install';
  }
// Download and import translations if needed.
  
if ($task == 'l10n-install') {
    if (
_l10n_install_language_selected()) {
      
$history l10n_update_get_history();
      
module_load_include('check.inc''l10n_update');
      
$available l10n_update_available_releases();
      
$updates l10n_update_build_updates($history$available);module_load_include('batch.inc''l10n_update');
      
$updates _l10n_update_prepare_updates($updatesNULL, array());
      
$batch l10n_update_batch_multiple($updatesLOCALE_IMPORT_KEEP);// Overwrite batch finish callback, so we can modify install task too.
      
$batch['finished'] = '_l10n_install_batch_finished';// Start a batch, switch to 'l10n-install-batch' task. We need to
      // set the variable here, because batch_process() redirects.
      
variable_set('install_task''l10n-install-batch');
      
batch_set($batch);
      
batch_process($url$url);
    }
  }

  if (

$task == 'l10n-install-batch') {
    include_once 
'includes/batch.inc';
    return 
_batch_page();
  }
}

function 

multimediabs_form_alter(&$form$form_state$form_id) {
  if (
$form_id == 'install_configure') {
    
$form['site_information']['site_name']['#default_value'] = 'MBS';
    
$form['site_information']['site_mail']['#default_value'] = ini_get('sendmail_from') ? ini_get('sendmail_from') : '<a href="https://dominiquedecooman.com/blog/automate-installing-drupal-ubuntu-drupal-training-purpose/mailto:[email protected]">[email protected]</a>';
    
$form['admin_account']['account']['name']['#default_value'] = 'admin';
    
$form['admin_account']['account']['mail']['#default_value'] = '<a href="https://dominiquedecooman.com/blog/automate-installing-drupal-ubuntu-drupal-training-purpose/mailto:[email protected]">[email protected]</a>';
  }
}

function 

_multimediabsformat_set_roles($roles$format_id) {
  
$roles implode(',',$roles);
  
// Drupal core depends on the list of roles beginning and ending with commas.
  
if (!empty($roles)) {
    
$roles ','$roles .',';
  }
  
db_query("UPDATE {filter_formats} SET roles = '%s' WHERE format = %d"$roles$format_id);
}

function 

_multimediabs_add_permissions($rid$perms) {
  
// Retrieve the currently set permissions.
  
$result db_query("SELECT p.perm FROM {role} r INNER JOIN {permission} p ON p.rid = r.rid WHERE r.rid = %d "$rid);
  
$existing_perms = array();
  while (
$row db_fetch_object($result)) {
    
$existing_perms += explode(', '$row->perm);
  }
  
// If this role already has permissions, merge them with the new permissions being set.
  
if (count($existing_perms) > 0) {
    
$perms array_unique(array_merge($perms, (array)$existing_perms));
  }
// Update the permissions.
  
db_query('DELETE FROM {permission} WHERE rid = %d'$rid);
  
db_query("INSERT INTO {permission} (rid, perm) VALUES (%d, '%s')"$ridimplode(', '$perms));
}
/**
 * Check whether we are installing in a language other than English.
 */
function _l10n_install_language_selected() {
  global 
$install_locale;
  return !empty(
$install_locale) && ($install_locale != 'en');
}
/**
 * Batch finish callback for l10n_install batches.
 */
function _l10n_install_batch_finished($success$results) {
  if (
$success) {
    
variable_set('install_task''profile-finished');
  }
  
// Invoke default batch finish function too.
  
module_load_include('batch.inc''l10n_update');
  
_l10n_update_batch_finished($success$results);// Set up l10n_client and inform the admin about it.
  // @todo This message will not show up for some reason.
  
global $user;
  
variable_set('l10n_client_use_server'1);
  
variable_set('l10n_client_server''<a href="http://localize.drupal.org'">http://localize.drupal.org'</a>);
  drupal_set_message(t('Localization client is set up to share your translations with <a href="https://dominiquedecooman.com/blog/automate-installing-drupal-ubuntu-drupal-training-purpose/
@localize">localize.drupal.org</a>. Each participating user should have a localize.drupal.org account and set up their API key on their user profile page. <a href="https://dominiquedecooman.com/blog/automate-installing-drupal-ubuntu-drupal-training-purpose/@edit-profile">Set up yours</a>.', array('@localize' => '<a href="https://dominiquedecooman.com/blog/automate-installing-drupal-ubuntu-drupal-training-purpose/http://localize.drupal.org'">http://localize.drupal.org'</a>, '@edit-profile' => url('user/' . $user->uid . '/edit'))));
  
  //Set language defaults
  
variable_set('language_negotiation'1);
  
db_query("UPDATE {variable} SET value = '%s' WHERE name = '%s'"'O:8:"stdClass":11:{s:8:"language";s:2:"en";s:4:"name";s:7:"English";s:6:"native";s:7:"English";s:9:"direction";s:1:"0";s:7:"enabled";i:1;s:7:"plurals";s:1:"0";s:7:"formula";s:0:"";s:6:"domain";s:0:"";s:6:"prefix";s:0:"";s:6:"weight";s:1:"0";s:10:"javascript";s:0:"";}''language_default');
  
  global 
$theme_key;
  
$theme_key 'garland';
  
_block_rehash();
  
install_set_block('user''0''garland''left');
  
install_set_block('locale''0''garland''left');
}
?>

Using this script to automate install is fast but it is not the only solution.

Here are some Other possibilities:

Drubuntu
You can get the script here (http://drupal.org/drubuntu). It works great. It installs lots of things(project page) but there isn't a lot of transparancy there. Meaning unless you read all the code to see what it actually does and where it puts everything, which versions, ... you don't really know whats going on. Note that it does quit a lot to your system, perhaps you dont want certain things but there are no options available.

It has some powerful capabilities though. Besides what you find on the project page, it provides drush commands to set up a site, to create a sandbox. It looks like a great tool but it is not very documented. So for beginners it might be a bit too complicated.

Quickstart
Yet another option is working in a vbox. You have http://drupal.org/project/quickstart to use. It has installed quit a bit (lamp stack, drush, eclipse, netbeans, ...). It is easy to use. When launching the browser you ll see everything you need. It works great but it assumes you ll be working in a vbox. I really like the philosophy behind it not polluting you instance, every project has its own environment and it is easy to share a vbox with other developers. I use it daily for personal development.

Acquia
Ofcourse you have Acquia drupal, which also comes as a stack installer on all platforms. You can get it here (http://network.acquia.com/downloads).

Buildkit
Created by development seed an install profile to create distrubtion and drupal sites. (http://drupal.org/project/buildkit)
It follows the logic as the install profile we build an easy simple setup to get started fast. However this is only an install profile it has no server setup. (Note that it only supports D7.)

Conclusion

A lot has been done around automating drupal installation, you should use it when possible to save yourself and others some time. Pick the one that suits your needs best.

For us drubuntu installed too much and it was a bit too intrusive, quickstart was no option because we had a machine at our disposal that we needed to use, acquia comes with acquia drupal which we didnt want and buildkit was drupal 7 and the training was presented for D6, so we ended up creating this lightweight script that configures the LAMP stack and installs our localisable pressflow drupal.

RESOURCES:The script and its required resources you can get on github. (https://github.com/dominiquedc/drupal-stack-installer-mbs)

NEXT:In the next post I ll report how the training of two days using this installer went.

Mar 09 2011
Mar 09

This post will summarize the experience of introducing drupal into an enterprise. How we achieved a level of quality in drupal development needed for an enterprise application.

It will cover:

  • Setup : The drupal 6 site and the integration with the other apps in the enterprise.
  • Level of quality
  • Testing : How we implemented tests using simpletest and selenium.
  • Staging : How we used features, hook_update (update_api) and deploy to stage our application in a dev/staging/prod release cycle and the problems we encountered.
  • Automation : How we used ant, shell scripts and drush to:
    • Automate post-install for staging.
    • Automate releasing versions and testing.
    • Automate initialize the project.
  • Development Management : How we managed to estimate and not to lose track of anything managing this complex project.
  • Garding Performance : How we kept an eye on performance.
  • Security and maintenance/
  • Training Developers : How we combined developing and training developers at the same time. This resulting in drupal being adopting in the organisation.
  • Conclusion

Setup

The extranet we are building is a window on all internal services. The goal is to aggregate all that data for these external services and let users/clients interact with it using drupal.
The elements:

  • We have the web/php machine which harbours the drupal installation.
  • A set of webservices to query an internal server, which contains all sorts of info about the organisation, clients, products, ... Goal is that clients can review there hosted products.
  • We have a service that holds all ticketing data that is queried by a webservice to retrieve tickets. Goal is that clients can view tickets concerning them.
  • There is a shell script that retrieves text and images from a an internal wiki to display in the drupal front end.
  • All authentication to drupal and all the other services is done by an SSO (shibboleth) which is in his turn connected to an LDAP containing all info about clients. All is https because pages contain sensitive information about the clients hosted the services.
  • Roles and permissions are assiged by custom code based on external rules retrieved from the LDAP.
  • Some other internal services which are really old (1999 old) the organisation has that generated html where exposed using iframes. To construct the url we needed retrieve elements using webservices.

This is the version 1.0, integration with other services like the intranet, business logic modules, ... are still being planned.

As you can see drupal would receive a central place in the enterprise integrating tightly with all existing apps. Some newer apps were designed with a webservice model in mind which allow other apps to connect easly with it, which is great. On the other hand some legacy apps werent that friendly and required a less sexy solutions.

This showed immediatly an advantage drupal could have in the entreprise. Imagine for example that drupal was used to create the wiki? The services module could be used to let the main drupal communicate with the wiki. An easy and clean implementation. It would also mean your teams would only need to know drupal and not 10 million other apps. A lesson was learned and the clients intranet that was sheduled for review will now be ported to a drupal commons.

Level of qualtiy

All components in the chain are staged from local development -> integration -> preproduction -> production.

We worked in a scrum system with each week a demo.

Before a story could move to done and be deployed it had to respond to a "fini fini" which means it has to respond to a list a checks to assure it is 100% done.

  • Are permssions configured?
  • Is all transalated to French?
  • Is all work documented, especially patches? (People change positions all the time in large organisations so good docs are essential)
  • Is all work tested using simple test and selenium?
  • Are all automation scripts up to date? (post-install, initialisation, test and release scripts)
    Are site conventions met? (for example expose new admin links to a custom admin menu)
  • Is performance acceptable? (profile with xhprof)
  • Is security acceptable? (external team)
  • Are all configurations and translations automaticaly exportable to the other environments?
  • Can you demo the story to the client on preproduction?

Clearly for most organisations this level of quality is simple unaffordable but for an enterprise app this standard quality.

Testing

Simple test
Simpletest is the standard way to test functionality on a drupal site. Here you can find a very good article that explains what simpletest does [http://www.comm-press.de/en/blog/simple-test-drupal].

We used the shib_auth module to wire drupals authentication system to the session provided by shibboleth. This means also that almost all standard simple tests for drupal fail because of this authentication. We decided to ignore tests provided by standard drupal module and we focused on our own tests.

We decided to use Simpletest to do all of the standard unit testing. Here is an execellent book that explains the principles of unit testing [http://www.amazon.com/Pragmatic-Unit-Testing-Java-JUnit/dp/0974514012]. Atomic tests testing our apis and webservices. The client had already a broad experience with unit testing. The team was used to really code test driven.

We immediatly had some trouble with simpletest. Because simpletest installs a drupal for each and every test it is very slow. Even with the optimisations proposed here [http://drupal.org/node/466972] it is still slow. To slow to continuously run tests. I found an article on drupal.org [http://drupal.org/node/758662] and apparantly in drupal 7 it is possible to do unit testing with simple test without sandboxing the whole drupal site on every test. It is possible to attack an instance and start running tests. This would obviously go a lot faster but you would have to take into account that you need to write tear down functions to clean up your database. Unfornatly this feature is not available on drupal 6 and we did not have the time to backport it. So we decided to run test only after completion of the functionality. Not very Test driven development. But we still had our code tested at the end.

We wrote tests for every functionality trying to keep the tests as short as possible and we would create a test file for each module we created. We also worked with ant, mainly because the client had a lot of experience with it and because they knew how to integrated the output of ant to a Hudson server. Simpletest comes with a file runTests.sh which can be launched by ant. We implemented that so we could launch all test, a test by group and a specific test class. Here is the script file. The script combines detecting, executing and cleaning. Ofcourse we used drush were possible.
Now we had only hudson to set up. On every commit the code base on hudson is updated and tests are launched.

Continous integration with drupal is possible. Note that in drupal 7 the core contains the test framework so you dont have to patch the core.

Selenium
We choose selenium to do our functional tests. Selenium is a great tool with requires some expertise to effectively use. Luckly the client had expertise using it. We used the PEAR phpUnit to code tests for selenium. Here is an example of a test:

It is pretty easy to code tests, you can even click your tests together using the firefox selenium plugin and export them for php.

With ant we could launch the selenium-server.jar which launches our tests.

But had some obstacles to cross.

  • We needed to make selenium work with https. In firefox you ll need to export certificates, on ie you can set a variable.
  • We needed to make it work on ie. Install phpunit on windows caused some pain but we managed to do it.
  • We had to pass shibboleth each time.

All obstacles were managed.
Note that testing with selenium is very expensive. It is a pretty slow process to execute on multiple platforms/browsers. Ie needed sometimes a different approach to test stuff which took even more time. But you are practically sure your app will do what you want. To give an idea it took 20% of dev time per story to write and complete selenium testing .

Staging

We made all configuration portable using a method which basicly consists of features, deploy and hook update. see staging problem:http://dominiquedecooman.com/blog/drupal-staging-problem

New to this is that we also contributed a patch with our functions to update_api module http://drupal.org/project/update_api so we had a toolkit to use in hook update.
Here is a piece of code from our install file where our hook updates reside.

Here we also thouched a painpoint drupal has, namely some poor core apis. I think everyone is aware of this problem, Dries has set it as his first priority for drupal 8 [http://buytaert.net/8-steps-for-drupal-8 Step1 : Create better separation between framework and product]. So I guess the problem is known. Since we had to code every change we made using the UI we encountered a lot missing apis.

For example try to change a theme and configure its vars. Or use the submit function to delete a role, ... I m not saying it is not possible but the problem is that you realy have to dig in the core functions, understand all the actions and assemble the needed ones to do what needs to be done. If you have a lot of expertise and you know your way around the core code you ll manage but this means everyone who wants a stageable site, and allmost any enterprise client needs this, need to invest a lot of time in training people until they reach this level of expertise.

Better apis would make drupal easier to understand and thus the investment of training would decline and therefore drupal would be better adopted.
Better core apis will make drupal more usable and integratable from a developers point of view.

Automation

Automate post-install for staging
The organisation has a deployment tool which basicaly takes a tarball and deploys it on the server. After it is been deployed a post install script is run. Together with drush we managed to set up the necessairy to deploy drupal. (Note that to make drush render its output correctly we made some adaptations. )

Automate releasing versions and testing.
Before you could create a release we made it mandatory that all tests were pasted. After this happend an svn version got tagged and a tarball created.

Automate initialize the project.
Since we have a lot of tests and integrations with other apps like LDAP, internal config, wiki, ... We created an init file that would fill our drupal and external apps with data for a (users, content, ) test game. The result is an initialisation script that automates this task.

All the scripts are ran from terminal using "ant".

The use of bash scripts and drush made our live much easier. It takes an investment to create them but they are a huge timesaver in the long run.

For example when a new developer needs to set up his workstation he checks out a working copy, he runs the init script and he is ready to go. He codes. He runs his tests. He creates his release, svn is tagged. Deploys his tarball on the server and the post install script will be called executing the hook updates and deploying config stored with feature module. Imagine doing all this by hand, all the time?

Development management

To estimate completion time of a complex project like this you have to look beyond the features that will appear on the screen. This is not allways easy to explain. As a client/end user you dont see the tests, the scripts, the staging, the docs, the performance tweaks, the security updates, the platform installation, ... (hopefully) you only see an app that will work all the time. This will be thanks to all the work done behind the scenes.

To estimate each story we used roughly this key to estimate the time passed. Note that all work was done by pair programming. (hours are based on 40h)

  • 35% development of the feature. (14h)
  • 30% writing and running tests + refactoring code accordingly (12h)
  • 10% deployment (exporting features, writing hook updates, deploying) (4h)
  • 5% scripts (2h)
  • 5% doc (2h)
  • 5% demo (2h)
  • 10% translation, perf, security, permissions, conventions. (depening on nature of functionality) (4h)

To manage this project we had to cooperate with a lot of people in the organisation. This is not allways time that shows up at the end of the week. For example to do a webservice you need to call some people in Paris to give you the right urls. This could quickly add up to some hours.

How did we not loose track of anything? We chose to work scrum. The scrum process protected us from being swamped in requests to introduce this and that while staying agile at the same time. It allowed us to estimate correctly and develop at maximum productivity creating functionality everyone agreed upon.

Garding Performance

We were only responsable for the performance on the application level. All other performance tweaks at server level where performed by the operation teams. For example integrating memcached in drupal is our task, installing the daemon on the server not.

To keep an eye on performance we used the profiller xhprof which can be used with devel. Installation is easy (http://techportal.ibuildings.com/2009/12/01/profiling-with-xhprof/)

Using this tools we monitored each page to see if we had any performance problems. For example we detected that a request to webservice was very slow. With xhprof it shows immediatly, you can compare the actual cpu time used versus the total load time. This showed us that our app was in fact waiting for the other server to respond. We alerted the other team, they found there was a problem with one of there sql servers and fixed it. Xhprof is defenitly a time saver.

Working with a profiller shows you how the app is structured, how the framework works, what is exectued, ... Its something you can learn a lot from. Xhprof togetter with common sense will make you think about performance all the time.

Security and maintenance

Security was a big deal in this project since it contains sensitve data about clients and the organisation.
An audit will be performed by an external entity.
We were only responsible on the drupal application level for security. On the server side we had https, sso, authentication proxy, ... but this was not our task.

We always checked as part of our "fini fini" that all form, url, ... input was sanitized so no sql or xss attacks could occur.

We always followed new security releases for modules and core. Modules that were patched were moved to a seperate folder. Patches were documented so upon core/module relase one could examine if there are patches and if it is still necessairy to apply them again.

I m not a security expert but double checking input and output and following updates is a good basic strategy to prevent evil from happening. Audits by security experts are a definitly necessairy if you work with critical data.

Training developers

This was probably the biggest challenge. How to train a team of already experienced php developers to use drupal and deliver an app that lives up to the quality standards. How to transfer knowledge effectively? How to make the team work on its own?

The method we used was probably not the fastest but I think in the long run the most effective. Instead of me developing stuff we immediatly started to let the team members develop.

Since we were trying to get a product done at the same time we did not take the approach: a node is blabla, a block is blabla, ... And I m glad we didnt because it would bore the hell out of everyone and after a month we wouldnt even had coded a module.

We posed the question what do we want to do? For example we want to retrieve data x from source y with a webservice and display it to user z with a role and permissions under conditions a,b,c. How do you do this in drupal? To answer this question you touch a lot of concepts, modules and apis.

So I architected the solution and then I explained every concept needed to complete the functionality. From blocks to modules, hooks, apis, ... Everytime we encounter a concept I would explain it in detail and how it fits as a piece of the puzzle. Ofcourse the real power of learning lies in repetition and exercise. So forexample the theming system was explained probably three times. The entry point being, we want to render some output using some html, how do we do that?

The first few weeks they were still a bit in the dark but as time progressed they saw how drupal was used: Creating functionality using existing code and integrating it with each other using configuration and custom code. Soon the awesome power of drupal became obvious to them, they learned to search drupal.org when in trouble, we posted issues, we even contributed a little module (http://drupal.org/project/jquery_jstree).

They could see that this vast community was creating and improving code. Almost anything we wanted to do was provided by a module and ways to interact with it. It was clear that with understanding of six concepts one can do almost anything with drupal. Upon these six points we would regularly mesure ourselves, we identified:

  • Framework
    • Core + APIs
    • Contrib Modules + APIs
    • Managing code, confifuration and content (features,deploy,hook_update,drush & use of scripts)
    • Combining existing modules
    • Best practices
  • Data
    • Internal data
    • database: data types (node, taxonomy, ...)
    • session
    • External data : Webservices
    • Interaction with users
  • Users
    • Authentication
    • Access models
    • Interaction with data
  • Quality
    • Performance
    • Tests
    • Security
  • UI
    • Information architecture by combining modules and using their apis to integrate. (views, cck, panels, block, taxonomy, contexts...)
    • Jquery
    • Templating
  • Problem solving
    • How to trace bugs
    • Tools (devel, logs, debugger, xhprof, ...)
    • Use Drupal.org, ask help from community

The reason why drupal has such a steep learning curve is because it you need to know a lot of little things that are individualy not that difficult to understand but the problem is it are a lot of those little things. Especialy when you want to create an enterprise level app.
On this level getting up and running requires a lot of knowledge about a lot of concepts. For instance: How do you stage config and content? How do you create a test? ...

When solved these questions you come to creating your functionality. You get introduced to hundreds of concepts, blocks, nodes, taxonomy, views, cck, fields, formatters, feature, rules, ...
To create a functionality you need to know which modules to use and how to configure them.
When you achieved your first functionality using the existing concepts and modules you want to extend the thing using hooks, apis, in your own modules.

To effectively use drupal you need to combine stuff. To be able to do that you need to know that stuff exists. This make the learning curve steep.

However if you get to this level you notice anything is possible with drupal.
In fact the developers were so enthousiastic that they started wondering why are we still building stuff from zero? Drupal gives us the head start we need. We were doing the same thing over and over again. Why are we maintaining ten differnet apps? Now we can trust the framework and focus on developing the fun stuff instead of recreating the framework over and over again. They even started proposing to start the new projects in drupal to their chiefs.

Altough drupal isnt the magic solutions. It is not click click click done. You still need to get work done. But what it offers not only a framework but an entire universe.

EDIT: I ve been working on this post for a while and when I saw Dries keynote and he spoke about the fact that the best ecosystem will win. I would certainly think so too. This experience made clear to me to it is that what a client wants. One ecosystem that everyone loves working with.

Conclusion

For me it was a quality experience, I learned a lot and I can only say drupal is ready to deliver enterprise quality.

EDIT : After watching Dries keynote this morning. Even though Drupal 6 does already a good job and Drupal 7 will do even better. If the community can achieve the points mentioned by Dries like staging config and content, serving data in multiple formats and improving the apis then Drupal 8 will truely be THE killer enterprise content app.

EDIT:Scripts have been removed

Oct 23 2010
Oct 23

I've ran into this one quit a few times. We have live production environment, a staging environment, an integration environment and a local development environment. This classic software development chain goes upstream from production to dev and downstream from dev to production.

Upstream

The upstream process requires usualy just transfer of newly created content, users and user generated settings which is mostly done by transfering database dumps. However on some projects the database is to big to transfer in one piece. When you have 3.5 million nodes in the database you dont want download the entire thing.

Solutions
When you only want the user generated settings from a production environment (example http://drupal.org/project/homebox settings) a possible solution is to use the http://drupal.org/project/backup_migrate module . It allows you to select which tables you want to export. Doing so you only export the tables containing your settings. Although this requires more advanced knowledge on how drupal works (because you have to know which are the correct tables to export) it is an easy repeatable way because you can save the table selection. You can automate this module using cron and it has some drush commands at your disposal so this is a fast tool to update the upstream sites.
If you want only newly created content, drupal settings or views moved upstream to dev use the http://drupal.org/project/deploy module. This video explains all http://www.youtube.com/watch?v=7PjwT0HWHxw

Downstream

The downstream process that pushes new code and settings down the chain is a bit of problem in drupal.
You have the brutal way by transferring database dumps. Which wont be a great plan when your site constantly gets updated. A variation of this would be like explained above and transfer only tables containing settings using backup and migrate. But you ll run into trouble as soon as you have to start installing modules which act on content. A simple example would be a new field on a content type. This alters your content type table. Also everything that is database transfer related has poor rollback capabilities and poor change logging. Doing database transfers togetter with manualy updating stuff is a caveman update strategy.

To solve this problem more elegant lets install a system that makes use of three components.

Features

This module http://drupal.org/project/features has the power to export datastructures which live in the database into code. Which is great because you can track changes in your version control tool and revert changes. This is truly the savior module. BUT it doesnt support all datastructures. It supports menu's, permissions, content types, views, displays, rules, ... (check the project page for more)
Use feature for the datastructures it supports. Hopefully all datastructure in drupal get supported and this will be the only thing needed to solve the staging problem.
How does it work? Install it, create one or more features, export them and when you have updates in your datastructures it will show them. Use drush fu [name-feature] to update them, commit them to your repository. When the svn up on downstream envirronments happens your site will use the new datastructures

Deploy

You know this one from above use it if you have some content staged. A rollback system is not in place so to rollback delete the content it has created manualy. The reason why I dont use it to transfer settings or views or other things than content is because there is no rollback.

Hook update

Use http://api.drupal.org/api/function/hook_update_N/6 to do all changes that cannot be done by the former two. An example would be a custom or a contrib datastructure that lives in the database and that cant be exported by features or deploy. If you want a rollback you ll have to write it yourself using yet another hook_update which hopefuly you wont need to do that to often :)

We also use hook update to do updates which are envirronment specific. For example you dont want devel enabled on production.
In settings.php set an envirronment variable. Since settings.php is different on each environment set a setting so you can have your drupal know on which environment it is. For example on production in settings.php.

<?php
$conf
['environment'] = 'production';
?>

Now in hook_update you do

<?php
function module_update_6001() {
  if (
variable_get('environment''dev') == 'production') {
    
module_disable(array('devel'));
  }

  return array();
}

?>

Conclusion

The key is to think everytime you change the database and grow the habitude to get those changes into code. Using above strategy will require more time especialy if more hook updates are required. It will consume even more time if you need rollbacks in place. But it gives you more control.

So we come down to a tradeof between control and development time. This should be made clear to everyone involved from client, project manager, developer, maintainer that installing this workflow in your project consumes time.

Future

The real solution to this problem lies in the lack of support in core. A core system registering all datastructures and a stage content marking should come in place. The system should be able to connect all environments and show the status of the content and structures. According to that releases could be planned, executed, rolled back, logged, ... And then offcourse all contrib modules contributing structures should integrate with this core system :)

For now that systems seems not become reality soon but I think features comes close for settings and datastructures and deploy comes close for content. I think we ll row with hook update to close the gaps for a couple of years.

Aug 05 2010
Aug 05

By big imports I mean creating millions of nodes and indexing them fo apache solr search. We needed to import 1 million records from file and we needed to create 3 nodes for each record, one in each of our 3 languages. So this gave us the task of creating scripts for creating, updating and deleting 3 millions nodes. These nodes also had 3 taxonomy vocabularies, around 30 cck fields and we also needed to join another table containing a spatial coordinates for our geo field.
So saving one node like that is an expensive operation.
We got the time down to 72 hours to import all of these records. The process was fully automated and we used a manualy configured setup to use the server at maximum capacity.

We received two main files (and another set of 5 files with key value pairs as mapping for some of our cck fields). One containing our records with data to fill our other cck fields and taxonomy, a second file with the coordinates.
First we loaded all the files into the database. We gave the first table an incremented id so we could use it later to track progress (we used a little script to do that). The second table we imported using phpmyadmin. The lookup tables we also imported using phpmyadmin. This process goes really fast. Under two hours we got the files onto the server and into the database.
Why dont we read out of file? We had to look up the VAT id of the company in the second file to find the coordinates for the appropriate company. This is a task a database can do a lot better than a file reading process in which you'll have to scan the file line by line.

We also created indexes on our tables a primary key on the auto inc field and a unique index on the VAT id in the first table and a primary on the VAT id in the second table. On the other tables with lookups we gave the key collumn a primary key. We also created an index on our VAT id in our cck table, since we ll use it to look up if a node needed to be created or updated. Indexes are very important as they speedup lookups in the database (http://dev.mysql.com/doc/refman/5.0/en/mysql-indexes.html). So allways analyse your database tables to see if an index is appropriate when doing imports.

Now we created our import script that basically reads out a line in the database and builds a node object and does a node_save to get it into drupal. Nothing special about that. Except that if you would just run that from your browser the script would just time out when your php max_execution_time is reached or when php is out of memory. Even if you set memory_limit to the max on your server and max_execution_time to unlimited the script would still fail since the machines memory would be consumed after an amount of time. Also this is not the way to follow up progress on your import. You cant restart when something fails...

So naturaly in drupal we think the batch api will save us. Here is the handbook page on how to create a batch (http://drupal.org/node/180528) Now you dont have to worry about your scripts timing out the batch api will make sure it wont use that much memory. There are two ways to construct a batch the first which is explained in the handbook page. The second is a bit different since it only uses one function in the operation array. We ll take the apachesolr reindex batch as an example. Why is this usefull, the batch api saves its batch in the batch table. So when you construct a batch containing a million operations and that array is serialized and put into the database nasty things will happen. Depending on your mysql settings and server capacity this will fail. In my case it failed at around 30k records.
You can still use the first method, what you do is split up the batches in pieces. And import them one by one. Which we did but not for the reason of a tho big serialized array. But more on that in a moment.

Check out this snippet to know how to write a batch api that only uses one function in the operations table. In our case the only thing you need is a variable that keeps track of where you are.

<?php
/**
 * Batch reindex functions.
 */

/**
* Submit a batch job to index the remaining, unindexed content.
*/

function apachesolr_batch_index_remaining() {
  
$batch = array(
    
'operations' => array(
      array(
'apachesolr_batch_index_nodes', array()),
    ),
    
'finished' => 'apachesolr_batch_index_finished',
    
'title' => t('Indexing'),
    
'init_message' => t('Preparing to submit content to Solr for indexing...'),
    
'progress_message' => t('Submitting content to Solr...'),
    
'error_message' => t('Solr indexing has encountered an error.'),
    
'file' => drupal_get_path('module''apachesolr') . '/apachesolr.admin.inc',
  );
  
batch_set($batch);
}
/**
* Batch Operation Callback
*/
function apachesolr_batch_index_nodes(&$context) {
  if (empty(
$context['sandbox'])) {
    try {
      
// Get the $solr object
      
$solr apachesolr_get_solr();
      
// If there is no server available, don't continue.
      
if (!$solr->ping()) {
        throw new 
Exception(t('No Solr instance available during indexing.'));
      }
    }
    catch (
Exception $e) {
      
watchdog('Apache Solr'$e->getMessage(), NULLWATCHDOG_ERROR);
      return 
FALSE;
    }
$status module_invoke('apachesolr_search''search''status');
    
$context['sandbox']['progress'] = 0;
    
$context['sandbox']['max'] = $status['remaining'];
  }
// We can safely process the apachesolr_cron_limit nodes at a time without a
  // timeout or out of memory error.
  
$limit variable_get('apachesolr_cron_limit'50);// With each pass through the callback, retrieve the next group of nids.
  
$rows apachesolr_get_nodes_to_index('apachesolr_search'$limit);
  
apachesolr_index_nodes($rows'apachesolr_search');$context['sandbox']['progress'] += count($rows);
  
$context['message'] = t('Indexed @current of @total nodes', array('@current' => $context['sandbox']['progress'], '@total' => $context['sandbox']['max']));// Inform the batch engine that we are not finished, and provide an
  // estimation of the completion level we reached.
  
$context['finished'] = empty($context['sandbox']['max']) ? $context['sandbox']['progress'] / $context['sandbox']['max'];// Put the total into the results section when we're finished so we can
  // show it to the admin.
  
if ($context['finished']) {
    
$context['results']['count'] = $context['sandbox']['progress'];
  }
}
/**
* Batch 'finished' callback
*/
function apachesolr_batch_index_finished($success$results$operations) {
  
$message format_plural($results['count'], '1 item processed successfully.''@count items successfully processed.');
  if (
$success) {
    
$type 'status';
  }
  else {
    
// An error occurred, $operations contains the operations that remained
    // unprocessed.
    
$error_operation reset($operations);
    
$message .= ' 't('An error occurred while processing @num with arguments :', array('@num' => $error_operation[0])) . print_r($error_operation[0], TRUE);
    
$type 'error';
  }
  
drupal_set_message($message$type);
}
?>

Allright so now we could import our nodes without worrying about timeouts and our batch failing. But we still have to monitor the process in case our internet connection got cut off. We would have to refresh the page and make the batch continue.
So it would be great to just have way of launching a command and dont have to worry about anything. To achieve this we use http://drupal.org/project/drush and the cron functionality on the server.
You could write your own drush command that launches a batch as you would do by calling the script we allready had. However it wont work, your memory will get exhausted. But dont worry, there is a solution to this. You ll call a drush command that is able to do batches. This drush command will make sure your memory doesnt get exhausted while doing batches. You can see how it is used in the updatedb drush function. Here is my snippet on how I implemented it using a custom drush command that calls the function that would be called by the "drush batch-process [batch-id]" command.

<?php
function import_drush_command() {
  
$items = array();
  
  
$items['import'] = array(
    
'callback' => 'import_drush_import',
    
'description' => dt('Import'),
    
'arguments' => array(
      
'start'        => "start",
      
'stop'   => "stop",
    ),
  );
}

function 

import_drush_import($start$stop) {
  
$result db_query("SELECT * FROM {our_table_with_records} WHERE id > %d AND id < %d"$start$stop);import_drush_import_operations($batch$result);
   
  
batch_set($batch);
 
  
$batch =& batch_get();
 
  
$batch['progressive'] = FALSE;
 
  
drush_backend_batch_process();
}
/**
*  Creates operations for importing bedrijf nodes
*/
function import_drush_import_operations(&$batch, &$result) {
  while (
$fields db_fetch_array($result)) {   
    
array_shift($fields);  
    
$fields_out = array();
    foreach (
$fields as $field) {
      
$fields_out[] = $field;
    }
$batch['operations'][] = array('import_create_bedrijf_nodes', array($fields_outTRUE17));   
  }
}
?>

Ok so now in the terminal we typ something like: drush import 1 1000 and it will create a batch, fire it and import the first 1000 records and create nodes for it.
You could have this function called by cron so you dont even need to have a terminal open. But as said earlier we are still creating an operation for every record. Why? When doing the one operation trick I ve noticed only 30% of the cpu was used (check this typing "top" in another terminal window). So I figured we could spawn multiple shells and make them all do work. I did that and I found I could launch six shells with the drush import command. On the seventh the server cpu spiked up to 300% and made the server crash so six shells was the limit. It is probably possible to measure resources and launch commands according to that. But for now I figured the server is using all of its resources and importing goes as fast as possible despite this being a manual process.
The final thing I did to automate the process was setting up an importcron.php in the drupal root installation containing this:

<?php
//Set the path correctly so drupal know how to include it's files during bootstrap
$path='/var/www/html/your_drupal';
chdir($path);//Bootstrap drupal
include_once './includes/bootstrap.inc';
drupal_bootstrap(DRUPAL_BOOTSTRAP_FULL);//Launch your function
import('importer_import_progress');//IMPORT
function import($value) {
  
$amount db_result(db_query("SELECT COUNT(bid) FROM {batch}"));
  
$start = (string) variable_get($value0);
 
  if(
$amount && $start 985558) {
    
//for some reason you need to cast everyhting to strings explicitly otherwise it wouldnt launch the command        
    
$stop = (string) $start 1000;                                      
    
variable_set($value$stop);                      
    
    
$command =  "/var/www/html/your_drupal_site/sites/all/modules/contrib/drush/drush import ";
    
$start = (string) ($start+1);                                       
    
$command .= $start " " $stop;
    
    
exec("$command$output $return);         
  }                       
}                                                                  
?>

Then in type crontab -e to edit the cron listing, typ i to insert and set this command:

* * * * * /usr/bin/php /var/www/html/your_drupal_site/cronimport.php

Type the full path to php and the full path to your file. This will execute the command once every minute. In the script a drush command will be exectuted with the next 300 items needed to import. To prevent the command from firing to much it will allway check if previous batches are finished. Our limit was 6 batches at the same time. If your server is more powerful you can increase the 1000 items and the 6 batches. It would be nice to have this process controlled by a function that calculates server resources and launches batches accordingly, but I would have to do some research on how to do that.

Conclusion
The script ran for about 70 hours and the site contained 3 Million nodes. The same principle was used to to the indexing which took about 50 hours to index all nodes. In the indexing we modified some other things to make it go faster but thats for another blog post.

Jun 29 2010
Jun 29

The rules module is a very powerful module that allows you to make things react in your site when certain events happen (like a comment has been made) under certain conditions (for example when the user has role x).

A lot of events, conditions and actions are allready out of the box or provided by some contrib modules who have rules integration.

But what to do when you have a module and you want to add rules support? First of all, why should you add rules support? Why not do everything in code?

Example

Let me explain this with an example. We have a custom module that allows administrators define packages which users can order (the packages are not nodes, why they are not is offtopic). So when a package is bought an event is fired. We also have a custom condition which allows to check on the type of the package. Finally we have a custom action that changes the status of the order made. (We have a database table with packages and one with orders of packages)

Now a site administrator can define a set of mails using these rules. For example a mail could be send to the acting user when he buys a subscription package or a system message could be set when the user buys a package that increases his amount of credits. For each type of package bought a different mail could be sent. You get the idea, imagine having to change in code the mail on this action, or the system messages everytime the site owner wants to have "Dear customer" instead of "Allmighty customer" on top of his mails/messages.

Now lets look at the code you need to have your custom rules.
There are 3 hooks you ll need to implement, so create a custom module and create an extra file called your_module.rules.inc, all our code rules related code will go in there.

Events

The first hook is hook_rules_event_info() and will defne our custom event.

<?php
/**
* Implementation of hook_rules_event_info().
* @ingroup rules
*/
function your_module_rules_event_info() {
  return array(
    
'your_module_package_bought' => array(
      
'label' => t('A package was bought'),
      
'module' => 'your_module',
      
'arguments' => array(
        
'acting_user' => array('type' => 'user''label' => t('The user who bought the package.')),
        
'package' => array('type' => 'package''label' => t('The purchased package.')),
      ),
    ),
  );
}
?>

What we have done is defined an action that registers the acting user and the package object. Let's trigger that action when we buy a package. (This function will be located in your_module.module)

<?php
function your_module_buy_package() {
  
//here the code for buying a package will be located
  //when that code returns that a package was bought trigger the rule
  
$order order_load($oid);//$oid will be the id of the order made
  
$package package_load($pid);//pid will be the id of the bought package
  
global $user;
  
rules_invoke_event('your_module_package_bought'$user$package$order);
  
watchdog('your_module',t('A Member !uid has ordered (order:!oid) package !pid', array('!uid' => $user->uid'pid' => $package->pid'oid' => $order->oid)));
}
?>

In the rules interface on the site you ll define your triggered rule by selecting this event.
Notice how we log the event in the watchdog. This is good practice because you want to follow up what happens on your site. You could also have the watchdog logging configured in the triggered rules interface when defining actions for your event, but I think watchdog logging is more something that belongs in code and it is something the site admin should not be able to configure (unlike mails and user messages).

Conditions

Now we ll define the condition so you can base your reaction on the the type of package that was bought. For this one you ll need to implement hook_condition_info()

<?php
/**
 * implementation of hook_rules_condition_info()
 */
function your_module_rules_condition_info() {
  return array(
    
'your_module_condition_package_type' => array(
      
'label' => t('Type of the package'),
      
'arguments' => array(
        
'package_id' => array('type' => 'value''label' => t('The type of the purchased package.')),
        
'package_type' => array('type' => 'string''label' => t('The type of the purchased package is')),
      ),
      
'module' => 'your_module',
    ),
  );
}
/**
 * Condition packages_condition_packagetype
 */
function your_module_condition_package_type($pid$type) {
  
$package package_load($pid);
  return (
$package->type == $type) ? true false;
}
?>

When the condition is assessed the function your_module_condition_package_type is executed using the argument pid (package id) provided by the event and the argument type defined in the interface.

The function will return true if the package type bought is the same as the package type defined in the interface.

Actions

A lot of other actions are allready defined. For our actions we 'll allready have the mail and message action available to us. We configure the "send a mail to a user" and to do this we'll need some tokens.

Replacement tokens

As you have noticed there are some handy replacement tokens we could use out of the box, like the acting user for example. We could then add a replacement token for the user name ([acting_user:user]). In the mail that token will be replaced by the username of the acting user.
In our example we could build a custom replacement token for the name of the package. Using the package id we could retieve the name of the package.

To get custom tokens in drupal we'll use the token api functions hook_token_values () and hook_token_list() (you ll put this piece of code in your_module.module)

The hook_token_list will provide data to the interface and the hook_token_values will replace the token [package:package_name] you'll put in your mails.

<?php
/**
 * Implementation of hook_token_list()
 */
function your_module_token_list($type 'all') {
  if (
$type == 'all' || $type == 'package') {
    
$tokens['your_module'] = array(
      
'package_name' => t('The name of the package'),   
    );
  }

  return 

$tokens;
}
/**
 * Implementation of hook token api
 */
function your_module_token_values($type$object NULL$options = array()) {
  
$tokens = array();
  if (
$object) {
    if (
$type == 'package') {
      
$tokens = array(
        
'package_name' => $object->name,
      );
    }
  }

  return 

$tokens;
}
?>

Only one funtion is missing to make the rules load our package the package_load function. This could look like this:

<?php
function package_load($pid) {
  
$package db_fetch_object("SELECT pid, name, type FROM {packages} WHERE pid = %d"$pid);
    
  return 
$package;
}
?>

In our invocation we gave the package as an argument, by doing so rules gave our package to the your_module_token_values() function in the $object argument so it could replace the package_name token with the real name of our token.

Custom actions

The final hook is hook_rules_action_info() which defines your custom action.

<?php
/**
 * Implementation of hook_rules_action_info().
 */
function packages_rules_action_info() {
  return array(
    
'your_module_action_change_order_status' => array(
      
'label' => t('Change the order status'),
      
'arguments' => array(
        
'order' => array('type' => 'value''label' => t('The order object.')),
      ),
      
'module' => 'your_module',
    ),
  );
}
?>

The function doing the action uses the settings variable. You can configure the value of the status by interface this way. If you dont want that just hardcode the value, this just demonstrates that you can have a form in the interface capturing custom values.

<?php
function your_module_action_change_order_status($oid) {
  
$q "UPDATE {orders} SET status = '%s' WHERE id = %d";
  
db_query($q$settings['status'], $oid);
  
//better would be to create an order_save function but you get the idea
}/**
* Action drupal message configuration form.
*/
function your_module_action_change_order_status_form($settings, &$form) {
  
$settings += array('status' => '');$form['settings']['status'] = array(
    
'#type' => 'textfield',
    
'#title' => t('Status'),
    
'#default_value' => $settings['status'],
    
'#description' => t('The order status will change to the one supplied here.'),
  );
}
?>

The order load could be something like this:

<?php
function order_load($oid) {
  
$order db_fetch_object("SELECT oid, status, uid FROM {orders} WHERE oid = %d"$oid);
    
  return 
$order;
}
?>

Conclusion

As demonstrated rules is a powerfull module, with an equally powerfull api capable of defining custom events, conditions and actions.
Rules integrates with all major modules (and the token api), and it is imho when developing modules, if possible, a point of integration worth to consider.

Extra

For more information about the api see the rules.api.php file in the rules package. It shows some altering capabilities and how to expose custom objects to rules.

Jun 26 2010
Jun 26

How to submit some hidden values along with a drupal webform? Our example. We have job node (nid 1) and we want to go to a webform to apply for the job. So we build a link on our node pointing to the webform node (nid 2).
In the node template type quickly a url: node , nid webform, nid job node

<?php
print l('Apply for this job''node/2/1');
?>

In a module you define a hook_form_alter(). (Hook form alters are used to change forms in drupal and naturaly they work on webforms too)
Maybe a little side trick here: We build the id of our webform dynmicaly. The first part is 'webform_client_form_' the second part is a variable who is also dynamicaly build because we have a multilingual site, the i18n_get_lang() api function gets us the current users language. In the end we have something like this 'webform_client_form_2' where 2 is the node id. You could do this all hardcoded too.
Now to save the node id of our job in a hidden webform field we do the following. We grab the path break it up and in the array on key 2 we find our node id. We use the default value of the hidden field to store our node id. Use checkplain to sanitize the input. Submit the form and you 'll find the value submitted onto your hidden field.

<?php
/**
 *  Implementation hook_form_alter()
 */  
function glue_form_alter(&$form$form_state$form_id) { 
  if (
$form_id == 'webform_client_form_'.variable_get('glue_link_to_webform_' i18n_get_lang(), '2')) {
    
array_unshift($form['#submit'], 'glue_webform_add_nid'); 
    
$bpath explode('/' ,$_GET['q']);
    if (
is_numeric($bpath[2])) {
      
$form['submitted']['vacature_node']['#default_value'] = 'node/'.check_plain($bpath[2]);
    } 
  }
}
?>


In a webform hidden fields have some variables available but you cant really format them properly.

You can define the variables in a settings form like this:

<?php
/**
 *  Implementation hook_menu()
 *  <a href="http://api.drupal.org/api/function/hook_menu/6
">http://api.drupal.org/api/function/hook_menu/6
</a> */ 
function glue_menu() {
  
$items = array();
  
$items['admin/settings/custom'] = array(
    
'title' => t('custom settings page'),
    
'page callback' => 'drupal_get_form',
    
'page arguments' => array('glue_settings_form'),
    
'access arguments' => array('administer custom settings'),
    
'type' => MENU_SUGGESTED_ITEM,
  );

  return 

$items;
}
/**
 * The settings form for the module
 */
function glue_settings_form() {
  
$form['glue_link_to_webform_nl'] = array(
    
'#type' => 'textfield',
    
'#title' => 'Sollicitatie node nl',
    
'#default_value' => variable_get('glue_link_to_webform_nl''117'),
  );   
  
$form['glue_link_to_webform_en'] = array(
    
'#type' => 'textfield',
    
'#title' => 'Sollicitatie node en',
    
'#default_value' => variable_get('glue_link_to_webform_en''122'),
  ); 
  
$form['glue_link_to_webform_fr'] = array(
    
'#type' => 'textfield',
    
'#title' => 'Sollicitatie node fr',
    
'#default_value' => variable_get('glue_link_to_webform_fr''123'),
  ); 
  
$form['glue_link_to_webform_de'] = array(
    
'#type' => 'textfield',
    
'#title' => 'Sollicitatie node de',
    
'#default_value' => variable_get('glue_link_to_webform_de''124'),
  ); 
  
  
  return 
system_settings_form($form); 
}
?>
Jun 17 2010
Jun 17

By knowing who you are you will be directed to a page where we have displayed all our drupal services relevant to your professional identity.

Mar 28 2010
Mar 28

By knowing who you are you will be directed to a page where we have displayed all our drupal services relevant to your professional identity.

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web