Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
Apr 19 2010
Apr 19

© Copyright 1999-2021 Baheyeldin.com. All rights reserved.
You can use our content under the Terms of Use.
All posted articles and comments are copyright by their owner, and reflect their own views and opinions, which may not necessarily be consistent with the views and opinions of the owners of the Baheyeldin.com.
Web site developed by 2bits.com.

Apr 17 2010
Apr 17

My experience themeing Drupal, like most of my coding skills, have been developed by digging up useful resources online and some trail and error. I have an interest in graphic design, but never really studied it. I can turn out sites which look good, but my “designs” don’t have the polish of a professionally designed site. I own quite a few (dead tree) books on development and project management. Generally I like to read when I am sick of sitting in front of a screen. The only ebooks I consider reading are short ones.

Emma Jane Hogbin offered her Drupal theming ebook Theming Drupal: A First Timer’s Guide to her mailing list subscribers for free. I am not a big fan of vendor mailing lists, most of the time I scan the messages and hit delete before the bottom. In the case of Emma, rumour has it that it is really worthwhile to subscribe to her list - especially if you are a designer interested in themeing Drupal. Emma also offered free copies of her ebook to those who begged, so I subscribed and I begged.

The first thing I noticed about the book was the ducks on the front cover, I’m a sucker for cute animal pics. The ebook is derived from Emma’s training courses and the book she coauthored with Konstantin Kaefer, Front End Drupal. Readers are assumed to have some experience with HTML, CSS and PHP. The book is pitched at designers and programmers who want to get into building themes for Drupal.

The reader is walked through building a complete Drupal theme. The writing is detailed and includes loads of references for obtaining additional information. It covers building a page theme, content type specific themeing and the various base themes available for Druapl. The book is a very useful resource for anyone working on a Drupal theme.

Although I have themed quite a few Drupal sites, Emma’s guide taught me a few things. The book is a good read for anyone who wants to improve their knowledge of Drupal themeing. Now to finish reading Front End Drupal …

Apr 14 2010
Apr 14

The following post will cover how to create dynamic landing pages. First lets define our mission:

Allow easily creating landing pages, with different content. The layout should be a header and footer which are always the same, and the main-content should be three columns layout, with dynamic content.

Before we go too technical, let’s translate the task into plain English in order to solve our task.

The layout should be a header and footer which are always the same

Or we can say it differently - don’t show the sidebars. A quick and dirty way of doing it, will be overriding page.tpl and remove the printing of the sidebars.

...three columns layout...

That’s easy with Panel’s three column layout (or we can create our own layout using 960).

...with dynamic content.

Dynamic content - that sounds like a job for Views. We can have a content type, that will have cck fields that describe the location of it (e.g. left, middle), and the page it should appear (e.g. /landing-page/foo). Views will get the right nodes for us.

Ok, now we know how we want to implement it, let’s get our hands dirty (or download the example module, and skip to 7).

  1. Download and enable necessary modules: ``` drush dl cck views ctools panels drush en optionwidgets text views_ui ctools _panels page_manager views_content ```
  2. Starting from the lowest level, we'll create a new content type called "Landing page element" with CCK fields - select list for the "Location" and a textfield for the "Page ID"
  3. Create a View that shows the full nodes, filtered by node type, that gets two arguments - the "Location" and the "Page ID"
  4. Add a new display of "Content pane" type. This display type is what ties Views to Panels in a way that allows us to define how the View is going to get its arguments
  5. Set the Page ID argument to be taken from the panel argument - or in other words, if the url will be landing-page/foo then foo is our page ID. The Location argument, on the other hand, shouldn't be taken from the URL - it should be set in the Pane configuration
  6. Create a panel page in the path ```landing-page``` with a three column layout, and in the content add our view to each of the columns. Every time the "Location" argument will change according to the column the View is added to
  7. Next, in our theme to copy page.tpl.php to page-landing-page.tpl.php, and the printing of the sidebars
  8. Optional; copy node.tpl.php to node-view-landing_page.tpl.php and delete the printing of the node title
  9. Optional; In admin/build/themes/settings uncheck "Display post information" from our new content type
  10. Now, all that is left to do, is to add three Landing page elements content, with the same page ID (e.g. `gizra`), and with different locations
  11. Navigate to landing-pages/gizra and see your landing page!

Another small note about using Panels, Views and nodes together. Although Panels allows us to add an existing content, I normally prefer to use a View that will return that node. Why? Because most of the sites I work on are multilangual, and if I hardcode the node ID then the users might see a node in the wrong language. View on the other hand can make sure the node the users see is in their language.

Apr 09 2010
Apr 09

Today I purchased a Motorola DEXT (aka Cliq) from Optus. Overall I like it. It feels more polished than the Nokia N97 which I bought last year. The range of apps is good. Even though the phone only ships with Android 1.6, 2.1 for the DEXT is due in Q3 2010.

The apps seem to run nice and fast. The responsive touch screen is bright and clear. I am yet to try to make a call on it from home, but the 3G data seems as fast as my Telstra 3G service, so the signal should be OK.

The keyboard is very functional, albeit cramped with my fat thumbs. The home screen is a little cluttered for my liking too, but it won’t take much to clean that up. I will miss my funambol sync, which is only available for Android 2.x.

I started writing this post using the Drupal Editor for Android app, which is pretty nice. The GPL app uses the XML-RPC and Drupal core’s Blog API module. Overall it feels like a stripped down version of Bilbo/Blogilo. Drupal Editor is an example of an app which does one thing and does it simply, but well. The only thing I haven’t liked about it was when originally writing this post, I bumped the save button and published an incomplete and poorly written post. Next time I will untick the publish checkbox until I am ready to really publish it.

I would still like a HTC Desire, but Telstra is only offering them on a $65 plan with no value. The Nokia N900 was off my list, due to the USB port of death and Nokia’s spam policies. The Nexus One was on the list too, but the lack of a local warranty was a concern.

Apr 01 2010
Apr 01
How to Customize and Theme Post & Date info in Drupal 6 » Danny Englander

Drupal engineer with a passion for learning new things every day

When I was theming my blog in Drupal, I decided I wanted a better way to customize and display post info such as wording used and the way date was displayed. The first step is to have a look around and see where the code is coming from that renders this info. I viewed the files in my custom theme folder and discovered these few lines of code in node.tpl.php

 if ($submitted): ?>
     class="submitted"> print $submitted ?>
 endif; ?>

In HTML, that is rendered as:

Submitted by Danny Englander on 4-01-10

I decided I wanted to have customized date and post info only for my blog so a standard Drupal convention allows you to have node-blog.tpl.php to tailor the display of the blog content type. My theme did not have this file so I simply copied node.tpl.php and renamed it. Now that I had my custom Node Blog template file, I was all set to start customizing date and post info.

Well, the above code seemed boring and hard to theme in Drupal so I discovered a way to be more specific with the way the code that displays post and date info gets output in your Drupal theme.

I simply replaced the code above with this code:

 class="meta post-info">
 if ($submitted): ?>
 class="submitted">Posted by  print theme('username', $node) ?>
   endif; ?>

 class="dateblock">
       class="month"> print $date_month ?>
       class="day"> print $date_day ?>
       class="year"> print $date_year ?>
 

**Note this crucial bit of code below was left out of the original post so if this did not work for you that's why. Place the code below in your theme's template.php file or create one if you don't have one already. (change "mytheme" to the name of your theme):

function mytheme_preprocess_node(&$vars) {
  // Grab the node object.
  $node = $vars['node'];
  // Make individual variables for the parts of the date.
  $vars['date_day'] = format_date($node->created, 'custom', 'j');
  $vars['date_month'] = format_date($node->created, 'custom', 'M');
  $vars['date_year'] = format_date($node->created, 'custom', 'Y');
}

By breaking down and getting more specific with this code, I was now able to use some CSS to customize the date into the nice little square blocks you see to the left of every post title. It also allows to have "Posted by", "submitted by" or whatever other wording you choose for the author part of the code.

Tags

Mar 23 2010
Mar 23

The Idea

A number of months back, a group of us had the idea to create a software co-operative. There were several tenets that we decided to follow:

  • The company wouldn't have any employees -- everybody involved would be have 1099 status and would be an independent contractor
  • The company would be formed as an Limited Liability Company - we chose the state of Delaware
  • The company would seek to have a $0 cash and asset value at the end of each year
  • The company would be virtual to keep costs low
  • We would focus on working with open sourced projects like Drupal

The Setup

We set the company up using Instacorp. The benefit was the speed at which we could set up the company with an automatic legal presence in Delaware. The people at Instacorp made the process incredibly simple, asking a few questions. Within days the legal documents were delivered. That, in itself, really didn't make the company real.

After receiving the legal documents, it was necessary to obtain an FEIN for tax purposes. This is a simple process on the IRS site - it just takes a few minutes and you get the documentation electronically.

We needed to decide how to be taxed.
LLC's report taxes in one of 3 ways:

  • Disregarded entity (limited to one member LLC's)
  • Partnership (default if other elections are not made)
  • Corporation, electing to be taxes as pass-through entity, called S corporations.

In the case of Vintage Digital, workers are paid for what they do which is reported to them on FORM 1099 as commissions.
Great variation will occur in compensation since it is entirely based on hours worked and percentages for those who find clients and shepherd clients through the contracting process. The LLC is a virtual corporation, with exceeding low or non-existent overhead. There is no intent to use the LLC other than as a distribution method for sharing work; profits/losses will be kept to a minimum. There is no intent to hold fixed assets or incur debt.

In the final analysis both partnership and S-corporation reporting would be the same.

Our accountant indicated that the following things were recommended:

  • Use "S" corporation for tax reporting because the laws are better understood and simpler.
  • Majority of LLC's elect to report taxes as a pass-through corporations, hence even the IRS is more familiar with these tax laws.

We had an S-corp election to indicate how we were going to be taxed - after the election that document needs to be sent to the IRS.

We needed a bank account and opted for a bank that had free business checking and had online bill pay. The bank required our Articles and two forms of identification. We also needed a copy of our FEIN letter from the IRS.

The Tools
All companies need tools to help run things on a day to day basis. A virtual venture is no different. We needed management tools, communication tools, invoicing/book keeping software, and ways to manage contracts. To that end we sought out different solutions that would provide us with ways to sensibly manage ourselves and our projects.

  • Skype for Communication (both voice and chat)
  • Open Atrium for a client intranet and as an internal planning tool.
  • Bamboo Invoice for invoicing clients (although that might change as we transition to QuickBooks)
  • Drupal for our Web presence
  • dotProject for time record keeping and ticketing
  • Office for estimates, calculating commission shares, and contracts
  • Google Voice for incoming phone calls - the rest of us use our own mobiles

The Team

The team is comprised of:

All of us have (and continue to) contribute to the Drupal Project and are heavily involved in our local communities.

The team keeps in pretty much constant contact through Skype. We try to meet about once a month together to have Member/Board meetings. They have occurred in restaurants for brunch, member's homes, and also at a bowling alley - pretty much anywhere that is quiet and you can get through company issues. Fortunately for our crew, we all live with 30 miles or so of one another which makes getting together fairly easy.

Each project gets its own Skype room, project in Open Atrium, project in dotProject, and commission spreadsheet.

As clients come in, we assess who has bandwidth for a given project - the goal ensuring that each co-op member has enough work (in and outside of the co-op) to make a reasonable living. Co-op members are free to work as much or as little as they want (given the work is available). This arrangement was designed to give our team as much flexibility as possible.

Mar 22 2010
Mar 22

A lot of people have been asking for the files we used to integrate Alfresco CMIS with Drupal Open Atrium (See ecmarchitect.com blog post). I’ve happily mailed those to whomever asked. I’ve had the intention of testing them with the latest version, cleaning them up, and putting somewhere more appropriate like the Open Atrium feature server, or at the very least, Google Code or GitHub. But it hasn’t happened yet so I figured I’d make them available here and appeal to the Community to give them a good home.

The zip includes a readme file with (very) rough install/config directions.

Good luck!

Mar 13 2010
Mar 13

I use Apache Solr for search on several projects, including a few with Drupal. Solr has built in support for replication and load balancing, unfortunately the load balancing is done on the client side and works best when using a persistent connection, which doesn’t make a lot of sense for PHP based webapps. In the case of Drupal, there has been a long discussion on a patch in the issue queue to enable Solr’s native load balancing, but things seem to have stalled.

In one instance I have Solr replicating from the primary to a secondary, with the plan to add additional secondary backends if the load justifies it. In order to get Drupal to write to the primary and read from either node I needed a proxy or load balancer. In my case the best lightweight http load balancer that would easily run on the web heads was haproxy. I could have run varnish in front of Solr and had it do the load balancing but that seemed like overkill at this stage.

Now when an update request hits HAProxy it directs it to the primary, but for reads it balances the requests between the 2 nodes. To get this setup running on ubuntu 9.10 with HAProxy 1.3.18, I used the following /etc/haproxy/haproxy.cfg on each of the web heads:

global
    log 127.0.0.1   local0
    log 127.0.0.1   local1 notice
    maxconn 4096
    nbproc 4
    user haproxy
    group haproxy
    daemon

defaults
    log     global
    mode    http
    option  httplog
    option  dontlognull
    retries 3
    maxconn 2000
    balance roundrobin
    stats enable
    stats uri /haproxy?stats

frontend solr_lb
    bind localhost:8080
    acl primary_methods method POST DELETE PUT
    use_backend primary_backend if primary_methods
    default_backend read_backends

backend primary_backend
    server solr-a 192.168.201.161:8080 weight 1 maxconn 512 check

backend secondary_backend
    server solr-b 192.168.201.162:8080 weight 1 maxconn 512 check

backend read_backends
    server solr-a 192.168.201.161:8080 weight 1 maxconn 512 check
    server solr-b 192.168.201.162:8080 weight 1 maxconn 512 check

To ensure the configuration is working properly run wget http://localhost:8080/solr -O - on each of the web heads. If you get a connection refused message HAProxy may not be running. If you get a 503 error make sure solr/jetty/tomcat is running on the Solr nodes. If you get some html output which mentions Solr, then it should be working properly.

For Drupal’s apachesolr module to use this configuration, set the hostname to localhost and the port to 8080 in the module configuration page. Rebuild your search index and you should be right to go.

If you had a lot of index updates then you could consider making the primary write only and having 2 read only secondary back ends, just change the IP addresses to point to the right hosts.

For more information on Solr replication refer to the Solr wiki, for more information on configuring HAProxy refer to the manual. Thanks to Joe William and his blog post on load balancing CouchDB using haproxy which helped me get the configuration I needed after I decided what I wanted.

Feb 22 2010
Feb 22

When you run a lot of Drupal sites it can be annoying to keep track of all of the modules contained in a platform and ensure all of them are up to date. One option is to setup a dummy site setup with all the modules installed and email notifications enabled, this is OK, but then you need to make sure you enable the additional modules every time you add something to your platform.

I wanted to be able to check the status of all of the modules in a given platform using the command line. I started scratching the itch by writing a simple shell script to use the Drupal updates server to check for the status of all the modules. I kept on polishing it until I was happy with it, there are some bits of which are a little bit ugly, but that is mostly due to the limitations of bash. If I had to rewrite the it I would do it in PHP or some other language which understands arrays/lists and has http client and XML libraries.

The script supports excluding modules by using an extended grep regular expression pattern and nominating a major version of Drupal. When there is a version mismatch it will be shown in bolded red, while modules where the versions match will be shown in green. The script filters out all dev and alpha releases, after all the script is designed for checking production sites. Adding support for per module update servers should be pretty easy to do, but I don’t have modules to test this with.

To use the script, download it, save it somewhere handy, such as ~/bin/check-module-status.sh, make it executable (run chmod +x ~/bin/check-module-status.sh). Now it is ready for you to run it - ~/bin/check-module-status.sh /path/to/drupal and wait for the output.

#
# Check the latest version of a drupal module / template
#
# Written by Dave Hall http://davehall.com.au
# Copyright (c) 2010 Dave Hall Consulting http://davehall.com.au
# Licensed under the terms of the GPLv3 http://www.gnu.org/licenses/gpl.html
#

BASE_URL='http://updates.drupal.org/release-history/'

if [[ 0 -eq $# || 3 -lt $# ]]; then
  echo Usage: $(basename $0) /path/to/drupal-instance [[exclude-pattern] drupal-version]
  exit 1
fi

DRUPAL_PATH=$1

if [ ! -d "$DRUPAL_PATH" ]; then
  echo ERROR: Invalid path - $DRUPAL_PATH
  exit 2
fi

EXCLUDE_PATTERN='(drupal)'
if [ 2 -lt $# ]; then
  EXCLUDE_PATTERN=$2
fi

DRUPAL_VERSION=6.x
if [ 3 -eq $# ]; then
  DRUPAL_VERSION=$3
fi

INFO_FILES=$(find $DRUPAL_PATH -name *.info)

modules=( )
for info in $INFO_FILES; do
  project=$(sed -e '/project =/!d' -e 's/project = "\(.*\)"/\1/g' < $info | head -n1)
  if [[ "" == "$project" || 0 -eq $(echo $project | egrep -cv $EXCLUDE_PATTERN) ]]; then
    continue
  fi
  cur_ver=$(sed -e '/version =/!d' -e 's/version = "\(.*\)"/\1/g' < $info | tail -n1)
  modules=( "${modules[@]}" "$project|$cur_ver##" )
done

modules=$(echo ${modules[@]} | tr '##' '\n' | sort -u)

echo -e "module\t\tlocal\tcurrent"
for mod in $modules; do
  color='\e[0;32m'
  new_ver=$(wget -O - http://updates.drupal.org/release-history/`echo $mod | cut -d\| -f1`/$DRUPAL_VERSION 2> /dev/null | sed -e '/<version>.*<\/version>/!d' -e '/\(dev\|alpha\)/d' -e 's/.*<version>\(.*\)<\/version>/\1/g' | head -n1 );
  if [ "$(echo $mod | cut -d\| -f2)" != "$new_ver" ]; then
    color='\e[1;31m'
  fi
  echo -e $color$mod\|$new_ver | tr '|' '\t'
done
Feb 16 2010
Feb 16

Fellow Optaros colleague, Chris Fuller, and I want to present on the Alfresco-Drupal integration at Drupalcon in San Francisco (April 19-21). If you’re interested in Alfresco, Drupal, and CMIS (any or all of the above), please vote for our session.

Feb 04 2010
Feb 04

Lately I have been trying to avoid non packaged software being installed on production servers. The main reason for this is to make it easier to apply updates. It also makes it easier to deploy new servers with meta packages when everything is pre packaged.

One tool which I am using a lot on production servers is Drupal’s command line tool - drush. Drush is awesome it makes managing Drupal sites so much easier, especially when it comes to applying updates. Drush is packaged for Debian testing, unstable and lenny backports by Antoine Beaupré (aka anarcat) and will be available in universe for ubuntu lucid. Drush depends on PEAR’s Console_Table module and includes some code which automagically installs the dependency from PEAR CVS. The Debianised package includes the PEAR class in the package, which is handy, but if you are building your own debs from CVS or the nightly tarballs, the dependency isn’t included. The auto installer only works if it can write to /path/to/drush/includes, which in these cases means calling drush as root, otherwise it spews a few errors about not being able to write the file then dies.

A more packaging friendly approach would be to build a Debian package for PEAR Console_Table and have that as a dependency of the drush package in Debian. The problem with this approach is that drush currently only looks in /path/to/drush/includes for the PEAR class. I have submitted a patch which first checks if Table_Console has been installed via the PEAR installer (or other package management tool). Combine this with the Debian source package I have created for Table_Console (see the file attached at the bottom of the post), you can have a modular and apt managed instance of drush, without having to duplicate code.

I have discussed this approach with anarcat, he is supportive and hopefully it will be the approach adopted for drush 3.0.

Update The drush patch has been committed and should be included in 3.0alpha2.

Jan 29 2010
Jan 29

Packt Publishing seem to have liked my review of Drupal 6 Javascript and jQuery, so much so they have asked me to review another title. On my return from linux.conf.au and Drupal South in New Zealand, a copy of the second edition of AJAX and PHP was waiting for me at the post office. I’ll be reading and reviewing the book during February.

I will cover LCA and Drupal South in other blog posts once I have some time to sit down and reflect on the events. For now I will just gloat about winning a spot prize at Drupal South. I walked away with Emma Jane Hogbin and Konstantin Käfer’s book, Front End Drupal. I’ve wanted to buy this title for a while, but shipping from the US made it a bit too pricey even with the strong Australian Dollar. I hope to start reading it in a few weeks, with a review to follow shortly after.

Got a book for me to review? I only read books in dead tree format as I mostly read when I want to get away from the screen. Feel free to contact me to discuss it further.

Jan 25 2010
Jan 25
Sites that show up quickly on a user's screen tend to keep the user's attention for longer and also rank better in Google. So when listening to the excellent Lullabot Podcast 80: Top 40 Drupal Modules Revisited I was caught by Drupal experts stating that Memcache would even speed up your site if it's not a high traffic site. It could off-load the database. I decided to give it a try on a relatively low traffic site of one of my clients. The site is already being served by nginx so the load times are already pretty good. When setting up the experimentation I was a bit confused by the thorough but chaotic instruction to set up memcache. I ran into a problem setting up PECL memcache:
$ sudo pecl install memcache
downloading memcache-2.2.5.tgz ...
Starting to download memcache-2.2.5.tgz (35,981 bytes)
..........done: 35,981 bytes
11 source files, building
running: phpize
Configuring for:
PHP Api Version:         20041225
Zend Module Api No:      20060613
Zend Extension Api No:   220060519
shtool at '/tmp/pear/temp/memcache/build/shtool' does not exist or is not executable.
Make sure that the file exists and is executable and then rerun this script.

ERROR: `phpize' failed
Since I run Debian I solved this by running apt-get install php5-memcache instead. I restarted php-fastcgi (needed for nginx) and added $conf['cache_inc'] ='sites/all/modules/memcache/memcache.db.inc'; to settings.php. Then I ran some tests in Hammerhead.  Not on the front page since there is an embedded Youtube video. I chose a page without any external elements. I was a bit confused at first, and ran several short tests. I forgot to log out and the memcache module was adding a lot of extra information in the bottom.  I thought that was the reason that my site was a lot slower with memcache than without it.  So I logged out and tried again: count latest median avg empty cache with memcache 4 1449 1449 1759 primed cache with memcache 4 1277 1186 1184 empty cache without memcache 4 1239 1283 1483 primed cache without memcache 4 1014 900 950
It could be that I'd have to tweak some memcache settings, or run longer tests, but I think these numbers are clear enough: memcache made my site slower instead of faster!
Jan 21 2010
Jan 21

If you are not familiar with the concept of 960 grid system you should be. In one sentence, 960 is a CSS framework that divides a page to 12 or 16 columns with a total of 960 pixels.

960 is a framework. As so, it doesn’t only give us tools, it also encourages us to use them the _right way. I think it’s a bit like forms API in Drupal that helps us build our forms correctly. In 960 you have CSS classes that can be used to set elements width and position.

For example, working with 16 columns and the following pseudo-code:

print $left-sidebar

print $main-content

Will give us something like this:

The left sidebar is taking 4 columns, and the main is taking a total of 12. As you see in the above example we make sure to “fill up columns”, so even if wanted the main to be in 11 columns, we would add a suffix (e.g. _suffix-1) or a prefix - so the total of the line will be 16 columns. This will assure all elements are in place, and nothing pops up. Oh, and it’s cross browser compatible. Oh, and it’s already RTLed.

Before we start with the examples set up your system:

  1. Download zen and zen_ninesixty themes
  2. Download the gizra_ninesixty theme used for this tutorial here (A clean gizra_ninesixty can be found in github)
  3. Enable the above themes and make gizra_ninesixty the default
  4. In the gizra_ninesixty configuration page, enable the "Active 960 grid system image"
  5. Download the example module with the tutorial's Views and Panels, and enable it
  6. In all the examples I assume we are working with 16 columns grid, and that we have blocks on the left sidebar, but not on the right (so the main content area is 12 columns wide)

Our first example will be showing in a node view the submitted by and tags on the left, and the node links on it’s right.

Open _node.tpl.php in gizra_ninesixty and compare it to the one in zen_ninesixty. You will notice we added grid-* classes. For example

Jan 13 2010
Jan 13

Is your recent Drupal update not taking effect? Drupal still claims to be the old version?

It is probably correct! There's at least one module on your system that claims it is 1) a core module and 2) old. How did this happen? Common scenarios are:

  • You accidentally restored the old modules folder from a backup.
  • You tried to overwrite the older install, but this failed for some reason (common on ftp).
  • You made a backup of core modules inside the modules or sites folder.
  • You copied a core module to sites/default/modules or sites/[site]/modules to override a core module.
  • You are looking at the wrong server (embarrassing, but it happens).

Remember, Drupal prefers core files in sites/all/modules or sites/[site]/modules over those in modules when it finds copies.

To identify the actual files in use, check the filename and info columns for core modules in the system table. If you don't like touching your database, install the Update: DOH! module. It will give you a list of filenames and their versions.

Jan 07 2010
Jan 07

Recently I had the privilege of working on the conversion of Novus Biological's website to Drupal 6, selling scientific supplies using Ubercart. I did this as a freelancer for the talented folks at SpireMedia, working with them as well as with fellow Vintage Digital member Ben Jeavons. It was a fantastically intense experience.

Ubercart's a great system out of the box, but there were a number of features on the site which made this implementation difficult.

  1. There were about 70,000 products, which is probably two or three orders of magnitude greater than the usual Ubercart site.
  2. The site needed to work with three currencies, and four prices, depending on the user's location or selection. Ubercart doesn't handle this well.
  3. The site needed to have two kinds of discounts - ones available to most any user, and ones only available to users who had bought the discount with userpoints.

I'll probably write about all of these in time, but at the moment I want to comment on that second item.

A close-up of the novusbio website, showing the region-change text
If you take a look at www.novusbio.com, or the image above, you'll see a bit of text in the upper right hand corner which says something like 'US site', or 'Europe Site', or 'Great Britain site', or 'World Site', with a little arrow next to it letting you change your region. We would start by pre-selecting a region for the user based on which IP they were browsing from, and then users could change it. If you go to a product page on Novus' site, you'll see different prices based on which of these are currently selected, as well as different currency signs. Since doing something like this is what some people would like to be able to do, I wanted to take a little time to go over how we did this.

Note: If you're not a coder, and you need this functionality, then you're going to want to go find a coder to handle this for you. It's not currently possible to do by flipping a switch or installing a module. I'm going over the changes you need to make here, but the full set of changes, along with the needed patches to Ubercart, are contained in uc_example.zip, attached to this post.

To start with, you're going to need to define the regions or currencies you're doing prices for. I use the term regions, because you could well have two different areas of the world using the same symbol but different price amounts. So, I use a define to mark which regions I'm keeping track of.

define('UC_EXAMPLE_DOLLAR', 1);
define('UC_EXAMPLE_EURO', 2);
?>

There are basically two things you want to handle when you allow for more than one currency in Ubercart:

  1. You want to change the currency symbol.
  2. You want to change the numeric amount of the price. You probably won't want to have a factor that you apply to the base price - I would rather suggest setting up alternate prices as CCK fields on your product nodes, much the same way that books in the US have set US and Canadian prices printed on them.

(As an aside, although the changes I'm discussing will allow you to have, say, both US and Euro prices on your site, you'll also at some point need to go through and write up some new reports that are multi-currency aware. After all, a report that says you sold 54,382 in the last two months doesn't mean anything if 24,382 was in US Dollars and 30,000 was in Euros. And, if you do up some reports like this, why not share them with the rest of us?)

There's two different places where you need to keep track of the current currency type as well - for when a user is browsing the site, or putting together an order, or reviewing her cart, or whichever, you'll want to store the current currency type in the session. On the other hand, if you're viewing an order - either in the very last step before the order is committed, (where a row has been written to uc_orders) or when viewing an order in the history, you need to have stored the currency type in the order data itself.

Changing the amount/magnitude of the item

Now, the slightly simpler part is changing the amount of the charge - if you're looking at $360, then we're talking about the 360 part of it. This amount needs to change on cart items or on products when they're viewed. We don't need to worry about changing the amount on the items in completed orders, because that amount quite sensibly is fixed when the order is committed.

I'm not going to go over every change you need to make - you might need to do a little work in your views to decide which price to use when displaying a view (indeed, views_customfield may be a good idea for that), but when you're changing how the product is displayed, you're basically doing a little work with hook_nodeapi:

/**
* Implements hook_nodeapi().
*/
function uc_example_nodeapi(&$node, $op, $a3 = NULL, $a4 = NULL) {
  if (
$op == 'view') {
   
$region_price = uc_example_get_node_price($node, $_SESSION['region']);
   
   
$context['class'][1] = 'sell';
   
$context['field'] = 'sell_price';
   
$node->content['sell_price']['#value'] = theme('uc_product_price', $region_price, $context, array('label' => !$a3));
   
   
$context['class'][1] = 'display';
   
$context['field'] = 'sell_price';
   
$node->content['display_price']['#value'] = theme('uc_product_price', $region_price, $context);

  }
}

/**
* Returns the price from the node, from the region passed in.
*
* @param stdClass $node Product node to get price for.
* @param integer $current_region Region to get the price for.
*/
function uc_example_get_node_price($node, $current_region) {
  switch (
$current_region) {
    case
UC_EXAMPLE_DOLLAR:
      return
$node->field_price_us[0]['value'];
    case
UC_EXAMPLE_EURO:
      return
$node->field_price_euro[0]['value'];
    default:
      return
0;
  }
}
?>

When viewing the price in a person's cart, we'll want to use hook_cart_item($op, $item) to change the price for products.

/**
* Implements hook_cart_item()
*
* This lets us change the base price based on the user's current region.
*
* See http://www.ubercart.org/docs/api/hook_cart_item
* @param String $op     Operation being done
* @param stdClass $item Cart item - usually a node of some sort.
*/
function uc_example_cart_item($op, &$item) {

  </span>//dpm('hook_cart_item '. $op);
 
if ($op == 'load') {
    global
$user;
   
   
$product_node = node_load($item->nid);

    </span>//dpm($_SESSION);
   
$region_id = 0;
   
// If we're viewing the cart item on the review page, we want to use the region
    // that's embedded in the order item, not the region that the user is using.
   
if ($_GET['q'] == 'cart/checkout/review') {
     
$order = db_fetch_object(db_query(
       
"SELECT uo.order_id, IFNULL(uom.region, 1) as mc_region,
        IFNULL(uom.currency_sign, '$') AS mc_currency_sign
        FROM {uc_orders} uo
        INNER JOIN {uc_order_products} uop ON (uo.order_id = uop.order_id)
        LEFT OUTER JOIN {uc_order_multicurrency} uom ON (uo.order_id = uom.order_id)
        WHERE uid = %d and order_status = '%s' AND uop.nid = %d ORDER BY uo.order_id DESC"
,
       
$user->uid, 'in_checkout', $item->nid
     
));
      if (
$order) {
       
$current_region = $order->mc_region;
      }
      else {
       
$current_region = $_SESSION['region'];
      }
    }
    else {
     
$current_region = $_SESSION['region'];
    }

    if (</span>$product_node) {
     
$item->price = uc_example_get_node_price($product_node, $current_region);
    }
  }
}
?>

Changing the Currency Symbol

With that done, lets consider the currency sign. I like storing the user's currently selected symbol in $_SESSION['currency_sign'], and we'll also be storing it in orders as well. The first impulse you might have would be to start mucking about with variable_set('uc_currency_sign') - but changing that changes the symbol for everything at once, and you're no doubt hoping to have more than one customer at a time.

Instead you want to register a price handler with hook_uc_price_handler. Then, whenever a price is being determined by uc_price(), your price alteration function can determine the proper currency sign to use, and provide it.

/**
* All we're doing with this price alteration function, is telling it to use for the
* currency sign the sign that we've stashed in $_SESSION.
*/
function _uc_example_price_alter(&$price_info, $context, &$options) {
 
// default currency sign.
 
$options['sign'] = $_SESSION['currency_sign'];
  if (isset(
$context['subject']['order']) && $context['subject']['order']->order_id) {
   
$order = $context['subject']['order'];
    if (isset(
$order->mc_currency_sign)) {
     
$options['sign'] = $order->mc_currency_sign; // use sign from order!
   
}
    else {
     
$currency_sign = db_result(db_query("SELECT currency_sign FROM {uc_order_multicurrency} WHERE order_id = %d", $order->order_id));
      if (
$currency_sign) {
       
$options['sign'] = $currency_sign;
      }
    }
  }
}
?>

What I'm doing there is setting $options['sign'] - which is an array passed in by reference - and changing it to the sign we want it to have. First we set it by the $_SESSION, and then we check and see if the context - a collection of information about what this price is and where it's being presented - contains the order which this price is a part of. If it is a part of an order, then we use the sign saved as part of that order.

If you're wondering how this information gets to be part of the order, unsurprisingly the answer is we implement hook_order in our code.

/**
* Implements hook_order().
*
* This is where we finalize the region_id and currency_sign.
*/
function uc_example_order($op, &$arg1, $arg2) {

  if (</span>$op == 'new') {
   
// $arg1 is a reference to the order object.

    </span>$arg1->mc_currency_sign = $_SESSION['currency_sign'];
   
$arg1->mc_region = $_SESSION['region'];
  }
  if (
$op == 'save' && $_GET['q'] == 'cart/checkout') {
   
// Now we need to save our updated region data to the order!
   
if (db_result(db_query("SELECT count(*) FROM {uc_order_multicurrency} WHERE order_id = %d", $arg1->order_id))) {
     
db_query("UPDATE {uc_order_multicurrency} SET region = %d, currency_sign = '%s' WHERE order_id = %d",
              
$_SESSION['region'], $_SESSION['currency_sign'], $arg1->order_id);
    }
    else {
     
db_query("INSERT INTO {uc_order_multicurrency} (order_id, region, currency_sign) VALUES (%d, %d, '%s')",
              
$arg1->order_id, $_SESSION['region'], $_SESSION['currency_sign']);
    }
  }
  if (
$op == 'load') {
   
$multicurrency = db_fetch_array(db_query("SELECT region, currency_sign FROM {uc_order_multicurrency} WHERE order_id = %d",
                                            
$arg1->order_id));
   
$arg1->mc_region = $multicurrency['region'];
   
$arg1->mc_currency_sign = $multicurrency['currency_sign'];
  }
}
?>

And... then this is where we get to the unfortunate part. Remember that price handler above, where we pull the order information out of the context? Unfortunately, Ubercart is really inconsistent about providing us that needed information, even when the call to uc_price is happening in a function where $order is already just sitting there, all loaded up. So, I'd like to introduce these patch files, which are provided to you inside of uc_example.zip:

  • ubercart-621494.patch
  • ubercart-display-symbols.patch
  • ubercart-order-price.patch

These three patch files need to be applied to Ubercart, and force it to provide the $order as part of the $context when uc_price is being called. If you look through these files, you'll see that most of the time this $order object is already there - I'm just adding it to the $context['subject'] array. (Unfamiliar with patches? Copy the three files to the Ubercart directory, change your directory to that Ubercart directory, and then execute patch -p0 < {filename} for each one.)

Enclosed is a module that contains the changes I've mentioned above. It assumes that you've used cck to add a 'price_us' and 'price_euro' field to the product nodes, but otherwise you can use it as is, plug it into a test Ubercart installation, run, and try it out. The patches are only necessary when viewing a finalized order, particularly in the admin area, so you don't even need to use that when just experimenting with the code.

AttachmentSize uc_example.zip7.1 KB
Dec 27 2009
Dec 27

Drupal is using CVS for version controlling core and contrib modules. CVS has diffing options, but it becomes annoying when you try adding or deleting new files. Here’s an example of how to use git to patch a contrib module - Organic groups.

  1. In OG's project page click on the CVS instructions and select the version you want to patch. ``` cvs -z6 -d:pserver:anonymous:[email protected]:/cvs/drupal-contrib checkout -d og-DRUPAL-6--2 -r DRUPAL-6--2 contributions/modules/og/ ```
  2. Go inside the directory you just checked out from CVS. ``` cd og-DRUPAL-6--2 ```
  3. Make the folder and all sub-folder a git repository, and add all the files. ``` git init git add . git commit -m "Initial commit of the Organic groups module." ```
  4. No we have a _master branch with the original module. We can create a new git branch and work on this new branch - edit files, move files around, create new directories, etc'. ``` git checkout -b new-branch ```
  5. The _new-branch is now the active branch. Do some changes to code, and commit them. ``` git commit -a -m "A meaningful commit description." ```
  6. In order to create a diff file, we need to switch back to the master branch, and create a diff against our new-branch. Note that we use the ```--no-prefix``` command to follow the patch creation standard of Drupal. ``` git checkout master git diff --no-prefix master new-branch > [issue-id]-[patch description]-[comment number].patch ```
  7. We can delete the temporary branch. ``` git branch -D new-branch ```

If you want to work only with git, and skip the CVS part, have a look at git.drupalfr.org.

Dec 08 2009
Dec 08

Aegir is an excellent way to manage multi site drupal instances, but sometimes it can be a bit too heavy. For example if you have a handful of sites, it can be overkill to deploy aegir. If there is an urgent security fix and you have a lot of sites (I am talking 100s if not 1000s) to patch, waiting for aegir to migrate and verify all of your sites can be a little too slow.

For these situations I have a little script which I use to do the heavy lifting. I keep in ~/bin/update-all-sites and it has a single purpose, to update all of my drupal instances with a single command. Just like aegir, my script leverages drush, but unlike aegir there is no parachute, so if something breaks during the upgrade you get to keep all of the pieces. If you use this script, I would recommend always backing up all of your databases first - just in case.

I keep my “platforms” in svn, so before running the script I run a svn switch or svn update depending on how major the update is. If you are using git or bzr, you would do something similar first. If you aren’t using any form of version control - I feel sorry for your clients.

So here is the code, it should be pretty self explanatory - if not ask questions via the comments.

#!/bin/sh
# Update all drupal sites at once using drush - aka lazy person's aegir
#
# Written by Dave Hall
# Copyright (c) 2009 Dave Hall Consulting http://davehall.com.au
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
# Alternatively you may use and/or distribute it under the terms
# of the CC-BY-SA license http://creativecommons.org/licenses/by-sa/3.0/
# Change this to point to your instance of drush isn't in your path
DRUSH_CMD="drush"
if [ $# != 1 ]; then
    SCRIPT="`basename $0`"
    echo "Usage: $SCRIPT path-to-drupal-install"
    exit 1;
fi
SITES_PATH="$1"
PWD=$(pwd)
cd "$SITES_PATH/sites";
for site in `find ./ -maxdepth 1 -type d | cut -d/ -f2 | egrep -v '(.git|.bzr|.svn|all|^$)'`; do
    if [ -f "${site}/settings.php" ]; then
        echo updating $site
        $DRUSH_CMD updatedb -y -l $site
    fi
done
# Lets go back to where we started
cd "$PWD"

OK, so my script isn’t any where as awesome as aegir, but if you are lazy (or in a hurry) it can come in handy. Most of the time you will probably still want to use aegir.

Notes:

Make sure you make the script executable (hint run chmod +x /path/to/update-all-sites)

If you don’t have drush in your path, I would recommend you add it, but if you can’t then change DRUSH_CMD="drush" to point to your instance of drush - such as DRUSH_CMD="/opt/drush/drush".

Thanks to Peter Lieverdink (aka cafuego) for suggesting the improved regex.

Dec 03 2009
Dec 03

My blog is now syndicated on Planet Drupal. I am very excited about this - thanks Simon.

For the last 8 years or so I have been running my own IT consulting business, focusing on free/open source software and web application development. My clients have range from micro businesses up to well known geek brands like SGI. Until recently I lead the phpGroupWare project.

My Drupal profile doesn’t really give much of a hint about my involvement with Drupal. My biggest regret is not signing up for a d.o account sooner. I forget when I started using Drupal 4.7, but I liked it straight away. It was the first CMS which worked the way I thought a CMS should work.

Over time I have learned how to get Drupal to do what I want it to do. Due to the massive range of contrib modules I haven’t got my hands very dirty hacking on Drupal - yet.

This year I have been involved in a major Drupal project which involves hosting around 2100 sites. Aegir has made a lot of this painless, especially with our 3,000 line install profile. Over the Christmas period I hope to find the time to blog about the setup, parts of it are pretty crazy.

I’ll get around to upgrading my site to Drupal 6 one of these days when I get some time, that should coincide with a visual and content refresh. Feel free to check out some of my older Drupal related posts.

Nov 30 2009
Nov 30

I have just finished reading Matt Butcher’s latest book, Drupal 6 JavaScript and jQuery, published by Packt Publishing - ISBN 978-1-847196-16-3. It is a good read. It is one of those books that arrived at the right time and left me inspired.

I have always leaned towards Yahoo’s YUI toolkit when I need an Ajax framework, while the rest of the time I just bash out a bit of JS to get the job done. The more I use Drupal, the more I have been wanting to find time to get into jQuery. This book has got me motivated to play with jQuery - especially in combination with Drupal.

The book is logically structured and flows well from chapter to chapter. I find Matt’s writing style easy to read, he even brought a smile to my face a few times. Matt assumes a basic knowledge of JS and Drupal, but he also provides links so the reader is able to get additional information if their knowledge is lacking. However, a couple of times Matt seemed to switch quite abruptly from assuming a good level of knowledge on a particular topic to explaining what seemed to me to be basic or simple concepts in great detail.

In the first chapter, entitled Drupal and JavaScript, Matt covers the basics of Drupal, its relationship with JavaScript and recommends some essential items for any serious Drupal developer’s toolbox. This chapter provides a nice introduction of what is to come in the rest of the book and allows the reader to become acquainted with Matt’s style.

Working with JavaScript in Drupal covers the basics of the Drupal coding standards and why sticking to the standard is important. It then moves onto a quick overview of Drupal’s theme engine, PHPTemplate, and integrating JS with Drupal themes. I felt that the development practices part of this chapter could have been expanded a bit more and turned into its own chapter. Understanding the basics of theming is critical for being able to follow the rest of the book, but again I think this half of the chapter could have been developed into a separate chapter. Regardless of how the chapter was arranged, the content is well written and provides solid and practical examples.

In jQuery: Do More with Drupal, Matt gives a detailed overview of jQuery and how it is used in Drupal. Although the code sample has limited real world usefulness, it provides the reader with a very clear idea of the power of jQuery and how easy it is to use with Drupal. By the end of this chapter I was left feeling like I wanted to get my hands dirty with jQuery, unfortunately it was after 1am and I had to work the next day.

In Chapter 4, we move onto Drupal’s Behaviors, which is covered in great detail. Behaviors are a key part of Drupal’s JS implementation and essentially provide an events based hooks system in JavaScript. Once again Matt spends a lot of time explaining this feature, how it works, how to use it and where to learn more. Matt’s description of this feature had me thinking “OMG, Drupal behaviours are awesome” throughout the chapter.

Lost in Translations, is the name of a good movie starring Scarlett Johansson and Bill Murray, which I enjoyed watching a few years ago, oh and is also the fifth chapter of the book. I suspect that I am like many English speaking Drupal developers in that I use the basics of the Drupal translation engine, but pay very little attention to how it works as my target audience is English speaking like me. Not only does Matt explain how Drupal’s translation system works in both PHP and JavaScript, he makes it clear why all Drupal developers should understand and use the system - regardless of their native/target language/s.

The JavaScript Themeing chapter was a bit of a surprise for me. I was expecting Drupal to have a JS equivalent to PHPTemplate and for this chapter to outline it and provide some code samples. Instead I learn that Drupal has a very simple, and easy to use, JS themeing system. Matt spends some time discussing best practice for themeing content in JS and goes on to provide the code for his own simple yet powerful jQuery based themeing engine for Drupal.

In AJAX and Drupal Web Services, we learn about JSON, XML and XHR in the context of Drupal. Once again Matt demonstrates the ease of using Drupal and jQuery for quickly building powerful functionality.

Chapter 8 is entitled, Building a Module, and covers the basics of building a JS enabled module for Drupal. Matt also discusses when JS belongs in a theme and when it should be part of a module. The cross promotion of his other book Learning Drupal 6 Module Development ramps up a couple of notches in this chapter. I found the plugs a bit irritating (especially as I own a copy of the book), but overall the chapter is loaded with useful information.

The final chapter, Integrating and Extending, leaves the reader with a solid understanding of what can be done to make jQuery even more useful. This chapter provides a nice motivational finish to the book.

At the start of each chapter Matt recaps what has been covered and outlines where the chapter is heading which makes it easy to get back into the book after putting it down for a few days.

This book is definitely not for the copy and paste coder, nor the developer who just wants ready made solutions they can quickly hack into an existing project. Some may disagree, but I think this is a real positive of this book. Matt uses the examples to illustrate certain concepts or features which he wants the reader to understand. I found the examples got me thinking about what I wanted to use JS and jQuery for in my Drupal sites. Although some of the code samples run to several pages, Matt then spends a lot of time explaining what is happening in bit sized chunks, which makes it easy to understand. I also appreciated the links to documentation so I could get the information I’d need to write my own code for my projects.

One thing which always annoys me about Packt books is the glossy ink they use. In some lighting conditions it is too shiny, which makes it annoying to read, especially with a bed side lamp. On the positive side, the paper is solid and easy to turn.

Sprinkled through the book is some cross promotion of other Packt titles, which I have no issue with, it is a good opportunity to try to grab some additional sales. In a couple of the later chapters it becomes a bit too much. I think once or twice per chapter is reasonable.

I really enjoyed reading Drupal 6 JavaScript and jQuery, it is easy to read and the chapters are a size which lend themselves to being read in a session. I think any Drupal developer who wants to get into using JS in their sites/projects would benefit from reading this book. I finished it feeling like I wanted to start doing some hacking. I plan to update this site in the next few months, and now jQuery enabled effects is on the requirements list. I hope I can bump into Matt Butcher at a DrupalCon or somewhere else in my travels so I can buy him a beer to thank him for putting together a quality book.

Disclaimer Packt Publishing gave me a dead tree copy of this book to review it and keep. I’m glad they gave me a good title to review.

Nov 27 2009
Nov 27

After two years of working from home, I've decided it's time to make a move back into an office and look for some contract-based or perhaps permanent work in London.

I have four years of experience using, developing and helping guide the development of Drupal projects as well as a background and interest in all things geographic, from maps to open data (as you've probably seen from the topics I cover in my blog). With these skills I am looking to find some work as a Drupal developer for an organisation based in London, ideally integrating my geographic interest. Alternatively, I'm open to other opportunities that I may be suitable for.

If you have, or know of, any positions coming up from January onwards, I'd love to hear from you to discuss the details. You can find out more information about me in my CV (pdf) or on my LinkedIn profile.

Oct 13 2009
Oct 13

UPDATE: Screencast now lives here:

[embedded content]

I recorded a quick screencast of a simple integration we did to show Open Atrium leveraging Alfresco as a formal document repository via CMIS. This leverages the CMIS Alfresco module we developed and released on Drupal.org.

As I point out in the screencast, there’s not much to the integration from a technical standpoint. Open Atrium is Drupal and the CMIS module already has a CMIS repository browser. So, all we had to do was expose the module as a “feature”, which is something Open Atrium uses to bundle modules together that create a given chunk of functionality.

Readers familiar with Alfresco Share will instantly recognize the Open Atrium concepts. Instead of “sites” Atrium uses “groups”. Instead of “pages” or “tools”, Atrium uses “features”. The overall purpose, self-provisioned team-based collaboration, is the same and many of the tools/features are the same (blog, calendar, member directory). I’m not advocating using one over the other–as usual, what works best for you depends on a lot of factors. I just thought Atrium provided a nice way to show yet another example of Drupal and Alfresco together (post).

Oct 12 2009
Oct 12

I am not reviewing a book today, but I soon will be. Packt Publishing have asked me to review Matt Butcher’s new book Drupal 6 JavaScript and jQuery. The book looks pretty interesting. Although it isn’t on the same scale, being asked to review a serious Drupal developer book, is a bit like Obama winning the noble peace prize - ok maybe I am exaggerating a little there.

I really like YUI, but Drupal has made me interested in jQuery. One of the things awesome things about Drupal is that you can use jQuery without ever having to touch jQuery. This has made me lazy about learning jQuery - especially in the context of Drupal. It look like I have run out of excuses.

The book should arrive in France by the end of the week, but I won’t be back in France for a couple of weeks, I have heading off to China for 10 days or so to catch up with friends and discuss some ideas about doing cool things with Drupal. Watch this space.

Sep 15 2009
Sep 15

People want intranets that are fun and easy to use, full of compelling content relevant to their job, and enabled with social and community features to help them discover connections with other teams, projects, and colleagues. IT wants something that’s lightweight and flexible enough to respond to the needs of the business that won’t cost a fortune.

That’s why Drupal + Alfresco is a great combination for things like intranets like the one Optaros built for Activision and why we had a record-breaking turnout for the Drupal + Alfresco webinar Chris Fuller and I did today. Thanks to everyone who came and asked good questions. I’ve posted the slides. Alfresco recorded the webinar so they’ll make it available soon, I’m sure. When that happens, I’ll update the post with a link. Until then, enjoy the slides.

[UPDATE: Fixed the slideshare link (thanks, David!) and added the links to the webinar recording below]

1. Streaming recording link:
https://alfresco.webex.com/alfresco/lsr.php?AT=pb&SP=TC&rID=42774837&act=pb&rKey=b44130d69cc9ec5f

2. Download recording link:
https://alfresco.webex.com/alfresco/ldr.php?AT=dw&SP=TC&rID=42774837&act=pf&rKey=c50049ac82e1220a

Sep 11 2009
Sep 11

I've been back in London for almost two years now and haven't met that many people working with Drupal, partly due to working from home I think, but also because there don't seem to be too many events (outside of paid training events and the like) that are aimed at Drupalers in the London area.

As I was looking today to find out if there were meetups happening already that I wasn't aware of, I came across a thread on groups.drupal.org asking about regular meetups, and left a comment to say I'd be interested if there was anything happening. In a city the size of London, there are surely enough people working with Drupal to get a group of people together every now and then for a social event. I for one would love to meet more people in the area who are working with Drupal, and maybe have a pint or two in the process.

By the end of the day, there was a meetup organised: London Drupal Pub Meet- September Meetup. Brilliant!

If you're interested in coming along, sign up to the meetup.com event, and I'll see you there!

edit: the event will be held from 7pm on Monday 28 September, at the Square Pig, Holborn.

Aug 29 2009
Aug 29

Today I should be in Prishtina Kosovo running Drupal workshops at the first Software Freedom Conference Kosova. Unfortunately due to work and family commitments I had to decline the invitation. I hope to make it there next year.

I will also be missing out on DrupalCon Paris next week too.

Sometimes it sucks to be in Australia, especially when Europe is so far away and so many cool things happen there.

Aug 28 2009
Aug 28

Bless me internet for I haven’t blogged, it has been 274 days since my last post.

I have wanted to blog, but I kept on finding excuses to avoid it - need to upgrade the software, need to finish x, y and z, need to focus on projects a, b and c etc. One of the main reasons is that I have been too lazy to put the effort in. I find it takes time to think of what to blog and then to bash it out, refine it and post it. When I have had the time to put that effort into my blog, I haven’t had the inclination.

While things have been quiet here, I have microblogging using the open source laconica platform through identi.ca.

More recently I have been working with a client in France who has some serious collaboration requirements. At last count they have almost 2100 instances of drupal running. This has involved a lot of work, and some travel. I will blog about this project soon - it is pretty awesome (even if I say so myself).

We have built a small drupal powered site for a local assest management consulting business. They are very happy with the results. Now their customers just log in to download the software. Everything was off the shelf drupal - except for the theme and a 60 line custom permissions module.

We have built the Newstead community website using drupal. It still needs some polish before final launch. The community has been heavily involved in the development of the site. So far over 30 locals have been trained in maintaining their page/s on the site. There is no “webmaster”, each local business and community group will maintain their own content

A couple of months ago I/we joined the drupal association. One day the buttons will be added to the site.

Where to next?

I plan to blog more about projects I am involved in. I also plan to switch this site to drupal 7 as close as possible to the release date - it looks like others will be switching too. Next week I will be commencing the build of the Newstead community wireless network.

Watch this space, lots happening - including more frequent updates from here on in.

May 27 2009
May 27

It is looking like June is going to be quite an interesting month for participation, with a couple of projects being set up to focus on certain parts of Drupal for the month.

Last week, Advantage Labs announced Geo June, a month of focused development on the geo module. The geo module has a lot of potential to become the basis of the GeoCMS that Drupal should be (as long as the module stays generic enough), and Advantage Labs are keen to get more people interested and involved to help make that happen. During the month there are a number of physical events, but you're also encouraged to share your use cases and join in day to day with the IRC chat in #drupal-geo.

The Drupal User Experience Project also yesterday announced the launch of Microprojects to encourage user experience (UX) professionals to get involved in small bounded problems, working with a Drupal developer to implement their designs and suggestions for improvement. This seems like a great idea, not only because it's breaking down some quite large problems into bite-size manageable chunks, but also to get some outside experts - who may not have previously used Drupal - involved in the community.

If you're interested in either of these areas (or any of the other sprints which are happening), why not jump in and get involved. Having not spent much time on the Geo project yet, I'm looking to spend some time getting to know it in June and hopefully help to push it forwards, as well as starting the rewrite of the KML module to simplify it as a views display type instead of a bundle of custom code.

Apr 28 2009
Apr 28

I’ll be in Chicago tomorrow for the Alfresco Meetup. I’ll be speaking during the Barcamp on Alfresco and Drupal integration with CMIS (module, screencast). I’ll also have the Alfresco-Django integration running on my laptop. I may not have time to show Alfresco-Django during my slot, but I’ll be happy to stick around and do informal demos and talk about either integration if you’re interested because I’d like your feedback on it.

Apr 20 2009
Apr 20

People often need to build a custom user interface on top of the Alfresco repository and I see a lot of people asking general questions about how to do it. There are lots of options to consider. Here are four options for creating a user interface on top of Alfresco, at a high level:

Option 1: Use your favorite programming language and/or framework to talk to Alfresco via REST or Web Services. PHP? Python? Java? Flex? Whatever, it’s up to you. The REST API is nice because if you can’t find a URL that does what you need it to out-of-the-box, you can always roll-your-own with the web script framework. This option offers the most flexibility and creative freedom, but of course you might end up building constructs or components that you may have gotten “for free” from a higher-level framework. Optaros‘ streamlined web client, DoCASU, built on Ext-JS, is one freely-available example of a custom UI on top of Alfresco but there are others.

Option 2: Use Alfresco’s Surf framework. Alfresco’s Surf framework is just that–it’s a framework. Don’t confuse it with Alfresco Share which is a team-centric collaboration client built on top of Surf. And, don’t assume that just because a piece of functionality is in Share it is available to you in the lower-level Surf framework. You may have to do some extra work to get some of the cool stuff in Share to work in your pure Surf app. Also realize that Surf is brand new and still maturing. You’ll be quickly disappointed if you hold it to the same standard as a more widely-used, well-established framework like Seam or Django. Surf is a good option for quick, Alfresco-centric solutions, especially if you think you might want to leverage Alfresco’s browser-based site assembly tool, Web Studio, at some point in the future. (See Do-it-yourself Alfresco Surf Code Camp).

Option 3: Customize the Alfresco “Explorer” web client. There are varying degrees to which you can customize the web client. On one end of the spectrum you’ve got Freemarker “presentation templates” followed closely by XML configuration. On the other end of the spectrum you’ve got more elaborate enhancements you can make using JavaServer Faces (JSF). Customizing the Alfresco Explorer web client should only be considered if you can keep your enhancements to an absolute minimum because:

  1. Alfresco is moving away from JSF in favor of Surf-based clients. The Explorer client will continue to be around, but I wouldn’t expect major efforts to be focused on that client going forward.
  2. JSF-based customizations of the web client can be time-consuming and potentially complex, particularly if you are new to JSF.
  3. For most solutions, you’ll get more customer satisfaction bang out of your coding buck by building a purpose-built, eye-catching, UI designed with your specific use cases in mind than you will by starting with the general-purpose web client and extending from there.

Option 4: Use a portal, community, or WCM platform. This includes PHP-based projects like Drupal (Drupal CMIS Screencast) or Joomla as well as Java-based projects like Liferay and JBoss Portal. This is a good option if you have requirements that match up well with the built-in (or easily added-on) capabilities of those platforms.

It’s worth talking about Java portal servers specifically. I think people are struggling a bit to find The Best Way to integrate Alfresco with a portal. Of course there probably is no single approach that will fit every situation but I think Alfresco (with help from the community) could do more to provide best practices.

Here are the options you have when integrating with a portal:

Portal Option 1: Configure Alfresco to be the replacement JSR-170 repository for the portal. This option seems like more trouble than it is worth. If all you need is what you can get out of JSR-170, you might as well use the already-integrated Jackrabbit repository that most open source portals ship with these days unless you have good reasons not to. I’m open to having my mind changed on this one, but it seems like if you want to use Alfresco and a portal, you’ve got bigger plans that are probably going to require custom portlets anyway.

Portal Option 2: Run Alfresco and the portal in the same JVM (post). This is NOT recommended if you need to scale beyond a small departmental solution and, really, I think with the de-coupling of the web script engine we should consider this one deprecated at this point.

Portal Option 3: Run the Alfresco web script engine and the portal in the same JVM. Like the previous option, this gives you the ability to write web scripts that are wrapped in a portlet but it cuts down on the size of the web app significantly and it frees up your portal to scale independently of the Alfresco repository tier. It’s a fast development cycle once you get it set up. But I haven’t seen great instructions for setting it up yet. Alfresco should document this on their wiki if they are going to support this pattern.

Portal Option 4: Write your own portlets that make services calls. This is the “cleanest” approach because it treats Alfresco like any other back-end you might want to integrate with from the portal. You write custom portlets and have them talk to Alfresco via REST or SOAP. You’ll have to decide how you want to handle authentication with Alfresco.

What about CMIS?

CMIS fits under the “Option 1: Use your favorite programming language” and “Portal Option 4: Write your own portlets” categories. You can make CMIS calls to Alfresco using both REST and SOAP from your own custom code, portlet or otherwise. The nice thing about CMIS is that you can use it to abstract the underlying repository so that (in theory) your front-end code will work with different CMIS-compliant back-ends. Just realize that CMIS isn’t a fully-ratified standard yet and although a CMIS implementation is in the Enterprise version of Alfresco, it isn’t clear to me whether or not you’d be supported if you had a problem. (The last response I saw on this specific question was a Peter Monks tweet saying, “I don’t think so”).

The CMIS standard should be approved by the end-of-the-year and if Alfresco’s past performance is an indicator of the future, they’ll be the first to market with a production-ready, fully-supported CMIS implementation based on the final spec.

Pick your poison

Those are the options as I see them. Each one has trade-offs. Some may become more or less attractive over time as languages, frameworks, and the state of the art evolve. Ultimately, you’re going to have to evaluate which one fits your situation the best. You may have a hard time making a decision, but you have to admit that having to choose from several options is a nice problem to have.

Apr 01 2009
Apr 01

Developers are all familiar with the default behavior of the drupal menu systems "local tasks" (aka tabs). These appear throughout most Drupal sites, primarily in the administration area, but also on other pages like the user profile.

Generally, developers are pretty good about creating logical local tasks, meaning only those menu items which logically live under another menu item (like view, edit, revisions, workflow, etc... live under the node/% menu item).

But sometimes, these tabs either don't really make sense as tabs or you simply want to have the flexibility of working with the items as "normal menu items", or those menu items which appear under admin/build/menu.

I recently wanted to move some of the tabs on the user profile page (user/UID) into the main menu so that I could include them as blocks.

For some reason, developers think the user profile page is a great place to put tabs for user related pages such as friendslist, tracker, bookmarks, notifications and so on. But these types of items are less a part of the user's account information than they are resources for specific users. Personally, I would not think to look at my account information on a site to find stuff like favorites or buddies. I'd expect those items to be presented somewhere much more obvious like a navigation block.

Initially, this may seem like a trivial task. My first thought was to simply use hook_menu_alter() and change the 'type' value of the menu item from MENU_LOCAL_TASK to MENU_NORMAL_ITEM. However, for reasons I don't understand well enough to explain in detail, this does not work.

In order to achieve the desired result, you must change the path of the menu item and incorporate the '%user_uid_optional' argument, replacing the default '%user' argument.

All very confusing, I know. Let's look at an example.

The notifications module (which provides notification on changes to subscribed to content) uses the user profile page rather heavily. I don't want its links there, I want them in the sidebar where users can always see them.

/**
* Implementation of hook_menu_alter().
*/
function MODULENAME_menu_alter(&amp;$callbacks) {
 
// NOTIFICATIONS MODULE
 
$callbacks['notifications/%user_uid_optional'] = $callbacks['user/%user/notifications'];
 
$callbacks['notifications/%user_uid_optional']['type'] = MENU_NORMAL_ITEM;
  unset(
$callbacks['user/%user/notifications']);
  <
SNIP>
}
?>

So I have moved the notifications menu into my own menu, changed the type, used %user_uid_optional instead of %user, and unset the original menu item.

This works fine except for the fact that you'll lose all of the other menu items under user/%user/notifications! You need to account for all menu items in the hierarchy to properly reproduce the tabs in the main menu system, so we add the following:

    $callbacks['notifications/%user_uid_optional/thread'] = $callbacks['user/%user/notifications/thread'];
    unset(
$callbacks['user/%user/notifications/thread']);

    </span>$callbacks['notifications/%user_uid_optional/nodetype'] = $callbacks['user/%user/notifications/nodetype'];
    unset(
$callbacks['user/%user/notifications/nodetype']);

    </span>$callbacks['notifications/%user_uid_optional/author'] = $callbacks['user/%user/notifications/author'];
    unset(
$callbacks['user/%user/notifications/author']);
?>

And of course, we don't want this code executing at all if our module is not enabled, so you'd want to wrap the whole thing in:

  if (module_exists('notifications')) {
 
  <
SNIP>

  }
?>

Keep in mind that not all modules implement menu items using hook_menu(). It's becoming more and more common for developers to rely on the views module to generate menu items, and this is a wise choice. Menus generated using views (ala bookmark module) can be modified to get the desired result without any custom code.

Mar 05 2009
Mar 05

OpenBand has today unveiled its new OpenBand Labs website to help improve the information around the work we've been doing over recent years as well as hopefully invite some discussion.

OpenBand Labs

We hope to make some improvements to the new site over the coming days as we continue to add some more information and blog about some of our experiences here at DrupalCon. We'll also be adding the slides and some writeups about the XMPP talk Darren gave yesterday and the distributed enterprise talk Ben gave today.

We had a great presentation yesterday with Darren Ferguson talking about the XMPP Framework, and today with Ben Lavender talking getting a chance to demonstrate our collaboration platform.

Mar 03 2009
Mar 03

I'm in Washington, DC this week with many of the rest of the OpenBand / M.C. Dean team. We're all here to visit DrupalCon DC and meet our friends and associates in the Drupal community as well as present some of the things we've been working on.

A couple of the sessions that we proposed have been accepted, so we're going to be presenting:

If you're interested in the work we've been doing to help distributed enterprises communicate better, feel free to come up and say hi. We look forward to meeting you at DrupalCon!

Feb 27 2009
Feb 27

I was working the other night to create an integration module that would tie the existing Activity Stream module for Drupal into the Brightkite location-based social network.

The idea is that users can check in at their current location using Brightkite and have their Drupal site update their location within the site based on their last known location - handy if you want a little map that shows where you are, for example, but you could do whatever you wanted with those locations, and even use them to extend a social network you might be building up in Drupal.

While I was fighting with the SimplePie feed parsing library to work out why it didn't like the feeds from Brightkite, John McKerrell suggested that some integration for his new Mapme.at service would be nice too.

So, the first two services to be supported by the new Activity Stream location services project are Brightkite and Mapme.at. I'd also like to extend this to other services like Yahoo's Fire Eagle and Google Latitude at some point, but neither of them are quite so simple to integrate with, the former because it has no public location feeds for users and requires authentication, and the latter because it doesn't share any of its data at all (boo!).

In their most basic form, the modules pull in the updates from these services and they get included in your activity stream along with your Twitter updates and the like, but also if you have Location module installed and the user locations module enabled (plus a patch for Activity Stream for now), your user will be updated with the latest coordinates from the location service you use.

Jan 22 2009
Jan 22

Ballavayre CottagesThe new website for Ballavayre Cottages went live recently to give a new online presence to this 5 star self-catering accommodation in their 200 year old cottage in Colby, Isle of Man.

The site, built on Drupal, allows the cottage owner to change content as and when they wish, and also to update their availability calendar to let visitors know when the cottages are available.

To help visitors see at a glance where the cottages are located, and to give directions from the sea terminal and airport, we included a series of custom built maps, designed (with some very helpful tips from Steve Chilton) using OpenStreetMap data.

Directions to Ballavayre Cottages

It is always great to help promote the Isle of Man as a tourist destination, even in a small way, by giving accommodation providers a chance to promote their services to a wider market.

Jan 15 2009
Jan 15

Man, we’re getting old!

Today (January 15th) is the 8th anniversary of the day Drupal 1.0 was released. Although Dries had no idea at the time - it was a move that would not only change his life, but mine too…

January 2009 also marks the 5th anniversary of my starting to work on Drupal full time (after a few years of “hobby” involvement). My first project (at the time, actually a re-launch) still stands as one of my favourites: http://www.terminus1525.ca/ . Since then, Drupal has defined my career: from co-founding Bryght to my current life as a Lullabot. The community is home to some of my best friends and people I love.

Five years - full-time. No wonder I feel old.

Jan 07 2009
Jan 07

It's amazing how much spam is generated when commenting is enabled on a blog. When I first launched the Drupal version of dankarran.com, I had commenting enabled for new posts but had it set so that administrator approval was needed before any new comments went live. Coming from a MovableType blog previously, I was used to doing this as I received tons of spam on that blog and had to moderate there as well. It's not a good user experience to expect people to wait until I approve a comment though, so I was keen to let users post directly.

When the site first launched, there were very few spam comments, so leaving comments open seamed feasible, but very soon - as spammers started to pick up on the changes - they started arriving in droves, and that was just on the few comment-enabled posts that I had created since launching.

Not wanting to impede people's commenting with a captcha for every comment, I avoided Drupal's captcha module and instead opted to try out the Mollom module as an interface to the Mollom spam filtering service created by Dries Buytaert, the founder of the Drupal project.

Mollom is a free service that checks all the comments (and/or other forms) posted for known patterns of spam, blocking it where appropriate, and letting real user-created comments through unhindered. If it's not sure whether it's spam or ham (the term for real content), it then presents the user with a captcha that the user can fill in if they have been mistakenly flagged as possible spam.

So far, the service has been great, with 588 spam messages blocked in the past 16 days, the busiest days being Christmas Eve and Christmas Day with 260 spam messages between them. I'm very happy to be a Mollom user! Sorry spammers, it's nothing personal, honest.

Dec 31 2008
Dec 31

From the start of 2009 I am going to be self-employed, and while I will still be working much of my time on the projects I've been working on for the past three years at OpenBand/M.C. Dean, I would also like to start taking on some small Drupal-based projects to go side-by-side with that.

Some of the services I am planning to offer in the New Year include:

  • Drupal site creation
    If you are looking to get a site set up for your small business or organisation, and you like the power that Drupal can give to your site, then I can set a site up for you to meet your needs. I'm particularly interested in creating sites based around geographic information or related to the tourism industry, but I will happily consider any project.
  • Drupal site setup support
    If you need advice on how best to achieve your requirements with existing Drupal modules, I can help point you in the right direction and get you started with your Drupal site setup. This can either be on a remote basis, or on a face-to-face basis in the London area if needed.
  • Drupal module development
    If there isn't already a module in the Drupal community to do what you need, I can help you by building a module to meet your requirements.

If you're interested in taking advantage of these freelance Drupal services for your project, please contact me to discuss your needs.

Dec 15 2008
Dec 15

We needed to disable all of Drupal’s CSS files from our theme. Here’s how we did it:

function THEMENAME_preprocess(&$variables) {

  // Get rid of all of Drupal's CSS files
  $css = drupal_add_css();
  foreach ($css['all']['module'] as $file => $status) {
    if (!strstr($file, 'modules/MYMODULE')) {
      unset($css['all']['module'][$file]);
    }
  }
  $variables['styles'] = drupal_get_css($css);

We also wanted (no *real* need) to use screen.css rather than style.css, so we edited THEMENAME.info to have this:

stylesheets[all][] = reset.css
stylesheets[all][] = screen.css

… and we removed the line for style.css from it.

Finally, as the superuser we went to /admin/build/modules (or /themes, can’t remember now) to refresh the theme cache. We also had to tick to enable the theme at /admin/build/themes as although we’d been using the theme for ages quite fine, it wasn’t actually ticked before.

And hey presto, it worked. Should probably add that it took waaaay too long to do though, so though we’d add this snipped for others to read.

Share this:

Like this:

Like Loading...

Pages

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web