Apr 13 2010
Apr 13

For those who might not know, Pantheon Mercury is:

... a drop-in replacement for your Drupal website hosting service that delivers break-through performance. Mercury can serve two-hundred times more pages per second and generate pages three times faster than standard hosting services.

Mercury achieves this by using open-source technologies like so many ingredients of a complex dish - a little Varnish here, a dash of Memcached there, a hint of the Alternative PHP Cache, a healthy dose of Tomcat and Solr, all based upon the Pressflow distribution of Drupal. None of it is anything you couldn't do yourself -- many before Chapter Three had done it actually. However, they were the first to tie it all together using BCFG2, and release an Amazon EC2 AMI image of it.

As word spread, many liked the idea of Mercury, but wanted to brew their own non-EC2 instance. While they posted a wiki article on how to do it yourself, they went to work on native support for RackSpace. When I read Josh Koenig's post on the Linode blog stating he wanted to bring Mercury to Linode, I made a mental note. Some time passed, I became much more involved in Drupal, and I decided to volunteer to write the StackScript. Josh said okay, and put me in touch with Greg Coit, their resident sysadmin, and we went to work.

Fast forward a couple weeks, and we've announced a beta! The StackScript is quite complete - it supports Ubuntu Jaunty and Karmic, and can use the current stable branch or the soon-to-be-released 1.1 development branch. Once Lucid is released, we'll test to make sure it works there as well.

I want to thank Greg for all his help. We found some bugs in Ubuntu, some quirks in the memcached init script, and fixed many bugs and added some features to their BCFG2 bazaar repo. Thanks also go out to Josh for his oversight and guidance. It was a great time, a great learning experience, and I came out of it with some new colleagues (and some free beers at DrupalConSF).

Feel free to read up on my experiences with Linode, and if you like what you see, click on one of the many links to Linode from my blog. If you sign up and stay a customer for 90 days (trust me, you will), I'll get $20 credited to my account. Feel free to comment below about the StackScript and let me know about any issues you might find.

Apr 13 2010
Apr 13

View Good cheapish eats near DrupalCon SF 2010 in a larger map

I once heard that San Francisco has so many restaurants that the entire population could eat out and there would still be empty seats. That's probably apocryphal, but it does have an embarrassment of gastronomic riches. The other truth about San Francisco is that it's expensive, so eating lunch for under $8 can be a challenge -- especially near a touristy area such as Moscone Convention Center.

But I lived there for 17 years before moving to Ohio last April and, being a cheap-ass, have collected a few favorites. They're in an annotated short list in this Google Map, "Good cheapish eats near DrupalCon SF 2010". In brief:

    Jollibee (4th Street at Howard) is the fastest and cheapest.

    Sushi Club is surprisingly fast and cheap, and has take-out.

    Tu Lan is the best for the money, but a bit far.

To add your own favorites, log into your Google account, click "My Maps", then click the Edit button. (Thanks to Shawn DeArmond for the tip.) Please (a) give your own marks a distinctive icon, (b) keep it limited to under-$8 lunch places within a 15-minute walk of the convention center, and (c) don't touch anyone else's marks.

Need a suggestion at the con? Text me at 415-317-1805 and I'll do my best to help.

This con feels like it's shaping up to be a real ground-breaker. I can't wait!

Blog category: Drupal Planet
Apr 12 2010
Apr 12
Starting with the community site Urban Vancouver, then as the support cowboy for Bryght and Raincity Studios and now with an independent practice, I've enjoyed all of my almost 6 years with the Drupal community. In a couple of weeks, I'll fly to San Francisco to attend my first DrupalCon. With my flight and hotel booked, conference ticket registered, and a ticket to a Major League Baseball ballgame ticket received in the mail, I look forward to the 3 full days of sightseeing in the Bay Area, including the plans to take a tour of North Beach and ride San Francisco's historic streetcars. The conference itself will present me the opportunity to meet many of the people working on the open source CMS that I admire. I'm looking forward to attending sessions and hanging out in the lobby, and in the evenings, drinking some sweet delicious beer with colleagues. Tuesday night sees me riding BART out to Oakland-Alameda County Coliseum for the first time in some twenty years where I saw my favourite Blue Jay of all time, Jesse Barfield, sock a dinger. This time, without a team to cheer for, I hope to soak it all in, arriving as early as possible to catch hitting and fielding practice, then taunt and boo the Yankees until my throat is sore. Leaving the conference early means missing some fantastic-looking DrupalCon sessions from Narayan Newton and Greg Knaddison. If the baseball gods get angry and rainout the A's game, you'll see me at one of those two sessions.
Apr 12 2010
Apr 12

Last week I had the crazy idea that I could build my own version of Twitpic with Drupal. It turns out that I can. In under 300 lines of code I wrote the Drippic module that allows posting of photos from Tweetie for iPhone. Here's what is does:- Allow posting of photos (and a tweet) via a post request. Photos saved as an ImageField and tweets saved as the title of a node. Allow commenting on photos, posting the comment back to Twitter. User generated, and verified with Twitter when posting a photo or comment. Short URLs created using the Shorten module. Tweetie for iPhone (recently bought by Twitter to be rebranded as 'The Twitter iPhone app') allows for a custom URL to be added as the image service. Tweetie will then post the photo to that URL when the user tweets a photo. By working from the Tweetie documentation Drippic supports this function. This is just a standard post request so an HTML form can easily be built to post to the same URL. There are still a few items left to do. These include, develop a theme, develop a mobile theme, allow posting and tweeting of photos via the Drupal / Drippic website, allow posting via email and add Oauth user login. If you want to take a look at Drippic head over to http://drippic.millwoodonline.co.uk If you want to use Drippic set your Tweetie image service to http://drippic.millwoodonline.co.uk/drippic/upload If you want to download Drippic goto http://github.com/timmillwood/Drippic If you want to just post to Drippic (and tweet) use the following HTML form code.

<form enctype="multipart/form-data" action="http://drippic.millwoodonline.co.uk/drippic/upload/tweet" method="post">
username <input name="username" type="text"><br>
password <input name="password" type="password"><br>
Tweet <input name="message" type="text"><br>
Choose a file to upload: <input name="media" type="file"><br>
<input type="submit" value="Upload File">
Apr 12 2010
Apr 12

The Drupal community is a very diverse one. So when the idea came up to make a viral video to promote the DrupalDevDays and invite Dries Buytaert (founder of the Drupal project), we didn't have a shortage of talent in the local community. From video experts to stunt actors, everyone came together to produce this video, and with some key people coordinating everything we managed to launch the whole campaign in just a few days. If you have as much fun watching the video as we had making it, please help us spread the word!

If you were already planning on coming to the DrupalDevDays, or if our video just convinced you to come, make sure to register now. Tickets prices will be raised next week so you don't want to wait too long.

PS: no, I didn't drink that liter of beer by myself. If I'm making that face in the last scene it's because we added sugar to the beer. It gives the beer a nice foam which is great esthetically, but it doesn't taste good...

Apr 12 2010
Apr 12

This is the third part of the series on Content Complete. If you have missed the introduction of the Content Complete module, please read the first part in this series.

Thanks to the integration with the powerful Rules module, you can set up complex workflows with data completion in a couple of minutes. I'll show you how to set up email reminders sent automatically to authors of incomplete Album nodes (< 100%).

If you're not familiar with the Rules modules, please read this introduction and check out tutorials here.

Workflows on data completion using Rules

You need to download and install the Rules module (Rules, Rules Administration UI, Rules Scheduler), the Views module and the Token Module.

First, we will create a rule set that will continuously reschedule itself until the node is completed.

  • Go to "admin/rules/rule_sets/add" and fill out the label with "CC Check (Set)" and the machine-readable name with "cc_check".
  • In the "Arguments" fieldset select "Content" and "User".
  • Back on the Rule Set page, click on the freshly created set and click "Add a new rule". Set the label to "CC Check".
  • Every rule works with conditions (IF) and actions (THEN). Add the condition "Content Complete % is >= than x" and fill out 100 under "% to compare". Click on "Negate" such that our condition translates to "if Content Complete % < 100".
  • Add the action "Send a mail to a user". As recipient, select the content's author. Fill out the subject with for example "Complete your node [node:title]" and the message with something like "Click on [node:site-url]/node/[node:nid]/edit to complete your node." Check out the Token Replacement Patterns for more options.
  • We need to reschedule our rule continuously. Add another action "Schedule 'CC Check (Set)'", give it an identifier "CC Check '[node:title]'" and as scheduled evaluation date put "+1 day". Make sure you have cron configured to have email reminders sent out daily.

Rule configuration with conditions (node of type 'Album' updated) and actions (run CC Rule Set)

We now have a rule set that checks the completion of the node and, if needed, sends an email reminder to the author of that node. Now we need to trigger that rule set on a Drupal event, for example, every time a node of type 'Album' is updated.

  • Go to "admin/rules/trigger/add" and fill out the label with "CC Check". As a trigger event, we choose "After updating existing content".
  • Add the condition "Content has type" and select the "Album" content type.
  • Add the action "CC Check (Set)" with Content set to "created content" and User set to "content's author".

Rule configuration with conditions (node of type 'Album' updated) and actions (run CC Rule Set)

Test your new workflow by writing an additional message to watchdog and setting the scheduled evaluation date to for example "+1 min". You can also enable the debugging mode of the Rules module: at the settings page click on "Debug rule evaluation".

Future of Content Complete: a Complete API

Checking completion for content is just one approach to having your Drupal websites as "complete" as possible. We have already experimented with different ways of "completion", e.g. by counting user actions as steps in processes that need completion. If you're interested in for example having your users each add 3 nodes of type 'Album' and every node at least filled for 75%, check out the Complete module. That module is related to the Content Complete module, but measures completeness of actions (as measured by the Rules module) instead of completeness of nodes. A combination of both modules can lead to a "Complete API" and some preliminary ideas are under discussion here. Looking forward to your feedback!

Apr 12 2010
Apr 12

jQuery for Designers and Themers is a fun interactive session at DrupalCon San francisco on getting started with jQuery. It is targeted at designers and themers but is suitable for anyone with a decent understanding of HTML and CSS — no programming experience is necessary. It doesn't include any PHP, and only basic programming concepts are introduced.

The session is early on Tuesday 20 April in room 307 (Commerce guys) at DrupalCon SF at 8:30am.

The sample code is available at Drupal.org/Project/jQ4DaT and slides are available at TinyURL.com/jQuery-Designers (Google Docs).

Some other related or similar sessions include;

Apr 12 2010
Apr 12

DrupalSouth attendees pointing at Angela 'webchick' Byron (Drupal 7 core committer) in the center

DrupalSouth Wellington 2010 was a booming success! And that would be an understatement. 100 Drupallers from NZ, Australia, North America and Europe came together for 2 Wellington-wet days in a brewery and couldn't stop talking about Drupal!

Here is DrupalSouth by the numbers;

  • 1: Code sprints
  • 2: Tracks (simultaneous sessions)
  • 2: Duration in days
  • 2: Lunches provided
  • 2: Organisers
  • 2: Attendees from parliament (Green party)
  • 3: Keynote speakers from North America (Liz Henry, Emma Jane Hogbin & Angela Byron)
  • 3: Platinum Sponsors
  • 3: DrupliBeanBags
  • 4: Attendees from the IRD
  • 5: Gold sponsors
  • 5: Percent of attendees from Hawkes bay
  • 5: Months to organise
  • 6: Companies involved in the wireless internet
  • 6: Wireless access points
  • 7: Value of each bar token in NZ dollars
  • 8: Silver Sponsors
  • 9: Varieties of beer brewed on-site
  • 10: Start time on Saturday
  • 11: Thousands of dollars turned over in event production
  • 15: Attendees from NZ government agencies (IRD, Greens, NZ Police, various ministries, etc.)
  • 16: Sponsors
  • 16: Percent of attendees from Australia
  • 16: Percent of attendees from Christchurch
  • 18: Age of youngest attendee
  • 20: MBs of synchronous bandwidth
  • 21: Percent of attendees from Auckland
  • 26: Speakers
  • 28: Attendees who also attended LCA the week before
  • 29: Sessions
  • 30: Percent of female attendees
  • 32: Percent of attendees from Wellington region
  • 36: A3 sheets of printed sponsor logos
  • 60: Registration cost
  • 64: Cost of food and snacks per attendee
  • 100: Registrations sold
  • 220: Bar tokens printed

Some of my personal highlights were;

Thank you to;

Read other's post-DrupalSouth write-ups at;

Apr 12 2010
Apr 12

Apr 12, 2010

The Drupal community is a very diverse one. So when the idea came up to make a viral video to promote the DrupalDevDays and invite Dries Buytaert (founder of the Drupal project), we didn't have a shortage of talent in the local community. From video experts to stunt actors, everyone came together to produce this video, and with some key people coordinating everything we managed to launch the whole campaign in just a few days. If you have as much fun watching the video as we had making it, please help us spread the word!

If you were already planning on coming to the DrupalDevDays, or if our video just convinced you to come, make sure to register now. Tickets prices will be raised next week so you don't want to wait too long.

PS: no, I didn't drink that liter of beer by myself. If I'm making that face in the last scene it's because we added sugar to the beer. It gives the beer a nice foam which is great esthetically, but it doesn't taste good…

Apr 12 2010
Apr 12

I am looking for confirmations from other Drupal developers regarding details and corroborations. Comments are welcome here. PHBs need not worry, your Drupal site is just fine.

This post is about an inherent problem with Google’s recently announced “Speed-as-a-ranking-feature” and its problems with content-management systems like Drupal and Wordpress. For an auto-generated website, Google is often the first and only visitor to a lot of pages. Since Drupal spends a lot of time in the first render of the page, Google will likely see this delay. This is both due to a problem with how Drupal generates pages, and Google’s metric.

Google recently announced that as a part of it’s quest to making the web a faster place, it will penalize slow websites in its ranking:

today we’re including a new signal in our search ranking algorithms: site speed. Site speed reflects how quickly a website responds to web requests.

Since Google’s nice enough to provide webmaster tools, I looked up how my site was doing, and got this disappointing set of numbers:

Screen shot 2010-04-11 at 10.35.31 PM

I’m aware 3 seconds is too long. Other Drupal folks have reported ~600ms averages. My current site does under 1s second on average based on my measurements. This is probably because I occasionally have some funky experiments going on in some parts of the site that run expensive queries. Still, some other results were surprising:

Investigating further, it looks like there are 3 problems:

Screen shot 2010-04-11 at 10.49.44 PM

DNS issues & Multiple CSS: Since Google Analytics is on a large number of websites, so I’m expecting their DNS to be prefetched. CSS is not an issue since the 2 files are client media specific(print / screen).

GZip Compression: Now this is very odd. I’m pretty sure I have gzip compression enabled in Drupal (Admin > Performance > Compression). Why is Google reporting lack of compression? To check, I ran some tests, and discovered that since Google usually sees the page before it’s cached, it’s getting a non-gzipped version. This happens due to the way Drupal’s cache behaves, and is fixable. Ordinarily, this is a small problem, since uncached pages are rendered for only the first visitor. But since Google is the first visitor to a majority of the pages in a less popular site, it thinks the entire site is uncompressed. I’ve started a bug report for the uncached page gzip problem.

A flawed metric: The other problem is that Drupal (and Wordpress etc) use a fry model ; pages are generated on the fly per request. On the other hand, Movable Type, etc., bake their pages beforehand, so anything served up doesn’t go through the CMS. Caching in fry-based systems is typically done on the first-render, i.e. the first visit to a page is generated from scratch and written to the database/filesystem, any successive visitor to that page will see a render from the cache.

Since the Googlebot is usually the first (and only) visitor to many pages in a small site, the average crawl would hit a large number of pages where Drupal is writing things to cache for the next visitor. This means every page Googlebot visits costs a write to the database. While afaik Drupal runs page_set_cache after rendering the entire page and hence the user experience is snappy, I’m assuming Google counts time to connection close and not the closing </html> tag, resulting in a bad rendering time evaluation.

This means that Google’s Site Speed is not representative of the average user(i.e. second, third, fourth etc visitors that read from the cache), it only represents the absolute worst case situation for the website, which is hardly a fair metric. (Note that this is based on my speculation of what Site Speed means, based on the existing documentation.)

Apr 10 2010
Apr 10

This video shows you how to create a custom page for your site using the Panels module. You can use Panels to help you make pages (including a front page) that include a variety of content from your site including content that exists in views, blocks and nodes. The Panels module is dependent on the Chaos Tool Suite module.

Note: Click the 'full screen' icon (to the left of the volume control) in order to watch online at full 1280x720 resolution.

Video Links

Flash Version

Apr 10 2010
Apr 10

PastedGraphic1_eIio1n3GgsC8_Iw0ZkR7vmQEJ.jpgSorry, I couldn’t resist the play on the words. This storied Minneapolis nightclub has been a music mecca since the 70’s, entering the national spotlight in the mid 80’s when it helped launch Prince and was the venue for the classic movie Purple Rain. Level OS first launched the site around 6 years ago on a custom ASP.NET platform (cut me some slack, it was my first project with my own company!). It was migrated to Drupal about 2 and a half years ago, right around v5.2. That version of the site featured a custom Ubercart integration allowing the club to sell both tickets and sightline seats to events, along with regular merchandise, heavy use of CCK and Views for features like blogs, forums, and galleries, sample audio tracks, and an online community. In the ensuing two years, both the code base and design lacked the flexibility the site needed, so we updated both. Side note: It’s amazing what passed in my book for Drupal best practices 3 years ago! The upgrade process was, well, extremely painful to be blunt, and this coming from a fairly experienced Druapler. The challenges we encountered included:

  • Some problems migrating modules from D5 to D6 versions. E.g., Ubercart changed the structure of a products data object, but existing data wasn’t updated and the change wasn’t clearly documented. The changes were not complex, but they were mostly only discovered through trial and error.
  • Migrating from the Event module to Date/Calendar/CCK.
  • Theme updates
  • Custom module updates
  • Migrating to the use of true best practices
  • The site is fairly heavily trafficked, with lots of new content daily, so nearly all the configuration updates were codified.

The new site, while still having lots of room for improvement, is finally coming into its own and features the following.

  • Still using Ubercart, now in a much more mature state, for selling tickets and reserved table seats for select events, both of which have their own inventories, along with merchandise. The former is accomplished by taking advantage of a few UC hooks along with product attributes.
  • Event system with JQuery carousels on the front page, listings grouped by day with heavily themed rows, and iCal feeds. The usual tools were employed, including CCK, Date, Calendar, and Views modules.
  • Very active Blog
  • Venues, with mapped locations and a list of events in each venue

Big thanks to Annette Brooks on her amazing design skills and to RJ Steinert for his theming help. Happy to answer any questions about implementation or the tools that were used, and, of course, feedback very welcome.

Apr 08 2010
Apr 08

This is the second part of the series on Content Complete. If you have missed the introduction of the Content Complete module, please read the first part in this series.

In this post, I'll guide you to setting up views to show the completion of Drupal nodes using Views. If you don't know how to work with Views, check out the documentation that is included within the module.

A list view on node completion

Imagine we have 10 artist nodes filled in by different users on the website and we want to have a simple list of those nodes together with their completion status.

Start by adding a new View of type 'Node' in your website. For a basic administration view, we add the post date and the title of the node to the 'Fields' and add a filter for nodes of content type 'Album'. To show the completion, add the 'Content Complete: Completeness' field from the 'Content Complete' group. You can display the field as a Numeric Value or Bar and select the option to show a field that links to the next field to be completed. The module also provides a raw view of the data with 'Content Complete: Completeness Data' which will be discussed in the next paragraph. We select the Table style and make the completeness field sortable. Check out the result of this simple view in the second figure.

Administering a list view and putting Content Complete fields

Field options: different displays for the Completeness field

A list view of Album nodes ordered by completion

Total completion of a list of nodes

Another way to output data is to show the total completion on a list of selected nodes, e.g., on all the 'Album' nodes. For this we select the Views style 'Content Complete' and select the 'Content Completeness Data' as a field. The Style will process all the data it receives from the nodes and compute the total completeness and the next field to be completed. This is very useful if you want to show only one completion bar for all your content, as shown in the figure below.

A list view of Album nodes ordered by completion

What's next

In the third and last part of this series, I'll show you how to set up a user flow to drive users to complete their nodes by sending out email reminders.

Apr 08 2010
Apr 08

The image at left is a slide from a deck I'm putting together for possible lightning talk at DrupalCon SF. So far there are 11 slides showing the historical complexity of Drupal in terms of some simple metrics, and comparisons with the current release of Joomla!. The point of this is to try to understand the complexity of Drupal, both historically and in comparison with other frameworks, in terms of software metrics.

Comments, questions, pointers, and insights/opinions are very welcome.

BTW - The Joomla metrics (and the Drupal 7x) metrics are probably not quite right, yet - this is a work in progress. I'm working on metrics for Wordpress and a few other platforms also.

UpdateI presented this at DCSF - thanks to everyone who commented and discussed this with me. The attached file is slides as presented Sat Apr 17, 2010.

AttachmentSize 207.79 KB
Apr 08 2010
Apr 08

Drupal's profile module is basic, but sometimes it's just right for the job. You can add categories that appear as tabs when you edit your profile, and you can add a variety of basic fields to the categories.

Drupal comes with a user picture and signature that can be displayed in your comments, forum posts, and pretty much anywhere with a little theming. By default the picture and signature are entered on the "account" tab when you edit the profile. But if you have another category with additional personal information, like a biography, it can make more sense to move the pic and sig there, instead of its default placement with the username, email and password.

The api's hook_user() lets you move these fields, and lets you merge any number of tabs, including the account tab. There is a module that blasts them all into one tab: One page profile .

If you need a bit more finesse, here are a few tricks. This all goes in a custom module.

First we'll combine a couple of tabs into the account tab.

function custom_user($op, &$edit, &$account, $category = NULL) {
if ($op == 'form' && $category == 'account') {
$form = profile_form_profile($edit, $user, 'Personal Information');
$form += profile_form_profile($edit, $user, 'Historical Information');
$form['Historical Information']['#weight'] = 10;
$form['Personal Information']['#weight'] = 11;
return $form;

This checks that hook_user() is building the 'account' tab, and it tacks on the Personal Information and Historical Information tabs. Those are the names I assigned as categories when I added profile fields.

Depending on how many tabs you have, you might need to adjust the weight.

Next, you probably want to remove the Historical and Personal tabs. You do this in a hook_menu_alter(). Be sure to clear your menu cache.

function custom_menu_alter(&$items) {
// Unset these tabs on the user/edit page.
unset($items['user/%user_category/edit/Historical Information']);
unset($items['user/%user_category/edit/Personal Information']);

Next, in a separate example, let's move the picture and signature fields to another tab where they make more sense, my Personal Information category.

This is hook_user() again.

if ($op == 'form' && $category == 'Personal Information') {

// Load the entire account form into $picture.
$picture = _user_forms($edit, $account, 'account');

// Add weight to personal info category fields so we can stick picture and signature where we want.
$i = 0;
foreach (element_children($form['Personal Information']) as $key => $element) {
$form['Personal Information'][$element]['#weight'] = $i;

// We must add the uid for use in the picture file name.
$form['#uid'] = $account->uid;

// Merge in the picture field.
$form['Personal Information'] += $picture['picture'];

// We need to add in the picture validation..
$form['#validate'][] = 'user_validate_picture';

// Set the weight and change the title.
$form['Personal Information']['picture_upload']['#weight'] = 0.1;
$form['Personal Information']['picture_delete']['#weight'] = 0.2;
$form['Personal Information']['picture_upload']['#title'] = 'Upload Personal Photo';

// Repeat for the signature.
$form['Personal Information'] += $picture['signature_settings'];
$form['Personal Information']['signature']['#weight'] = 0.5;
$form['Personal Information']['signature']['#title'] = 'Peronal Motto';
// We don't want the confusing filter tips.
unset($form['Personal Information']['signature_format']);

return $form;

Now we need to remove those fields from our account tab, which will be left with just the email, username and password, like it should be.

This is done with hook_form_alter().

function custom_form_alter(&$form, $form_state) {
switch ($form['#id']) {

// Remove sig and pic from accout tab.
case 'user-profile-form' :

Finally, Drupal will keep all the profile fields' data in the users table's data column until they are specifically saved to the profile_values table. This happens automatically until you start moving things around like we did above. In your hook_user() implementation you'll need to save the profile specifically for categories you merged in, but not for the the original category. Otherwise, your merged fields will live in limbo and not be availabe to the views module.

if (($op == 'update' || $op == 'insert) && $category == 'account') {
profile_save_profile($edit, $account, 'Personal Information');
profile_save_profile($edit, $account, 'Historical Information');

Apr 07 2010
Apr 07

The OpenLayers suite represents a breakthrough in the Drupal mapping solutions. The module allows to combine maps coming from different providers (Google Maps, Yahoo, and many others...) and, by using services like CloudMade or MapBox, it is finally possible to have maps perfectly integrated into the website look and feel. In order to input geospatial information using OpenLayers we have to enable the OpenLayers CCK module: the input widget provides a map where users can easily mark the desired location.

Unfortunately, this approach does not scale to deal with more complex scenarios. Imagine a website where users need to geo-locate restaurants in different cities: the input process will result in a series of very tedious drag-and-zoom operations. Ideally, the only information the user would need to provide is the address or the name of the location he or she is looking for.

Google Geocoding Web Service

The Google Geocoding Web Service V3 is a very powerful service that accepts full street addresses, location names and even known places (i.e. airports, train stations etc.), returning a complete set of information about the location we are looking for. For instance, Let's have a closer look to the web service response for Pizzeria da Vittorio, Rome:


results in:

  "status": "OK",
  "results": [ {
    "types": [ "point_of_interest", "establishment" ],
    "formatted_address": "Pizzeria da Vittorio di Martino Enzo, Via Benedetto Croce, 123, 00142 Rome, Italy",
    "address_components": [ {
      "long_name": "Pizzeria da Vittorio di Martino Enzo",
      "short_name": "Pizzeria da Vittorio di Martino Enzo",
      "types": [ "point_of_interest", "establishment" ]
    }, {
      "long_name": "123",
      "short_name": "123",
      "types": [ "street_number" ]
    }, {
      "long_name": "Via Benedetto Croce",
      "short_name": "Via Benedetto Croce",
      "types": [ "route" ]
    }, {
      "long_name": "Rome",
      "short_name": "Rome",
      "types": [ "locality", "political" ]
    }, {
      "long_name": "Rome",
      "short_name": "RM",
      "types": [ "administrative_area_level_2", "political" ]
    }, {
      "long_name": "Lazio",
      "short_name": "Lazio",
      "types": [ "administrative_area_level_1", "political" ]
    }, {
      "long_name": "Italy",
      "short_name": "IT",
      "types": [ "country", "political" ]
    }, {
      "long_name": "00142",
      "short_name": "00142",
      "types": [ "postal_code" ]
    } ],
    "geometry": {
      "location": {
        "lat": 41.8428020,
        "lng": 12.4858480
      "location_type": "APPROXIMATE",
      "viewport": {
        "southwest": {
          "lat": 41.8344890,
          "lng": 12.4698406
        "northeast": {
          "lat": 41.8511139,
          "lng": 12.5018554
    "partial_match": true
  } ]

As you can see the web service tells us pretty much everything we want to know about the location. It's time to use all this awesomeness to make the user's life shining again!

Meet OpenLayers Geocoder

OpenLayers Geocoder provides a new input widget for OpenLayers CCK fields that makes location spotting a fast and painless experience. All we need to do is to select the OpenLayers Geocoder input widget from within the CCK field setting page.


After enabling the OpenLayers Geocoder widget, adding a restaurant is all about providing its name or address: OpenLayers Geocoder will provide the user with a list of possible locations.


Get the most out of the web service response

We have only used the geospatial information from the web service response so far: latitude and longitude to center the map and the boundary box to nicely fit the desired location. Besides that, the response contains really valuable information about the location we were looking for, like postal code, administrative area, city, country, etc.

OpenLayers Geocoder gives the possibility to fill CCK text fields automatically on the node submission form with data coming from the response object. This is possible thanks to the integration with the Token module: all address parts are exposed and ready to be used as replacement patterns. To enable the autofilling we have to visit the OpenLayers CCK field setting page in order to map which token is going to fill which text field.


When we will be looking for a place, information like city, country, etc. will be automatically fetched into the selected text fields.


This gives us a great flexibility: we can display our nodes by city using Views or search through them using Faceted Search.

Future development: Reverse geocoding

Another very interesting feature of the Google Geocoding Web Service is the possibility to perform reverse geocoding. As the name suggests, this time the user will spot a location on the map and the web service will return the closest address to the given point. OpenLayers Geocoder will have support for reverse geocoding in one its next releases.

Apr 07 2010
Apr 07

Drupal has some very good modules to deal with the 403 - 404 error pages. This article is about how to use Drupal modules to customize 403 and 404 pages.

Why should we bother? Among pages of a website, the 403 (access denied) and 404 (page not found) error page seems to have least attention. However, it comes a little bit annoying for users to see an unwelcome page. I found frustrating when doing google for some keywords, clicked to a link but ended up with a page not found. Some times it's caused by system changes or article archiving. But for any reason, this could cause users to leave the site immediately.

Here are some solutions which Drupal provides for web administrators:

1. Redirect your error page

The simplest thing is to redirect all 403 and 404 error page to different pages of your website which contains more useful content.

This link /admin/settings/error-reporting is where we can set URL for each 403 and 404 page.

2. Search404 module

Search404 performs a search on keywords in the URL, for example, if a user goes to http: //example.com/does/not/exist, this module will do a search for "does not exist".

I have tried this module but opt it out. Many of my dead link is like /node/xxxx, so performing search for those keywords are useless.

3. Make your own custom error page

Important update !!!: since Drupal 7, custom error page is supported by Drupal core. You can find it under Admin - Configuration - Site information. Please see my detail guide.

Drupal 7 error handling

For Drupal 6, the CustomError module provides many handy options, you can change the page title and make a HTML custom page for error notification.

After installing this module, first you go to Error report section and change the 403 and 404 URLs to customerror/403, customerror/404 respectively.

Then you can go to /admin/settings/customerror to make your own HTML error page. Some thing funny could relax users and we can provide more useful information of the site.

Drupal 403 404 custom pagesSymphony Themes 404 Page

If you want more advanced options, CustomError allows you to execute code within the error page. Follow this link to Nik's blog, he has very nice pieces of code to provide different messages to different type of users (visitors or authenticated users).

4. Fast 404

One possible strategy to deal with 403 and 404 errors is delivering a message as quick as we can to save resources for other customers. It is particularly useful for high traffic website with limited resources (shared hostings, VPS ...).


I faced this issue when customizing URLs for better reading and leave many links to death. The problem is I have placed those URLs in blogs, forums or somewhere else. Many people following them could suffer from 404 error page.

Thanks to CustomError, I have a nice page that does not push my visitors away.

Apr 06 2010
Apr 06

I'm not a fan of repeating myself, or of doing the same work twice. So when I first got a look at FieldAPI back at DrupalCon Paris, my thought after 'Wow this is going to change everything,' was 'Every module that converts its custom data storage to this is going to be doing the same work, over and over'.

It seemed to me that we have a lot of modules that add things to stuff. The examples that spring to mind include Taxonomy image (add an image to a taxonomy term), Comment upload (add an uploaded file to a comment), User terms (you get the picture), and the daddy of them all, Image (which now we must call Image Oldskool, due to there being a shiny new Image module in Core).

What all these have in common is that on Drupal 7 their things-to-stuffness would be perfectly served by FieldAPI. Users get more flexibility, an expanding universe of formatters and widgets; module maintainers can write less code and in some cases even hang up their hats on a particular project as it can live entirely in configuration space. Everyone wins. But how to get there?

This matter has been bubbling at the back of my mind ever since Paris. On the one hand I could sort of see how this could be all done with a single framework: you tell it what sort of entity you are dealing with (node, comment, term), which bundle (article, page, which vocab), what your fields are, and how to load the old data. The framework creates the fields for you, then runs a batch to load each object, moves values around in clever ways that you don't need to worry about, and then saves the object. Banzai! Your data is now saved in a field.

On the other hand, this was clearly crack. Of the highest order.

And yet it works. It quite possibly still is crack, and some extra pairs of eyes to de-crackify it would be very welcome. But I can confirm that I've run data conversions on both User terms and oldskool Image nodes, and got term fields on my users and beautiful image fields on my nodes.

There remains some work to be done: I'd like to make conversion to file and image fields a bit more straightforward and create a nice way to specify how to populate the alt and title fields for images. And all this is largely academic and has only run on D7 databases with dummy D6 tables imported, since core's 6 to 7 upgrade process is not currently working. So I could use a hand. And if you have a things-to-stuff module, get in touch and give it a whirl.

Apr 05 2010
Apr 05

Ever needed to measure the completeness of your Drupal nodes? Or wanted to motivate your users to complete their content until it reaches a 100%? In this post, I'll outline the basic configuration of Content Complete, a new Drupal module that does exactly that. Content Complete started as a simple module to show completeness of nodes, but over time has included Rules support to manage complex user flows, dynamic caching and Views integration. In this first post, I'll guide you through the steps to configure the module, output the percentage of completeness and show you how to theme the percentage bar. In the second post of this series, I'll show you how to output all this data using Views.

To start, you need to download and install the module from the download page or directly via Drush.

Configuring the module


Completeness of nodes is measured per content type. For each content type, you need to select the fields you want to be included in the percentage. For example, you have a content type "Artist" with the fields "Title", "Release Date", "Label" and "Website". You want all of those fields to be completed by your users, such that the node is at 100%. If, for example, a user only fills in a value for "Title" and "Website", the completeness will be at 50%.

To activate content completion checks, go to Administer > Content Management > Content types, and click 'edit' on the content type you wish to have checked. Enable the checks, save the content type and reload the page to find the fields you can check for completeness, as shown in Figure 1.

If you are using the Content Profile module to manage user profiles, the module will figure out the profile node that belongs to the logged in user.

Displaying completeness


Content Complete provides various ways to display data like the percentage, a percentage bar and the next field to be completed. These are the different blocks provided by the module:

  • Content Complete: current node: show completeness for the current node. Can be shown only at the node page.
  • Content Complete: Album (first node): show completeness for the first node it finds of the specified content type and for which the logged in user has edit permissions. Can be shown at any page. Use this only if you have one node of that content type, for example, the associated profile node of that user.
  • Views. Will be reviewed in the next post of this series.

To enable a block, go to Administer > Site building > Blocks and drag and drop Content Complete: current node to the region you want to display the block at. Then, navigate to your node to see the block appear (see Figure 2). You can also configure the block to be hidden when 100% is reached.

Who gets to see what: configuring permissions


The completeness block will be shown to every user including the anonymous user. Often, you will want to provide this block only to users who can edit the content. You can change the permissions at Administer > User management > Permissions.



The module provides several CSS classes and ids for the themer. For example, you can use '.cck-complete-percent-bar-leq-25' to style the appearance of the percent bar if the percent is lower or equal (leq) to 25. Likewise, there are classes for leq-50, leq-75 and leq-100. Absolute numbers can be styled using .cck-complete-percent-bar-x (with x replacing the actual percentage). Here is an example we used in one of our projects:

.content-complete-percent-bar-wrapper {
  background: #333333;
  border:1px solid #666666;
  margin: .5em 0;
.content-complete-percent-bar {
  height: 5px;
  background-color: #8CC101;

What's next

In the second part of this series, I will show you how to use Views to set up different views on the completeness of your nodes.

Apr 05 2010
Apr 05

Amitai Burstein

05 April 2010

When I decided I’ll use CTools for the Message module, I knew it will save me some time. I mean, everybody knows the “let’s re-use the same API” concept. I started copying from Context module the parts that I needed, and added my own logic.

Today, I’ve decided it is time to add a message UI module. It took me about twenty minutes which ended with the commit message: “Added message UI - A shameless copy/ paste from Context module.”

It was much faster than what I thought it would be. So, it’s not just about using the same API, it is about using the API the _same way as others. And if “others” are yhahn and jmiccolis then I can sleep better at night (or at least when the baby doesn’t cry).

I’m also thinking – which is easier to do than to actually sit down and write a patch – that it would be neat if CTools exportables plugin would have had also a uniform UI that modules such as Context, Message and others to come, could use. So even on the UI level, there will be no code duplication, and a UX gain as-well.

Now with the spare time at hand, all that is left to do is use the super-secret CTools plugin:


Apr 04 2010
Apr 04

After reading You may want to avoid hacking your open-source CMS today I was saddened, not because Drupal fell short (it didn't), not because Open Source usage is flawed (it isn't), but because it reminds me of situations I have endured in the past from the receiving end unfortunately.

Summarizing the story, The Onion forked (as in adapted to their own wishes) Drupal 4.7 apparently not realizing the costs of maintaining/syncing their own product. Wrong expectations make up a big part of a disappointment. Good expectations are part of doing your homework.

As a system engineer, I have a hard time understanding how a reputable organization can make a decision to modify existing software without thinking of the implications. If you set up infrastructure for yourself or others, the maintenance cost is an important criteria in making a (any) decision.

I often see a difference in attitudes between typical developers and system engineers, the former focused on the end-product and requirements (and likely a deadline), the latter focused on the end-solution and maintenance. A programmer's job has usually stopped when the system engineer's work has to start.

Sometimes it is frustrating that the long-term effects are overlooked and a piece of software is forced into production without any regard of maintenance (security, support or development) costs.

So a valuable lesson is in this story, if you introduce something new, make sure you have a good grip on what the costs are, also after it is successfully launched. In economics that's the difference between extraordinary costs and recurring costs. Don't leave that up to the marketeer, salesman or developer, maybe involve your system engineer next time ;-)

It is up to The Onion to see if they're better off with Django than Drupal, I guess it depends on how much development is part of their core-business. At least I hope they have thought about the development and maintenance costs this time around. And according to this Reddit thread they know what they are doing and where they are going :-)

Apr 04 2010
Apr 04

A constant annoyance with managing website today is the level of spam that comes in through comments, forum posts, contact requests, user registrations, etc, etc, etc... Not only can spam messages make your site look like crap, if you have any sort of comment reply notification (as this site has) you can end up emailing spam to your visitors, which will turn off a LOT of people. There are times when you don't seem to be getting much and then other times when it seems your site is being flooded with this junk - this week feels like the latter.

There are several ways of dealing with spam:

  1. Allow all content be automatically posted and moderate it after-the-fact,
  2. Manually approve every piece of content from unknown sources or unrecognized users,
  3. Add a plugin / code that blocks content based on certain keywords, e.g. swear words, references to Star Trek, etc.
  4. Add a plugin that requires some sort of identification that the visitor is a legitimate person rather than an automated program, dubbed CAPTCHA ("Completely Automated Public Turing test to tell Computers and Humans Apart"),
  5. Add a plugin / code that uses advanced algorithms to try to automatically detect spam,
  6. Add a plugin / code that identifies spam using distributed user actions, e.g. someone in a foreign country, like Alaska, sees that a message containing "Barney", "submarines", "camfires", "milkshakes" and "UFOs", they mark it as spam and that knowledge then helps identify similar content on your site.

So, the above is all wonderful, but where do you start? The first option above is messy as you end up with a lot of junk to deal with, the second one halts the natural flow of conversations as everything must be approved, and the third option is very limited - what if you *wanted* to discuss the effects of watching Barney-like dinosaur puppet TV shows on the reproductive cycle of goats, that conversation would be sure to cause a few messages to be blocked? So that leaves advanced solutions as the only viable options.

For this site, which is built with the excellent content management system Drupal, I took a look at some different modules that cover some of these concepts. One in particular piqued my interest, a service built by the creator of Drupal, Dries Buytaert, called Mollom. Based on a combination of several of the above ideas, Mollom seems like it would be a great solution, and with a really good Drupal module available so I gave it a spin.

So cut to a year later and the Mollom service has been working really well, leaving almost no spam. Unfortunately in the past ten days it has failed almost completely with thirty to almost one hundred spam messages getting through daily, which is obviously not what I want.

As a result of the influx of spam getting past Mollom I've changed over to using a service called reCAPTCHA (some details on Wikipedia) which provides a simpler though more reliable CAPTCHA. Installation on Drupal is super-simple, you just install the CAPTCHA dependency and then install the reCAPTCHA module itself, sign up for the free reCAPTCHA service, do a little bit of configuration (admin/user/captcha) and then hopefully just forget about it.

I'll let you know how it goes.

UPDATE: Believe it or not but no sooner had I tweeted about this post than Dries himself responded! that after upgrading to the latest version it was necessary to reconfigure the module as it seems the settings structure changed. As a result I've switched back to Mollom to give it one last try. That said, I did suggest that an update script be added that leaves a message for the admin informing them of this. We'll see how it goes!

Apr 03 2010
Apr 03

In order to backup a Drupal site you need to take care of both code and database. The first problem is usually solved by placing the Drupal directory trees under Revision Control. The second requires a more careful analysis, and this post describes optimal stategies for MySQL.

Before you start: common caveats

Use a lock file

While the backup of files is usually more predictable, the database dumps can take more time than you expect: il will depend on site traffic and activity too; the usual good practice of using a lock file is a must in this case.

Keep your database clean

Drupal has known problems in clearing some cache tables, that are left unaffected by the normal cache clearing in Drupal or Drush. If you extensively use AJAX in forms, your cache tables can very quickly grow to hundreds of MBytes. Solve this by clearing expired cache elements manually; for example, with Drush, just run

drush sql-query "DELETE FROM cache_form WHERE expire < UNIX_TIMESTAMP(NOW())"

in your site directory tree to clean the DB.

What do you need the database dump for?

The best database backup stategy depends on how you actually expect to use the dump.

Use case 1: Verbatim copy of the database

This is not what you want ordinarily. Even if you do a good job of keeping your cache clear and deal manually with the tables that Drupal doesn't clear, the database will still contain a lot of information that is useful only for forensic-like examinations, like session information and watchdog messages.

If this is what you want, meet mysqlhotcopy, the most efficient way to perform a verbatim dump.

Examples (assuming that everything happens on localhost, that your database username is drupal, password is xyz, database name is dbname):

mysqlhotcopy --user=drupal --password=xyz dbname

will (re)create a database named dbname_copy and populate it with an exact copy of the dbname database. To restore the site database exactly like it was before the dump you merely need to edit settings.php and replace dbname with dbname_copy.

mysqlhotcopy --user=drupal --password=xyz dbname directory

copies the full database contents into the specified directory. This way you can store, by changing directory name, an arbitrary number of backups. Backups are restored by copying the files back to the MySQL database directory with cp -r or rsync -a (recommended, especially for remote use, since it will compress data and it won't replace a file with an identical copy, thus giving the same result in much less time).

Main problems with mysqlhotcopy:

  • You need suitable privileges on the filesystem: not applicable on shared hosting most of the times.
  • The dump is not human-readable: you won't be able to easily compare two dumps.

Use case 2: Copying the meaningful tables from the database

If your backup is more for safety than for archival, then you will probably want to be more careful in selecting what to dump and how to dump it. In most cases the best solution is a selective SQL dump.

Investigate at least the following:

  • Drush (namely, drush sql-dump) can be enough for you. You can tell to only dump meaningful tables with the --structure-tables-key=structure-tables option (to use it, rename the file example.drushrc.php to drushrc.php), and you can enable slower but more usefully formatted dumps (one line per insert, to make the comparison of different dumps much easier) with --ordered-dump. A typical dump command would thus look like

drush sql-dump --structure-tables-key=structure-tables --ordered-dump --result-file=dumpfile.sql

  • Backup and migrate is an ordinary Drupal module, with very flexible backup options and scheduling. All of this is configured and used from within Drupal, so no command line hacking needed. Depending on your infrastructure, Backup and migrate will take advantage of existing server capabilities like FTP transfers and backup encryption.

Main problems with this approach:

  • You still need command-line access for Drush and enough flexibility for Backup and Migrate; though, you will be able to use Backup and Migrate on most shared hosting solutions.
  • The dump is much more computationally expensive than the "raw" dump of use case 1: you will pay for the tuning possibilities and the readable SQL output with significantly longer execution times.

Use case 3: Sharing a database with other sites or developers

Once upon a time, database dumps were the only way to transfer a site, complete with full configuration, to other developers or to develop a feature on a staging site a migrating it to production.

Fortunately, this has changed in recent times: the Features module allows to export most configuration in code and enable it on a new site as easily as enabling an ordinary module. Take the time to investigate Features, or simpler dedicated modules like the built-in export functionalities in CCK for content type definitions, or Drush Views to export and import views from database to code and vice versa; don't rely on database dumps if at all possible.

If you do want to share a database for this purpose, take a look at dbscripts; configuration is a bit tedious, but it will allow you to define what you dump in different scenarios and how to merge the dumped database with the "production" one.

This article is released under the Creative Commons Attribution Noncommercial No Derivative Works 3.0 license.

Apr 02 2010
Apr 02

2nd Apr

jqzoom_drupal.pngRecently I integrated the nifty jQZoom jQuery plug-in into a client's site built on Drupal 5 and Drupal e-commerce 5.x-3.6. A jQZoom module for Drupal does exist, but here I'm going to outline the few simple steps required to get the script working with images output via a node template (or the Contemplate module).

For starters, on the site we have the following Imagecache presets:

  • product_full: "full-size" version, actually scaled to an width of 600px;
  • product_main: the "main image", used for display on a product node;
  • product_thumb: a thumbnail, used in product node teasers and for switching between product images.

The product nodes are templated to output several product images, something like this:

Showing the first image as the "main" image, and all images as thumbnails on the right which, when clicked on, replace the "main" image with the (mid-size) image associated with the thumbnail.

The layout is achieved with a product node template outputting the following HTML, where filenamex.jpg are the images associated with the product node, and filename1.jpg is the "main" image. There is of course some CSS to achieve the layout and borders in the image above which I haven't included here.

  1. <!-- the main product image //-->

  2. <div class="product-left">

  3. <div id="js-main-image">

  4. <a href="/files/imagecache/product_full/filename1.jpg" class="hoverproduct">

  5. <img src="/files/imagecache/product_main/filename1.jpg" id="main-product-image" />

  6. </a>

  7. </div>

  8. <p class="tip-text">Hover over the image to view detail.</p>

  9. </div>

  10. <!-- product thumbnail images //-->

  11. <div class="product-right">

  12. <a href="/files/imagecache/product_main/filename1.jpg" onclick="replaceProductImage('/files/imagecache/product_main/filename1.jpg', '/files/imagecache/product_full/filename1.jpg');return false;">

  13. <img src="/files/imagecache/product_thumb/filename1.jpg" alt="Product title'" title="Product title" />

  14. </a>

  15. <a href="/files/imagecache/product_main/filename2.jpg" onclick="replaceProductImage('/files/imagecache/product_main/filename2.jpg', '/files/imagecache/product_full/filename2.jpg');return false;">

  16. <img src="/files/imagecache/product_thumb/filename2.jpg" alt="Product title'" title="Product title" />

  17. </a>

  18. <a href="/files/imagecache/product_main/filename3.jpg" onclick="replaceProductImage('/files/imagecache/product_main/filename3.jpg', '/files/imagecache/product_full/filename3.jpg');return false;">

  19. <img src="/files/imagecache/product_thumb/filename3.jpg" alt="Product title'" title="Product title" />

  20. </a>

  21. </div>

A utility module (which I called ec_zoom) simply adds our JavaScript and CSS when we need it via hook_menu(). Note that the jQZoom files were installed to /sites/all/libraries/.

  1. /* Implements hook_menu(). */

  2. function ec_zoom_menu($may_cache) {

  3. if (!$may_cache) {

  4. // NOTE: this is only needed on (product) node pages.

  5. if (arg(0) == 'node' && is_numeric(arg(1)) && arg(2) != 'edit') {
  6. drupal_add_js(drupal_get_path('module', 'ec_zoom') .'/product_images.js', 'module');

  7. // jQZoom

  8. drupal_add_js('sites/all/libraries/jqzoom/js/jquery.jqzoom1.0.1.js');

  9. drupal_add_css('sites/all/libraries/jqzoom/css/jqzoom.css');

  10. }

  11. }

  12. }

The JavaScript to swap images is contained in the file product_images.js:

  1. function replaceProductImage(image_mid, image_big) {

  2. // remove jQZoom so we can reload new image/dimensions

  3. $(".jqZoomWindow").remove();

  4. $(".jqZoomPup").remove();

  5. $(".jqzoom").remove();

  6. // switch the image by removing the node and re-writing in the neccessary HTML

  7. var image = $('#main-product-image').attr("src");

  8. $(".hoverproduct").remove();

  9. $('#js-main-image').append('<a href="http://paul.leafish.co.uk/articles/drupal/quick_and_dirty_jqzoom_with_dr...'+image_big+'" class="hoverproduct"><img src="http://paul.leafish.co.uk/articles/drupal/quick_and_dirty_jqzoom_with_dr...'+image_mid+'" id="main-product-image" /></a>');

  10. // reload jQZoom after switching image

  11. $(".hoverproduct").jqzoom({

  12. zoomWidth: 250,

  13. zoomHeight: 200,

  14. title: false

  15. });

  16. }

  17. // load jQZoom on main image as soon as page is loaded

  18. $(document).ready(function(){

  19. $(".hoverproduct").jqzoom({

  20. zoomWidth: 250,

  21. zoomHeight: 200,

  22. title: false

  23. });

  24. });

Gotta love quick and dirty integration...

Apr 01 2010
Apr 01

When I was theming my blog in Drupal, I decided I wanted a better way to customize and display post info such as wording used and the way date was displayed. The first step is to have a look around and see where the code is coming from that renders this info. I viewed the files in my custom theme folder and discovered these few lines of code in node.tpl.php

<?php if ($submitted): ?>
    <span class="submitted"><?php print $submitted ?></span>
<?php endif; ?>

In HTML, that is rendered as:

Submitted by Danny Englander on 4-01-10

I decided I wanted to have customized date and post info only for my blog so a standard Drupal convention allows you to have node-blog.tpl.php to tailor the display of the blog content type. My theme did not have this file so I simply copied node.tpl.php and renamed it. Now that I had my custom Node Blog template file, I was all set to start customizing date and post info. </p>

Well, the above code seemed boring and hard to theme in Drupal so I discovered a way to be more specific with the way the code that displays post and date info gets output in your Drupal theme.

I simply replaced the code above with this code:

<div class="meta post-info">
<?php if ($submitted): ?>
<span class="submitted">Posted by <?php print theme('username', $node) ?></span>
  <?php endif; ?>

<div class="dateblock">
      <span class="month"><?php print $date_month ?></span>
      <span class="day"><?php print $date_day ?></span>
      <span class="year"><?php print $date_year ?></span>
 </div><!--//end dateblock-->
</div><!--//end meta post-info-->

**Note this crucial bit of code below was left out of the original post so if this did not work for you that's why. Place the code below in your theme's template.php file or create one if you don't have one already. (change "mytheme" to the name of your theme):

function mytheme_preprocess_node(&$vars) {
  // Grab the node object.
  $node = $vars['node'];
  // Make individual variables for the parts of the date.
  $vars['date_day'] = format_date($node->created, 'custom', 'j');
  $vars['date_month'] = format_date($node->created, 'custom', 'M');
  $vars['date_year'] = format_date($node->created, 'custom', 'Y');

By breaking down and getting more specific with this code, I was now able to use some CSS to customize the date into the nice little square blocks you see to the left of every post title. It also allows to have "Posted by", "submitted by" or whatever other wording you choose for the author part of the code.


Apr 01 2010
Apr 01

Last night I browsed 4chan as usual. Besides the usual Pedobears and Lolcats I found out that the Drupal.org server has been hacked, and someone made a torrent from all of those very expensive modules that are stored in the "contrib" repository.
This is a very serious issue, since the whole thing worth around $189.000 if you buy it in the shop...
Our only luck is that the very Sophisticated SecuROM copy-protection method was not hacked yet, so without it, you cannot install that version (some reports said that the "grep" hacker tool is able to circumvent the protection, but we cannot confirm that).

Bookmark/Search this post with:

AttachmentSize 24.14 KB
Mar 31 2010
Mar 31

                       HIRE ME for Drupal Training 
                                                                            Contact Me 


Mar 31 2010
Mar 31

In one of the most epic chapters of Sienfeld - George and Jerry are confronted with the Dillema "Tuck or No Tuck"?
A brave group has decided that this Dillema must be resolved and that blankets (like people, data and source) should be free!

They started the liberation front and have founded - "Free The Blankets!" a site with one clear agenda and goal - Untucked blankets by 2020.
Aren't you sick and tired of entering your hotel room and fighting with the tightly tucked blankets messing up everything in the process?!

Like George, We are free spirits, and can't be confined to the atrocities of maids stretching blankets to confining walls.
Now note it is not the maids fault - IT IS POLICY! and because of this it will be resolved by the policy creators - the managers and boards of Hotels and Hostels.
Joining this effort they will create a social movement that will free the blankets and will let us wake up at a fine morning in 2020 (or sooner!) stretch are legs and sigh...

Mar 30 2010
Mar 30

This is a quick post while it's fresh in my head, regarding porting your contributed modules to the Drupal 7 API. This post specifically focusses on database manipulation and provides the like-for-like changes. If anything here is wrong, or could be done better, please let me know!

First up, individual results:

* Remember this?
$value = db_result(db_query("SELECT some_field FROM {some_table}"));/*
* Forget it! It now looks like this:
$value = db_query("SELECT some_field FROM {some_table}")->fetchField();

Selects still look pretty much the same, at first glance, but wait:

* Let's fetch a database object in Drupal 6:
$result = db_query("SELECT some_field FROM {some_table} WHERE some_other_field = %d", $some_var);
while (
$row = db_fetch_object($result)) {
// Loop through your query results and do some stuff.
* In Drupal 7 there is no db_fetch_object() or db_fetch_array():
$result = db_query("SELECT some_field FROM {some_table} WHERE some_other_field = :some_other_field", array(':some_other_field' => $some_var));
foreach (
$result as $row) {
// Loop through your query results and do some stuff.

Deletes and updates now look quite different. You don't use db_query() at all any more for those types of SQL query. Each one has its own query building function:

* A delete query in Drupal 7:
condition('some_field', $some_var)
* An update query in Drupal 7:
'some_field' => $some_var,
'some_other_field' => $some_other_var,
condition('some_identifying_field', $some_id_var)

You should already be using drupal_write_record() for inserts anyway, and it has not changed, as far as I can tell.

Good news is hook_schema() and the related functions don't seem to have changed at all, so many of you won't need to touch installation scripts, unless you need to manipulate data as part of the update process, of course.

And that's that for now. Hope it's useful.

Mar 30 2010
Mar 30

My buddy Vito made a few thoughtful comments (It's not just enough to train them) to my post on training Drupalers (Dries was right).

Vito states: You don't want to get into a situation like CS had a few/several/many years ago, with a big influx of people learning Java because that's where the money was, and we end up with a wealth of lackluster, unmotivated, average Java developers who can "get by" but who aren't ever going to build you anything interesting.

This certainly resonates for me.

It's just a paycheck

Back when I ran Europa, and later, Desert Books , I sold the entire O'Reilly catalog at 25% off. Although sales on CS books were good, only about 300 students from the UT CS dept. had ever visited the store. Knowing that was only a fraction of the CS enrollment, I did a little "data mining" to determine the actual enrollment. It appeared that UT's CS dept. had about 2500-3000 students.

I asked my friends, Chris and Omar, who were head of UT sig-linux and UT-ACM, why only 10 percent of the CS dept. visited my store -- especially when I had O'Reilly books cheaper than anywhere else in town. Chris' reply was something that I'll never forget:

The 10% of the students who come to your store -- they're the ones who are interested in computer science. They're the ones who chose that major because they like computing. The rest -- they mostly selected the CS major because they believe that it will get them a job. They only read what they have to. Ten years ago, they would have majored in accounting.

Omar studying before class at Desert Books. Geeks communing in background.

A few years later, when I had moved into IT and began to hire for positions, I frequently interviewed recent CS grads. I was surprised to find many who couldn't distinguish a bubble sort from a binary sort, or couldn't give an example of a schema or ER diagram. Clearly, these folks were somewhere in that 90%. There was no passion for the craft, simply the expectation of a steady paycheck.

Propagating the Passion

Among the other things that draw folks to Drupal, are the perceived passion and sense of community. No one wants to see this diluted. Yet, all over the world, we hear from notable members of the community, like chx and even Dries, stating that we need more good Drupalers.

Widespread training is not the fix-all for this problem. David Strauss states it simply: Training doesn't create experts for things like Drupal. At best, it shortens the path to becoming an expert.

Even if training were the answer, there aren't currently enough Drupal trainers to address the demand, and given the down economy, many folks simply can't afford a training program.

I can't remember from whom I heard this remark first -- either Eric Raymond or Tim O'Reilly -- it was something like: "It's important not to grow your organization faster than you can propagate its culture." This is a dilemma we Drupalers now face. There is tremendous pressure to grow more qualified Drupal developers and designers. Yet many fear that, in responding to this demand, we may encourage an influx of those for whom Drupal is merely a means to a paycheck.

Communities of Drupal Practice.

I believe the long-term solution to the shortage is to focus on facilitating Drupal learning communities -- what Etienne Wenger and Jean Lave have called: Communities of Practice.

To quote Etienne Wenger:
Communities of Practice are groups of people who share a concern, a set of problems, or a passion about a topic, and who deepen their knowledge and expertise in this area by interacting on an ongoing basis. Engineers who design a certain kind of electronic circuit called phase-lock loops find it useful to compare design regularly and to discuss the intricacies of their esoteric specialty. Soccer moms and dads take advantage of game times to share tips and insights about the subtle art of parenting. Artists congregate in cafes and studios to debate the merits of new style or technique. Gang members learn to survive on the street and deal with an unfriendly world. Frontline managers running manufacturing operations get a chance to commiserate, to learn about upcoming technologies, and to foresee shifts in the winds of power.

In the absence of sufficient trainers, such Communities of Practice give Drupalers the opportunity to bootstrap each other. Someone may not know how to theme a Drupal site, but maybe they know the process to create a multilingual site. Everyone brings something, shares it, and hopefully walks away with something new.

The Drupal Dojo, described by Josh Koenig in Lullabot podcast 33, which has had its ups and downs, is an example of a virtual community of practice. Essentially, experienced Drupalists give online lessons, which are recorded for later study. It lacks the back and forth of a conversation, but some of that can be achieved via IRC.

Ideally, a community of practice meets in a physical space. People can talk faster than they can type -- especially when they're excited -- and in a shared physical space, passion and excitement are much more contagious. In a physical space, you can share a meal, a beer, a glass of wine while having a discussion.

Discussing the slipstream amidst a room of books -- another glass of wine Mr. Sterling?

When I joined the Austin Drupal community, I looked for opportunities to meet with other local Drupalers, to exchange ideas and gain insight on how to proceed. Unfortunately, there was only a monthly meetup with a featured speaker. Given that the Austin Drupal community consists of developers, designers, adminstrators, and others, rarely would there be a speaker that would attract more than a fraction of the group. Although the current managers of the meetup group are working are to change this, no matter what the format, a monthly meetup is not enough.

For all the people who wanted to make the leap from Drupal hobbyist to Drupal pro, the guidance and support was to be found mostly online.

To this end, I started the Austin Drupal Dojo. It's a weekly get together modeled loosely on the online dojo, with a few differences. There is little structure. People come and work on their projects, ask questions about what things are stumping them, or simply hang out to discuss Drupal. Because there is no structured format, the group is attended by designers and developers of all levels as well as beginners. Because the event is held at a cafe, people can drop in on their way home from work, and grab a bite if they're hungry.

The Austin Drupal Dojo is a good start, but more is needed. Becoming a Drupal ninja is no different from becoming a great musician or learning a foreign language. To become fluent in a foreign language, you have to speak it everyday. To become a great (ensemble) musician, you have to have someone to jam with. To become a great Drupalist, you have to have projects to hack on, and people to hack with.

For the last month, I've been meeting with local Drupalers to help figure out what the next steps are. I hope to publish the notes soon. In the meantime, please continue to send me your thoughts, or better yet, post them here.


Mar 30 2010
Mar 30

Play Video

Movie link (right-click to download) Problem with this video? Contact me

Twitter's great. SMS messaging is great. Even Instant Messaging is great.

But, there's one thing which ties everyone together online. It's not going away soon, and it's something that pretty much everyone has. If you guessed email (the title of the article was a give away, wasn't it?), then you're absolutely right. Just try to imagine a world without email - even with killer audio/voice technology, it just wouldn't be as refined without email.

So, if you're seeking a solution to have Drupal email your users, then one of the best methods for doing this is the Messaging and Notifications modules. You won't find one specific Drupal email module on Drupal.org because sending email is already built into Drupal. It's they way, and when email, or any other message method is used, that you need to focus on.

Sending out a message via any of the above mentioned methods, plus a Drupal email is easily accomplished with these modules.

It's easy, because they offer all the methods plus more. One of my own recent requirements was having Drupal email all users of a given role for one specific content type, and then also support email notifications to admin users in another role.

If this is the type of solution you're seeking for your Drupal site then this video may have what you're looking for.

Messaging [issues] - [usage]

Notifications [issues] - [usage]

Mar 29 2010
Mar 29

The goal of this blog is to envision us to build interesting application beyond a website or a simple mobile application, by using the platforms that are already available to us.

Android is a great open source mobile platform. On the local device, it has many sensors (for example, GPS, Accelerometer, barometer, etc) to collect much information that a user can or cannot easily gather. The information travels through the internet, works with remote resources, and returns to users with rich customized results and functionality. This awesomeness -- because of much processing power and logic workflow needed on the Cloud -- demands an equally powerful and open platform on the server end. If developers build a powerful server framework from stretch, it is not fun and too costly to be a favorable solution. Therefore, finding an open-sourced, powerful, extensible and scalable framework should be the first thing on the developers' mind.

What&#039;s Drupal Services for?What's Drupal Services for?Drupal Services ArchitectureDrupal with and without Services Module
Drupal, on the other hand, is used by web developers worldwide to build sophisticated community interactive websites. Its big success has been proven since many diverse organizations use Drupal as their core social publishing system for websites and internal collaboration applications. With add-on modules support, Drupal is also a web application framework with tens of thousands of features such as flagging system, approximate search, barcode generation, and many other social publishing functions. The Drupal Services module is the add-on module that allows Drupal content to be retrieved and saved through a standard API using pluggable communication mechanisms including XMLRPC, JSON, SOAP and REST. Also, with pluggable authentication mechanisms including OAuth and Key Auth, the communication between frameworks is highly secured. Furthermore, developers can also easily create their own web services to expose additional integration with contribute Drupal modules or custom code.

Therefore, developers will appreciate a marriage between these two great platforms. Building cloud applications with the two platforms will immediately have
1.) Both web portal and mobile portal
2.) The development cost is predictively minimal since the huge open source communities support together with hundreds of thousands of existing add-on features.
3.) The business logistics and workflow is handled on the cloud by Drupal. As we already have a lot of experience with Drupal.
4.) The application will be able to reach bedroom, garage, bathroom, office, subway, ..., anywhere the mobile goes.

Reference links:
Drupal Services Module: http://drupal.org/project/services
BeerCloud/GreatBrewers.com Architecture Diagram: A Drupal-Android case studyBeerCloud/GreatBrewers.com Architecture Diagram: A Drupal-Android case studyA case study (written by my colleague and me and once was promoted to frong page of drupal.org): Drupal + Services Module + Beer = BeerCloud on Android and iPhone http://drupal.org/node/659772
Open Source Android Library (work in progress) to work with Drupal Services: http://github.com/skyred/DrupalCloud

Mar 29 2010
Mar 29

On Saturday, April 17th, the weekend before DrupalCon San Francisco, I'm helping to organize the very first Drupal core developer summit. The goal of the Drupal core developer summit is to talk about ways we can improve Drupal core, and the core development processes, all while having a good time socializing with fellow core developers. Meeting in person for a full day and having more focused time to brainstorm about just core, should be really valuable. We can come up with plans to get Drupal 7 released, and we can get initial alignment on Drupal 8.

To make it lively and fun, we'll do a series of 10 minute lightning talks. In addition to the lightning talks, we'll have a number of meatier discussions and breakout sessions in smaller groups. The lightning talks will take the format: "How to make X more awesome?" where X can be anything in Drupal core. The idea behind the lightning talks is to educate core contributors about problems that need to be fixed, to present foundations for solutions, and to bootstrap collaboration. The original plan was to have 16 lightning talks, but based on feedback, I'm now leaning towards more breakout sessions and fewer lightning talks. On Sunday, the day after the Drupal core developer summit, there will be a code lounge where longer breakout sessions can be held too. Suggestions welcome as we can still made adjustments. Read on to learn more about how to attend.

The event is open to all, but ... in order to attend, you must be prepared to do a 10 minute lightning talk. To secure a ticket to the Drupal core developer summit, you have to submit a 4 slide presentation. We expect one background slide to provide context or to talk about the history of the problem, one slide with a clear problem statement, and a couple of slides to propose a solution. You can focus on big things (i.e. How to help migrate Drupal.org from CVS to Git) or smaller issue-level things (i.e. Why drupal_get_schema() is slow and how to make it faster). Everyone who submitted slides ahead of time can attend. All slides will be shared publicly, but not everyone will have to present as we'll only have time for a limited number of lightning talks. Some talks will be hand-picked because they are important or particularly intriguing, other talks will be randomly selected the day of the event. Don't worry -- if you don't want to present, we won't force you.

So far more than 50 people submitted a presentation, but we're happy to host more people. The original deadline to submit your short presentations is extended until April 15th so there is still time to submit your presentations.

Mar 29 2010
Mar 29

The following Photoshop tutorial shows how to apply realistic drop shadows to photos. Once satisfied with your result, you can slice your image up and, using CSS rules, apply creative shadows, hence 3d effect, to any image, Drupal block, Drupal box, etc. Similar tutorials are all over the web — well, not exactly, but there are many. However, I had difficulty understanding the best of them. I clarified the whole process for myself, and have decided to dump it here. Lovely photo. Attached is my Photoshop CS2 file.

Here is my tutorial, scanned from an Hilroy loose-leaf.

Just kidding! Although this image sums it up really well, you may not be able to read it. You'll end up with four layers (as shown on the scan), with 'effects' applied to the two top layers.

Let's go through the process step by step.

  1. Create a new document in Photoshop. Any 'ol version of Photoshop will do. You'll either create your new document as transparent or pre-fill it with some color. Does not matter. However, you might want to create it right off in larger proportions than your photo's. To give you some room. Of course, you can fix it all later using Image → Canvas Size (ALT-CTRL-C in Windows). Me, I created a 800 by 600 pixels document with 72 ppi, pre-filled with white.

  2. Go head and open your photo in Photoshop. Then place your photo in your Untitled document. Know how? Select the layer where your photo is — that layer is probably locked, that's fine, then drag it and drop it onto the window of your Untitled document. This will effectively add a new layer to your Untitled document, on which your photo will reside. Now I realize, doin' it on my end, that my photo is not in 72 ppi, so it is way too big. So I delete my new layer, and resize my photo first (in its own document) using Image → Image Size (ALT-CTRL-I in Windows) and setting the Resolution to 72 ppi. Then I reselect the layer and drag & drop it onto my other document. Done.

    I now have 2 layers. I rename my bottom layer to 'background' and my top layer to 'photo'.

  3. Now comes the fun part, in which we'll create the shadow. This shadow will reside on its own layer, and we will *not* use any effect to create it. Go ahead and create a new layer (SHIFT-CTRL-N in Windows), and name that layer 'shadow'.

    Here is our next goal: create a selection that we will fill with black-ness, and we'll start by shaping this selection after our photo's outline. So, go and select the 'photo' layer. Then, go to Select → Load Selection. In the 'Load Selection' dialog that'll pop up, select 'photo Transparency' as 'Channel'. Click OK. You now have a blinking selection surrounding your photo.

    The next goal is to create a path out of this selection. We want to change the shape of our selection. Once we're happy with the shape, we'll turn in into a selection again and fill it with black-ness. So, how do we do it? Press 'M' to use the Marquee tool. Place your cursor over your image, right click, and from the context menu select 'Make Work Path...' (with Tolerance set to 2.0 pixels). You've now created a path out of your selection.

    Tools used to play with the shadow path

    To distort the path, we'll use these tools: the Direct Selection Tool and the Convert Point Tool. Press 'A' to grab the Direct Selection Tool. Select the top left point of your path. You might have to click twice. You'll know that your point is selected when its little square has turned gray — as opposed to the other 3 points along the path. Move that point away from the photo. Use the Convert Point Tool (from your Tools box, just below) to change that 90-degrees corner to one with handlebars. Play with the handlebars using the Direct Selection Tool. To break the connection between the 2 handlebars of a point, you have to select one of the 2 handlebars with the Convert Point Tool. (And there is another way to achieve this, but I forget...) Repeat the procedure for the other 3 points. At the end, you'll end up with something like this:

    Notice on the last screen capture that the bottom left point is selected: it is gray.

    Important: make sure that there is 1 or many sides to your photo where the shadow is hidden, not showing *at all*. Why? Because if you don't do that — and there is a shadow showing all around your image — your photo will look as if it is floating on the page. That ain't too realistic.

    Now we will turn this path into a selection again. Access the 'Paths' panel, and click on the selection icon, shown here:

    We will now fill our selection with black-ness. Make sure that your 'shadow' layer is selected. Then go to Edit → Fill. Select 'Black' from the drop-down. Then deselect all (CTRL-D). Set the Opacity of your 'shadow' layer to ~50%, right now or later. Then, apply a Gaussian blur to your layer. Make sure that when you apply this filter your shadow is no longer selected, because if it's still selected your filter will only apply to your selection, and we want to apply the 'blur' to the entire layer. You'll end up with a layer that looks like this, when others layers are hidden from view:

    We now have 3 layers. Reorder your layers so that the shadow is under the photo.

  4. We will now create our white border. This part is easy. Create a new layer (SHIFT-CTRL-N in Windows), and name that layer 'outline'. We need to create a selection based on the photo outline. We did this before in step 2, but we hadn't saved our selection. Go ahead and select the 'photo' layer. Then, go to Select → Load Selection. In the 'Load Selection' dialog that'll pop up, select 'photo Transparency' as 'Channel'. Click OK. You now have a blinking selection surrounding your photo. Press 'M' to revert to the Marquee tool. Then press the ALT key. Do not move your finger from that key. Placing your cursor above your current selection, you'll see that your cursor has turned into a plus sign with a smaller minus sign next to it. What that means? It means we'll subtract from our current selection. Draw a rectangle selection within your current selection. You're creating a selection in the shape of a border. In the process of creating your border, you'll hide part of your photo, you will effectively crop it. (I wish I'd told you this before, please do not slap me.) Once done, select your 'outline' layer, then go to Edit → Fill, and select White as your fill. Then deselect all (CTRL-D). You have a white outline, border, whatever, on your 'outline' layer.

    We now have 4 layers, like so:

Time to add 2 Photoshop effects

  1. We'll apply a gray-to-white gradient on the white border. Right-click on the 'outline' layer, in your Layers palette, and choose 'Blending Options...' from the context menu. Check 'Gradient Overlay' in the list on the left in the dialog box. Double-click on 'Gradient Overlay' to access the gradient properties. Make sure you play with the Angle. Look at the preview. Decide for yourself where the Light is coming from.

  2. A final touch will consist in applying some light effect on the photo itself. This step is useless if you slice up the resulting image for use in CSS-styling. Right-click on the 'photo' layer, in your Layers palette, and choose 'Blending Options...' from the context menu. Check 'Inner Shadow' in the list on the left in the dialog box. Double-click on 'Inner Shadow' to access the shadow properties. Make sure you play with the Angle here as well. Look at the preview. Decide for yourself where the Light is falling.

Time to slice up the image for CSS-styling

And time to have supper.

AttachmentSizeHitsLast download advancedShadow.psd1.39 MB484 years 37 weeks ago

Last edited by Caroline Schnapp about 3 years ago.

Mar 29 2010
Mar 29

Ever wanted to add alternating background colors (a.k.a. "zebra stripes") to your webform fields? I had a need to do this a couple of days ago and struggled to find a method for this. Little did I know that the webform module actually has a few template files included with it that you can use to override what's going on in the theme layer.

The particular thing I wanted to do was to add some CSS to the webform fields so that I could do some easy CSS zebra striping. I know this is super easy to do with the output of the Views module because of the "row-even" and "row-odd" CSS classes that are provided out of the box. I wanted to be able to do some similar CSS classes on my webform fields. Getting the job done requires taking the "webform-form.tpl.php" default template file in the Webform module directory and copying it to own your custom theme directory.

Here's the code I ended up using:

// If editing or viewing submissions, display the navigation at the top.
if (isset($form['submission_info']) || isset($form['navigation'])) {


// Print out the main part of the form.
  // Feel free to break this up and move the pieces within the array.
$counter = 0; // Keep track of the fields and how many are being displayed.
foreach ($form['submitted'] as $key => $value) {
    if (
substr($key, 0, 1) != '#') { // Only do this for the actual fields being displayed.
$counter ++;
      if (
$counter % 2) {
$even_odd = 'odd';
      else {
$even_odd = 'even';
// Tack a new even/odd div onto the beginning of the existing div around the form field.
$form['submitted'][$key]['#prefix'] = '<div class="row-' . $even_odd . '">' . $form['submitted'][$key]['#prefix'];
$form['submitted'][$key]['#suffix'] = $form['submitted'][$key]['#suffix'] . '</div>';


// Always print out the entire $form. This renders the remaining pieces of the
  // form that haven't yet been rendered above.
print drupal_render($form);


// Print out the navigation again at the bottom.
if (isset($form['submission_info']) || isset($form['navigation'])) {

You should be able to see from the code above that I basically wrapped an extra div around each webform field that provides the "row-even" and "row-odd" CSS classes. You can then use your regular CSS methods to determine how the rows will display in the end. If you want to see everything that appears within the $form['submitted'] array, I recommend making sure you have the devel module installed and then try adding


...into this same template file. That will show you all of the elements that exist within the array provided by Webform.

While researching, I also discovered that you can use a webform template file, "webform-mail.tpl.php", to theme the output of the emails that Webform sends out. This is super handy to know, although I haven't made use of it as of yet. You can even theme the confirmation page using "webform-confirmation.tpl.php" if you're interested. If you're interested in learning more, just take a look at the "THEMING.txt" file that's included with the module and you'll find a lot of good information. Happy theming!

Mar 28 2010
Mar 28

By knowing who you are you will be directed to a page where we have displayed all our drupal services relevant to your professional identity.

Mar 28 2010
Mar 28

Freelinking 1.x was previously announced as feature-frozen, and the next thing to unsupported. Looking at the long development cycle for Freelinking 3 reaching stability, the issue queue was dusted off.

Freelinking 1.9 will fix a number of bugs in 1.8, and smooth out the transition to FL3. Please go on over to the Freelinking homepage and download the dev version to try it out.

Freelinking 1.9 will be released this week.

Terms: freelinkingmodulesdrupal6
Mar 28 2010
Mar 28

GerryBot - A theme for Drupal

I promised a while back that I would make the theme I use on this site available for free download.

Well, here it is. Called GerryBot, it's  a very minimalist one-column Drupal 6 theme. The original concept behind this theme was to have as little as possible on the page to distract from the content. Some of the features are:

  • Single column design, but with support for blocks in the sidebar underneath the logo.
  • Only two regions for blocks - in the sidebar area or underneath the content. Remember, the goal is to reduce on-page clutter.
  • Primary links at top of page, secondary links at the footer of the page.
  • Content area is 500px wide, uses 14px font size for good legibility. Body is a subtle grey shade, and the inner content area is white to focus the eye on the article.

Now, the theme is in 'beta mode' at the moment. I've never packaged a Drupal theme before, so I really would appreciate your feedback.



Usual Drupal theme installation advice applies:

  • Unzip the archive and upload the gerrybot folder to your Drupal site to sites/*/themes
  • Log on to Drupal as the administrator, go to Site Building -> Themes. Activate the theme (set it as default too) and then click Save Configuration.
  • Visit the homepage of your site and the GerryBot theme should now be applied.


Whether it's aesthetic details of the theme - font-sizes, colours of links, etc, or if GerryBot is missing support for common styling elements that occur in Drupal. Just let me know - either in the comments to this page, or use my contact form.

Mar 27 2010
Mar 27

I noticed today that one of my sites returned 403 Access denied on various pages with URLs like format/<foo>, although it was an alias for a taxonomy/term/<tid> taxonomy path which was actually available when not aliased. What could be going on ?

It turned out that this issue is caused by the http://drupal.org/node/28776 patch introduced in DRUPAL-6-7 to protect various VCS paths.

That patch introduced by this issue modifies the FilesMatch clause to match on ^format$ (for SVN), which causes any path containing the format string to be denied, causing that problem. Which gives the solution if this is a problem for you: either modify the relavant FilesMatch clause or rename every path containing format on your site.

Note that this specific subpattern has been rolled back and reworked after http://drupal.org/node/581706 for DRUPAL-7-0-ALPHA2, so Drupal 7 does not have this problem.


About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web