Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
Jun 02 2015
Jun 02

In April 2015, NASA unveiled a brand new look and user experience for NASA.gov. This release revealed a site modernized to 1) work across all devices and screen sizes (responsive web design), 2) eliminate visual clutter, and 3) highlight the continuous flow of news updates, images, and videos.

With its latest site version, NASA—already an established leader in the digital space—has reached even higher heights by being one of the first federal sites to use a “headless” Drupal approach. Though this model was used when the site was initially migrated to Drupal in 2013, this most recent deployment rounded out the endeavor by using the Services module to provide a REST interface, and ember.js for the client-side, front-end framework.

Implementing a “headless” Drupal approach prepares NASA for the future of content management systems (CMS) by:

  1. Leveraging the strength and flexibility of Drupal’s back-end to easily architect content models and ingest content from other sources. As examples:

  • Our team created the concept of an “ubernode”, a content type which homogenizes fields across historically varied content types (e.g., features, images, press releases, etc.). Implementing an “ubernode” enables easy integration of content in web services feeds, allowing developers to seamlessly pull multiple content types into a single, “latest news” feed. This approach also provides a foundation for the agency to truly embrace the “Create Once, Publish Everywhere” philosophy of content development and syndication to multiple channels, including mobile applications, GovDelivery, iTunes, and other third party applications.

  • Additionally, the team harnessed Drupal’s power to integrate with other content stores and applications, successfully ingesting content from blogs.nasa.gov, svs.gsfc.nasa.gov, earthobservatory.nasa.gov, www.spc.noaa.gov, etc., and aggregating the sourced content for publication.

  1. Optimizing the front-end by building with a client-side, front-end framework, as opposed to a theme. For this task, our team chose ember.js, distinguished by both its maturity as a framework and its emphasis of convention over configuration. Ember embraces model-view-controller (MVC), and also excels at performance by batching updates to the document object model (DOM) and bindings.

In another stride toward maximizing “Headless” Drupal’s massive potential, we configured the site so that JSON feed records are published to an Amazon S3 bucket as an origin for a content delivery network (CDN), ultimately allowing for a high-security, high-performance, and highly available site.

Below is an example of how the technology stack which we implemented works:

Using ember.js, the NASA.gov home page requests a list of nodes of the latest content to display. Drupal provides this list as a JSON feed of nodes:

Ember then retrieves specific content for each node. Again, Drupal provides this content as a JSON response stored on Amazon S3:

Finally, Ember distributes these results into the individual items for the home page:

The result? A NASA.gov architected for the future. It is worth noting that upgrading to Drupal 8 can be done without reconfiguring the ember front-end. Further, migrating to another front-end framework (such as Angular or Backbone) does not require modification of the Drupal CMS.

Feb 09 2015
Feb 09

TLPL; j'ai changé de logiciel pour la gestion de mon blog, de Drupal à Ikiwiki.

TLDR; I have changed my blog from Drupal to Ikiwiki.

http://anarcat.koumbit.org/ will continue operating for a while to give a chance to feed aggregators to catch that article. It will also give time to the Internet archive to catchup with the static stylesheets (it turns out it doesn't like Drupal's CSS compression at all!) An archive will therefore continue being available on the internet archive for people that miss the old stylesheet.

I have redirected the http://anarcat.koumbit.org URL to the new blog location, http://anarc.at/blog. This will be my last blog post written on Drupal, and all new content will be available on the new URL. RSS feed URLs should not change.

I have migrated away from Drupal because it is basically impossible to upgrade my blog from Drupal 6 to Drupal 7. Or if it is, I'll have to redo the whole freaking thing again when Drupal 8 comes along.

And frankly, I don't really need Drupal to run a blog. A blog was originally a really simple thing: a web blog. A set of articles written on the corner of a table. Now with Drupal, I can add ecommerce, a photo gallery and whatnot to my blog, but why would I do that? and why does it need to be a dynamic CMS at all, if I get so little comments?

So I'm switching to ikiwiki, for the following reason:

  • no upgrades necessary: well, not exactly true, i still need to upgrade ikiwiki, but that's covered by the Debian package maintenance and I only have one patch to it, and there's no data migration! (the last such migration in ikiwiki was in 2009 and was fully supported)
  • offline editing: this is a a big thing for me: i can just note things down and push them when I get back online
  • one place for everything: this blog is where I keep my notes, it's getting annoying to have to keep track of two places for that stuff
  • future-proof: extracting content from ikiwiki is amazingly simple. every page is a single markdown-formatted file. that's it.

Migrating will mean abandoning the barlow theme, which was seeing a declining usage anyways.

So what should be exported exactly. There's a bunch of crap in the old blog that i don't want: users, caches, logs, "modules", and the list goes on. Maybe it's better to create a list of what I need to extract:

  • nodes
    • title (meta title and guid tags, guid to avoid flooding aggregators)
    • body (need to check for "break comments")
    • nid (for future reference?)
    • tags (should be added as [[!tag foo bar baz]] at the bottom)
    • URL (to keep old addresses)
    • published date (meta date directive)
    • modification date (meta updated directive)
    • revisions?
    • attached files
  • menus
    • RSS feed
    • contact
    • search
  • comments
    • author name
    • date
    • title
    • content
  • attached files
    • thumbnails
    • links
  • tags
    • each tag should have its own RSS feed and latest posts displayed

I had planned to do this before summer 2015, but it turned out being fairly easy and fun, so i spent two evenings working on a script on feb 5th and 6th, and finally turned off the Drupal site on monday february 9th.

Well me, who else. You probably really don't care about that, so let's get to the meat of it.

How to perform this migration... There are multiple paths:

  • MySQL commandline: extracting data using the commandline mysql tool (drush sqlq ...)
  • Views export: extracting "standard format" dumps from Drupal and parse it (JSON, XML, CSV?)

Both approaches had issues, and I found a third way: talk directly to mysql and generate the files directly, in a Python script. But first, here are the two previous approaches I know of.

MySQL commandline

LeLutin switched using MySQL requests, although he doesn't specify how content itself was migrated. Comments importing is done with that script:

echo "select n.title, concat('| [[!comment   format=mdwn|| username=\"', c.name, '\"|| ip=\"', c.hostname, '\"|| subject=\"', c.subject, '\"|| date=\"', FROM_UNIXTIME(c.created), '\"|| content=\"\"\"||', b.comment_body_value, '||\"\"\"]]') from node n, comment c, field_data_comment_body b where n.nid=c.nid and c.cid=b.entity_id;" | drush sqlc | tail -n +2 | while read line; do if [ -z "$i" ]; then i=0; fi; title=$(echo "$line" | sed -e 's/[    ]\+|.*//' -e 's/ /_/g' -e 's/[:(),?/+]//g'); body=$(echo "$line" | sed 's/[^|]*| //'); mkdir -p ~/comments/$title; echo -e "$body" > ~/comments/$title/comment_$i._comment; i=$((i+1)); done

Kind of ugly, but beats what i had before (which was "nothing").

I do think it is the good direction to take, to simply talk to the MySQL database, maybe with a native Python script. I know the Drupal database schema pretty well (still! this is D6 after all) and it's simple enough that this should just work.

Views export

screenshot of views 2.x

mvc recommended views data export on Lelutin's blog. Unfortunately, my experience with the views export interface has been somewhat mediocre so far. Yet another reason why I don't like using Drupal anymore is this kind of obtuse dialogs:

I clicked through those for about an hour to get JSON output that turned out to be provided by views bonus instead of views_data_export. And confusingly enough, the path and format_name fields are null in the JSON output (whyyy!?). views_data_export unfortunately only supports XML, which seems hardly better than SQL for structured data, especially considering I am going to write a script for the conversion anyways.

Basically, it doesn't seem like any amount of views mangling will provide me with what i need.

Nevertheless, here's the failed-export-view.txt that I was able to come up with, may it be useful for future freedom fighters.

Python script

I ended up making a fairly simple Python script to talk directly to the MySQL database.

The script exports only nodes and comments, and nothing else. It makes a bunch of assumptions about the structure of the site, and is probably only going to work if your site is a simple blog like mine, but could probably be improved significantly to encompass larger and more complex datasets. History is not preserved so no interaction is performed with git.

Generating dump

First, I imported the MySQL dump file on my local mysql server for easier development. It is 13.9MiO!!

mysql -e 'CREATE DATABASE anarcatblogbak;'
ssh aegir.koumbit.net "cd anarcat.koumbit.org ; drush sql-dump" | pv | mysql anarcatblogbak

I decided to not import revisions. The majority (70%) of the content has 1 or 2 revisions, and those with two revisions are likely just when the node was actually published, with minor changes. ~80% have 3 revisions or less, 90% have 5 or less, 95% 8 or less, and 98% 10 or less. Only 5 articles have more than 10 revisions, with two having the maximum of 15 revisions.

Those stats were generated with:

SELECT title,count(vid) FROM anarcatblogbak.node_revisions group
by nid;

Then throwing the output in a CSV spreadsheet (thanks to mysql-workbench for the easy export), adding a column numbering the rows (B1=1,B2=B1+1), another for generating percentages (C1=B1/count(B$2:B$218)) and generating a simple graph with that. There were probably ways of doing that more cleanly with R, and I broke my promise to never use a spreadsheet again, but then again it was Gnumeric and it's just to get a rough idea.

There are 196 articles to import, with 251 comments, which means an average of 1.15 comment per article (not much!). Unpublished articles (5!) are completely ignored.

Summaries are also not imported as such (break comments are ignored) because ikiwiki doesn't support post summaries.

Calling the conversion script

The script is in drupal2ikiwiki.py. It is called with:

./drupal2ikiwiki.py -u anarcatblogbak -d anarcatblogbak blog -vv

The -n and -l1 have been used for first tests as well. Use this command to generate HTML from the result without having to commit and push all:

ikiwiki --plugin meta --plugin tag --plugin comments --plugin inline  . ../anarc.at.html

More plugins are of course enabled in the blog, see the setup file for more information, or just enable plugin as you want to unbreak things. Use the --rebuild flag on subsequent runs. The actual invocation I use is more something like:

ikiwiki --rebuild --no-usedirs --plugin inline --plugin calendar --plugin postsparkline --plugin meta --plugin tag --plugin comments --plugin sidebar  . ../anarc.at.html

I had problems with dates, but it turns out that I wasn't setting dates in redirects... Instead of doing that, I started adding a "redirection" tag that gets ignored by the main page.

Files and old URLs

The script should keep the same URLs, as long as pathauto is enabled on the site. Otherwise, some logic should be easy to add to point to node/N.

To redirect to the new blog, rewrite rules, on original blog, should be as simple as:

Redirect / http://anarc.at/blog/

When we're sure:

Redirect permanent / http://anarc.at/blog/

Now, on the new blog, some magic needs to happen for files. Both /files and /sites/anarcat.koumbit.org/files need to resolve properly. We can't use symlinks because ikiwiki drops symlinks on generation.

So I'll just drop the files in /blog/files directly, the actual migration is:

cp $DRUPAL/sites/anarcat.koumbit.org/files $IKIWIKI/blog/files
rm -r .htaccess css/ js/ tmp/ languages/
rm foo/bar # wtf was that.
rmdir *
sed -i 's#/sites/anarcat.koumbit.org/files/#/blog/files/#g' blog/*.mdwn
sed -i 's#http://anarcat.koumbit.org/blog/files/#/blog/files/#g' blog/*.mdwn
chmod -R -x blog/files
sudo chmod -R +X blog/files

A few pages to test images:

There are some pretty big files in there, 10-30MB MP3s - but those are already in this wiki! so do not import them!

Running fdupes on the result helps find oddities.

The meta guid directive is used to keep the aggregators from finding duplicate feed entries. I tested it with Liferea, but it may freak out some other sites.

Remaining issues

  • postsparkline and calendar archive disrespect meta(date) - filed upstream bup
  • merge the files in /communication with the ones in /blog/files before import - done!
  • import non-published nodes ignored for now
  • check nodes with a format different than markdown (only a few 3=Full HTML found so far)
  • replace links to this wiki in blog posts with internal links

More progress information in the script itself.

Created at noon on Monday, February 9th, 2015. Edited Thursday afternoon, September 10th, 2015.
Jan 14 2015
Jan 14

I'm updating a Drupal 6 theme to Drupal 8.  One thing I'm doing is making the logo in my Twig template a Twig variable instead of hardcoding the path.  Here's how you do it.  This assumes a theme named 'acton', but you'll change that to your own theme's name.

In 'acton.theme', assuming your logo is 'logo.png' in your theme's root:

function acton_preprocess_page(&$variables) {
  $variables['logopath'] = '/' . drupal_get_path('theme','acton') . '/logo.png';

In your Twig template, do something like this:

img class="img-responsive" src="{{ logopath }}" />


Jan 08 2014
Jan 08

Bootstrap is a great Drupal theme that makes it so your form elements and other Drupal things get output with proper Twitter Bootstrap CSS attributes.  One downside to it is that the popular Webform module has elements that don't get styled by the Bootstrap Drupal theme and then they look like unstyled form fields.

To fix it, go to Bootstrap's theme/process.inc file.  Inside it, add 'webform_email' to the 'types' array in the _bootstrap_process_input() function.  This will style Webform's special email field.  Other fields likely have different types.  The reason it doesn't get styled is because the 'types' array is coded to look for only default types and not the special ones that Webform is using.

If you want to see what the #type is on an element, I recommend installing the Devel module and calling "dpm($element);" inside the theme/alter.inc bootstrap_element_info_alter() function.  This will output all of the elements on your current Webform.

Happy Bootstrapping!

Sep 25 2013
Sep 25

If you're doing any theming with Drupal, you'll undoubtedly want to implement template suggestions for some of your fields at some point.  Usually you'll have some undesirable formatting, especially in Panels panes.  This post at Drupal.org has a method of how to find the appropriate template suggestion for your panel pane by working in your template.php file with the Devel module and dpm().

One gotcha would be with pane subtypes that have colons in them.  This can happen with tokens relatively often.  The answer is to substitute the colon with an underscore in the filename.  For instance, I was theming a token price with the pane type 'token' and pane subtype 'node:price'.  The correct template suggestion is 'panels-pane--token--node_price.tpl.php'.

Note that this technique with the colons only works after applying this patch with the 7.x-3.3 version of Panels!

Feb 22 2013
Feb 22

Simplifying Wordpress and Drupal configurationAt last year's Drupalcon in Denver there was an excellent session called Delivering Drupal.  It had to do with the oftentimes painful process of deploying a website to web servers.  This was a huge deep dive session that went into the vast underbelly of devops and production server deployment.  There were a ton of great nuggets and I recommend watching the session recording for serious web developers.

The most effective takeway for me was the manipulation of the settings files for your Drupal site, which was only briefly covered but not demonstrated.  The seed of this idea that Sam Boyer presented got me wondering about how to streamline my site deployment with Git.  I was using Git for my Drupal sites, but not effectively for easy site deployment.  Here are the details of what I changed with new sites that I build.  This can be applied to Wordpress as well, which I'll demonstrate after Drupal.

Why would I want to do this?

When you push your site to production you won't have to update a database connection string after the first time.  When you develop locally you won't have to update database connections, either.

Streamlining settings files in Drupal

Drupal has the following settings file for your site:


This becomes a read only file when your site is set up and is difficult to edit.  It's a pain editing it to run a local site for development.  Not to mention if you include it in your git repository, it's flagged as modified when you change it locally.

Instead, let's go ahead and create two new files:


Add the following to your .gitignore file in the site root:


This will put settings.php and settings.production.php under version control, while your local settings.local.php file is not.  With this in place, remove the $databases array from settings.php.  At the bottom of settings.php, insert the following:

$settingsDirectory = dirname(__FILE__) . '/';
if(file_exists($settingsDirectory . 'settings.local.php')){
    require_once($settingsDirectory . 'settings.local.php');
    require_once($settingsDirectory . 'settings.production.php');

This code tells Drupal to include the local settings file if it exists, and if it doesn't it will include the production settings file.  Since settings.local.php is not in Git, when you push your code to production you won't have to mess with the settings file at all.  Your next step is to populate the settings.local.php and settings.production.php files with your database configuration.  Here's my settings.local.php with database credentials obscured.  The production file looks identical but with the production database server defined:

    $databases['default']['default'] = array(
      'driver' => 'mysql',
      'database' => 'drupal_site_db',
      'username' => 'db_user',
      'password' => 'db_user_password',
      'host' => 'localhost',
      'prefix' => '',

Streamlining settings files in Wordpress

Wordpress has a similar process to Drupal, but the settings files are a bit different.  The config file for Wordpress is the following in site root:


Go ahead and create two new files:


Add the following to your .gitinore file in the site root:


This will make it so wp-config.php and wp-config.production.php are under version control when you create your Git repository, but wp-config.local.php is not.  The local config will not be present when you push your site to production.  Next, open the Wordpress wp-config.php and remove the defined DB_NAME, DB_USER, DB_PASSWORD, DB_HOST, DB_CHARSET, and DB_COLLATE variables.  Insert the following in their place:

/** Absolute path to the WordPress directory. */
if ( !defined('ABSPATH') ) {
    define('ABSPATH', dirname(__FILE__) . '/');
if(file_exists(ABSPATH  . 'wp-config.local.php')){
    require_once(ABSPATH  . 'wp-config.local.php');
    require_once(ABSPATH . 'wp-config.production.php');

This code tells Wordpress to include the local settings file if it exists, and if it doesn't it will include the production settings file. Your next step is to populate the wp-config.local.php and wp-config.production.php files with your database configuration.  Here's my wp-config.local.php with database credentials obscured.  The production file looks identical but with the production database server defined:

// ** MySQL settings - You can get this info from your web host ** //
/** The name of the database for WordPress */
define('DB_NAME', 'db_name');
/** MySQL database username */
define('DB_USER', 'db_user');
/** MySQL database password */
define('DB_PASSWORD', 'db_user_password');
/** MySQL hostname */
define('DB_HOST', 'localhost');
/** Database Charset to use in creating database tables. */
define('DB_CHARSET', 'utf8');
/** The Database Collate type. Don't change this if in doubt. */
define('DB_COLLATE', '');

What's next?

Now that you're all set up to deploy easily to production with Git and Wordpress or Drupal, the next step is to actually get your database updated from local to production.  This is a topic for another post, but I've created my own set of Unix shell scripts to simplify this task greatly.  If you're ambitious, go grab my MySQL Loaders scripts that I've put on Github.

Nov 03 2010
Nov 03

Technological leadership

Active in product innovation and open source software services, we build lasting value for our customers and the Drupal community.

Empowering relationships

Caring about clients and the outcome of their projects, we get involved and make meaningful relationships that give our customers maximum independence.

Community engagement

Dedicated to fostering the Drupal open source community, we catalyze event and development initiatives, resulting in real change and growth.

Sep 21 2010
Sep 21

One of the things that Dries said during his keynote at DrupalCon Copenhagen 2010 really stuck with me, and I've been trying to follow through. What he said was something like "if you don't schedule time to work on Drupal, it won't happen". Sadly, I've been finding this is true, so I decided I'd schedule three hours on Tuesday morning for Drupal. The plan is to wake up early-ish as usual, respond to email from my students, and then devote my morning to Drupal. Of course, I have some other work-related Drupal things happening, but this time will be for personal projects, the Drupal issue queue, and so on.

Mar 23 2010
Mar 23

The Idea

A number of months back, a group of us had the idea to create a software co-operative. There were several tenets that we decided to follow:

  • The company wouldn't have any employees -- everybody involved would be have 1099 status and would be an independent contractor
  • The company would be formed as an Limited Liability Company - we chose the state of Delaware
  • The company would seek to have a $0 cash and asset value at the end of each year
  • The company would be virtual to keep costs low
  • We would focus on working with open sourced projects like Drupal

The Setup

We set the company up using Instacorp. The benefit was the speed at which we could set up the company with an automatic legal presence in Delaware. The people at Instacorp made the process incredibly simple, asking a few questions. Within days the legal documents were delivered. That, in itself, really didn't make the company real.

After receiving the legal documents, it was necessary to obtain an FEIN for tax purposes. This is a simple process on the IRS site - it just takes a few minutes and you get the documentation electronically.

We needed to decide how to be taxed.
LLC's report taxes in one of 3 ways:

  • Disregarded entity (limited to one member LLC's)
  • Partnership (default if other elections are not made)
  • Corporation, electing to be taxes as pass-through entity, called S corporations.

In the case of Vintage Digital, workers are paid for what they do which is reported to them on FORM 1099 as commissions.
Great variation will occur in compensation since it is entirely based on hours worked and percentages for those who find clients and shepherd clients through the contracting process. The LLC is a virtual corporation, with exceeding low or non-existent overhead. There is no intent to use the LLC other than as a distribution method for sharing work; profits/losses will be kept to a minimum. There is no intent to hold fixed assets or incur debt.

In the final analysis both partnership and S-corporation reporting would be the same.

Our accountant indicated that the following things were recommended:

  • Use "S" corporation for tax reporting because the laws are better understood and simpler.
  • Majority of LLC's elect to report taxes as a pass-through corporations, hence even the IRS is more familiar with these tax laws.

We had an S-corp election to indicate how we were going to be taxed - after the election that document needs to be sent to the IRS.

We needed a bank account and opted for a bank that had free business checking and had online bill pay. The bank required our Articles and two forms of identification. We also needed a copy of our FEIN letter from the IRS.

The Tools
All companies need tools to help run things on a day to day basis. A virtual venture is no different. We needed management tools, communication tools, invoicing/book keeping software, and ways to manage contracts. To that end we sought out different solutions that would provide us with ways to sensibly manage ourselves and our projects.

  • Skype for Communication (both voice and chat)
  • Open Atrium for a client intranet and as an internal planning tool.
  • Bamboo Invoice for invoicing clients (although that might change as we transition to QuickBooks)
  • Drupal for our Web presence
  • dotProject for time record keeping and ticketing
  • Office for estimates, calculating commission shares, and contracts
  • Google Voice for incoming phone calls - the rest of us use our own mobiles

The Team

The team is comprised of:

All of us have (and continue to) contribute to the Drupal Project and are heavily involved in our local communities.

The team keeps in pretty much constant contact through Skype. We try to meet about once a month together to have Member/Board meetings. They have occurred in restaurants for brunch, member's homes, and also at a bowling alley - pretty much anywhere that is quiet and you can get through company issues. Fortunately for our crew, we all live with 30 miles or so of one another which makes getting together fairly easy.

Each project gets its own Skype room, project in Open Atrium, project in dotProject, and commission spreadsheet.

As clients come in, we assess who has bandwidth for a given project - the goal ensuring that each co-op member has enough work (in and outside of the co-op) to make a reasonable living. Co-op members are free to work as much or as little as they want (given the work is available). This arrangement was designed to give our team as much flexibility as possible.

Jan 15 2009
Jan 15

Today is the eighth birthday of Drupal, the PHP website platform that I'm most involved with. In Drupal creator Dries Buytaert's own words from the news release:

When I started work on Drupal as a graduate student, Drupal was just a little hobby project grown out of my own interest in the web. As you can tell from the original release notes, being the only programmer certainly had its charms. ;-)

Fast forward 8 years, and we're a global project with hundreds of thousands of users, thousands of active contributors and a healthy ecosystem. Along the way, I've always tried to listen to the community, and to trust my own instincts and moral compass. We built an amazing community together, and because of that, working on Drupal continues to be a labor of love. Even after eight years.

A big project can't always do what a small project can; there is more legacy and overhead, but nonetheless, I think what is important is that we stayed true to our initial values: innovation, collaboration and a healthy desire to keep the code as small and simple as possible.

I'm very much looking forward to where the next eight years takes Drupal and the web at large!

If you are interested in finding out more about Drupal, have a look at groups.drupal.org to see if there is a group near you or head to DC in March for Drupalcon. I just attended my first Portland Drupal meeting last night and hope to be more involved with the community here going forward.

Viva la Drupal!


Jan 05 2009
Jan 05

forum icon

At last, I'm happy to announce the Code Sorcery Workshop support forums! These forums will gradually become the official support channel for our Mac products Meerkat and Pukka, as well as a place to discuss what's on your mind with regard to our website, potential future products, our services, or happenings in the Mac & Drupal communities.

The forums have been open for a week or two in unannounced form, but have quite expectedly not garnered much activity, so consider this the official "word". Feel free to go to it!

Feature Run-Down

We are using Drupal for the forum solution, which is what is used for the rest of the website as well. I'd like to take a moment to go over some of the features that this provides. In the near future, I also hope to make another post about the more technical details, such as which modules were used, what kind of custom solutions were implemented, and what administrative features are provided on the backend.

Main Page

The main page gives an at-a-glance view of the latest topics, much like any forum software. Posts are organized into containers, such as Mac OS X Products, and below that, forums addressing a particular product or group of topics, such as Pukka. When new topics are posted under a forum, they bubble up to the top.

Forums main page (click to enlarge)

Your Account

To participate in the forums, you must register for an account. With this account, you can maintain a unique identity across all of the posts. You can include as much or as little information as you like, currently including real name, photo or avatar, physical location, and website. This information is only available to other forum users -- only your username is available publicly.

Account page (click to enlarge)

In addition, you get a box in the right sidebar with easy access to My forum posts (posts created by you) and My forum votes (posts you've voted on).

User box


Just like the blog archives and all of the pages on the site, forum topics are searchable. And these searches are able to be bookmarked, so you can easily check back frequently for updates related to a topic you are interested in.

Topic Voting

Aside from easy access to any forum topics that you may have created yourself, you can also vote on anyone else's topics using a zero- to five-star rating system. Perhaps the best use for this feature is that you can use this to flag topics that you are interested in periodically checking back on. Another use might be a tip that you really want others to see or a feature request that you'd like to weigh in on.

The popular topics get aggregated to a special page called top forum topics where they can be easily tracked. I'm hoping that this can be a useful way to chart the future direction of our applications, as well as to more easily resolve important issues affecting many customers.

Topic voting

Feeds, Feeds, Feeds

One of the strongest features of the forums is easy and plentiful RSS feeds. Currently, you can access feeds for:

  • Container activity: For example, all posts about Pukka. Just add /feed to the end of any container URL.
  • Topic activity: If you make a post, you subscribe to all comments on the post by clicking the link on the topic page or adding /feed to the URL.

Forum feeds (click to enlarge)


In conclusion, I'm happy to launch the forums and I hope that they will be of benefit to users of our products, Mac, iPhone, and Drupal enthusiasts, and folks interested in our services, for starters. Please, if you have any suggestions or feedback, consider using the General Discussion forum topic.

Enjoy the forums!


Oct 19 2007
Oct 19

Rosetta Stone GermanA new and very interesting project with Drupal at Munich renewed my interest in learning German, something I've had pending for too long. This time I decided to try harder and finally learn Hermann Hesse's tongue.

I already knew that finding the time and right teacher was not an easy task so I started searching online. I had already tried a few options, including the quite helpful and funny free course from Deutsche Welle, but now I wanted something that could teach me German as fast as possible and I would pay for it.

That's how I found Rosetta Stone, a company offering software used by many many people all over the world, including executives and employees from some very big businesses. The system Rosetta Stone uses is called Dynamic Immersion and works quite well, connecting you with the new language, they have thirty available, from the very first moment.

Using nice pictures, different voices and a series of interactive exercises the student can easily get the basic concepts behind the language. I have almost one week with my Rosetta Stone course and I can say I'm right on track. The project with Drupal went quite well and at some point I could just program and design without even noticing all the text was in German.

I purchased the online version of Rosetta Stone, a little more than US$ 100 for a 3 month subscription, to avoid additional shipping and custom costs, and of course to save time. The only problem I've found so far, as a Linux user, is that the software requires Adobe Shockwave, available only for Windows and Mac OS, but anyway, that's an Adobe issue actually.

Obviously this is not the end of the road for my German classes. I will take other courses in the future, probably Berlitz Online, much more expensive and, obviously, will keep listening to Rammstein and Tokio Hotel as much as possible.

Auf Wiedersehen!

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web