Upgrade Your Drupal Skills

We trained 1,000+ Drupal Developers over the last decade.

See Advanced Courses NAH, I know Enough
Aug 10 2013
Aug 10

We rely firstly on a clear understanding of the Drupal site’s requirements, including the performance requirements and the expected load on the site.  A module that has been proven to be effective via the direct experience of our team is always highly desirable.

The following points describe tests we apply for module selection once we are confident that the requirements are clearly defined and understood.

  • Is the module compatible with known requirements?
  • Does the module provide functional compatibility with other modules or functionalities in the site?
  • Is the module suitable and available for the selected version of Drupal? D6, D7 or maybe D8?

If a module appears to be suitable then we further consider our full understanding of the site’s functional requirements, including its ongoing maintenance and expected shelf life. With this context, can the requirements be sufficiently realised with the module out of the box, or would we develop on top of the module’s API? For this consideration we also aim to estimate how much time we would need to implement the required functionality with the proposed module as opposed to implementation via a custom module developed from scratch.

After a range of considerations a short list of modules or or approaches can be considered alongside the pros and cons of each, allowing for a business decision on which module or integration approach provides the most benefit to our client.

When researching a module, information is found via the following methods:

  • review of the module home page
  • review of home page linked resources (Demos, video walkthroughs, FAQs, use cases, etc.)
  • review of the module README file
  • review of the module API file
  • review of any commentary on external blogs and groups.drupal.org
  • review of the module’s health via
    • Bugs Reports
    • Last modified date
    • Usage statistics and install base
    • Use Coder module to test Drupal Best Practice implementation
    • Use Devel module to test possible module performance problems

We also have a library of modules which we know we can comfortably use in almost any D7 project. Such as:

  • views
  • panels
  • panels_everywhere
  • ctools
  • elisya_cron

The selection of modules may also be predicated by architecture decisions that we make for the project.

Aug 10 2013
Aug 10

The following provides a range of areas where performance can be improved by using various caching options or server architecture improvements. We assume here that your SQL queries and the web application code are already optimised for performance. There are a number of options outlined below that will assist with supporting load surges. Our usual recommendation is to mix a variety of caching approaches with load balanced web application servers and a persistent DB and file system server.

Option 1: Manage your Drupal cache

Drupal stores all caches in the database (which will is ideally stored via memcached, see below).

The caching strategy can differ from site to site, but these are the likely configurations you would enable:

  • Cache pages for anonymous users (cache_page)
  • Cache blocks (cache_blocks)
  • Cache (cache)
  • User sessions (sessions)
  • Compress pages
  • Aggregate and compress CSS files
  • Aggregate JavaScript files

The minimum cache lifetime and the expiration of the cached pages should be set according to the needs of the site (eg 30 min).

Some modules offer additional caches to reduce the execution time of pages, these are essential to set.
 
For Example:

Module views
- Caching query results
- Caching the display rendering
- Caching views blocks

Module panels
- Caching panels on pages

Option 2: Install and use Varnish caching

Install Varnish, which is an HTTP accelerator designed for content-heavy dynamic web sites (ref: www.varnish-cache.org).

Option 3: Install and use Memcached

Using Memcached is recommended to deport the Drupal cache from Drupal MySQL database, which is usually already highly stressed, to dedicated infrastructure (ref: http://memcached.org/)

Option 4: Install and configure as required, additional Drupal cache modules

Memcache – http://drupal.org/project/memcache
Recommended: useful for the administration and implementation of the Drupal cache storage in memcached.

Cache Actions – http://drupal.org/project/cache_actions
Recommended: Allows smart cache invalidation and management

Varnish – http://drupal.org/project/varnish

Auth cache – http://drupal.org/project/authcache
Recommended: offers page caching for both anonymous users and logged-in authenticated users

Performance hacks – http://drupal.org/project/performance_hacks
Helpful but not necessary: provides specific improvements in performance.

Entity Cache – http://drupal.org/project/entitycache
Highly recommended: activate cache for Drupal entities.

Path Cache – http://drupal.org/project/pathcache
Highly recommended: activate cache for paths

ESI API – http://drupal.org/project/esi_api
Not recommended because of few users and community activity. However, it uses the menu router ‘theme callback’ and ‘delivery callback’ for integrating to core, which makes the approach efficient and comprehensible (no exit() in AJAX/ESI callbacks).

Option 5: Use MongoDB instead of MySQL

Using MongoDB can be considered as alternative storage backend for MySQL.  This option is effective but it is not suitable for every project if you don’t have time to configure everything correctly.

MongoDB is a relatively young product and the possible impacts on development processes can be very high. Retaining skill sets for support and maintenance periods also needs to be considered.

Option 6: Configure architecture to use multiple load balanced web application servers

Link is an advanced consulting partner for Amazon Web Services and has established a number of hosting environments for Drupal sites. Our best practice approach is to have a dedicated DB server with a volume mounted as an NFS (this instance can also be used for SMTP). The NFS contains all code and user uploaded files such as images and PDF documents.  A web application server is then created as a template and configured to scale horizontally within an auto scale group (ASG).

Using a dedicated NFS also helps with the deployment of new code as production web application servers do not need to be updated outside of regular OS or application patching.

Option 7: Vertically scale server infrastructure

Link built, hosted and supported hackerspace.govhack.org using a minimally customised Drupal Commons 3 distribution. The Hackerspace site was highly active for a three day period but received additional traffic in the last hour of the competition deadline. To meet the additional load we scaled up the AWS instance from a single medium server to an extra large when we reached a 75% utilisation rate. Unfortunately, the method required a three minute outage during a high load period to scale up the single server. Such a solution would be suitable if a planned outage for a few minutes could be undertaken prior to an event which is expected to generate high load.

Sep 23 2012
Sep 23

In 50 words or less (or, maybe 65)…

I’ll bring a strategic marketing framework to the Board – but don’t confuse this with posters and tweets :) I want to help the DA define its position within a broader economic view of the IT services industry and leverage the flow of value to benefit the Drupal project. 1st up: I’ll aim to bring new funds via Govt memberships and a grants funding initiative.

Now, the extended version…

I aim to bring new members from the user side of the community, create targeted global special interest user groups that help the project continue to thrive, and I aim to bring new funding opportunities to help support all the Association’s existing initiatives.

Where does Drupal sit in the economic value chain?

If you look at the entire IT industry, it is mostly part of a supply engine. Actual production of economic output occurs either via the platforms implemented or is facilitated by them (such as with an eCommerce website).

From what I’ve experienced so far, the Drupal community currently represents the aggregated needs of those working within this supply ecosystem. It has very little involvement with the actual outputs being produced by customers.

What are the long term threats to the Drupal project?

I see two significant threats to the Drupal project if it stands only within the economic footprint of IT services:

  1. The contraction of the IT services chain is real, so the market for Drupal will shrink and ultimately be replaced with services that emerge directly within hosted service platforms and are click-configured directly by user organisations.
  2. The Drupal services ecosystem will begin to cannibalise itself. Groups are already providing hosted distributions that can be deployed directly by clients without any additional partner support. Distribution contributors are pivoting toward monetisation strategies that will also aim to capture the direct customer dollar. In a more competitive atmosphere, where the business process IP of coded modules and distributions is seen as a possible revenue model, groups will begin to hedge their community efforts in favour of entirely commercial efforts.

What is an immediate opportunity for the Drupal project?

Drupal is being adopted by a lot of organisations worldwide that are choosing it over the online publishing platforms of proprietary software vendors. The opportunity exists for the Drupal project to engage with site owners and operators so it can better understand and aggregate their needs. The entire project could benefit significantly by stepping more into the world of end users, understand how to best add unique value to their production capability, then nurture and secure each vertical by investing resources to support that special interest group.

Do the Association goals align with this opportunity?

It’s important to note that the Drupal Association has no authority over the planning, functionality and development of the Drupal software.  Even the Drupal.org sites, which are a responsibility of the Association and are in need of significant upgrades in content, structure, platform and functionality are currently managed and maintained by teams of dedicated volunteers from the Drupal community.

Referring to the statutes for the Drupal Association, the goals are described as:

The purposes of the Association shall be for providing support in developing, communicating, promoting, distributing the Drupal project and in deploying an infrastructure in support of the Drupal project.

The scope of the “Drupal project” shall include the following speci?c issues: the Drupal open-source software system (http://drupal.org), the community of developers and users of this software and all associated activities and all infrastructures in their broadest sense considered to be required to further develop this project.

What does the community of developers and users of Drupal look like? This diagram helps illustrate the Drupal Community, which in turn helps to show what the scope of the Drupal project is.

The biggest population in the diagram falls into the category of ‘People that use Drupal’.

Who are these people? Depending on your perspective:

  1. They are Drupal site owners and operators
  2. They are the Drupal developers who setup the sites and might still maintain them from a technical perspective.
  3. They are both of the groups above.

No matter your perspective, the population of people that use Drupal is large enough to cover the specific interests of 7.2 million Drupal sites (as estimated in July 2010).

So, the Association goals are to support users, the community is defined to be inclusive of users, yet I suspect there are a lot less that 7.2 million interested parties currently being targeted for support by the Association under its current set of initiatives.

Lets inform, interact and transact with those using Drupal to further develop the project for everyone in the Drupal community.

Where can new funding initiatives emerge?

With my own bias of being located in Canberra, the Capital city of Australia, I see providing support to National Governments as the closest and most significant opportunity. Collectively, Government’s already have billion dollar budgets for IT expenditure and they are open to petitions from non-profits to rethink how they spend those dollars to provide better economic and social outcomes.

Further, there are growing Open Government and Open Knowledge movements which the Association can align or partner with to help generate a voice that extends beyond the subject of software development and further into the realm of social publishing.

I will work with others to directly bring together the needs of National Government organizations and encourage them to join the Association as individuals, organizations and supporting partners.

I will encourage this group to seek ways it can work with Drupal developers and commercial organizations to further the Drupal project in alignment with their specific needs. This could initially be run as a grants program which funds specific development outcomes; such as best practice models for lifecycle management, enterprise architecture, security hardening and performance optimization.

If it works, the funding model scales by the formation of additional special interest groups (SIGs) within industry verticals such as finance, retail, entertainment, etc. Ultimately, to be truly scalable, the mechanics to generate SIG project funding could be modeled on existing crowdsourcing approaches used on sites like Kickstarter.com and bidded for by commercial groups or individuals who are registered members of the association. Self-organizing SIGs can make their own rules on deciding on who receives the development grants on offer and Association revenue can be derived through a small percentage of the funds allocated against SIG projects.

Importantly, all of this can be supported by the tasks the Association is entitled to undertake in support of its goals. All it needs is someone to kick things off with a business case to establish the initiative, and that is something I’m already working on.

If you are in support of public money funding significant development initiatives within the Drupal project, please consider casting your vote in my direction :) Either way, I’d love to receive your comments below.

Who can vote in the elections?

You are eligible to vote if you have an account on drupal.org, logged in during the past 12 months, and created your account before 31 August 2012 when the election was announced.

How to vote?

Aug 22 2012
Aug 22

What is OpenPublic?

Maybe the first question should be “What is Drupal?”.  Drupal is a free open source content management system (CMS) that allows you to easily organize, manage and publish online content. It is secure, scalable, compliant, flexible and capable of more customisation than any one web site will ever need.

OpenPublic is a CMS based on Drupal. It is a pre-packaged product containing modules particularly valuable for government websites. It is created for, and by, industry professionals working with top level government agencies and organisations. As a global, community driven project OpenPublic takes the open source Drupal CMS to its next instinctive level: tailoring it to government within a shared learning environment. Growing this open source CMS with genuine community input ensures its enduring relevance.

An active community

OpenPublic is a community project being spearheaded by US based content management and open data integration specialists Phase2 Technology. Contributors are generally administrators, site builders and developers. A number of web application and data visualisation tools have already been added to the mix and enthusiastic advocates continue to contribute code, tutorials, screencasts, and other documentation for OpenPublic.

But isn’t all open source software at odds with government security needs?

Not at all. OpenPublic is founded on US government security requirements and is also applicable to Australian government requirements as prescribed by the Defence Signals Directorate’s Australian Government Information Security Manual, particularly in regard to: standard operating environments; web and email applications; web application development; and databases.

Additionally Link’s partnership with Acquia goes a long way to removing historical and assumed risks around the use of open source software and means our clients can have 24/7 access to expert Drupal support for break-fixing and advice

How does Link Digital fit in with the whole OpenPublic CMS and community ethos?

For over ten years Link has been working with government clients to help them meet the demand for greater transparency, participation and collaboration, while satisfying their unique needs, not only in terms of security compliancy but also across accessibility and usability. OpenPublic provides the ideal model for us.

We are aligned with a strategic approach to the establishment of Drupal and OpenPublic sites for a growing number of clients, and closely linked with the activities of global leaders in the community, such as Phase2 Technology and Acquia. We have opened up dialogue with both companies to seek their support in the establishment of best practice deployment for our clients and will, in the coming months be acting on their recommendations and service options in this regard.

What does OpenPublic mean for Australian enterprise and government clients?

The Drupal 7 environment already offers a very budget friendly solution with great flexibility, a highly intuitive interface, and the ideal platform for information sharing and knowledge exchange. By aligning our development approach for enterprise and government clients with the OpenPublic community our clients get access to best-of-breed and emerging website management tools. This provides our clients a more long term CMS solution that is both effective now and has a roadmap aligned with the needs of government and public oriented organisations into the future.

OpenPublic is a direct response to the US Government’s Open Government Initiative and there are direct parallels between this and similar mandates placed on Australian government agencies, particularly in regard to the central recommendation of the Government 2.0 Taskforce’s report. Indeed the Australian Government’s 16 July 2010 Declaration of Open Government made by the Hon. Lindsay Tanner, MP, then Minister for Finance and Deregulation, opens with:

“The Australian Government now declares that, in order to promote greater participation in Australia’s democracy, it is committed to open government based on a culture of engagement, built on better access to and use of government held information, and sustained by the innovative use of technology.”

Link Digital is currently developing the new website for the Department of Prime Minister and Cabinet using OpenPublic. Like us, they are encouraged by, and committed to, the ongoing development roadmap it provides.

What are the key features of Drupal with Open Public?

Responsive Design

The number of environments and platforms available today (and tomorrow) means for a site to be relevant it must also be responsive design ready.  OpenPublic allows customisation of the look and feel of a site with base themes, while implementing the best practices in responsive design. The implementation uses ‘contexts’, whereby the device a visitor is using – mobile phone, iPad etc – is noted as a particular context, such as a smaller screen size. An OpenPublic site can present site content within specific layout regions that are optimised for each context. Entirely different content can also be presented if desired.

Security

OpenPublic is pre-configured to better meet the needs of Government-level security. Passwords comply with Level 2 of the US’s National Institute of Standards and Technology (NIST) Electronic Authentication Guidelines, https is setup from the start, and CAPTCHA comes standard on forms.

Accessibility and WCAG Compliance

Australian government accessibility guidelines and regulations and WCAG compliance are key, if not mandated, foundations to usable government websites. In meeting US Government requirements OpenPublic’s default themes meet ADA guidelines for Section 508 Compliance, which goes a long way to giving site implementers a head start on testing for their own compliance. Link Digital is further mapping the compliance of OpenPublic’s default themes against those adopted within Australia. Localisation is a requirement for our www.dpmc.gov.au work and the benefits will roll into the work provided for other clients.

Workflow

OpenPublic allows for the tailoring of permissions and a customisable workflow that meets organisational needs. This kind of functionality is fairly standard in modern CMS platforms but the implementation provided by OpenPublic is extremely relevant to establishing self-managed facilities for Australian government and enterprise clients. Link is able to implement content which is aggregated from nominated RSS, twitter or user contributed sources and queue these for promotion within the site via automated, yet moderated, workflow rules.

Customisable Look and Feel

OpenPublic, similar to WordPress, offers a selection of standard designed themes, or you can apply custom design. For Link clients we apply uniquely designed themes, but the functionality allows us to consider design variations to accommodate seasonal or temporary opportunities to refresh the site design. For example, a design to celebrate a major event hosted by a government department could be adapted and applied during celebrations.

Intuitive Dashboard

You can have all the greatest modules in the world but, at the end of the day, if you don’t have a user-friendly dashboard that makes administering your site easy, you will have a suboptimal outcome. Thankfully OpenPublic does not fall short here and offers an intuitive dashboard for easy content adding and updating. The vast improvement in usability will be particularly apparent to anyone with previous experience in administering a Drupal site. For a number of content types, improved rich-media management means the addition of graphics and data visualisation elements is as simple as clicking “upload.”

Media Room

OpenPublic provides tools for the quick release of breaking news and information that’s important to your website visitors. You can also invite media to access images, video, press releases, and other valuable content in a single, centrally accessible location.

Directory Management

With directory management features OpenPublic sites can publish via profiles – for example via a Ministerial profile for an agency or via program ambassador for a major initiative. This is an effective method for site visitors to develop a deeper and more personal association with an organisation and its objectives. Tools provide the ability to easily and quickly maintain a directory of contact information for hundreds of profiled people and/or organisations.

Pluggable Features

Out-of-the-box, OpenPublic comes with pertinent features – from integrating a Twitter feed to posting blog entries – that you can enable or disable, depending upon your needs. These pluggable options make customisation a process that is less technical and more focussed on meeting your objectives.

Mar 16 2012
Mar 16

Posted by Alli Price on Friday, 16 March 2012 at 5pm

Since the iPad 3/New iPad was announced, there has been a swirl of questions, which actually boil down to just one – 'how are we going to cater to this?'.

It's a real problem – do I want to, or should I, be serving up images at double the size for pixel-dense displays?

The first thing to ask yourself is, what's the most important thing to your visitors? If it's speed then this throws doubling your images (at least) in file-size right out the window. @brad_frost in his post makes this point: just because we can serve this, and the device fits (has a good enough pixel density) doesn't mean that the user would actually want this. This fits in quite nicely with what Luke Wroblewski outlines in Mobile First, users want what they want, fast – connection speed is a clear constraint and one that applies to the desktop too.

This is the first segment; we can try and cater to them by figuring out things like connection speed and even if we want to, processor power, but there is no perfect, reliable solution (that I'm aware of) for delivering double resolution images quickly to only users on decent connections.

The next segment will be users who want the best experience hands down, to be blown away. These are the people who are excited by the technology and want from the most from anything they view on their device. These people aren't new, they've got iPhones with retina displays and a slew of Android devices with Super AMOLED displays. What's odd is that we've got the first wave of dense display devices, but behind the iPad we might see the big push.

Whilst it's weak, the answer is, as usual, common sense. Only serve up images doubled where they're going to be most effective, improve a user's experience and add value to it.

I don't doubt that detection for connection speed will be improved by someone really smart, but right now it's about striking the balance between fast and impressive. This is by no means a new challenge to us as developers/designers. We choose to optimise images before exporting them from Photoshop, or to use a minified JavaScript file. It's down to us to be smart and deliver smart solutions; the end user won't care about anything except the end result.

Instead of seeing higher density displays as a threat to web, we'll see innovation. We'd rather have the choice to deliver higher res content, than not have the option at all.

The road ahead

I'm a firm believer that resolution independence is the best way to adapt. Let's face it, Apple will no doubt supply a fair slice of the high resolution devices out there, but there's a massive number of resolutions we could cater for.

Resolution independence will let us shrug off the doubling, tripling or quadrupling of our images further down the line. But right now it's not especially easy. I'm putting my stock in SVG + CSS3 as a solution, which is improving but not quite there yet.

Exciting times!

Feb 04 2012
Feb 04

Posted by Graeme Blackwood on Saturday, 4 February 2012 at 6pm

One of the most important and complex aspects of a DrupalCon is the schedule. An enormous amount of work goes into getting it right – from the huge number of session submissions, which have to be reviewed and selected by the track chairs and their teams, to the people whose job it is to carefully consider and decide time slots for all of them.

Once all of this work has taken place, the schedule then needs to be presented, in print, on meter boards, posters and in the delegate guide, as well as on the website and mobile app.

With around 70-80 sessions over three days and eight tracks, with three possible skill levels and multiple presenters, all split up into different time slots, and sometimes sub-time slots, presenting this lot is not a simple task. I had some great people working with me on the London schedule and I think we did a pretty good job.

For Denver, the plan was to take the schedule a bit further, making it responsive so that the layout adjusts to the size of the screen you are viewing it on. This is particularly useful for mobile phones and tablets, on which the user experience would be very poor if the design wasn't responsive. Initially the Denver team were looking at a table format for the schedule, similar to the Chicago approach: http://chicago2011.drupal.org/schedule. This layout is really good, but tables don't do well with responsive design. Tables have no way of rearranging themselves – if the width of the table shrinks, the cells just squash horizontally until they are stopped by the longest word in each. This looks pretty horrible and usually breaks a website's layout on smaller screens.

DrupalCamp Austin did use a semi-table layout, and importantly, it doesn't actually use table markup, meaning it can collapse. This worked well because the number of sessions in a given time slot was limited. Denver's maximum is seven sessions in a single time slot, which even in a 960 set up, would be really squashing them in on a single row and force them to collapse almost immediately on the slightest resize.

Drupalcamp Austin's horizontal schedule layout

So a different method was needed. Initially taking the approach of a mobile web app, I put together an example schedule using Denver's branding to help demonstrate how it could collapse on smaller screens. The main difference in this layout is that instead of side by side, the sessions are stacked, divided by the time slots. The track icons were produced for Drupalcon Chicago and it felt really right to pick them up again for Denver.

The Denver team then adapted the prototype to fit the website and extended the icon set to cover the new tracks. While implementing, they made some subtle improvements to my prototype, like the track title on hover: http://denver2012.drupal.org/program/schedule

Drupalcon Denver web app prototype

There are definitely more improvements to be made. The hit area isn't very large on the sessions (only the title), so it's not always easy to press with your finger; wrapping everything in an a tag would resolve this. The rooms aren't displayed yet, which would be pretty useful to help you find your way around and some of the sessions don't fall into specific time slots, so we are working on adding these soon. Also the filters are yet to be implemented on the Denver site, but it is worth looking at the prototype on a mobile device to see how I envisaged them working.

This is of course, just one example of a schedule for one event format, but if you are reading this from inside or outside the 'Drupalsphere', I hope you found some of the ideas useful.

Categories: drupal, DrupalCon, Mobile, Responsive, UX

Jul 02 2011
Jul 02

Posted by Tim Deeson on Saturday, 2 July 2011 at 10am

Drupal’s core search technology is a good fit for small to medium sites, or where the search requirements aren’t particularly sophisticated. The benefits of core search are zero setup and no additional server requirements; the node content is indexed in the database.

However, for busy sites, sites with a lot of content, or if features such as faceting are required, then Drupal can be combined with Apache Solr, a specialised search platform.

Apache Solr provides scalability and performance benefits over core database search, as well as providing some features that are difficult to deliver (or difficult to deliver with acceptable performance) using core search.

What is Apache Solr?

Apache Solr is a search platform focused on delivering enterprise class, high performance search functionality. The software was originally created as an internal CNET project and then donated to the Apache Foundation in 2006. The Apache Solr Drupal integration module makes it relatively easy to replace Drupal core search with this external search platform.

Apache Solr runs as a separate service from the web server and the database, so requires some extra resources. Normally this means a dedicated server rather than cheaper shared hosting. The fact that it is separate means that it can scale independently of the other two services, from being run on its own dedicated server through to its own cluster.

Use Apache Solr to help sell content with Drupal and UberCart

Drupal’s e-commerce module, UberCart, has two great features that make selling content online easy. Firstly, you can sell file downloads; for example, you can sell PDFs with training content. We recently launched Soccer Coaching Club for Green Star Media, providing users with the ability to find and purchase the right piece of content from thousands of potential choices.

UberCart also allows you to change a user’s role for a certain period. For example, users can buy a subscription to allow access to premium videos or articles for a month via UberCart and be automatically reverted to a ‘free’ role after this period.

UberCart’s content purchasing functionality, combined with Apache Solr’s powerful content filtering, represents an exciting and entirely open source solution.

Search facets with Apache Solr and Drupal

Facets are attributes of content that allow filtering alongside the user’s search query. A well known example is Amazon: a search for ‘John Grisham’ brings up results from the Book, Film & TV and MP3 Downloads categories. By selecting ‘Book’, you eliminate the results related to DVDs and audio books that are in the other categories. You can also filter for certain delivery options or customer ratings.

The Apache Solr module provides Drupal facet data to Apache Solr. For example the content type (eg News, Blog post or Research paper), the publication data and, most usefully, all of the taxonomy terms (tags) associated with that piece of content. This means users can use a keyword search in conjunction with the site’s taxonomy to further narrow their search, providing a very usable and powerful tool.

Performance improvements with Apache Solr

High traffic sites running search queries against the database can start to degrade the site’s overall performance if the database becomes the bottleneck. This is also true for lower traffic sites with lots of content. Complex search queries can be slow to run.

Requirements for features such as faceted search (see below) are becoming increasingly common. This can be delivered in conjunction with Drupal core search using the Faceted Search module but the inherent scalability and performance implications are well documented by the module’s maintainers.

Other useful Apache Solr features

Apache Solr also supports indexing and searching multiple sites (imagine internal intranet site and external corporate site), indexing attachments (eg PDFs, Excel documents) and recommended content blocks driven by a node’s taxonomy. The module page and the Acquia Search overview both have a good overview of the Apache Solr features that Drupal supports.

How does Apache Solr fit with Acquia Search?

Acquia Search is a cloud-based ‘Platform as a Service’ (PaaS) delivery of Apache Solr. The difference is in where Apache Solr is hosted; it’s essentially the same software backed by an SLA. The main benefits are ease of set up and scalability. No local installation or management of Apache Solr is required. You just enter a license key into your Drupal site. Also because it is hosted by Acquia (on their Amazon EC2 infrastructure), you don’t have to worry about scaling or managing the load of your Apache Solr usage.

In many cases, the fact that Acquia Search simplifies the hosting stack, potentially reduces hosting costs and Just Works™ means that it’s the default choice over a local Apache Solr install. Projects requiring bespoke Solr configuration, or an unwillingness to rely on a 3rd party solution, should consider their own local Apache Solr install. For example, a project we worked on recently required a custom Solr synonyms configuration file to ‘educate’ the search on a niche subject’s search terms. This isn’t possible with Acquia Search currently.

The difference between Apache Solr and Apache Lucene

For most people, there isn’t one. Lucene is the internal indexing and search library that Apache Solr uses to deliver its search functionality, Solr can be considered the ‘service’ wrapper around the Lucene engine. They were originally separate projects but have merged. Outside of the technical community, generally people use the terms Solr and Lucene interchangeably. If you are using Solr, you are implicitly using Lucene, and vice versa.

Categories: apache solr, drupal, ecommerce, ubercart

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web