Apr 10 2019
Apr 10

Lots of people in the Drupal community are eager to learn React these days, following Dries's announcement that React is coming to Drupal.

At NEDCamp in 2018 I presented on how to dip your toe into embedding a react application into a Drupal framework (video on drupal.tv).

This is the long-delayed blog post to follow up to the presentation.

Our approach was fundamentally this:

  • we wanted to possibly embed multiple React apps on the site eventually, so we wanted to treat our base React libraries as common across the site.
  • we needed to marry React routing and Drupal routing so that we could occupy a whole "namespace" of the site
  • we wanted Drupal to store all the entities managed by the front-end, so we had to settle on storage and an API

React Libraries

We wrote a small react_libraries module to expose the libraries for React that we thought we would use everywhere and wanted consistent on every site.

Besides the .info.yml file for the module, the only other thing in the module is the libraries' definitions.

# react_libraries.libraries.yml react: js: https://unpkg.com/[email protected]/umd/react.production.min.js: external: true https://unpkg.com/[email protected]/umd/react-dom.production.min.js: external: true react-dev: js: https://unpkg.com/[email protected]/umd/react.development.js: external: true https://unpkg.com/[email protected]/umd/react-dom.development.js: external: true

All this does is suck in the libraries from CDN, including a prod (react) version and a dev (react-dev) version.

Our smaller apps just depend on this module and then attach the react_libraries/react(-dev) libraries as needed (you'll see that next).

One lesson learned is that we started with create-react-app, so to make this work we had to eject from that and remove the libraries that are normally in the build app bundle. Next time, we will build our app up from scratch rather than using the scaffolding.

Routing

The way React apps work is that they handle the routing by changing the URL with JavaScript and allow the app to deal with what to do given that route. But, as the page isn't refreshing, it's all just one path to Drupal. The problem comes in when a user bookmarks a URL and expects it to work (and it should). To handle this scenario, we assumed a 'namespace' in the routing by declaring /my-react-app/* to belong to our React app. In Drupal 7, this would've "just worked," as any path registered auto-assumes that anything appearing on the URL after that are just arguments to the route. In Drupal 8, this is no longer true, so we have to sort of fake that old behavior.

To do that, we need a custom module. As part of that module, we can define routing--and we tell our route that there is a single argument passed ({react_route}), and we set the default value of that parameter to "" if it is not passed at all (i.e., you navigate to /my-react-app by itself).

# my_react_app.routing.yml my_react_app.overview: path: /my-react-app/{react_route} defaults: _controller: \Drupal\my_react_app\Controller\MyReactAppController::overview _title: 'My React App' react_route: ''

But, alas - this does not match the path /my-react-app/pathpart1/pathpart2 - so this is not complete yet.

Next, we need to create an Inbound Path Processor, by dropping a new class in our module's src/PathProcessor/ folder.

namespace Drupal\my_react_app\PathProcessor; use Drupal\Core\PathProcessor\InboundPathProcessorInterface; use Symfony\Component\HttpFoundation\Request; class MyReactAppPathProcessor implements InboundPathProcessorInterface { public function processInbound($path, Request $request) { if (strpos($path, '/my-react-app/') === 0) { $names = preg_replace('|^\/my-react-app\/|', '', $path); $names = str_replace('/',':', $names); return "/savant-tools/$names"; } return $path; } }

What the above code does is strip out the "namespace" part of our route, and then replace all the forward-slashes with colons so that it appears as a single route parameter. Essentially, PHP is just throwing this out anyway, since it's the front-end that will be using this route information in JavaScript.

The Controller

The final missing piece is - well, what DOES Drupal actually serve up for markup when we hit anything at my-react-app/*? That's defined by the Controller that your routing.yml file refers to. Your routing class gets dropped in your module's src/Controller folder.

<?php namespace Drupal\savant_tools\Controller; use Drupal\Core\Controller\ControllerBase; /** * Controller for My React App. */ class MyReactAppController extends ControllerBase { /** * Renders the react app. * * @return array * The render array. */ public function overview() { $build = []; // @TODO - do dev / prod here. (/react or /react-dev) // Ideally, make this configurable somehow. $build['#attached']['library'][] = 'react_libraries/react-dev'; // This is where you attach the additional library from your // module that contains the non-React-libraries code $build['#attached']['library'][] = 'my_react_app/actualapp'; // Finally, drop your main mount point for React. // This ID can be whatever you use in your app. $build['#markup'] = '<div id="root"></div>'; return $build; } }

Fin!

At this point, you're now free to write that whole slick frontend!

One last thing to mention, another alternative to this is to mount a React application onto a node. Using JD Flynn's module React Mount Node you can simply specify a node, the div ID, and the library you've registered with your React App in it. You will need React fully bundled, or you'll need to attach your react_libraries on every page or through some other mechanism, and the routing isn't handled with as much elegance - but if you have simpler needs, this is a great way to go!

Apr 08 2019
Apr 08
border_clear

April 08, 2019

Christina

Drupal 8 ships with a custom CKEditor build. This build is configured with a build-config.js file. We recently ran into a situation in which we wanted to override this configuration in order to disable a plugin. There is some information in this build-config.js file about replacing it with a non-minified build for development purposes, but nothing about overriding it. Here is how we did it.

The plugin we wanted to disable was the Show Table Borders plugin. This is the feature that provides default dotted borders around all table cells while you're editing a table in CKEditor. We wanted to disable that and, instead, just show the table borders as they would be styled on the front-end. Upon inspection of the build-config.js file that Drupal uses, which is located at core/assets/vendor/ckeditor, we found that the plugins key contained showborders: 1. To disable it, we needed to rebuild CKEditor with this line removed.

To do that, we saved a copy of build-config.js to our theme in a similar location: assets/vendor/ckeditor. We removed the line which enables showborders. Then, we went to CKEditor's online Builder and used the Upload build-config.js feature to download a newly generated copy of CKEditor that would exclude the Show Table Borders plugin. We placed the downloaded files in our theme's assets/vendor/ckeditor directory.

The last step is to override Drupal's core CKEditor build from within the theme's info.yml file. Add the following lines (modified for your theme):

libraries-override: core/ckeditor: js: assets/vendor/ckeditor/ckeditor.js: /themes/custom/YOUR_THEME_NAME/assets/vendor/ckeditor/ckeditor.js

Flush the caches and the plugin should be gone!

Nov 15 2018
Nov 15
vertical_align_center

November 15, 2018

Chris

It's almost time for NEDCamp, and I can't wait!

Redfin will be presenting a session there on our toe-dipping foray into the world of "progressively decoupling" Drupal.

Recently, I was on an episode of Talking Drupal to explore a little bit more about React and Drupal together--this shoudl whet your appetite for the session at NEDCamp. Give it a listen!

At the presentation, you can expect a deeper dive into some of the code and the real implementation that wires these two technologies together. I look forward to seeing you there!

Mar 29 2018
Mar 29

It all started with an innocent tweet:

https://twitter.com/mirisuzanne/status/948637526612324352

"Excited to announce our new open-source, Sass-driven pattern-library generator! Go design some systems!"

I follow Miriam on Twitter because I love everything she's ever done. At Redfin, we were huge fans of Susy, right up until she told us not to use it any more. And, like everyone else in the Drupal community and web developer community at large, we're hearing more and more about Atomic Design and the use of pattern libraries to build websites. We're encouraged to build and use canonical and living style guides. Many tools have come forward and, in the Drupal world, it looks like Pattern Lab has been a big winner.

At Redfin, we've tried a number of these tools, including Sam Richard's Style Prototyping approach, and attended trainings for Pattern Lab. We've also experimented with KSS for documenting Sass and generating a style guide.

Why Herman

What attracted me to Herman was the common predicament of the small-to-medium project and its budget's ability (or inability) to deliver on these prototypes. From the Herman announcement on Oddbird:

Creating the beautiful Salesforce Lightning Design System requires an ongoing full-time dedicated team. Those of us doing agency work don't often have that luxury, but our clients still need a system that will work for them.

So how can we make design systems part of an agile process – growing slowly along-side the application, the same way we write and maintain test-coverage as part of any project? How do we make documentation and design consistency the path of least resistance?

I'm a big believer in systems. That is, I'm a big believer in systems that work. If a human doesn't want to use a system because it's too much of a hurdle, the system has failed, not the human. So, the idea of "the path of least resistance" is appealing to me (or perhaps I'm just lazy but, nonetheless, systems should be built for the lazy).

So, Herman came along with all this promise and sparkle and I decided to give it a whirl. For starters, Herman is based largely in the foundations of SassDoc. SassDoc shares a similar purpose with KSS, though, having now played with it, I find its syntax just a bit easier to understand. Perhaps, since I've learned PHP Annotations for Drupal, the annotations in SassDoc feel natural.

Getting Started with SassDoc

To this end, Herman is actually just a "theme" for SassDoc. So, to get started, you are going to initialize a new SassDoc project. Like most of the front-end world today, new front-end projects are initialized using a tool like Yarn or NPM. At Redfin, we use Yarn, so we initialized our project using "yarn init" and answering the questions as appropriate.

Once we were initialized, we added in our two dependencies - SassDoc and the Herman theme:

yarn add sassdoc sassdoc-theme-herman

Once that finishes, you have scaffolded out a Herman project… kind of. What you now need is all your Sass! Create a sass folder to get started and put a style.scss file in there. We'll start with something simple:

.button { border-radius: 5px; background-color: green; color: white; font-weight: bold; }

Here's our first simple component we'd like to document. Maybe, if you were lucky, you had SOME kind of note in there before, like // typical button styles or something.

SassDoc uses a "three-slash" syntax to pull comments in as documentation. So, let's enhance that a bit.

/// Components: small, re-useable components used on the site. /// @group components /// @name Button /// @group components /// @example html /// <a href="#" class="button">Click me</a> %button { border-radius: 5px; background-color: green; color: white; font-weight: bold; }

The first comment, which is offset by a newline from the rest, is called a "free-floating comment." It's just "out there," and not attached to anything. However, note that using the "group" annotation (@group components) I was able to assign it to belong to a group. Using other annotations, like name and example, I'm able to generate my style guide (at the end of the day, just a static site).

To generate, you need to be in the root of your project and run:

node_modules/sassdoc/bin/sassdoc sass --theme=herman

And this gives you the following static site (find it by visiting /sassdoc/index.html off your site's root):

Moving On

Let's get something different put together, a little more advanced. Let's throw in a mixin.

@mixin embed-container($width, $height) { $ratio: ($height / $width) * 100%; position: relative; padding-bottom: $ratio; height: 0; overflow: hidden; max-width: 100%; iframe, object, embed { position: absolute; top: 0; left: 0; width: 100%; height: 100%; } }

This mixin is inspired by Smashing Magazine's timeless article on the subject.

Now, let's annotate! Put this directly above your mixin.

/// Mixins: custom re-useable but configurable tools we use. /// @group Mixins /// Helper mixin to drop on the wrapper for an iframe /// that you would like to be responsive. /// /// @group Mixins /// @author Smashing Magazine /// /// @param {Length} $width - Element's width /// @param {Length} $height - Element's height /// @output CSS for the wrapper and the inner iframe that maintains the aspect /// ratio as it is resized. /// /// @example scss - /// .embed-container { /// @include embed-container(400px, 300px); /// }

The above documentation introduces us to the @parameter annotation, which allows us to document a parameter, its type, the name, a default value, and a description, using the syntax:

/// @param {type} $name [default value] - description

Also, note that we're not displaying markup here for our @example annotation; rather, we're using scss for the syntax to output. For mixins, this is incredibly helpful as it can show us what the compiled CSS is as a result of using this mixin! Let's go ahead and compile again (node_modules/sassdoc/bin/sassdoc sass --theme=herman).

UH-OH!

» [WARNING] Error compiling @example scss: no mixin named embed-container Backtrace: stdin:2 .embed-container { @include embed-container(400px, 300px); }

SIDE NOTE: In addition to being confused, I bet you're already tired of specifying --theme=herman on the command line every time, huh? Let's kill two birds with one stone.

Rather than specifying your Herman parameters every time on the command line, you can specify them in a JSON or YAML file. In that way, you then only specify -c /path/to/config every time. Of course, at this point, you're just robbing Peter to pay Paul. Switch one command line option out for another.

There's an even better option. Just name your config file .sassdocrc and put it in the root of your project and it will be automatically used. The entirety of that file (so far):

theme: herman

However, we haven't yet solved the problem of "no mixin named." See, the @example annotation from SassDoc doesn't natively support compiling Sass into its CSS counterpart. That's a gift from Herman. In order for Herman to compile the SCSS into CSS, though, each @example must be able to stand on its own, and this was the one area that really tripped me up. Thankfully, Miriam was there to help out.

To make this work, one option is to import the Sass file that we need in order for the example to stand on its own. Change your example to this:

/// @example scss - /// @import "style.scss" /// .embed-container { /// @include embed-container(400px, 300px); /// }

I'll save you some time before you run off ahead and compile--this still won't work. But, it's easy to fix. Go back to your .sassdocrc and specify a "herman" object with some configuration. (Full details on the herman object configuration.)

Make your .sassdocrc like this now:

theme: herman herman: sass: includepaths: - 'sass'

The includepaths directive is important so that Herman can resolve your import statements. Want to do one better? You can auto-import a path (or paths) using another declaration but, beware--nothing you auto-include should generate any actual Sass output or it will show up in EVERY example. This is best used for your utility files, like _extends.scss, _mixins.scss, etc. (Refer to our own Bundler Shell theme to see how we lay this out.) For example:

theme: herman herman: sass: includepaths: - 'sass' includes: - 'util/mixins'

If you auto-include your util/mixins (really ./sass/util/_mixins.scss) then you can make use of your mixins without needing to put the @import in every @example!

Another Side Note: README

If you are feeling harassed by "[WARNING] Description file not found: `./README.md` given." It's probably best to have a README.md for your project. This shows up as the text of the index.html page for the top-level of your SassDoc project. I just went ahead and created a simple one. This is a SassDoc configuration value, and if you'd rather create an introduction to your style guide that is separate from the main README for your project, you can set descriptionPath in your .sassdocrc file.

Level Up

This is all great but, we need to level up. What else does Herman offer?

No one can say it better than their own README:

In addition to the core SassDoc annotations, our @icons annotation allows you to display SVG icons from a given folder, and we extend the core @example annotation to display compiled Sass/Nunjucks output and render sample components. We also provide a @font annotation for displaying font-specimens, and @colors, @sizes, and @ratios annotations for displaying color-palettes, text and spacing sizes, and modular ratios."

Icons

This one is easy so we'll start here. Add a comment and use @icons \<path-to-icons-relative-to-project-root> and you're there! It auto-generates your icon previews with filenames, even optimizing them as it goes. (Bear in mind your SVG's should specify a viewBox or they will likely be very, very tiny in the preview.) It expects a folder with individual SVG files per icon.

Font Stack Previews

Things start to get a little trickier starting here. For the fonts, colors, ratios, and sizes annotations, you will need to generate some JSON that the front-end JavaScript/templates can use. There's a plugin called sass-json that is doing this for you--taking sass maps and writing them out to encoded JSON--but you need to export your data in order to do this. So, let's dissect the font annotation first.

/// @font key (styles, to, show)

In this case, the ‘key' is the variable name of the Sass map holding the information about your font style, and the (styles, to, show) are a list of font weights/styles that you would like to display, for example: (regular, bold, bold italic).

Note that, at least for Google Fonts, the numbers are a more consistent thing to use when outside of the normal keywords of bold and regular. I didn't have success with previews using things like "semibold" or "light." (This is because they only support valid CSS values for font-weight - though there's discussion around that: https://github.com/oddbird/sassdoc-theme-herman/issues/250 ).

Finally, the second line is indented, to show that it's still part of the @font annotation, and it consists of any markup needed for the font to render correctly (JavaScript tag, link tag, etc).

So, in real life, this looks like:

/// @font sans-stack (300, 300 italic, regular, 600) /// <link href="https://fonts.googleapis.com/css?family=Work+Sans:300,400,600" rel="stylesheet"> $sans-stack: ( 'name': 'Work Sans', 'source': 'https://fonts.google.com/specimen/Work+Sans', 'stack': ("Helvetica Neue", "Lucida Grande"), );

For a web font like this, we use the name (that is, the actual font name you would use if you were to display it in a font-family property), source (this renders as an external link when the preview displays), and stack (which are the fallbacks you've chosen when this font is not available).

Getting that to render, though...

This is all the annotation related to the font, specifically, but now we need to include this Sass map into the "herman" map more globally. There's a handy mixin that Herman provides, called "herman-add" which we can use to do that. After the map, I put:

@include herman-add(‘fonts', ‘sans-stack', $sans-stack);

In order to use this herman-add mixin, you will need to include Herman's utilities (where this mixin is defined), so at the top of my file I put:

@import "../node_modules/sassdoc-theme-herman/scss/utilities/_utilities.scss";

Finally, we need to do a final export of the Herman map into JSON. At the bottom of my Sass file, I put:

@include herman-export;

This ensures that the herman map is exported to JSON so the front-end can pick it up. The Herman team is currently working on improving this process but, for now, this is still a pretty clean way to handle it. If you get a little cuter than I did with your partials, you can have a Sass file that only outputs the herman map JSON so you don't need to pollute your regular CSS with it if you don't want to.

Keep this pattern in mind, because most of Herman's awesomeness depends on it. You'll see as we move on.

Colors

Now that we've established a pattern, we're keen to keep following it. For color palettes to be generated in your SassDoc static site, we'll follow a similar pattern. First, the annotation:

/// @group colors /// @colors demo-colors $demo-colors: ( 'alto': #d8d8d8, 'scorpion': #5b5b5b, 'tree-poppy': #f36c38, 'white': white, 'wild-sand': #f5f5f5, 'grey-light': #d5dbe4, ); @include herman-add('colors', 'demo-colors', $demo-colors);

First, I use the @group annotation to put this in the ‘colors' navigation over at the left. Then, the actual @colors annotation puts the map key you're going to use to add to the Herman map. We add those colors in a map, and then finally use herman-add to map $demo-colors into $herman. In this way, the herman-export we call at the very end will ALSO now include this color palette in the static site.

Sizes

For text sizes, a great preview can be generated to show you the various headings or sizes you want to use. Sense a pattern yet? Let's look:

/// All the sizes that we have. /// @group sizing /// @sizes font-sizes {text} $font-sizes: ( 'base': 16px, 'important': 1.8rem, 'largest': 3rem, ); @include herman-add('sizes', 'font-sizes', $font-sizes);

Ratios

Ratios behave nearly identically:

/// Ratios we are using. /// @group sizing /// @ratios my-ratios $my-ratios: ( 'line-height': 1.4, 'gutter': 0.5, ); @include herman-add('ratios', 'my-ratios', $my-ratios);

The only thing to know is that you can optionally display text sizes (or spacing sizes, or page sizes) as rulers, though the default is to display a text preview. To do this, add the optional "{rulers}" or "{rulers-large}" after the sizes annotation (rather than "{text}" - which is the default).

Nunjucks - Martial arts the templates up a notch

For markup that is more complicated than some simple HTML, you can write a Nunjucks template to generate output for a preview. Let's enhance our button example with a Nunjucks template.

/// @group components /// @name buttonset /// @example njk /// {% set items = [{"name": "do something", "label": "open"}, {"name": "do something else", "label": "close"}] %} /// {% include 'buttonlist.njk' %} /// .buttonset { li { display: inline-block; list-style-type: none; margin-right: 1em; } a { display: inline-block; @extend %button; } }

You'll notice I still put this in the components group but I've turned my regular button into a buttonset. You'll also notice immediately the @example annotation this time specifies the "njk" syntax, meaning "compile Nunjucks code." When using njk in an annotation, you are required to specify a templatepath in your config. (Alternatively, you can specify an entire Nunjucks environment, but to do that you must be using the Node API version, which I am not.) Add this to your .sassdocrc inside herman:

nunjucks: templatepath: './templates'

So, I created a "templates" folder off the root of my project and put a simple buttonset.njk file in it. (Dear Drupalists, don't be scared of Nunjucks--it's Django/Jinja-based templates for JavaScript, just the same way Twig is Django/Jinja-based templates for PHP!)

{% block content %} <ul class="buttonset"> {% for item in items %} <li><a class="button" title="{{ item.name }}">{{ item.label }}</a></li> {% endfor %}` </ul> {% endblock %}

Now that I've configured a templates directory, and my syntax for using the templates is all set up, I get a fully rendered example. It includes (a) the Nunjucks language used to generate it, (b) the fully compiled HTML markup, and (c) a fully rendered example with all of my styles!

For bonus points, check out Nunjucks macros, which should help you further componentize your markup into easily-reproduced snippets. If we do it this way, we can sort of reverse the order. First, we import our njk file which defines our macro:

/// @name buttonset /// @example njk /// {% import 'buttonset.macro.njk' as mymacro %} /// {{ mymacro.buttonset([{"name": "do something", "label": "open"}, {"name": "do something else", "label": "close"}]) }}

...and our Nunjucks template is slightly different, wrapping the block with a macro call. A macro is similar to a "function."

{% macro buttonset(items) %} <ul class="buttonset"> {% for item in items %} <li><a class="button" title="{{ item.name }}">{{ item.label }}</a></li> {% endfor %} </ul> {% endmacro %}

The Power

So, by combining all of these elements directly into the Sass you're writing for your agile site, you can document on the fly and have an easy way to:

  • Reference and document your mixins
  • Display typography stacks and document when you should use each
  • Show heading sizes and spacing ratios for vertical rhythm
  • Discuss branding color palettes and describe how each should be used
  • Demonstrate the icon set available to the application quickly
  • ...and so much more!

The Review

So what do I like and dislike about this? What did I learn?

For someone like me, a lot of this was actually quite new coming in. Not just the fundamental concepts that Herman brought, but all of it. I had never used SassDoc before, though I'd played briefly with KSS. I'd never even heard of Nunjucks before, though I had used Twig. But also, the concepts that give Herman its power also add complexity to an already complex stack. You need to remember that, in a sense, everything is compiling twice. Once, it's compiling your Sass to be output (and then you're bringing that output into the static site via custom CSS), but it's also compiling all the SassDoc comments into a static site as well. These two different steps are nonetheless sourced largely from the same files, so all the pieces of the puzzle feel like they need to fit together just right for everything to work in harmony. Once that was fundamentally understood, the idea of the JSON encoding actually made total sense, and was OK to understand.

I also spent a lot of time getting bit by the Ruby bug. At Redfin, we sort of skipped over the whole Gulp / Node API thing. We used a lot of Ruby Sass and Compass and Bundler, until we recently switched to a stack based on Dart-Sass. While trying to learn, I tried to strip everything down to its fundamental elements, and that actually got me a few times. I should've started with a modern stack and used the node-sass implementation that I installed with my yarn install, and I wouldn't have had such issues. (With that said, we never would've improved Herman to support Ruby Sass!)

Overall, I believe that this is definitely good enough to go into our next project. Beyond that, I am confident in the Herman team that if I find any bugs as we use it, they will be responded to swiftly, which is hugely important for adoption of something kind of new like this.

UPDATE 04-09-2018: Added additional clarifications from Miriam.

Dec 11 2017
Dec 11
code mood

December 11, 2017

Brett

Have you ever posted a link to facebook and wondered where the image comes from in the post preview? Ever wondered if this image is even correct or relevant to what you are posting?

Recently, a client of ours pointed out that when users post links to Facebook in order to share upcoming live shows, the set of rendered images in the post previews were displaying unrelated content. Specifically, images from the site header (like the site logo) were being defaulted to in the post preview much to the client’s chagrin. Her preference was to have an accurate picture of the artist displayed in the Facebook post instead of a set of unrelated images parsed from other sections of the site.

This is where the Metatag module in Drupal contrib can give you better control over what images get displayed in social media posts. It works by adding meta tags to the page like <meta property="‘og:image’" content="//mydomain/path/to/desired_image.png’"/> for example. To get started, first install the Metatag module and enable the base module. Then, head over to Admin->Config->Search and metadata->Metatag and you will see a set of global properties to change. Alternatively, by clicking the “Add default meta tags” button you can restrict the placement of these tags to the content types of your choice instead of placing tags on all content types globally. Next click “Override” (or edit) off to the right and expand the “Open Graph” section at the bottom of this page. In the “Image” input box you can enter a token to an image field of your choice from your content type and the Metatag module will output a tag on content pages with the image url as seen above. Facebook will scrape this and render the correct image.

Panels and the Metatag: Panels Sub-module This can also be done if you have panel pages on your site but you have to enable the Metatag:Panels sub-module first. Instead of going to configuration to add the image field token as described above I found that by going into Admin->Structure->Panels and editing the panel page from there was the preferred approach. With the sub-module enabled a “Meta tags” menu item became exposed in the vertical tabs. When clicked it brings you to an identical set of input fields ready to accept site tokens. Make sure your token browser is enabled to search available tokens, and once a token is saved you should inspect your page from the browser to ensure that the new meta tag is being output to the head section. With the newly generated image meta tags on your site anyone posting links to your site will be sharing relevant images that you've designated for the world to see and that's pretty cool.

There is also a Metatag:Facebook sub-module which can be leveraged to do similar tasks but because our pages were panel pages we just needed the panel sub-module enabled.

If your metatag is being generated but facebook isn’t picking it up try the dev tools link below: https://developers.facebook.com/tools/debug

This is a great tool that can tell you the last time the link was scraped among other things. This will help determine if Facebook has cached the link instead of rendering a new post with the newly created open graph image metatag. There are a lot of Metatag sub-modules that come with the main module which can be a bit daunting, but once you find the right one and the right token you can do a lot to better control the use of metatags being generated on your site.

Nov 20 2017
Nov 20

This past weekend, I was honored to be able to present a session at 2017's New England Drupal Camp (NEDCamp) about Drupal 8's Migrate API. Redfin has implemented many Drupal 8 migrations to date both from CSV data sources and legacy Drupal sites (Drupal 6 and 7). As a result, we want to share with you what we've learned in hopes of saving you the time often spent in the trials and errors of data migration.

My Master's degree is in Technology Education, so I understand that people learn things differently. Some people are auditory or visual learners while others like to read. To that end, I wanted to summarize our session here. If you are an audio or visual learner, please check out these resources:

Otherwise, let's peek at the concepts...

The overall process is:

  1. Scaffold out a module (.info.yml file). You can use Drupal Console for this.
  2. Write YAML files to define your migrations.
  3. Set up any global configuration if needed (your legacy database connection credentials in settings.php, a folder for your CSV files, etc).
  4. Extend/alter/write any custom plugins or code to support those migrations.

While you can use the Drupal Migrate UI to do this, I recommend building your Drupal 8 site the way you want (maybe taking advantage of newer paradigms like paragraphs, for example), and then worry about your migrations. There are four main modules that come in to play when not using the UI. Two are in core--migrate, and migrate_drupal. "migrate" lets you get anything into Drupal 8, while "migrate_drupal" extends and supports that to enhance your experience when migrating IN from a Drupal 6 or 7 (or 8!) site.

Two modules in the contrib space help you above and beyond what is provided in core. Migrate Tools provides Drush command integration for managing your migrations, while Migrate Plus provides loads of enhancements for migrating your data (additional plugins, etc). It's important to make sure you're using the right version of the module for your version of Drupal, by the way--but that's easy since you're using Composer, right?

Write Some Migrations

You will need to drop a migration YAML file in your module's /migrations folder. Generally, the yaml file is named after the id you specify for your migration plugin. As a full example, see core's d7_node.yml migration.

The Migrate API follows a traditional software design pattern called Extract, Transform, Load. To avoid some confusion with the concept of "Load" (in this case meaning loading data INTO your Drupal database), there's some different terminology used in Migrate:

  • Extract == Source
  • Transform == Process
  • Load == Destination

One thing that Migrate Plus provides is the concept of a "Migration Group." This allows multiple migrations to share some configuration across all of them. For example, if all migrations are coming from the same MySQL database (say, a Drupal 6 database), then that shared configuration can go into the migration group configuration once rather than into each individual migration.

There's some global configuration that goes into each individual migration, for example its "id" key (the unique ID of this migration), and its "label" (friendly name in the UI / Drush).

One thing you can specify also are "dependencies" - for example to a module. You can also enforce "migration_dependencies," which means that before THIS migration is run, THAT one needs to run. This is a great way to ensure references (like entity references, or taxonomy terms) are migrated into the site before anything that uses them.

Each migration then should specify three unique sections--source, process, and destination (sound familiar?).

Source

The source section specifies which plugin to use. These plugins are usually found in the module involved. For example, if you want to migrate data in from Drupal 7 nodes, take a look in core/modules/node/src/Plugin/migrate/source for the implementation / plugin id to use. Often, though, you'll actually find yourself writing a new Class in your own module which extends this one.

Each different source plugin might have some additional configuration that goes along with it. For example, with "plugin: d7_node" you might also specify "node_type: page" to migrate in only basic pages from your Drupal 7 site. You might also specify "key" here to say which database key from the $databases array in settings to use (if that wasn't specified globally in your migration group!).

The purpose of all source plugins is to provide a Row object (core/modules/migrate/src/Row.php) that is uniform and can be consumed by the next part of the process.

If you do write your own migration plugin, two methods I find myself frequently overriding are query() (so I can add conditions to the typical source query, for example - like only grabbing the last year's worth of blog posts), and prepareRow(). The method prepareRow() is your hook-like opportunity to manipulate the Row object that's about to be transformed and loaded into the database. You can add additionally-queried information, or translate certain values into others, or anything you need in prepareRow(). The only thing to beware of is every Row becomes a destination entity, so if you're doing something like joining on taxonomy terms, you're better to do that in prepareRow and add a new property to it with setSourceProperty() rather than, say, LEFT JOINing it on in the query.

Destination (yes, I skipped Process)

The destination plugins work largely the same way. You simply specify what entity (usually) you're migrating into. For example, you might have destination: 'entity:node' and any additional configuration related to that destination. One example for entity:node is to add default_bundle: page here so that you don't need to set bundle: 'page' in your process section (which we're about to get to). Similarly, if migrating files, you can specify source_base_path: 'https://example.org' to automatically download images from that path when importing!

Like source plugins, destination plugins have additional configuration here that is tied to the plugin.

There are so many things that are entities in Drupal 8, the possibilities are vast here - you can migrate nodes, users, comments, files, sure. But you can also migrate configuration, form display, entity view modules, anything in contrib, or your own legacy code! Migrate Plus also provides a "Table" destination so you can migrate directly into MySQL tables inside of Drupal (note that this is generally not best practice if you're migrating into entities--you're better off using the entity:whatever plugin so you take full advantage of the entity API).

Process

This is where all the real magic happens, in my opinion. To keep this blog post short (is it too late for that?), I won't go too deep into all the process plugins available, but I will talk about a few special cases and then encourage you to check out the documentation for yourself.

The "get" plugin is the most basic of all. It simply means "take the value off the Row object for property x, and map it to value y." In your real migration's yml file it would look like destVal: sourceVal - which simply means "take what's in $row->sourceVal and put it in the destination's "destVal" property.

The "migration_lookup" plugin goes one simple step further than get and translates the incoming ID to the new ID value on the new site. For example, if you have a migration that migrates person nodes and the nid was 65 for John Smith on the Drupal 6 site, but is 907 on the new Drupal 8 site, a reference to that person (say on the "authors" field of your "Research" content type) would also need to be translated. This plugin transforms the incoming 65 to the correct 907 by referencing a migration that has already been run (remember the migration_dependencies key?).

Multiple plugins can even be chained together to form a "pipeline" of transformations that can happen in order. For example, if your old database only had usernames as "Chris Wells" and "Patrick Corbett," but you wanted to make usernames, you could run that through the machine_name plugin to change it to "chris_wells" instead. But, what if there was already a "chris_wells" user? Well, you can then run the new value through dedupe_entity to append an "1" or "2" etc until it's unique. You can create fairly complex pipelines here in the yml file without having to touch any PHP code.

Sometimes a field in Drupal has a "nested value," like the body or link fields. The body field has a "value" and a "format" on it. To map these, you use a slash (/) to separate the field and the sub-field, like 'body/value': description and 'body/format': format -- just be sure and use those "ticks" (apostrophes, single-quotes, whatever you call them) around these types of keys.

Feel free to check out all the core process plugins, and even ones provided by contrib, like: get, migration_lookup, machine_name, dedupe_entity, default_value, concat, explode, flatten, static_map, substr, skip_on_empty, skip_row_if_not_set, menu_link_parent, callback, entity_generate, and entity_lookup!

There's one more special one, formerly known as "iterator" and now called "sub_process." This lets you create multi-step a pipeline against an array of structured data arrays. Make sure to pay some special attention to that one.

Put it all together

So by now you've created your shiny new Drupal 8 site just how you want and you've written a module (.info.yml file, really). You've placed all these migrations in it. You can place them in config/install and they will be read in as configuration. You can then edit them as needed using drush config-edit or similar in Drupal Console. You could also uninstall and reinstall your module each time you alter the yml files.

Alternatively, you can also place them in /migrations (off your module root) and instead they will be loaded as plugins instead of configuration. This way is likely preferred since you can just flush the plugin cache when you make changes to the YML file.

Once you also have your source set up (database, CSV files, XML, whatever), you can start to run your migrations with Drush!

The most commonly-used Drush commands for migrating (in my world) are:

  • migrate-status - where are things at?
  • migrate-import - run a migration (single or whole group)
  • migrate-rollback - "un-run" a migration
  • migrate-reset-status - used to reset a migration to "Idle" state if the PHP code you're writing bombs out or you press Ctrl-C in frustration.

Others I don't use as frequently are:

  • migrate-stop - stops a running migration and resets it to idle (I usually press Ctrl-C and then do a drush mrs (migrate-reset-status))
  • migrate-fields-source - list all the available properties on Row to import (I usually just inspect this in the debugger in prepareRow())
  • migrate-messages - display messages captured during migration (PHP warnings, etc) (I usually just look at the database table where these are stored instead of printing them in terminal)

WHOA.

So there you have it. Migration in a nutshell! Please do feel free to leave comments and questions below or reach out to us at Redfin if you need help migrating data into your shiny new Drupal 8 site.

Oct 19 2017
Oct 19

Salesforce Suite is a group of modules for Drupal that allows for pulling data from Salesforce into Drupal, as well as pushing data from Drupal to Salesforce. The module api provides some very useful hooks, including the _salesforce_pull_entity_presave hook implemented by the Salesforce Pull module. In this blog post, we’ll look at using that hook to pull three Salesforce custom fields (select lists) into Drupal as taxonomy terms in three vocabularies.

Create a custom module to house the hook called <sitename>_salesforce and create a <sitename>_salesforce.module file. In that file, drop in the presave function, as copied from salesforce.api.php in the Salesforce Suite module:

/**
 * Act on an entity just before it is saved by a salesforce pull operation.
 * Implementations should throw a SalesforcePullException to prevent the pull.
 *
 * @param $entity
 *   The Drupal entity object.
 * @param array $sf_object
 *   The Salesforce query result array.
 * @param SalesforceMapping $sf_mapping
 *   The Salesforce Mapping being used to pull this record
 *
 * @throws SalesforcePullException
 */
function hook_salesforce_pull_entity_presave($entity, $sf_object, $sf_mapping) {
  if (!some_entity_validation_mechanism($entity)) {
    throw new SalesforcePullException('Refused to pull invalid entity.');
  }
  // Set a fictional property using a fictional Salesforce result object.
  $entity->example_property = $sf_object['Lookup__r']['Data__c'];
}

Take a look at the example code in the function body but remove it.

The hook gets called during the salesforce_pull_process_records function with this line:

// Allow modules to react just prior to entity save.
module_invoke_all('salesforce_pull_entity_presave', $wrapper->value(), $sf_object, $sf_mapping);

So, that’s where we will intervene with our custom code. With this hook, we have access to the data queried from Salesforce, and the entity that is about to be saved into Drupal, so it's a perfect time to do any translations between the two data sets.

The first problem we have to address is that, by default, the Salesforce Pull module will create a new node as it processes each Salesforce record instead of modifying the existing nodes on your Drupal site. If you don’t want this behavior, add this code:

// first of all, don't create new nodes
if (isset($entity->is_new) && $entity->is_new == TRUE) {
  throw new SalesforcePullException('Tried to create a new node.');
}

You may also want to look at the _salesforce_pull_mapping_object_alter hook to aid in prematching nodes.

Then, we need to define our taxonomy vocabularies:

// lookup table
$names_vids = array(
  'exampleVocabularyA' => array('vid' => 1, 'field' => 'field_example_vocabulary_a'),
  'exampleVocabularyB' => array('vid' => 2, 'field' => 'field_example_vocabulary_b'),
  'exampleVocabularyC' => array('vid' => 3, 'field' => 'field_example_vocabulary_c'),
);

Gather the terms from $sf_object like this:

// gather terms
$incoming = array(
  'exampleVocabularyA' => explode(';', $sf_object['Terms_A__c'] ? $sf_object['Terms_A__c'] : ''),
  'exampleVocabularyB' => explode(';', $sf_object['Terms_B__c'] ? $sf_object['Terms_B__c'] : ''),
  'exampleVocabularyC' => explode(';', $sf_object['Terms_C__c'] ? $sf_object['Terms_C__c'] : ''),
 );

You’ll want to clean up the incoming data:

array_walk_recursive($incoming, 'trim');
$incoming = array_map('array_filter', $incoming);

Then, we need to iterate over the incoming terms and create a new term if it doesn’t already exist in Drupal. Finally, we set the tids on the desired nodes:

foreach($incoming as $vname => $term_names) {
  $tids = array();
  foreach($term_names as $term_name) {
    $tid = taxonomy_get_term_by_name($term_name, $vname);
    if (empty($tid)) {
      // add the term if we don't already have it
      $newterm = new stdClass();
      $newterm->name = $term_name;
      $newterm->vid = $names_vids[$vname]['vid'];
      taxonomy_term_save($newterm);
      $tid = $newterm->tid;
    }
    array_push($tids, $tid);
  }
  // set tids on target nodes
  // first unset all existing tids
  $entity->{$names_vids[$vname]['field']} = array();
  // using $length here because we modify $tids in loop
  $length = count($tids);
  for ($i = 0; $i < $length; $i++) {
    $tid = array_shift($tids);
    $tid = array_keys($tid)[0];
    $entity->{$names_vids[$vname]['field']}[LANGUAGE_NONE][$i]['tid'] = $tid;
  }
}

This will keep your Drupal nodes in sync (on each cron run) with any terms added or deleted on the Salesforce objects.

(If you are having trouble getting Salesforce to complete its whole queue on a single cron run, I recommend this blog post for troubleshooting tips: Drupal Salesforce Not Updating Records. In particular, we recommend the Queue UI module.)

Sep 21 2017
Sep 21
shopping_cart

September 21, 2017

Chris

Redfin is happy to announce that thanks to the efforts of vetchneons, we have at long last released a -dev version of the CashNET module for Ubercart in Drupal 7. CashNET is a payment processor used by a lot of institutions in the higher education realm.

We would love for any folks using Ubercart in 7 to test it out, so the module can be promoted to a stable release. 

And of course, if anyone wants to also port this to Drupal Commerce, they'd be more than welcome. :)

Head on down to the UC CashNET project page and give it a download!

Jul 28 2017
Jul 28
music_note

July 27, 2017

Brett

In Drupal 8 there are a handful of ways you can install contrib modules to your project and here we’ll discuss some of the pros and cons of each.

When installing a D8 contrib module “manually” you have to navigate to a site that hosts the module’s compressed tar.gz or zip file like at drupal.org/project/project_module or github and download the file of your choice. Next, the module has to be extracted, usually by double clicking the compressed icon, and then the new, uncompressed module folder that now appears may need to be renamed to just its machine name (for example from paragraphs-8.x-1.1 to just "paragraphs"). This module folder must then be moved into the site’s /modules directory and enabled using either Drush, Drupal Console, or the Drupal admin UI. This manual approach isn’t completely terrible because it does work and you have control over each step, but it isn’t the recommended strategy mainly because the entire module codebase has to be committed into version control which isn’t ideal. More on this in a bit.

By utilizing the extend tab in the admin UI to install a D8 contrib module, you can automate some of the steps encountered within the manual strategy but the same major problem exists - the entire codebase has to be checked into version control. It’s worth noting that this functionality allows you to point to an already downloaded tar.gz/zip file or provide a link to one hosted remotely. When the file is then uploaded, Drupal will extract the module for you and will place it into the correct directory as well. For this to work, though, you need to make sure your web user can write to the modules folder, which is generally frowned upon from a security perspective.

The previous methods definitely work and any developer should be aware of them but with D8 came the ability to use Composer to manage Drupal modules. It’s technically a dependency manager that works in much the same way as Node.JS’s "npm" and Ruby’s "Bundler."

Once installed from getcomposer.org you can find, install, update, and remove modules from the command line among other things which is a big win especially when many dependencies need updating. Also, instead of having to commit the entire codebase of the module(s) into version control, you only have to commit the manifest files (composer.json and composer.lock). The .json file lists out what dependencies the project needs and the acceptable version ranges of each while the .lock file records what versions actually got installed. It's very bad practice to commit dependencies that aren't yours into your repository and also because it increases the size of the repository significantly. To include a new dependency simply execute ‘composer require drupal/module_name’ and composer will add the module to the .json file. Then execute ‘composer install’ to get the module codebase from the Drupal packages API (NOTE: not "Packagist," which is the default Composer package respository) and to update the .lock file. Running ‘composer update’ is great because it will update all the modules to their latest versions allowed by the .json file, install them, and update the .lock file. Composer however will not enable the modules for you in Drupal, so that is where Drush can come to the rescue. Lastly, you can still use Drush or Drupal Console in D8 to install modules but composer is definitely the preferred tool because of these added benefits.

Further Reading

May 02 2017
May 02

Recently, one of our Enterprise clients asked for some help installing SSL certificates on their Acquia-hosted Stage and Development environments. This is not something that Acquia seems to provide (they do provide basic SSL on dev/stage environments, but not with hostname matching), so we set out to get them set up. They use their dev and staging environments to demonstrate new content and features to stakeholders, and some were getting scared off by the SSL certificate warnings.

Rather than pay, we decided to try it out with Let's Encrypt, which if you haven't heard, is the amazing and relatively-new Certificate Authority that provides FREE CERTIFICATES, and has a mission of enabling SSL everywhere.

encrypt all the things!Get Certbot

The first thing you need to do is download certbot. Certbot is a command line tool from the EFF for managing SSL certificates with Let's Encrypt. At Redfin, we use Macs with Homebrew, so the easiest way to get the tool was to enter `brew install certbot` into a terminal. Now, there's a "certbot" global executable to use.

If you follow the "download certbot" link above, and for example enter "Apache" and "Ubuntu 14.04," you'll get instructions for how to install certbot on other platforms.

Once you have certbot downloaded, you need to run the "manual" method of validation. This feels like the old familiar way of verifying site ownership--adding some files to a particular directory. Let's Encrypt then calls out to that URL, and if it finds you put the right thing there, then it assumes you have control of that website, and provides you with the certificate.

On your local machine, run the certbot command that does manual verification: `sudo certbot certonly --manual -d SITEstg.prod.acquia-sites.com -d  SITEdev.prod.acquia-sites.com` (where SITE is dependent on your specific Acquia setup). You'll keep this command running as you perform the next steps.

The "certonly" and the "--manual" are the main influencers here. Note that you can add as many -d's and domains as you need. If you have more dev environments than the standard stage/dev in Acquia (my client did), you can just keep adding the -d's. Note that also on my Mac I had to run this with 'sudo' in front of it, because it writes to /etc. You can also specify some additional parameters on the command to put these files in a separate location if you need.

Allow in .htaccess

This starts the process of verifying your sites. As you step through, it will give you some long, hash-y looking text strings that need to be available at a particular URL. According to the spec, this is at a .well-known/ folder off your site root. In order to allow Drupal to see this, you may need some changes to your .htaccess file. 

If you're using a Drupal 8.3.x site (newer than Feb 9 2017), the issue has already been fixed. See https://www.drupal.org/node/2408321 for more information.

If you're using Drupal 7, then as of this writing it has not been fixed in core. See https://www.drupal.org/node/2847325 for more information. Essentially you need to allow .well-known in the FilesMatch directive at the top of .htaccess, and then exclude it from the main RewriteRule later down in the file.

Make verification "file" visible

The next thought you might have is, "OK, now I need to put all the files that need to be visible in that .well-known/acme-challenges/ry784yfy7...fdhj directory." Except, you don't really. (Pro tip: you do not need to enable live development mode on 4 environments at once and crash the server.)

The reason why not? The fabulous Let's Encrypt Challenge module. This lets you use the Drupal UI to enter your challenge information, or upload files to sites/default/files to answer the challenges. Download that module and push it to a branch, and set all of the Acquia environments your enabling Let's Encrypt SSL for, to use that branch. Enable the module on each dev/stage site, and as you walk through the certbot command (it gives you a challenge for each domain), log in to the site, enter the challenge, and hit save. You can then pull up the URL that certbot gives you, in order to verify that the module is doing what it promises. (Important note here, if you accidentally pull up the URL before you've changed .htaccess or enabled it, Acquia's Varnish is going to cache the bogus response and validation won't work. If you accidentally do this, be sure and flush the Varnish caches at Acquia for the environment where you got an itchy trigger finger.)

When the certbot process completes, it will tell you where you can find the certificate files needed. These are stored locally on the machine where you run certbot, in the case of a Mac with certbot installed with Homebrew, in /etc/letsencrypt/live/FIRSTDOMAIN (where FIRSTDOMAIN is the first domain you passed into your certbot command, above). 

Tell Acquia You Got the Goods

This is a two-part process. By logging into your Acquia console, you must go to each individual environment and go to the SSL section, in the nav at the left hand side. From there, you can click "Install SSL Certificate" at the top. You will be prompted to enter four pieces of information: (1) a name for your certificate (ex.g. "LE 05022017" because it's Let's Encrypt and the day it was created), (2) the private key for the certificate (use privkey.pem from the folder above, where certbot put all this info), (3) the certificate itself (use cert.pem), and (4) the "chain" certificate (often called "intermediate" certificate), which establishes security from your certificate all the way to a Root Certification Authority (use chain.pem). NOTE: with Acquia, you will not use fullchain.pem. This is simply a file that concatenates all the information together into a single certificate file.

The second part of this process is to click "Activate" next to the certificate once the "installing" activity is completed.

Again, this needs to be repeated for each environment, but with the same certificate information.

In the below screenshot I've tried to call attention to some relevant parts of the SSL screen in the Acquia console:

the Acquia console with a Let's Encrypt certificate successfully installed and activatedNote the SSL navigation on the left column, the name of the certificate "LE 05022017," the "Deactivate" link (this is where you would find the "Activate" link after the certificate installs), and the "Install SSL certificate" link at the top.

We hope this proves helpful in getting some basic SSL certificates installed in your own Acquia environments!

May 02 2017
May 02

Recently, one of our Enterprise clients asked for some help installing SSL certificates on their Acquia-hosted Stage and Development environments. This is not something that Acquia seems to provide (they do provide basic SSL on dev/stage environments, but not with hostname matching), so we set out to get them set up. They use their dev and staging environments to demonstrate new content and features to stakeholders, and some were getting scared off by the SSL certificate warnings.

Rather than pay, we decided to try it out with Let's Encrypt, which if you haven't heard, is the amazing and relatively-new Certificate Authority that provides FREE CERTIFICATES, and has a mission of enabling SSL everywhere.

encrypt all the things!Get Certbot

The first thing you need to do is download certbot. Certbot is a command line tool from the EFF for managing SSL certificates with Let's Encrypt. At Redfin, we use Macs with Homebrew, so the easiest way to get the tool was to enter `brew install certbot` into a terminal. Now, there's a "certbot" global executable to use.

If you follow the "download certbot" link above, and for example enter "Apache" and "Ubuntu 14.04," you'll get instructions for how to install certbot on other platforms.

Once you have certbot downloaded, you need to run the "manual" method of validation. This feels like the old familiar way of verifying site ownership--adding some files to a particular directory. Let's Encrypt then calls out to that URL, and if it finds you put the right thing there, then it assumes you have control of that website, and provides you with the certificate.

On your local machine, run the certbot command that does manual verification: `sudo certbot certonly --manual -d SITEstg.prod.acquia-sites.com -d  SITEdev.prod.acquia-sites.com` (where SITE is dependent on your specific Acquia setup). You'll keep this command running as you perform the next steps.

The "certonly" and the "--manual" are the main influencers here. Note that you can add as many -d's and domains as you need. If you have more dev environments than the standard stage/dev in Acquia (my client did), you can just keep adding the -d's. Note that also on my Mac I had to run this with 'sudo' in front of it, because it writes to /etc. You can also specify some additional parameters on the command to put these files in a separate location if you need.

Allow in .htaccess

This starts the process of verifying your sites. As you step through, it will give you some long, hash-y looking text strings that need to be available at a particular URL. According to the spec, this is at a .well-known/ folder off your site root. In order to allow Drupal to see this, you may need some changes to your .htaccess file. 

If you're using a Drupal 8.3.x site (newer than Feb 9 2017), the issue has already been fixed. See https://www.drupal.org/node/2408321 for more information.

If you're using Drupal 7, then as of this writing it has not been fixed in core. See https://www.drupal.org/node/2847325 for more information. Essentially you need to allow .well-known in the FilesMatch directive at the top of .htaccess, and then exclude it from the main RewriteRule later down in the file.

Make verification "file" visible

The next thought you might have is, "OK, now I need to put all the files that need to be visible in that .well-known/acme-challenges/ry784yfy7...fdhj directory." Except, you don't really. (Pro tip: you do not need to enable live development mode on 4 environments at once and crash the server.)

The reason why not? The fabulous Let's Encrypt Challenge module. This lets you use the Drupal UI to enter your challenge information, or upload files to sites/default/files to answer the challenges. Download that module and push it to a branch, and set all of the Acquia environments your enabling Let's Encrypt SSL for, to use that branch. Enable the module on each dev/stage site, and as you walk through the certbot command (it gives you a challenge for each domain), log in to the site, enter the challenge, and hit save. You can then pull up the URL that certbot gives you, in order to verify that the module is doing what it promises. (Important note here, if you accidentally pull up the URL before you've changed .htaccess or enabled it, Acquia's Varnish is going to cache the bogus response and validation won't work. If you accidentally do this, be sure and flush the Varnish caches at Acquia for the environment where you got an itchy trigger finger.)

When the certbot process completes, it will tell you where you can find the certificate files needed. These are stored locally on the machine where you run certbot, in the case of a Mac with certbot installed with Homebrew, in /etc/letsencrypt/live/FIRSTDOMAIN (where FIRSTDOMAIN is the first domain you passed into your certbot command, above). 

Tell Acquia You Got the Goods

This is a two-part process. By logging into your Acquia console, you must go to each individual environment and go to the SSL section, in the nav at the left hand side. From there, you can click "Install SSL Certificate" at the top. You will be prompted to enter four pieces of information: (1) a name for your certificate (ex.g. "LE 05022017" because it's Let's Encrypt and the day it was created), (2) the private key for the certificate (use privkey.pem from the folder above, where certbot put all this info), (3) the certificate itself (use cert.pem), and (4) the "chain" certificate (often called "intermediate" certificate), which establishes security from your certificate all the way to a Root Certification Authority (use chain.pem). NOTE: with Acquia, you will not use fullchain.pem. This is simply a file that concatenates all the information together into a single certificate file.

The second part of this process is to click "Activate" next to the certificate once the "installing" activity is completed.

Again, this needs to be repeated for each environment, but with the same certificate information.

In the below screenshot I've tried to call attention to some relevant parts of the SSL screen in the Acquia console:

the Acquia console with a Let's Encrypt certificate successfully installed and activatedNote the SSL navigation on the left column, the name of the certificate "LE 05022017," the "Deactivate" link (this is where you would find the "Activate" link after the certificate installs), and the "Install SSL certificate" link at the top.

We hope this proves helpful in getting some basic SSL certificates installed in your own Acquia environments!

Dec 14 2016
Dec 14

One of my big projects this past summer as an intern at Redfin was to learn about the design software Sketch. This was supposed to culminate in a small presentation just to the office, but I ended up giving a presentation at a “birds of a feather” session at Design4Drupal in Boston. A couple people who missed it asked if I could record it, so I made a video of it once I got back to Portland.

Sketch is made for designing user interfaces. The workspace is great for designing and organizing multiple pages and artboard sizes (so it’s great for responsive design). It’s also really easy to use so when the design goes to the developer they’ll have everything they need with a couple simple tricks (like holding the alt key).

Sketch is like a combo of photoshop and illustrator. It wouldn’t replace them for their main functionality (photo editing and vector design), but in my opinion, it surpasses them in UI design and workflow. Sketch creates vector elements, scaling without loss, but does it on a pixel based landscape. You can select to view vector or pixels, and switch easily between the two.

sketch example of pixelation and vectorization

One of the nice things about it is it’s easy and fast to use, for designers and developers. All the things that slow you down in other design software (importing and exporting, gradients, shadows, etc.) are simple and quick. You can drag in images from a browser, and drag artboards onto your desktop.

There are a lot of great tools and plugins to look at, that can help you get your design process moving more efficiently. Check out the video I made for some of these features, and a demonstration of how the software works.

 

[embedded content]
Nov 16 2016
Nov 16

While we at Redfin don't really yet have a full on base theme for every project, one thing we do use is our "bundler shell." This forms the basis of any Drupal 7 or 8 theme we build, and in fact, is really just the framework for the front-end (that is, this shell is useful outside of the realm of Drupal, actually).

First things first - here's the code.

Let's go ahead and begin the dissection...

The Gemfile

The Gemfile comes from Bundler, which is an amazing dependency management tool for front-end stuff, namely Ruby Gems. Here, we mention the repository we need, and which gems to get. You can then run bundle install in order to install the requisite version of gems specified in the Gemfile.lock. (If you're at all familiar with Composer, this should sound familiar, because Bundler was a huge inspiration for Composer.) Then, we can start our normal compass commands by adding bundle exec in front of them, and it will ensure you've installed everything correctly before it starts to watch the sass folder, and compile to the css folder. 

bundle exec compass watch

The 'sass' folder

Our Sass folder structure leans on a single but simple style.scss, which includes a number of things. 

Up top, we include any compass imports, followed by any external vendor imports, namely Formalize and Susy. Combining Susy with Breakpoint we can rapidly build responsive layouts that just make sense.

We use Chris Eppstein's sass globbing plugin to import an entire folder of partials with one line.

@import "folder/*";

In this way, we then allow the project to grow folders out organically. Typically we end up with folders like:

  • nodes
  • views
  • paragraphs
  • misc (or "components")
  • regions

Even as I write this, we're discussing how to better standardize this list.

Configuration

The last thing to check out is the config.rb file. This is a standard sass/compass file that you need, but I do want to call particular attention to the last line, where we mention sourcemap=true.

This little bit of magic gives your Chrome Inspector (or other Dev Tool of choice) the ability to identify which line of the sass partial--not the compiled css--the styles are coming from, so you know right where to go to change them.

Leveling Up

The last piece of our front-end stack is Browsersync, which we'll cover in a future post!

Feb 12 2014
Feb 12

Here at Redfin we've come up with a couple of helpful Drush commands that help us with our everyday workflows. The first shows a database string as a URI for Drush aliases, and the second reverts all views that are in an overridden state without any prompting or confirmation.

The power of Drush aliases really help you to easily sync database from your staging or production environments down to your local development environment. When creating a drush alias, it likes database connection strings in URI format, like mysql://username:[email protected]/database.

The first command in the script is "drush rdb" (or "drush redfin-db" for long). This will read the settings.php (works with Drupal 6 and 7) and spit out a URI-formatted version of the connection string. We automate the creation of our aliases so it's nice to be able to call this from drush inside a bash script which generates the full alias file.

The second command is "drush rvr" ("drush redfin-vr"). **DANGER, DANGER** Ok, now that I have your attention, this command will revert ALL views currently in a state of "Overridden" to code. We use this as part of a deployment script to sites where the Views UI is completely disabled and all the views are managed in code. This means that when you push up your new view code, it gets pulled into the site. It's also helpful for local development when you've just synced down new views code from your fellow developers. You can also use the regular "drush vr" command for this, but that command first prompts you as to which views you'd like to revert, and you have to choose all as an option, which appears as a different number each time, preventing its use in automated scripts (like git post-receive hooks).

The code is available as a Gist. Just put this file in your home's .drush folder as redfin.drush.inc and you'll be able to use these two commands as-is, or modify them to suit your own needs.

DOWNLOAD/VIEW CODE

Aug 02 2013
Aug 02

If you provide Drupal development services in Massachusetts, you may need to start taxing your clients. No, seriously -- this is our attempt to understand the new Massachusetts "Sales and Use Tax on Computer and Software Services" law, which was conveniently released on July 25th 2013 and went into effect on July 31st, 2013.

Seemingly out of the blue, Massachusetts has amended the recent legislation, "An Act Relative to Transportation Finance, St. 2013, c. 46" that specifies a tax on "computer system design...modification, integration, enhancement, installation, or configuration of standardized or prewritten software." A TIR was released that attemps to clarify the tax, but not really.

Here's our take on how the law affects us Drupal developers (we are by no means lawyers and you should contact yours to determine how this affects your business):

  • It DOES apply to any open source software services like Drupal sourced (purchased) from Massachusetts-based vendors.
  • It DOES NOT affect Drupal vendors who provide services from outside Massachusetts to clients in Massachusetts.
  • It DOES NOT apply to training, design, hosting, consultation and other practices that aren't directly related to the act of customizing "prewritten" software like Drupal.
  • It DOES NOT apply to sites built from "scratch" using HTML, CSS, JS, etc..
  • It DOES suggest that Massachusetts couldn't think of any better ways of funding their transportation projects (Big Dig anyone?) 

What's clear is that this law will upset more firms than just Redfin -- many of which are small businesses who are already overwhelmed with the challenges of running a business. Some taxes make sense and others just seem like a desperate, last-ditch effort to collect some extra cash using the hope-they-don't-notice technique. While Redfin luckily resides outside of Massachusetts, many of our awesome partners and clients do not and, at some point, we'll all be affected by this poorly-enacted ammendment.

We encourage anyone in Massachusetts or who has partners in the state to sign this petition or contact the state and let them know how this legislation will affect you and your clients. If you have any questions, this FAQ proved way more useful to us than any other publication.

Jun 24 2013
Jun 24

This past weekend marked a turning point for the relatively young Design 4 Drupal (D4D) camp. Over the past several years, D4D has suffered from an identity crisis as it attempted to grow attendance from a (seemingly) developer-heavy pool of Drupal enthusiasts. Boston's lack of an existing camp gave developers little choice if they wanted to satisfy their desire to meet up with others in the Drupal community. As a result, D4D has traditionally offered sessions that span the design and development disciplines. As a result, this has watered down the camp's main goal of attracting more designers to the Drupal movement and help promote the improvement of the Drupal framework through better user interface (UI), user experience (UX) and anchoring it in more sound design principles.

This year, however, marked a turning point. A conscious decision was made after D4D 2012 to focus session content on design topics (SASS, CSS, Theming, Principles & Workflow, Prototyping, UX, UI). Take Amy Kosh's session on the Logic of Color and Design: A welcomed departure from traditional camp sessions focusing on the core principles of color spaces gently wrapped in the Drupal design workflow. Another session highlighted the importance of structured content while yet another referred back noting the fact that content is, in fact, part of the design and not a separate entity left for clients to enter after the site is complete.

At the end, organizers and those interested in volunteering next year gathered to discuss what worked and what could be improved. Out of this came the idea of attracting developers less likely to attend a design-oriented camp through a challenge: What if we facilitated a design sprint with the goal of having a Drupal core (or individual) design-related issue worked on by a developer? By the end of the camp, designers would have their issue improved or fixed with exposure to the patching/debugging process while developers could gain insight on how their work is valued by designers. I think this idea has merit and should be pursued.

I am a developer with an appreciation for design and D4D 2013 was a welcomed refreshment during the hot Boston summer weekend. I recommend more developers come next year in order to improve their relationships with clients, designers and their own code that would otherwise remain dormant without a design to bring it to life.

#drupal @d4dboston 

May 16 2013
Amy
May 16

The idea for this series of posts started at lunch one day when I made an off-hand comment about reading Hex numbers and making fast changes in my CSS file. I am somewhat new to developing for Drupal, but amid all the new information and terminology was something familiar, colour-space*. A sword that I understood how to put to use in elegant arcs within the rest of the coding I was learning.

The initial exchange started with my comment about finally seeing something I already understood and could use quickly, but how easy it was, using Hex numbers, to just change colours on a site. That led to an impromptu 15-minute teaching session. Coloured-pencils were involved and a t-shirt and a houseplant. At the end, my boss, who was on the receiving end of all this, suggested that I do a longer version for our lunchtime lecture series. That led to building a session for the upcoming Design 4 Drupal, and creating that led me to create an ongoing series of posts on colour/design/workflow and how all of those somehow influence each other within this world of site design/build and Drupal.

Drupal is a very blue world. The logo is blue (HEX#0173ba), the default build is blue, it’s a calm world with a clean design and implies, at least at the user end, precision. The admin menus are designed in shades or grey, the “block demonstration” area, also grey. Tables often have default grey borders; everywhere we look we swim in a sea of blue-grey. Add to that browser defaults that can add grey borders around our search bars. The areas of warnings are shades of orange or red, but take a minute to notice that even these, at least in Drupal basic sites, are muted. That warning you get for “module updates”, or “files not found” is never a pure hue, always somewhat diluted, transparent to a point, so that it is less jarring.

All these colours serve a purpose. The first colour of Drupal, that Drupal blue, may have started out as a whim, a guess, but whole schemes have been created based on it. Themes have a look and feel to them, in large part because of the colours that are used, the themes that have been created.

My intention for this series is an ongoing inquiry into colour, how we can understand the technical aspects of hue and all the attributes. How we can use colours to create exciting and vibrant websites that hold user’s attention and move them through the site in exciting an innovative ways. And not least, how understanding colour can make our work more interesting, more efficient and far more inspiring.

*Yes, I spell it with a "u" ;-)

Apr 03 2013
Amy
Apr 03

This is something that is so simple to do, once you have figured out the steps to get to the right little click boxes to appear. I wish that it were more intuitive to find, but for all those who struggle with “hiding” the page titles, Display Suite makes it really easy to choose to hide the page title for a specific Content Type or for a particular Node.

Follow these steps:

(I’m starting at the beginning for Newbies)

1. Download and install the Display Suite Module. (If you are using Drush the (current) module-name is “ds”).

2. Enable at least the following in Administration Menu>Modules:
            a. Display Suite
            b. Display Suite Extras (This is the key!!)
            c. Display Suite UI
    You can enable more of the Display Suite pieces if you need them.

3. At this point you’ll need to go to whatever content type or particular content (i.e. a particular page like “My page about my dog”) you want to have title control over and create a “Custom Display” for that content.

Here’s how:
   
4. For “Content Types," go Structure>Content Type>Manage Display. There’s a list at the bottom of the page on the left. Look for “Custom Display Settings”. Enable at least one new display. (I recommend “Full content”) and you MUST choose a layout for the display. Use “One Column” if you really don’t want to change your current layout at all. Save your configuration.

5. You can do the same thing for a particular node/page, go to the Content list and select the particular node you want to work your magic on. Then follow step #5.

Once that is done………

6. Go to Structure > Display Suite

7. In the upper right corner look for the “Extras” button. Click the “Extras” button.

8. Look at the list on the left side. You want that 3rd option, “Other." Enable the checkbox for “Page Title Options”. Save the configuration.

9. Now go to Structure > Content Type (Choose your content type) > Manage Display OR Content>(pick your node/Page) > Manage Display. Make certain that in the upper right “Default” is NOT selected. When default is selected you do not get the same option set to put into use.

10. At the bottom of the Display page you’ll see another list. You want to click on the “Custom Page Title”, and then select either “Hide” or “Show”. Save your configuration.

Sit back and enjoy your new control over page titles.

Mar 19 2013
Mar 19

I am constantly re-working Drupal's tabs to look a little bit more like a pile of bricks, and I've finally decided to stop reinventing the wheel and to document the CSS that makes them display more sanely. (Namely, if you have a narrow main column and a lot of tabs, they start disappearing into the ether over at the right).

I hope this snippet helps some other folks, too.

This fixes your default Drupal tabs by taking it from this:

to this:

CSS Code:

ul.primary { border-bottom: 0; white-space: normal; line-height: 1.6em; padding: 0; margin: 0 }
ul.primary li a { border-style: solid; white-space: nowrap; margin-right: .1em; }
ul.primary li.active a { border-bottom: 1px solid #bbb; }
ul.primary li a:hover { border-bottom-color: #ccc; }

Aug 20 2012
jp
Aug 20

For some reason I have a huge mental block when it comes to image captions. I can never quite remember exactly which combination of modules I prefer. Part of this is because I've tried so many different modules that offer this functionality. Also, they all have similar names, such as Image caption, Caption filter, and Image caption formatter.

My go-to for Drupal 6 was Image Caption; unfortunately, right now the D7 version is still in beta. So, I recently tried jCaption and found that it can do everything Image Caption did, plus it's more flexible. The jCaption Module uses jQuery to change an image's title or alt attributes into a caption, conveniently wrapped in a paragraph tag for easy styling. So it can change something like this:

<img src="http://redfinsolutions.com/blog/easy-image-captions-drupal-7/image.jpg" alt="Image description" title="This is an image caption" class="caption" />

into this:

<img src="http://redfinsolutions.com/blog/easy-image-captions-drupal-7/image.jpg" alt="Image description" title="This is an image caption" /> <p>This is an image caption</p>

One of the things I like about this module is that you use CSS selectors to target specific images. My clients and I are in the habit of targeting images with the class of caption, so here are the steps to get it working that way on your site.

Configuring the modules

  • Install and enable jCaption, Wysiwyg, and your favorite client-side editor. Our typical installation uses TinyMCE, IMCE, and IMCE Wysiwyg bridge. If you're using something else, feel free to describe how to do so by posting a comment.
  • Next, you'll need to edit the Wysiwyg profile(s) that users will use to insert images at Configuration » Wysiwyg profiles (/admin/config/content/wysiwyg).
  • Choose an input format (I typically do this to Filtered and Full HTML), and click Edit

Wysiwyg Profiles Screenshot

  • Click the Buttons and Plugins dropdown and enable the Image, Advanced Image, and IMCE buttons.
  • Click the CSS dropdown and enter "Image Caption=caption" (without quotes) under CSS classes. You can replace caption with any valid CSS selector. I got in the habit of using the class of caption with the Image Caption module so that's what I continue to use.

Wysiwyg CSS Config Screenshot

  • Click Save.
  • Configure the jCaption module at Configuration » jQuery Captions (/admin/config/media/jcaption)
  • There are many options here, which is great. You can leave everything set to the default for now, except that you need to add "img.caption" (without quotes) where it asks for your selectors. I really like that I can choose more than one selector here. In the past, I've had to use template.php to add the classes to images programmatically, which was a bit of a pain.
  • Although there is an option to use the alt attribute for captions, I prefer to use title. The reason is that the alt attribute replaces the image itself for users with visual impairments who may be accessing your website with screen readers. For this reason, I like to use the alt attribute to describe the image and the title attribute, which was originally intended as a tooltip, for the more descriptive caption.

jCaption Config Screenshot

  • Click Save Configuration.

Adding captions to images

  • Create content containing a field with the input format you configured above.
  • Insert an image using the Insert / Edit image button in your text editor.
  • Here's where the Advanced Image button comes in handy. In addition to the General tab, the Advanced Image button will give you two more tabs: Appearance and Advanced.
  • In the first tab, upload or navigate to the image you want to add with IMCE, give it a title (remember that this will become a caption), and it is also a best-practice to give it a description (this will become the alt attribute).

First tab in insert image button

  • In the Appearance tab, you can now select "Image Caption" in the class dropdown.

Advanced Image Tabs

  • Click insert and note that your image will appear in the text editor without a caption. Don't panic; you won't actually see an image caption until after you save your content and the module's jcaption.js file loads.

Text editor screenshot

  • Save your content, and viola! Your image will have a caption, which you can now style.
Final image with caption

Happy captioning!

Jun 29 2012
Jun 29

We were having a situation with a site where sessions weren’t being shared between insecure (http) and secure (https / SSL). The one major difference with this site was that we were using Nginx as a reverse proxy, but we weren’t quite sure how it was affecting us. We ultimately found the cause to be a combination of Nginx settings and how Drupal differentiates between different domains.

(HINT: If you want the short version, and are having similar issues, try setting cookie_domain in settings.php!)

Mixed-Mode SSL

Everyone knows that SSL is a required technology when passing “sensitive” data over the Internet. That is, anything that, if it got into the wrong hands, could cause someone some significant harm (Credit cards, social security numbers, passwords etc.). On a particular client site, we are selling roles (subscriptions to premium content) and need SSL since we’re taking credit cards on-site to provide a seamless user experience. First, let’s talk about why there are so many moving parts here. (I should also mention that a lot of this applies to Drupal 7, but our case study is based on a Drupal 6 site.) Generally, there’s four ways you can handle SSL (from this Drupal Scout article):

  • Use a module like securepages to force SSL on certain pages. This is all well and good, but since the session cookie can eventually be passed in plaintext later, it doesn’t do much in the way of actual security because the session can be hijacked and someone can gain access to the secure portions of your site.
  • Add hijack prevention to securepages. This additional module, logs a user out if it detects someone trying to hijack a session by keeping two cookies (a secure and an insecure) and ensures they match the incoming SSL request; If not, it logs the user out. This doesn’t necessarily prevent the hijack itself, but it increases security when it detects a hijacked session.
  • Once someone is logged in, serve their entire session over SSL. (The downside is that if someone has the home page bookmarked, for example, they might appear to be logged out though they are logged in on the https side.)
  • Only use SSL for the site. (Don’t even respond on port 80 with a redirect).

We chose model #2 because a history of bad user experience (users becoming/appearing logged out) ruled out #3. Serving SSL-encrypted pages does come with some performance overhead, so generally the practice is to run SSL in “mixed-mode,” meaning to secure only the pages that transmit the sensitive information.

Nginx Reverse Proxy

Drupal and Apache are not the most performant pieces of software, especially as modules are added to the fray. We were originally brought on to improve this sites performance. We decided that using Nginx as a reverse proxy (directly serving static files like css, js, and images) and passing only non-trivial requests back to Apache was the magic bullet. Apache runs on a non-standard port so the two don’t conflict. Here’s a snippet of the Nginx config, to give you an idea: … location / {     ...     root /var/www/oursite.com;     ...     proxy_pass http://www.oursite.com:8080;   } There’s some obvious stuff missing (e.g., a regex to match images and serve them directly, etc.), but that crucial proxy_pass directive will be a major player here soon. The following diagram should help you visualize the setup:

The Issue

Like I said, once we installed e-commerce and got SSL running with securepages (and securepages_prevent_hijack), we noticed that users were appearing logged out when switching from http to https. To that end, we decided to redirect all traffic to https so that users wouldn’t log out. Unfortunately, our client started to see a decline in traffic, particularly from organic search results (Google) following the switch to https. We checked for a few things, chief among them that the analytics themselves were correct. After an analysis there, we concluded that they were -- and that the traffic decline was real.

The Solution

We referred our client to the Drupal SEO professionals at Volacci to help them develop an SEO strategy and implement some best practices. Among the recommendations was that the site go back to mixed-mode SSL. For that to be successful, we needed to get to the bottom of the “logout situation” (that sessions were not being shared between SSL and non-SSL variants). My motto is always to “go to the code.” If you want to know why things are behaving the way they are, you gotta go to the code! (And potentially dump some variables as you go!) The problem is here: if ($cookie_domain) {     // If the user specifies the cookie domain, also use it for session name.     $session_name = $cookie_domain; } else {   // Otherwise use $base_url as session name, without the protocol   // to use the same session identifiers across http and https.   list( , $session_name) = explode('://', $base_url, 2);   … } As you can see, the idea is that it uses a cookie domain, if set (in settings.php) to key the session. If you don’t set a cookie domain, it uses the HTTP_HOST (stripping the PROTOCOL so that sessions will be shared, but NOT stripping the port number). The port number is passed as part of the HTTP_HOST (remember that proxy_pass directive from the nginx config?). In our config, we have Apache listening on port 443 and handling all the SSL traffic directly since it is minimal. Over SSL, the HTTP_HOST value is “oursite.com” but when proxied back the HTTP_HOST it is “oursite.com:8080” - which matches the proxy_pass directive from above. This causes two different sessions to be created in Drupal’s sessions table and treats http and https as two separate areas. To fix, well, you can see it in the code! By specifying our cookie_domain in settings.php, we were able to keep all the sessions keyed the same, and switching back and forth from http and https maintained the session state. Hypothetically, another solution would be to let nginx handle the SSL and to proxy_pass the SSL requests over the localhost also on :8080, and the HTTP_HOSTs would match, but I prefer the idea all the SSL is singly handled directly by Apache. Has anyone else encountered this? Does anyone have alternate solutions or problems with SSL and a reverse proxy config? We’d love to hear about your situations.

Jun 16 2011
Jun 16

Redfin is pleased to announce the 1.0 release of the uc_cashnet module, a payment processor for Drupal's Ubercart 6.x that integrates with their external payment service, similar to 2Checkout.

Any feedback is welcome at this point, and we'd welcome the opportunity to further develop this module and create a similar module for Drupal Commerce / 7.x.

Enjoy!
http://drupal.org/project/uc_cashnet

Feb 14 2011
Feb 14

Redfin Solutions was recently quoted in an IT World article: "Joomla vs. Drupal: An open source CMS shootout," by Daniel P. Dern of Newton Center, MA.

The article is a shootout primarily between Joomla and Drupal. In sum, the article comes to the conclusion (granted this is our particular takeaway from the article) that Joomla is a great "starter CMS" and is geared more for designers and those who need to get a simple site up quickly.

By comparison, Drupal has a lot more extensibility behind it, and is great for larger sites that are more feature-rich, or as a platform for building rich web applications. With that said, one contributor mentioned that they often build smaller sites in Drupal so that they can "future-proof" them for growth. That is to say, with Joomla, once you grow outside of its boundaries, you may find yourself handling a costly rewrite or re-implementation. With Drupal, you can start with a simple blog but later grow and expand the web site.

Redfin's contributions included our thought that Acquia's presence in the Drupal game has secured (read: "put at ease") the minds of the enterprise-level thinkers. Acquia is there as a "safety net" who can always be a go-to for Drupal work if there's something that's above the heads of the in-house developers, or should all the in-house knowledge suddenly disappear. (Incidentally, Redfin can fill those shoes, too!) But Acquia has really put a big name on the face of this open source project.

We also contributed a piece about the vast number of projects and verticals that we've hit with Drupal projects over the years, precisely because Drupal is so developer-friendly, and has been from the get-go. Five years ago we chose Drupal in its 4.7 days because it was so extensible and developer-friendly. We continue to choose Drupal today for that very same reason.

Dec 10 2010
Dec 10

Today I was trying to figure out why in the name of all that good in the world why I couldn't use $_SESSION in my form's _submit handler.

As it turns out, I actually CAN put stuff into $_SESSION, it's just that when you want to pull something OUT of $_SESSION later, that's not where it is.

Instead, it's on the $user object in Drupal, in $user->session. After some studying, it seems this is a pipe-delimited and semi-colon delimited list of variable names and their serialized values.

Here's a function to pull stuff off of $user->session:

<?php
/**
* private function to return whether or not the person wants auto-play
*
* @return
*   either "yes" or "no" or "" (empty string)
*   yes - if the person has set their session var to autoplay
*   no - if they have set their session var to NOT autoplay
*   (empty string) - if they have not set their session var
*/
function _get_autoplay_from_session() {
 
$swftools_user_autoplay = '';
  global
$user;
 
 
$the_session = $user->session;
 
$the_session = explode('|', $the_session);
 
$swftools_user_autoplay = '';
  for (
$i = 0; $i < count($the_session); $i+=2) {
    if (
$the_session[$i] == 'swftools_user_autoplay') {
     
$swftools_user_autoplay = unserialize($the_session[++$i]);
      break;
    }
  }
 
  return
$swftools_user_autoplay;
}
?>

You'll notice that this pulls out a particular variable, but it can be easily modified and/or genericized to have you pass in the variable you're looking for.

Dec 06 2010
Dec 06

First of all, let me tell you how much I love and appreciate the Views_Customfield module, which lets you (among other things), write PHP to print out a field using views. When I need to, I can do some complex conditional-writing based on two other views fields, or I can do things like calculate a thumbnail for a photo gallery based on grabbing some fields using a raw db_query that aren't available through views, and other such craziness (note: be wary of doing this it can kill your site performance!).

But I've found that I can be quick to download and install that little gem in cases where I don't really need it. Take today, for example.

I was building a table view and I wanted the second column to be a concatenation of three fields. In an ideal world, it would look something like this:

Chris Wells (Redfin Solutions)

The three fields are first name, last name, and business name. Business name is an optional field, though. I needed the parentheses around it to offset it, I mean "Chris Wells Redfin Solutions" would in fact look pretty unclear, if not downright silly.

So I thought, well, I will exclude the business name and first name fields, and then I will write the last name field to be (paraphrasing):

[first] [last] ([business])

...by using tokens and the "rewrite the output of this field" piece of Views. The problem here, though, is that when you write that, and business is blank (remember, it's an optional field), you get bogus parentheses, like "Chris Wells ()" which also looks kinda silly.

Here's the trick. All you need to do is doubly-rewrite. Go to the business name, and rewrite it, putting parentheses around it. Then check "hide field if empty." Also mark "exclude from display."

Then, in your later field, you can use the token for the business field WITHOUT wrapping it in parentheses, et voila! A field that only writes (including the parentheses) when there is content in it. No need for PHP, Views is already working hard for you.

Oct 26 2010
Oct 26

If you want to modify the fields included in a RSS feed generated by the Views 2 module, you have a few options including themeing the view, but the easiest way to simply show or hide certain fields and/or their labels is to simply go to the "RSS" tab when editing a content type's "Display Fields" settings. That's it. Below, I set a couple fields to not be included in my RSS feed since I didn't want, for example, the image name from showing up in the feed.

Oct 12 2010
Oct 12

Theming the Date and Calendar modules can sometimes be a little tricky. I wanted to change the title display for a "mini" calendar in a block from "Month" to "Month YYYY" (e.g., "October 2010"). In order to do this you need to first learn how to override a theme function.

Now that you're up to par on theme overrides, we can override this theme function: /modules/date/theme/theme.inc: theme_date_nav_title. You'll have something that looks like this in Drupal 6.x.

In my case, I just had to adjust the ternary IF statement to always display 'F Y' as the date format, but you can adjust whatever you wish here.

<?php
/**
* Theme the calendar title
*/
function [your_template_name]_date_nav_title($granularity, $view, $link = FALSE, $format = NULL) {
  switch (
$granularity) {
    case
'year':
     
$title = $view->date_info->year;
     
$date_arg = $view->date_info->year;
      break;
    case
'month':
     
$format = !empty($format) ? $format : (empty($view->date_info->mini) ? 'F Y' : 'F Y');
     
$title = date_format_date($view->date_info->min_date, 'custom', $format);
     
$date_arg = $view->date_info->year .'-'. date_pad($view->date_info->month);
      break;
    case
'day':
     
$format = !empty($format) ? $format : (empty($view->date_info->mini) ? 'l, F j Y' : 'l, F j');
     
$title = date_format_date($view->date_info->min_date, 'custom', $format);
     
$date_arg = $view->date_info->year .'-'. date_pad($view->date_info->month) .'-'. date_pad($view->date_info->day);
      break;
    case
'week':
     
$format = !empty($format) ? $format : (empty($view->date_info->mini) ? 'F j Y' : 'F j');
     
$title = t('Week of @date', array('@date' => date_format_date($view->date_info->min_date, 'custom', $format)));
     
$date_arg = $view->date_info->year .'-W'. date_pad($view->date_info->week);
      break;
  }
  if (!empty(
$view->date_info->mini) || $link) {
   
// Month navigation titles are used as links in the mini view.
   
$attributes = array('title' => t('View full page month'));
   
$url = date_real_url($view, $granularity, $date_arg, TRUE);
    return
l($title, $url, array('attributes' => $attributes));
  }
  else {
    return
$title;
  }
}
?>

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web