May 18 2018
May 18

Just imagine: a user asks Amazon Alexa to read out loud to him/her the headline of your latest blog post! Or maybe to look for a specific section on your Drupal site! Or, even better: quit imagining this and start implementing it instead! Right on your website. And here's how you integrate Alexa with your Drupal 8 website via the Alexa integration APIs.

A 7-step tutorial:
 

  • on how to get Alexa to “talk to” your site users/online customers
  • on turning your site's content into the needed “raw material” for setting up your custom Alexa skills
  • on how you can leverage Drupal 8's outstanding third-party integration capabilities to “fuel” your implementation plan with
     

So, here's how it's done: 
 

But Why Precisely Amazon Alexa over Other Voice Assistants?

Because Amazon Alexa stands out with its top notch integration capabilities.

Its integration APIs make it possible for this particular voice service to be “plugged into” various devices and web services.

As simple as that! Alexa's more than just a voice assistant making voice (obviously!) interaction possible:

It's a voice service that empowers you to integrate it even with your Drupal 8 website quickly and smoothly, via its own built-in APIs!
 

Introducing Alexa: The Drupal Module for Amazon Alexa Integration

With Alexa “doing its own part” and the voice service bringing its Alexa integration APIs into the equation, it was only fair that the Drupal community should play their own part, as well.

The result of their sustained efforts? The Alexa Drupal module:
 

  • which provides an endpoint for your Drupal 8 website, where it would receive the vocal user requests “stored” in the Alexa Skills
  • "user requests” which get converted into text strings before getting sent over to the Alexa module on your Drupal site
     

Note: do keep in mind that the Alexa module is still under development, but with a more than promising, long-term future ahead of it.

For now, it offers basic integration with Amazon's Alexa. And this is precisely why you'll need to build a custom module, as well, to integrate Alexa with your Drupal 8 website.

But more details on this, in the tutorial here below:
 

Integrate Alexa With Your Drupal 8 Website: A 7-Step Guide 
 

Step 1: Make Sure Your Site Uses HTTPs

In other words: make sure your Drupal 8 website's “easily detectable” by Amazon's servers!

The very first step to take will be to switch your site over to an HTTPs domain (a step you can skip if your site's already on HTTPs)
 

Step 2: Install the Alexa Module

Go “grab” the Alexa Drupal module and get it installed and enabled on your website. 
 

Step 3: Set Up Your Alexa Skill 

With your dedicated Drupal module ON, it's time to focus on all the needed setting up to be done on the Amazon Developer site. And the very first step to take is to create your own new Alexa Skill in the Skills Kit there.

How to Integrate Alexa with Your Drupal 8 Website: Set Up Your Alexa Skill

Step 4: Copy & Paste Your Application ID

And this is no more than a quick 2-step process:
 

  1. first, you copy the Application ID provided in your “Skill information” section, on the Amazon developer site
  2. then you submit it to your website's configuration at /admin/config/services/alexa
     

Step 5: Configure Your New Alexa Skill

A key 3-part step to take when you integrate Alexa with your Drupal 8 website, where you:
 

  1. give a name to the Alexa skill (in the Alexa app) to be triggered
  2. set up an Invocation Name for your users to utter for “activating” your newly created Alexa skill
  3. set up the custom vocal commands or “intents” that Alexa will need to respond to
     

For this, you'll need to go to the Amazon Development website again and access the “Skill Information” section.

Note: maximize the odds that it's precisely those intents that your users will utter by adding more phrasings of the very same question/vocal command. 

Another note: this flexibility proves that you get to harness the power of... variables when setting up your custom intents. “Variables” that you'll use with the custom module that you're going to build at the following step of the process:
 

Step 6: Create a Custom Module for Triggering The Right Responses to Your Intents

What should happen when your custom intents get invoked and sent through to your Drupal 8 website? 

You'll need to create a custom Drupal 8 module that would handle responses.

For this, insert the following info in the demo_alexa.info.yml file:

name: Alexa Latest Articles Demo
type: module
description: Demonstrates an integration to Amazon Echo.
core: 8.x
package: Alexa
dependencies:
 - alexa

Note: Do keep in mind to add the Alexa Drupal module as a dependency!

Now, time to build the custom module itself: 
 

  1. create a file at src/EventSubscriber/
  2. name it RequestSubscriber.php 
     

As for the code that will “populate” your module, first of all it's the namespace and use statements that you'll need to create:

namespace Drupal\demo_alexa\EventSubscriber;

use Drupal\alexa\AlexaEvent;
use Symfony\Component\EventDispatcher\EventSubscriberInterface;
use Drupal\paragraphs\Entity\Paragraph;


Then, you'll need to set up your main class, as well as a function to trigger the event:

/**
* An event subscriber for Alexa request events.
*/
class RequestSubscriber implements EventSubscriberInterface {

 /**
  * Gets the event.
  */
 public static function getSubscribedEvents() {
   $events['alexaevent.request'][] = ['onRequest', 0];
   return $events;
 }

Next, set up the function “responsible” for giving responses to each one of your custom intents. 

With the code for your responses at hand, the very last file that you'll need to focus on is the demo_alexa.services.yml:

services:
 alexa_demo.request_subscriber:
   class: Drupal\demo_alexa\EventSubscriber\RequestSubscriber
   tags:
     - { name: event_subscriber }

Note: Remember to enable your demo Alexa module, then to navigate to the Amazon Developer site once again!
 

Step 7: Test Out Your New Alexa Skill 

Another essential step to take when you integrate Alexa with your Drupal 8 website is testing your newly created Alexa skill. 

And there's even a Test tab on https://developer.amazon.com for that!

How to Integrate Alexa with Your Drupal 8 Website: Test Out Your New Alexa Skill

Click on this specific tab, ensure that your new Alexa skill is enabled and thus ready to be tested and... see whether you'll get the right responses!

The END! This is the “how it's made” for getting Amazon Alexa to “talk to” your Drupal 8 website via:
 

  1. the Alexa integration APIS
  2. the Alexa module
  3. a custom-built Drupal 8 module
     
Apr 24 2018
Apr 24

Whether you're "constrained" to migrate content to Drupal 8 or you're just eager to jump on the Drupal 8 bandwagon and harness its much-talked-about advanced features, the most important “warning/advice” to keep in mind is:

Don't migrate mindlessly!

Meaning that before you even get to the point of:
 

  • triggering the Migrate module's capabilities and adjusting them to your migration project's needs and requirements
  • selecting and combining all the needed contrib modules
  • writing down your YAML files for carrying out your content migration process
     

You'll need to think through every little aspect involved in/impacted by this process:
 

  • your goals
  • your growth plan
  • your current site visitors' complaints and suggestions
     

That being said, here's more of a “backbone” or summary of the migration workflow, one that highlights the:
 

  1. main phases to go through
  2. the right approach to the whole process
  3. Drupal-specific concepts and tools to use
     

Do NOT expect a very detailed, highly technical tutorial, though!

As for the Drupal concepts that you'll need to be already (more than) familiarized with once you launch your migration process, maybe you want to have a look at this guide here, on Understanding Drupal

And now, let's delve in:
 

1. The Migration Workflow: 4 Key Phases to Consider 

Here's the entire process in 4 steps (so you know what to expect):
 

  1. first, you'll need to migrate your data into the destination nodes, files and paragraphs on the newly built Drupal 8 site
  2. then you'll migrate data into date, image, taxonomy, address fields and file
  3. next, you'll move your precious data from JSON and CVS files
  4. and finally, you'll complete your migrations from the UI and the terminal
     

2. Are You Upgrading from Drupal 6 or 7 or Migrating From a Different System?

And here's what to expect depending on your answer to the above question:
 

  1. if you migrate content to Drupal 8 from an older version of Drupal (6 or 7), then you're quite “spoiled”: a lot of hard work has been done, by the Drupal community, for turning this migration process into the official path to Drupal 8; you could say that the solid framework has already been set up, so all there's left for you to do is to... take advantage of it!
  2. if it's from a whole different system that you're migrating your site (let's say WordPress or maybe Joomla), then... expect it to be a bit more challenging. Not impossible, yet more complex
     

3. Plan Everything in Detail: Think Everything Through!

Now with the risk of sounding awfully annoying and repetitive, I feel like stressing this out:

Don't migrate... mindlessly!

Plan everything in the smallest detail. Re-evaluate the content on your current site and its “load” of features. 

Take the time to define your clear goals and to put together your growth plan (if there's any).

Then, do lend ear to what your current site visitors have to say, filter through all their complaints and suggestions and tailor your final decisions accordingly.

It's only then that you can go ahead and set up your content architecture.
 

4. Start With the Structure: Build Your Drupal 8 Site First

“But I haven't picked a theme yet!” you might be thinking.

No need to! Not at this stage of the migration process.

You can still build your Drupal 8, from the ground up, even without a theme ready to be used. You can add it later on, once you have the final version of your content!

But the site itself, its solid structure, this is a “must do”. It's the very foundation of all your next operations included in your migration workflow!
 

5. Deep Clean & Declutter! Take Time to Audit Your Content

Don't underrate this very step! For moving over all that clutter, that heavy load of unused, outdated features and all those chaotic, crummy pages will only impact your Drupal 8 site's performance from the start.

So, now it's the right time to do some... deep cleaning!

Audit your content, your features, plugins and other functionalities included in your site's infrastructure and... trim it down by:
 

  1. relevance (are you using it?)
  2. quality: keyword-stuffed, unstructured pages (a heavy pile of them) will surely not give your new Drupal 8 site any significant jumpstart in rankings!
     

6. About the Migration Module Included in Drupal 8 Core

Using this dedicated module in Drupal core to migrate content to Drupal 8 comes down to implementing the:

Extract- Transform-Load process

Or simply: ETL.

In Drupal — as related to the Drupal migrate module — these 3 operations come under different names:
 

  • the source plugin stands for “extract”
  • the process plugin stands for “transform”
  • the destination plugin stands for “load”
     

7. Time to... Migrate Content to Drupal 8 Now!

Now it's time to put some order into that “pile” of content of yours! To neatly structure Google Sheets, XML files, CVS files etc.

And here's the whole “structuring process” summed up to the 3 above-mentioned plugins: source, process and destination.
 

Source: 

  • XML file
  • SQL database
  • Google Sheet
  • CVS file
  • JSON file
     

Process:

  • iterator
  • default_value
  • migration_lookup
  • concat
  • get 


Destination:

  • images
  • users
  • paragraphs
  • nodes
  • files

And here's a specific example of how to “glue” data for a neater and ideally structured content architecture:
 

Before the migration:

  • A: First Name- Kevin
  • B: Last Name: Thomson
  • C: Department- Commerce
     

After Migration: 

  • A: Name- Kevin Thomson
  • B: Department- Commerce
     

8. 4 Contrib Modules to Incorporate Into Your Migration Workflow

As already mentioned, the migrate content to Drupal 8 process also involves using a combination of contrib modules. 

Speaking of which, allow me to get them listed here:
 

  1. Migrate Tools          
  2. Migrate Source CVS    
  3. Migrate Spreadsheet 
  4. Migrate Plus 
                 

The END! This is the tutorial on how to migrate content to Drupal 8 trimmed down to its bare essentials.

To its core phases, key steps to take, main Drupal concepts to “joggle with”, right approach/mindset to adopt and best tools/modules to leverage for a smooth process!

Any questions?

Apr 06 2018
Apr 06

With popularity comes trouble... In this case here meaning: security vulnerabilities and risky over-exposure to cyber threats. And this can only mean that securing your website, that's running on the currently third most popular CMS in the world, calls for a set of Drupal security best practices for you to adopt.

And to stick to!

There's no other way around it: a set of strategically chosen security measures, backed by a prevention-focused mindset, pave the shortest path to top security.   

Stay assured: I've selected not just THE most effective best practices for you to consider adopting, but the easiest to implement ones, as well.

Quick note: before I go on and knee deep into this Drupal security checklist, I feel like highlighting that:
 

  • Drupal still has a low vulnerability percentage rate compared to its market share
  • the majority of Drupal's vulnerabilities (46%) are generated by cross-site scripting (XSS)
     

And now, here are the tips, techniques and resources for you to tap into and harden your Drupal site's security shield with.
 

1. The Proper Configuration Is Required to Secure Your Drupal Database 

Consider enforcing some security measures at your Drupal database level, as well.

It won't take you more than a few minutes and the security dangers that you'll be safeguarding it from are massive.

Here are some basic, yet effective measures you could implement:
 

  • go for a different table prefix; this will only make it trickier for an intruder to track it down, thus preventing possible SQL injection attacks
  • change its name to a less obvious, harder to guess one
     

Note: for changing your table prefix you can either navigate to phpMyAdmin, if you already have your Drupal site installed, or do it right on the setup screen (if it's just now that you're installing your website).
 

2. Always Run The Latest Version of Drupal on Your Website

And this is the least you could do, with a significant negative impact on your Drupal site if you undermine its importance. If you neglect your updating routine.

Do keep in mind that:
 

  1. it's older versions of Drupal that hackers usually target (since they're more vulnerable)
  2. the regularly released updates are precisely those bug fixes and new security hardening features that are crucial for patching your site's vulnerabilities.
     

Why should you leave it recklessly exposed? Running on an outdated Drupal version, packed with untrusted Drupal modules and themes?

Especially since keeping it up to date means nothing more than integrating 2 basic Drupal security best practices into your site securing “routine”:
 

  1. always download your themes and modules from the Drupal repository (or well-known companies)
  2. regularly check if there are any new updates for you to install: “Reports” → “Available Updates”→“Check manually” 
     

Drupal Security Best Practices: run the latest version of Drupal
 

3. Make a Habit of Backing Up Your Website

And here's another one of those underrated and too often neglected Drupal security best practices!

Why should you wait for a ransomware attack and realize its true importance... “the hard way”?

Instead, make a habit of regularly backing up your website since, as already mentioned:

There's no such thing as perfection when it comes to securing a Drupal site, there's only a hierarchy of different “security levels” that you can activate on your site

And backing up your site, constantly, sure stands for one of the most effective measures you could apply for hardening your Drupal website.

Now, here's how you do it:
 

  1. make use of Pantheon's “one-click backup” functionality
  2. test your updates locally using MAMP or XAMPP or another “kindred” software
  3. harness the Backup and Migrate module's power, currently available only for Drupal 7
  4. export your MySQL database and back up your files “the old way”... manually
     

There, now you can stay assured that, if/when trouble strikes, you always have your backup(s) to retrieve your data from and get back “on your feet” in no time!
 

4. Block Those Bots That You're Unwillingly Sharing Your Bandwidth With

No need to get all “altruist” when it comes to your bandwidth!

And to share it with all kinds of scrappers, bad bots, crawlers.

Instead, consider blocking their access to your bandwidth right from your server.

Here's how:

Add the following code to your .htacces file and block multiple user-agent files at once:

RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^.*(agent1|Wget|Catall Spider).*$ [NC]
RewriteRule .* - [F,L]

Or use the BrowserMatchNoCase directive as follows:

BrowserMatchNoCase “agent1” bots
BrowserMatchNoCase "Wget" bots
BrowserMatchNoCase "Catall Spider" bots

Order Allow,Deny
Allow from ALL
Deny from env=bots

Use the KeyCDN feature for preventing those malicious bots from stealing your bandwidth!



5. Use Strong Passwords Only: One of the Easiest to Implement Drupal Security Best Practices

More often than not “easy” doesn't mean “less efficient”. 

And in this particular case here, simply opting for a strong username (smarter than the standard “admin”) and password can make the difference between a vulnerable and a hard-to-hack Drupal site.

For this, just:

Manually change your credentials right from your admin dashboard:  “People” → “Edit”→ “Username” while relying on a strong password-generating program ( KeePassX or KeePass) 
 

6. Use an SSL Certificate: Secure All Sensitive Data and Login Credentials

Would you knowingly risk your users' sensitive data? Their card information let's say, if it's an e-commerce Drupal site that you own?

And how about your login credentials?

For this is what you'd be doing if — though you do recognize the importance of using an SSL certificate —  you'd still put this measure at the back of your list of Drupal security best practices.

In other words, running your site on HTTPs (preferably on HTTP/2, considering all the performance benefits that it comes packaged with) you'll be:
 

  • encrypting all sensitive data that's being passed on, back and forth, between the server and the client
  • encrypting login credentials, instead of just letting them get sent, in crystal-clear text, over the internet.
     

7. Use Drupal Security Modules to Harden Your Site's Shield

For they sure make your most reliable allies when it comes to tracking down loopholes in your site's code or preventing brutal cyber attacks.

From:
 

  • scanning vulnerabilities
  • to monitoring DNS changes
  • blocking malicious networks
  • identifying the files where changes have been applied
     

… and so on, these Drupal modules will be “in charge” of every single aspect of your site's security strategy.

And supercharging your site with some of the most powerful Drupal security modules is, again, the easiest, yet most effective measure you could possibly enforce.

Now speaking of these powerful modules, here's a short selection of the “must-have” ones:
 

  • Password Policy: enables you to enforce certain rules when it comes to setting up new passwords (you even get to define the frequency of password changes)
  • Coder : runs in-depth checks, setting your code against Drupal's best practices and coding standards
  • Automated Logout: as an admin, you get to define the time limit for a user's session; he/she will get automatically logged out when time expires
  • SpamSpan Filter: enables you to obfuscate email addresses, thus preventing spambots from “stealing” them
  • Login Security: deny access by ID address and limit the number of login attempts
  • Content Access: grant permission to certain content types by user roles and authors
  • Hacked!: provides an easy way for you to check whether any new changes have been applied to Drupal core/themes
  • Security Review Module: it will check your website for those easy-to-make mistakes that could easily turn into security vulnerabilities; here's a preview of this module “at work”
     

Drupal Security Best Practices: the Drupal Security Review Module
 

8. Implement HTTP Security Headers

Another one of those too-easy-to-implement, yet highly effective Drupal security best practices to add to your Drupal security checklist:

Implementing (and updating) HTTP security headers

“Why bother?”

Cause:
 

  1. first of all, their implementation requires nothing more than a configuration change at the web server level
  2. their key role is letting the browsers know just how to handle your site's content
  3. … thus reducing the risk of security vulnerabilities and brute force attacks
     

9. Properly Secure File Permissions

Ensure that your file permissions for:
 

  • opening
  • reading
  • modifying them
     

… aren't too dangerously loose.

Since such negligence could easily turn into an invitation for “evil-minded” intruders! 

And it's on Drupal.org's dedicated page that you can find more valuable info on this apparently insignificant, yet extremely effective security measure 
 

10. Restrict Access To Critical Files 

Told you this was going to be a list of exclusively easy-to-implement Drupal security best practices.

Blocking access to sensitive files on your website (the upgrade.php file, the install.php file, the authorize.php file etc.) won't take you more than a few minutes.

But the danger you'd avoid — having a malicious intruder risking to access core files on your Drupal site — is way too significant to overlook.
 

END of list! These are probably the easiest steps to take for securing your Drupal site.

How does your own list of Drupal security tips, techniques and resources to tap into look like?

Mar 16 2018
Mar 16

And I'm back, as promised, with 5 more key differences meant to help you solve your Apache Solr vs Elasticsearch dilemma.

To help you properly evaluate the 2 open source search engines and, therefore, to identify the perfect fit for your own use case and your project's particular needs.
 

6. Node Discovery

Another aspect that clearly differentiates the 2 search engines is the way(s) they handle node discovery.That is, whenever a new node joins the cluster or when there's something wrong with one of them, immediate measures, following certain criteria, need to be taken.

The 2 technologies handle this node-discovery challenge differently:
 

  1. Apache Solr uses Apache Zookeeper — already a “veteran”, with plenty of projects in its “portfolio” —  requiring external Zookeper instances (minimum 3 for a fault-tolerant SolrCloud cluster).
  2. Elasticsearch relies on Zen for this, requiring 3 dedicated master nodes to properly carry out its discovery “mission”

 
7. Apache Solr vs Elasticsearch: Machine Learning

Machine learning has a way too powerful influence on the technological landscape these days not to take it into consideration in our Apache Solr vs Elasticsearch comparison here.

So, how do these 2 open source search engines support and leverage machine learning algorithms?
 

  1. Apache Solr, for instance, comes with a built-in dedicated contrib module, on top of streaming aggregations framework; this makes it easy for you to use machine-learning ranking models right on top of Solr
  2. Elasticsearch comes with its own X-Pack commercial plugin, along with the plugin for Kibana (supporting machine learning algorithms) geared at detecting anomalies and outlines in the time series data 
     

8. Full-Text Search Features 

In any Apache Solr vs Elasticsearch comparison, the first one's richness in full-text search related features is just... striking!

Its codebase's simply “overcrowded” with text-focused features, such as:
 

  • the functionality to correct user spelling mistakes
  • a heavy load of request parsers
  • configurable, extensive highlight support
  • a rich collection of request parsers
     

Even so, Elasticsearch “strikes back” with its own dedicated suggesters API. And what this feature does precisely is hiding implementation details from user sight, so that we can add our suggestions far more easily.

And, we can't leave out its highlighting functionality (both search engines rely on Lucene for this), which is less configurable than in Apache Solr.
 

9. Indexing & Searching: Text Searching vs Filtering & Grouping

As already mentioned in this post, any Apache Solr vs Elasticsearch debate is a:

Text-search oriented approach vs Filtering and grouping analytical queries type of contrast.

Therefore, the 2 technologies are built, from the ground up, so that they approach different, specific use cases:
 

  1. Solr is geared at text search
  2. Elasticsearch is always a far better fit for those apps where analytical type of queries, complex search-time aggregations need to be handled
     

Moreover, each one comes with its own “toolbox” of tokenizers and analyzers for tackling text, for breaking it down into several terms/tokens to be indexed.

Speaking of which (indexing), I should also point out that the two search engine “giants” handle it differently:
 

  1. Apache Solr has the single-shard join index “rule”; one that gets replicated across all nodes (to search inter-document relationships)
  2. Elasticsearch seems to be playing its “efficiency card” better, since it enables you to retrieve such documents using top_children and has_children queries
     

10. Shard Placement: Static by Nature vs Dynamic By Definition

Shard replacement: the last test that our two contestants here need to pass, so you can have your final answer to your “Apache Solr vs Elasticsearch” dilemma.

In this respect, Apache Solr is static, at least far more static than Elasticsearch. It calls for manual work for migrating shards whenever a Solr node joins or leaves the cluster. 

Nothing impossible, simply less convenient and slightly more cumbersome for you:
 

  • you'll need to create a replica 
  • wait till it synchronizes the data
  • remove the “outdated” node
     

Luckily for you, Elasticsearch is not just “more”, but “highly” dynamic and, therefore, far more independent.

It's capable to move around shards and indices, while you're being granted total control over shard placement:

  • by using awareness tags, you get to control where those shards should/shouldn't be placed
  • by using an API call you can guide Elasticsearch into moving shards around on demand

The END! Now if you come to think about it, my 10-point comparative overview here could be summed up to 2 key ideas worth remembering:
 

  1. go for ApacheSolr if it's a standard text-search focused app that you're planning to build; if you already have hands-on experience working with it and you're particularly drawn to the open-source philosophy
  2. go for Elasticsearch if it's a modern, real-time search application that you have in mind; one perfectly “equipped” to handle analytical queries. If your scenario calls for a distributed/cloud environment (since Elastic is built with out-of-the-ordinary scalability in mind)
     
Mar 16 2018
Mar 16

Apache Solr vs Elasticsearch, the 2 leading open-source search engines... What are the main differences between these technologies?

Which one's faster? And which one's more scalable? How about ease-of-use?

Which one should you choose? Which search engine's the perfect fit for your own:
 

  1. use case
  2. specific needs
  3. particular expectations?
     

Obviously, there's no universally applicable answer. Yet, there are certain parameters to use when evaluating these 2 technologies. 

And this is precisely what we've come up with: a list of 10 key criteria to evaluate the two search engines by, revealing both their main strengths and most discouraging weakness.

So you can compare, weight pros and cons and... draw your own conclusions.
 

But First, A Few Words About The Two “Contestants”

I find it only natural to start any Apache Solr vs Elasticsearch comparison by briefly shading some light on their common origins:

Both open source search engine “giants” are built on the Apache Lucene platform. And this is precisely why you're being challenged with a significant number of similar functionalities.
 

Apache Solr

Already a mature and versatile technology, with a broad user community (including some heavy-weighting names: Netflix, Amazon CloudSearch, Instagram), Apache Solr is an open source search platform built on Lucene, a Java library.

And no wonder why these internet giants have chosen Solr. Its indexing and searching multiple sites capabilities are completed by a full set of other powerful features, too:
 

  • dynamic clustering
  • faceted search 
  • NoSQL features & rich document handling
  • full-text search 
  • real-time indexing
     

Elasticsearch 

It's a (younger) distributed open source (RESTful) search engine built on top of Apache Lucene library.

Practically, it emerged as a solution to Solr's limitations in meeting those scalability requirements specific to modern cloud environments. Moreover, it's a:
 

  • multitenant-capable
  • distributed
  • full-text


...  search engine, with schema-free JSON documents and HTTP web interfaces, that it “spoils” its users with.

And here's how Elasticsearch works:

It includes multiple indices that can be easily divided into shards which, furthermore, can (each) have their own “clusters” of replicas.

Each Elasticsearch node can have multiple (or just a single one) shards and the search engine is the one “in charge” with passing over operations to the right shards.

Now, if I am to highlight some of its power features:
 

  • analytical search 
  • multi-tenancy
  • grouping & aggregation 
  • distributed search 
     

1. User and Developer Communities: Truly Open-Source vs Technically Open-Source

A contrast that we could define as:

“Community over code” philosophy vs Open codebase that anyone can contribute to, but that only “certified” committers can actually apply changes to.

And by “certified” I do mean Elasticsearch employees only.

So, you get the picture:

If it's a fully open-source technology that you're looking for, Apache Solr is the one. Its robust community of contributors and committers, coming from different well-known companies and its large user base make the best proof.

It provides a healthy project pipeline, everyone can contribute, so there's no one single company claiming the monopoly over its codebase.

One that would decide which changes make it to the code base and which don't.

Elasticsearch, on the other hand, is a single commercial entity-backed technology. Its code is right there, open and available to everyone on Github, and anyone can submit pull requests.

And yet: it's only Elasticsearch employees who can actually commit new code to Elastic.
 

2. What Specific Use Cases Do They Address?

As you can just guess it yourself:

There's a better or worse fit, in any Apache Solr vs Elasticsearch debate, depending exclusively on your use case.

So, let's see first what use cases are more appropriate for Apache Solr:
 

  • applications relying greatly on text-search functionality
  • complex scenarios with entire ecosystems of apps (microservices) using multiple search indexes, processing a heavy load of search-request operations
     

And now some (modern) use cases that call for Elasticsearch:
 

  • applications relying (besides the standard text-search functionality) on complex search-time aggregations, too
  • open-source log management use cases with many organizations indexing their logs in Elasticsearch in order to make them more searchable
  • use cases depending on high(er) query rates
  • data stores “supercharged” with capabilities for handling analytical type of queries (besides text searching)

… and pretty much any new project that you need to jump right onto, since Elasticsearch is much easier to get started with. You get to set up a cluster in no time.
 

3. Apache Solr vs Elastic Search: Which One's Best in Terms of Performance?

And a performance benchmark must be on top of your list when doing an Apache Solr vs Elasticsearch comparison, right?

Well, the truth is that, performance-wise, the two search engines are comparable. And this is mostly because they're both built on Lucene.

In short: there are specific use cases where one “scores” a better performance than the other.

Now, if you're interested in search speed, in terms of performance, you should know that:
 

  1. Solr scores best when handling static data (thanks to its capability to use an uninverted reader for sorting and faceting and thanks to its catches, as well)
  2. Elasticsearch, being “dynamic by nature”, performs better when used in... dynamic environments, such as log analysis use cases
     

4. Installation and Configuration

Elasticsearch is a clear winner at this test:

It's considerably easier to install, suitable even for a newbie, and lighter, too.

And yet (for there is a “yet”), this ease of deployment and use can easily turn against it/you. Particularly when the Elasticsearch cluster is not managed well.

For instance, if you need to add comments to every single configuration inside the file, then the JSON-based configuration, otherwise a surprisingly simple one, can turn into a problem.

In short, what you should keep in mind here is that:
 

  1. Elastricsearch makes the best option if you're already using JSON
  2. if not, then Apach Solr would make a better choice, thanks to its well-documented solrconfig.xml and schema.xml 
     

5. Which One Scales Better?

And Elasticsearch wins this Apache Solr vs Elasticsearch test, too.

As already mentioned here, it has been developed precisely as an answer to some of Apache Solr well-known scalability shortcomings.

It's true, though, that Apache Solr comes with SolrCloud, yet its younger “rival”:
 

  • comes with better built-in scalability
  • it's designed, from the ground up, with cloud environments in mind
     

And so, Elasticsearch can be scaled to accommodate very large clusters considerably easier than Apach Solr. This is what makes it a far better fit for cloud and distributed environments.

And this is the END of PART 1. Stay tuned for I have 5 more key aspects “in store” for you, 5 more “criteria” to consider when running an Apache Solr vs Elasticsearch comparison!

Still a bit curious: judging by these 5 first key features only, which search engine do you think that suits your project best?

Mar 02 2018
Mar 02

The Earth is round, a buttered toast will always fall butter-side down and doing clear cache is every Drupal developer's best practice, these are all globally-accepted truths! And speaking of the latter, when you discover that the familiar Drush clear cache technique is no longer universally unique you wonder: why the change?

Why go from clear-cache, to... actually rebuilding cache, starting with Drupal 8?

What's the catch?

This new way to clear Drupal cache must be stemming from a certain limitation that earlier versions of Drupal presented:

Partially completed cache-clearing operations threatening to grow into fatal errors.

And now, let's dig into more details on:
 

  • clear Drupal cache: why & when
  • the 4 methods for clearing your cache in Drupal
  • Drush clear cache vs rebuilding cache: differences, the initiative behind this change, main benefits to expect 
     

So, shall we proceed?
 

Clearing Your Drupal Cache: Why Bother? And When?

First of all, here's the “motivation” that drives Drupal to create a cache in the first place:

Each time a Drupal site has to render a certain web page, it is “forced” to perform specific database queries; and since all these queries have a negative impact on the overall page loading time, Drupal “decides” to store these web pages, once it will have rendered them, in a cache for later (streamlined) reference.

OK, now that we've settled this whole “cause and effect” process, let's see why and when you should clear cache on your Drupal site:
 

  1. when you're troubleshooting problems on your website; clear Drupal cache before you undertake any debugging, since this might just confirm to you that the “alerting issue” was nothing but a bad cache entry
  2. whenever you want Drupal to quickly record all the updates that you will have performed via the UI, all the changes you will have applied to your code
  3. when you're moving your website to a new host
  4. when you're installing a new theme or module on your Drupal site; just another scenario when Drush clear cache should be the very first step to take while you're troubleshooting
     

In a few words: clearing your cache might just be one of the most frequent actions you'll take while working (or simply maintaining) on a Drupal site. 

And in many cases, the one that will “save the day”, without the need to apply other more complex techniques from your “arsenal”.
 

4 Different Methods to Clear Drupal's Cache 

For there are several ways for you to clear your Drupal site's cache. Just go with the one that best suits your work style:
 

1. The Easy Way: Clear the Drupal Cache From the User Interface 

By far the handiest (and some might say “the less-efficient”, too) method to clear Drupal cache is via the UI:
 

  1. just go to Administration>Configuration>Development>Performance 
  2. and hit the “Clear all caches” button
     

It won't be long till Drupal displays the “Caches cleared” message for you! And that's it!


2. Drush Clear Cache (Drupal 7) or Drush Cache-Rebuild (Drupal 8)

And now, the second method in your “arsenal”: the clear Drupal cache command line one!

A two-way method, better said, which depends greatly on the version of Drupal on your website: 7 or 8?

In this respect, here's the “magic command” to use for clearing your Drupal 7's cache:

drush cache-clear all 

or

drush cc all

Whereas in Drupal 8, this is the Drush command for tackling your cache:

drush cache-rebuild

or, alternatively, these 2 aliased commands:

drush rebuild or drush cr

And here I'm sure you can already tell which are the specific steps to take for handling your cache in Drupal 8 using Drush (still the most convenient way to do it):
 

  1. first of all, you open a Terminal window and CD in your Drupal 8 website's root (a step that you can overlook if it's Drush aliases that you're using):
  2. next, your run your “magic formula”, your Drush command (“drush cache-rebuild” or “drush cr”) and wait for it to complete its task before going back to your website
  3. and finally, you just reload the page you were on, in your web browser
     

3. Run the /core/rebuild.php file for Clearing Your Drupal 8 Site's Cache 

Among all the improvements that Drupal 8 “lures” us in with (built-in WYSIWYG, a Twig templating system and so on), there's the /core/rebuild.php file standing out!

And “promising” us to streamline our frequent (and time-consuming) cache tackling tasks that we need to carry out during development:

The Drupal 8 site in question doesn't even have to be working and the whole process doesn't require Drupal Console or Drush either!

How about that?

The one and only requirement (for there still is one) is that your site's configuration supports it. 

And how can you check whether your site's config accepts this functionality? Well, there 2 methods at your disposal:
 

  1. in case you're working locally, just ensure that $settings['rebuild_access'] = TRUE; in your settings.php (settings.local.php)
  2. or run this script in your command line: /core/scripts/rebuild_token_calculator.sh; then just use the results there as query parameters for /core/rebuild.php (https://goo.gl/qTrJ9d)
     

And voila! This “trick” will rebuild all cache without even requiring for the Drupal 8 site itself to be working during the whole process! 

Which makes it the perfect “plan B”, whenever you don't have Drupal Console or Drush installed where you're working!

The only condition is that your websites' configuration supports this functionality!
 

4. In the Database: Truncate all Tables Starting With “cache_”

Spoiler alert: by “truncate” I do mean emptying, not removing!

The fourth method to clear Drupal cache involves clearing all the data from the cache-related tables in your database. Meaning all the tables starting with “cache_”.

For this, you just go over to your phpMyAdmin, select all the cache_* table and then click “Truncate” in the “with selected” drop-down menu placed at the bottom of the page:

TRUNCATE cache_config;
TRUNCATE cache_container;
TRUNCATE cache_data;
TRUNCATE cache_default;
TRUNCATE cache_discovery;
TRUNCATE cache_dynamic_page_cache;
TRUNCATE cache_entity;
TRUNCATE cache_menu;
TRUNCATE cache_render;
TRUNCATE cache_toolbar;

As for the command line, feel free to scan through and then to tap into the valuable info that you'll find here: https://goo.gl/1b4otB here's another practical example:

Let's say it's Sequel Pro — an SQL GUI app — that you're using. For truncating those specific tables, connect to the server, track down your site's database, have those specific “cache_” tables highlighted and just choose “Truncate tables”, from the drop-down menu!

Also, in the above-mentioned “scenario” you could alternatively go to your PhPMyAdmin's SQL command field or MySQL CLI and run the above-mentioned command:
 

From Drush Clear Cache to Cache Rebuilding in Drupal 8: Why the Change?

Here's the challenge that the conventional Drush clear cache (or “drush cc all”) used to make us deal with:

Drupal's using caching intensively and therefore, it implicitely creates lots of inter-dependencies. Only partially flushing this heavy load of caches used to pose some major risks for any website.

This is where the “cache-rebuild” method stepped in, starting with Drupal 8!

It practically rebuilds (re-bootstraps) the whole Drupal site, after making sure that all cache is perfectly cleared. A “check and double check” technique, you may call it, which makes sure that:
 

  1. your site is up and running
  2. all cache gets flawlessly flushed!
     

Drupal 7's so very popular Drush cache command itself gets cleared and replaced with “cache-rebuild” in Drupal 8.

Which (the Drush cache-rebuild command specific to Drupal 8) carries out the following tasks:
 

  1. clearing the APC cache 
  2. bootstrapping Drupal
  3. calling drupal_rebuild() 
  4. removing the Drush cache 
     

4. Wrap-Up

Summing it up now, the essential info to remember is that:
 

  • “clear cache” should be on top of your “best practices” list as a Drupal developer 
  • you have not just 1, but 4 methods to choose from, depending on your work style and context: via the UI, clear cache using Drush, by truncating your “cache_” database tables, by running the /core/rebuild.php file
  • Drupal 8's cache-rebuild is a step forward from the conventional cache-clear practice; it adds a new “re-bootstrapping” operation to the “cache clearing” process!
Feb 23 2018
Feb 23

When to use REST? What are some practical use cases of REST web services? How does it work? What's the “catch”, why has this new architecture for web services had such an impact on the industry? How is it any different/better than SOAP? Why use RESTful web services after all?

“Tormented” by all these questions related to the REST approach/alternative to building web services?

Relax now, you'll have your answers in a minute (or a few seconds)!

For here are the REST-related “enigmas” that I commit myself to solving in today's post:
 

  • What is REST and how does it work?
  • Which are the specific use cases for building web services using the REST architecture?
  • What's driving it? Why is this technology increasingly popular?
  • What sets REST apart from the traditional SOAP approach to web services?
  • When NOT to use RESTful web services?
     

And now... the answers that I promised you:
 

What Is REST and How Does It Work?

Here are some valid answers to all your “What?” questions: “What is REST?”, “What are web services”, “What are RESTful web services?
 

  • REST is the native API of web browsers
  • REST is how one creates web services
  • web services are... the future of everything: creating and publishing APIs that would do CRUD (create, read, update and delete)
  • … thus making machine-to-machine communication possible, making apps' functionality accessible to multiple users and external systems
  • RESTful web services are those resources on the web that can be tapped into for retrieving certain information
     

“And how does it work?”

First of all, we should get one thing straight: REST is not an official standard! It's more of an architectural style, the one organizing a network of systems, where the “systems” are basically servers and clients.

Here's how it works:

Clients make a request to the web servers; the latter process it and, in response, return the appropriate web pages. And in this request-and-response equation, RESTful web services are the very resources on the web that servers tap into for retrieving the requested data.

Does this definition shed any light on your RESTful web services-related questions? 
 

Why Use RESTful Web Services?

Here's the actual context where the RESTful web services technology emerged and “grew like a beanstalk”, with a huge impact on the industry:

The web “exploded” and, starting with web 2.0, the interaction between websites and client apps, between multiple sites, multiple devices, sites and databases, became increasingly intense. And more and more “demanding”, calling for a new technology that could handle and streamline this communication taking place on the web.

And here's where web services and REST, a new way of building them, emerged!

The REST architecture is designed to build:
 

  • maintainable
  • lightweight
  • scalable 
     

… web services, which make retrieving the data requested and “exposing” all that information far less cumbersome.

As compared to the conventional SOAP/XMLRPC web page-scrapping method.

Data's being passed on the web, from one website/app/device/database to another, faster than ever these days. Just think about all those websites incorporating Twitter and Facebook content!

Or of websites “capturing” data coming from multiple sources: financial information, sales data, online communities...

RESTful web services is the technology that streamlines all these intense data “harvesting” processes!

This is the answer to your “Why use RESTful web services?” question.
 

When Should You Use RESTful Web Services? 5 Practical Use Cases

There are specific use cases when you should go “the RESTful way”.

Adopt this approach to building web services if:
 

1. In your distributed app it's crucial to keep the coupling between client and server components to a minimum:
 

  • you'll need to be able to update your server frequently, without having to update the client software, as well
  • your server's going to be used by multiple clients, yet you don't want them to have control over it
     

Just make sure you follow all the REST constraints for achieving this kind of basic level of coupling. Maintaining a purely stateless connection will be challenging, but not impossible if you “follow the rules”.
 

2. It's a custom, on-demand digital product that you're developing

Such as an Ubercart or Drupal online store that you're putting together on a remote cloud server:
 

  • you set it up
  • create a suitable architecture that would scale the environment if/when this your custom product goes viral
     

3. You want your game's high scores and user community forums to be displayed both in-game and on the web

Let's say that you're a mobile/console game developer facing the above-mentioned “challenge”. 

In your practical use case you can:
 

  1. have your Drupal site publish an API, using Services (thus doing “CRUD” with the data that needs to be “harvested”)
  2. leverage a RESTful type of communication with the Drupal site in order to retrieve that data and have it displayed in-game, on mobile/console, too
     

4. You want to create a user alert system on your e-commerce website

One that would alert your customers, via your e-commerce mobile app, whenever a product that they visualized becomes available (or its price drops).

Also, you want those alerts to pop up in an iPhone app and on Facebook, too.

And the solution is:

Your Drupal site (for yes, it's a Drupal site that you own in this scenario) will use Services & a custom module to have the example.com/alerts/uid API published. And it's this specific API that the iPhone app and Facebook will use for manipulating that particular content to be shown in the user “alerting” message.
 

5. You want to provide (paid) access to commercially-controlled data

Such as movies, music, stock or trading data.

Imagine that you own an event venue and you publish a ticketing API. People (such as ticket brokers) will be charged for gaining access to it.

In short: RESTful web services for can be used for all kinds of commercial activities, as well.

Just use them to create and to publish the API that will do CRUD with precisely that commercially-controlled data that people are willing to pay for gaining access to!
 

What Sets REST Apart from the Traditional SOAP Approach to Web Services?

Of simply put: 

Why use RESTful web services instead of the traditional SOAP-based web services?

Here's a list of strong arguments:
 

  1. with REST, all that intense data interaction is more lightweight; it doesn't weight so heavy on your web server like a SOAP page-scrapping method would
  2. with REST, only the specifically requested information gets retrieved, instead of having whole web pages scrapped off the “target” content (like with the SOAP approach)
  3. the architecture is way simpler than the SOAP one and it leverages the standards (and protocols) of modern web
     

And what does this last argument even mean?

It means that heavy SOA (Service Oriented Architecture) is shifting to lightweight WOA (Web Oriented Architecture), since these days apps need to tap into a web that's “packed” with REST resources.

And so, instead of leveraging a few point SOA services, data gets collected and displayed via millions of granular REST resources. Developing arbitrary apps, that interact over the network, has become conveniently easier.

Complex things (systems) get simplified!
 

When not to Use REST Web Services?

There are — as I've just pointed out — use cases when the REST approach is the perfectly suitable one: business-to-consumer apps.

But there also are specific cases when RESTful web services don't work so well: B2B apps!

Take this practical example here:

A bookstore might find it a bit more challenging to make a volume purchase from an online vendor as compared to a regular customer. 

It would need to “juggle with” several apps to track shipment, sales, determine re-orders etc. And where do you add that one app might need to be re-entered into other apps, turning the entire process into an overly complex, hard-to-manage one.
 

The END! 

Have I managed to answer your “Why Use RESTful web services?” question or not quite? Or just partially? 

Do be honest and, if it's the case, share your other REST inquiries and dilemmas with me! Or point out those use case examples or explanations presented here that you'd like me to shed some more light on.

Feb 15 2018
Feb 15

Last' year's “Should I learn Nodejs?” dilemma has turned into an “I'll strive to become a better Nodejs developer!” resolution this year. Nodejs is already developers' “adored” framework and building "the next big thing" in terms of Nodejs-backed apps is the new challenge in 2018! And this definitely calls for a shiny and new set of coding habits to integrate into your Nodejs app development workflow.

New code practices to stick to when writing Nodejs apps, new coding standards that you should follow and techniques to master for using this framework to its full potential. To your future apps' advantage, of course.

Speaking of which, here's a list of 12 Nodejs development pieces of advice for you to consider if one of your resolutions for 2018 has been: “To become a Nodejs padawan!”
 

1. Start to Learn The Basics of Import and Import()

Think ahead of ES modules' current situation. Which is the following:

  • ES modules have been supported since Node.8.5
  • it's true, though, that they're still wearing their “experimental-modules” flag
  • yet they're already being used, intensively, with the @std/esm library (and transpilers)

Do consider learning the basics and be ready to level up from there. Since it's pretty obvious that 2018 has lots in store for the ES modules.

2. Integrate Asynchronous Programming Into Your Nodejs App Development Workflow

There are 2 ways of carrying out your input/output operations and setting up your Nodejs development environment:

  1. synchronously, having your resources blocked for some time 
  2. asynchronously (Node.js' innovative application of asynchronous programming): where tasks can be carried out simultaneously since the resources don't get blocked 

Now just consider a scenario of multiple operations to be performed, where resources keep getting blocked... This would have a dramatic impact on your Nodejs app's performance!

In other words: embrace the asynchronous code!

Use async-await! Turn it into your own “trump card” for handling async events and embrace the simplified version of the once so overwhelming code bases.

3. Modularize Your Code: One of The Very Best Coding Habits to Develop 

Keep it small! Get into the habit of writing “bite-sized” chunks of code replacing the tediously long blocks of code that you might be used to right now.

Here's why:

  1. it will be fairly easier for you to embrace the asynchronous coding philosophy this way
  2. small-sized pieces of code will be easier to handle, adjust and closely monitor both for you and for anyone in your development team 
  3. handling a cluster of bite-sized chunks of code gets particularly convenient when it's a complex Nodejs app development project that you're dealing with

4. Master Containerization & Adopt the Microservice Architecture

Since the Nodejs application architecture is a microservices-based one. 

Therefore, one of the best code practices to incorporate into your workflow is using containers. Containerize your Nodejs apps and streamline your services deployment by tapping into these 2 techs this year:
 

Docker:
 

  • the software technology to generate your containers 
  • … which are nothing less than all-in-one pieces of software encapsulating all the resources that they need to run: system tools, code, runtime, system libraries 
  • containers that will increase your deployments' security level
  • and that you even get to use for simulating production environments locally 
     

Kubernetes
 

  • an open-source system that you get to use for automating everything Docker containers-related: scaling, deployment, containerized apps management...
     

Friendly advice: before you jump straight to containerizing your services, take some time to upgrade your existing code; for this, apply the principles included in the 12-factor app methodology.
 

5. Nodejs Application Performance Monitoring: Make It an Ongoing Process

Especially if it's a complex microservice ecosystem that you need to keep a close eye on!

Monitor your system, using the ever-improving toolbox at your disposal, detect and fix errors before they even get to catch your app users' attention. 

Close and on-going monitoring sure is one of the very best Nodejs app development habits that you could develop this year!
 

6. Mind The Bugs in the Code, That You Might Leave Behind

Be alert and avoid those scenarios where you leave trouble-making bugs behind, as you “knit” your web of code. 

And being alert means:
 

  • tracking your error events
  • detecting errors in their early infancy
     

Note: luckily for you, incorporating this practice into your Nodejs app development process is somewhat easier when using this framework. 
 

7. Get Acquainted With HTTP/2

Again: always be one step ahead of the game! And since we can't but anticipate that HTTP/2 will be going from experimental to stable this year in Nodejs, make sure it won't “take you by surprise”.

HTTP/2 has multiplexing and server push, with a signification impact on the native module loading in browsers.

So, there's no doubt about it: it's going to lose the “experimental” flag, that it has been wearing since Nodejs 8.8, and become the new standard with Nodejs this year.
 

8. Use Semantic Versioning: Another Nodejs App Development Habit to Form 

And this practice is what sets apart a Nodejs padawan from a Node.js... enthusiast.

If you've decided to learn Nodejs this year, make sure you delve deep(er) into its set of best practices (don't just scratch the surface): use semantic versioning for letting the users know that you've updated the app.

To inform them about the actions that they should perform in order to update the app to its latest version.

In short: by updating your packages without SemVer you risk breaking up your precious app!
 

9. Turn Securing Your Nodejs Application Into Your Top Priority

Make sure your Nodejs app is 100% secure above all! Apps' security has been both the vulnerable aspect and the ultimate objective for app developers in 2017.

And 2018 is no different from this standpoint!

Run tests over tests to “challenge” your Nodejs app's security by tapping into all the apps that this framework puts at your disposal:

  • Snyk
  • Data Validation
  • Node Security Platform
  • Brute Force Protection
  • Session Management

If there's a vulnerability there, somewhere, within your app, make sure you track it down before... it gets exploited!
 

10. Adhere to The JavaScript Standard Style for Writing Your Code

Following a set of coding standards will just guarantee you that no big issues will show up later on. In this respect, the JavaScript standard style makes the best choice for your Nodejs app development workflow.

Here's why:

  • you get to “hunt down” style issues and coding errors early in the development process
  • it sets the single space after keywords “rule”
  • it will automate your code's formatting by running standard-fix 
  • it sets the “function name followed by space” standard
  • and the “single quotes for strings” one

11. Put All Your “Require” Statements at the Top 

“Pin” this app development advice right on top of your list of best practices!

It will make all the difference! By grouping all your “require” statements right at the top you'll practically:

steer clear of performance issues, since “Require” is synchronous and it will simply block the executions (thus avoiding ugly scenarios)

Major tip: use Node's built-in module loading system; it comes with a "require" function which will automatically load the modules existing in separate files.

END of the list! These are the top code practices & new “healthy” habits to get into this year for making the most of this framework.

And thus turn your Nodejs app development projects into the next famous apps built with Nodejs!

Jan 04 2018
Jan 04

To go or not to go serverless... This is one of 2018's most asked questions in the IT industry. And it's true that serverless computing has grown from a niche solution, with a somewhat misleading name, into a tech trend guaranteed by all the industry experts.

Yet, you're still a bit hesitant when it comes to replacing your heavy, yet familiar infrastructure with a serverless framework, right? You feel like “dipping a toe into the water” first.

And who could blame you? It's a more than legitimately prudent approach considering the implications of such a switch. 

You shouldn't make a move, not until you have some unbeatable arguments at hand. Until you gain a deep understanding of all the benefits that derive from adopting a cloud-native approach.

Well, this is precisely what this blog post is all about:

pointing out all the strongest benefits that you will reap from taking the infrastructure management weight off your shoulders and going... serverless.
 

But First: What Is Serverless Architecture More Exactly?

First of all, let's get one thing straight: “serverless computing” doesn't mean, by all means, that there isn't a server, out there somewhere, doing its data workload processing work!

It's just that the user (yes, you!):
 

  1. is no longer burdened with all the server (or fleet of servers) management and monitoring ongoing tasks
  2. doesn't know (or care) where in the world his server is located
     

You'd be practically running your code right in the cloud, taking out of the picture the need of provisioning servers on your enterprise's end. 

"But how does the data processing work on cloud?"


A valid question indeed. Basically, you're enabled to set up your individual API endpoints which will fire code and perform certain actions once triggered.

As simple as that.

And if I am to exemplify, probably the best-known example would be the serverless computing AWS: Amazon's AWS Lambda. It has already managed to “seduce” plenty of IT managers in the industry and its popularity is sure to... explode in 2018.

Why? Because it's a serverless computing architecture which:
 

  • scales automatically, granting enterprises the flexibility they need and helping them cut down costs
  • it executes code only when/if certain events occur, when specific endpoints get triggered
     

And it's not a serverless framework without “rivals”. Competition is about to get fierce when other frameworks such as Webtask, Microsoft Azure Functions, Google Cloud Functions, IBM OpenWhisk and Iron.io will start to... take off.
 

It's Horizontally Scalable: Relax and Let It Handle Massive Waves of Traffic for You

In other words: you get to worry less about balancing the heavy load, about the waves of traffic that your site/app might face once certain of your endpoints get exposed. And about rushing in to put together the properly robust infrastructure.

The underlying system will handle it for you!
 

You're Free to Set Up Each Endpoint in the Language of Your Choice

Or simply put: your API endpoints will be language-agnostic!

You (or your lucky team of developers) get to write each endpoint in a different language runtime. So, you're free to use the one that you're most familiar with or the one that best fits our work scenario. 

And this is already a major pro for adopting a serverless computing approach in 2018!
 

You Only Pay for What You Use: A Strong Benefit of Serverless Computing

Here's another “irresistible” benefit that you can reap from going serverless: you only pay for what you use!

So, there's no need (not anymore) to pile up on T2 small instances and auto-scaling groups... just in case. The “case” here being: “if I ever hit a surge of traffic”.

When you're using a serverless architecture all this comes without a price tag on!
 

Worry Less About Managing The Underlying Infrastructure: It's Being Taken Cared Of!

Your serverless cloud provider will be managing the entire fleet of servers for you.

And this “management” includes applying security patches as soon as they get released, as well!

So, take all these monitoring and security-related concerns off your back and... focus more on turning great ideas into great digital products!

And this is — all these resources of time and effort that you'll get to invest elsewhere — the main advantage that you'll leverage from switching to serverless computing!
 

Turn That Great Idea of Yours Into The Next Big Thing Quicker Than Ever! 

Just think about it: going from having a great idea to actually turning it into the next big... app (let's say) will take you a whole less time (and effort).

Serverless computing will smooth the path for you and shorten the process, meaning that:
 

  1. you'll be having your idea production ready a lot quicker
  2. you'll gain more confidence to try on newer technologies, as well
     

Summing Up... Plus a Forecast for 2018

“Serverless IT will move from the niche corners of the cloud estate into the spotlight as it addresses the three key areas that keep IT admins up at night: speed, cost and risk.” (Jason McDonald, President U.S., Contino)

Yet (for there is a “yet”), serverless computing isn't a “one size fits all” type of solution. It won't scale to all app architectures and needs (like the need to closely monitor and to control how things get configured).

Nevertheless, if we:
 

  • go beyond its somehow misleading name
  • see its strong benefits
  • consider it only for those use cases that it's best fitted for
     

... serverless architecture is here to stay and change the way we build software from the ground up!

Nov 23 2017
Nov 23

It's overwhelmingly lengthy, it's discouragingly “crowded”... it's your checklist to follow when choosing the right CMS for your content-heavy website!

And there's no way around it: you need to check them ALL off, all the must-have features and functionalities included there.

For you can't afford to make compromises on security for a boosted performance, for instance. And you sure can't get away with trading high speed for easy authoring, right? Or with accepting anything less than “the very best” editorial experience for the sake of easy-to-customize design, for example.

It should be an all-in-one CMS solution! 

Well, it looks like Drupal is the only platform to fit the profile: it lives up to your legitimately high standards and is capable to meet your content-packed site's specific needs.

Here's why:
 

1. It's Ideally Flexible & Conveniently Extensible

Dare to dream big, for your Drupal site's content infrastructure is built to grow, seamlessly and almost organically, at the same rate as your future plans!

For any performance, security, content management-related, or any other heavy-content site/industry-specific functionality that you might need to add... there is a Drupal contributed module!

… or there is a team of Drupal developers ready to write custom code for you and build your custom-fit Drupal module from scratch!

And here are 2 possible scenarios where you could capitalize on Drupal's impressive flexibility and extensibility:
 

  1. you need to integrate SalesForce with your website: there isn't just one, but several Drupal modules that you can use for injecting this type of functionality into your website
  2. you need to add an Apache Solr to your search bar for indexing results (a critical integration for any large-scale, content-heavy website): Drupal turns this type of integration into a... breeze
     

Whether it's a blog or a content-packed, high-trafficked website that you own or plan to build: Drupal's conveniently extensible to fit any size, any business needs.
 

2. It Provides a Both Flexible and Rich Content Authoring Experience

Here's another strong reason why Drupal's the right CMS for your content-heavy website: it makes content authoring unexpectedly easy!

“Armed” with the WYSIWYG editor — which makes such an easy to use content management and editing interface —  with URLs, taxonomy, custom lists and tags, your editorial team gets to:
 

  • craft
  • edit
  • publish
  • perfectly structure
     

… content on your site.

Podcasts, articles, infographics, guides, e-books, case studies... your heavy infrastructure gets ideally easy to manage with Drupal as your site's backbone-CMS.
 

3. It Ships With Impressive Database Accommodation Capabilities

Not only that your Drupal CMS's built to seamlessly accommodate your large and enlarging database, but it ships with organizing and sorting features, as well.

Features/functionalities delivered to you in the form of dedicated modules.

In other words: setting up your customized, ideally structured, perfectly usable library calls for zero custom code writing when using Drupal as your website's CMS!
 

4. It's Open Source, Making It a Perfectly Suited CMS for Your Content-Heavy Website

Drupal's open source nature opens the door to a whole world of possibilities (free of charge) to you!

Just imagine this scenario here:
 

Your heavy-content website has a huge influx of regular visitors and then...all of a sudden... a big nasty bug attacks! And it's just inevitable when we're talking about a content-rich website, with content being added and updated almost on a daily basis!
 

What do you do then?

You reach out for a patch digging deep into all the free resources put at your disposal by the Drupal community!

Just think of all the costs that you'll be cutting off when building your large-scale project with so many modules, site elements specific to your use case and features out there for you to just... “grab” and implement.
 

5. It Meets The Highest Government Online Security Standards

High waves of traffic and a robust content infrastructure do come at a cost: the cost of the highest levels of security.

And it's by far the most important point on your checklist to finding the most suitable CMS for your content-heavy website.

Drupal's already built a solid reputation around it as the CMS that powers government and high education websites.

Need we add more?

If it's powering and safeguarding the White House's website from cyber threats, then it must be built with high-security standards mind, don't you agree?

Where do you add that, in addition to its robust built-in security features, there's always the worldwide large Drupal community out there to “alert” if something goes bad. A community constantly monitoring Drupal's status at a security level.
 

6. It's Highly Customizable in Terms of Design 

How to design content for heavy websites? The best example in this respect is the Panels module that Drupal puts at your disposal.

Harness its power to create layouts perfectly tailored to each specific use case. 

How? With drag and drop! Put together the custom layout and then just fill it in with its corresponding content.

Hence, you get to personalize each page on your website all while keeping a visual continuity throughout it!
 

The END! Do you find these 6 reasons strong enough for you to start seeing Drupal as the most suitable CMS for your content-heavy website?

What other must-have features (if any) would you add to your checklist?

About Drupal Sun

Drupal Sun is an Evolving Web project. It allows you to:

  • Do full-text search on all the articles in Drupal Planet (thanks to Apache Solr)
  • Facet based on tags, author, or feed
  • Flip through articles quickly (with j/k or arrow keys) to find what you're interested in
  • View the entire article text inline, or in the context of the site where it was created

See the blog post at Evolving Web

Evolving Web