One of the three common elements that plays a crucial SEO role in today’s ecommerce websites is layered navigation.

Layered navigation (also referred to as faceted navigation, attribute filters, etc.) can have a positive or negative impact on your site’s crawlability, keyword breadth, and content duplication.

Getting it right should be an important part of your SEO strategy.

The Benefits of Layered Navigation

Layered navigation has become a very popular way of organizing ecommerce websites. It allows the user to narrow down on certain product attributes from a top-level category page, rather than trying to navigate through separate subcategories. Users can filter results by a nearly endless combination of brands, sizes, colors, shapes, and more.

This type of navigation is very “in vogue” right now and is great for giving more control over the product search process. In addition, layered navigation typically gives users smooth visual transitions as new filters are added. It eliminates complete page reloads, and often makes it easier for users to compare similar products.

While layered navigation is great for users, it can hurt your SEO if it’s not done right. Two major ways that layered navigation can affect SEO are URLs (or a lack thereof) and what we refer to as “spider traps”.

1. Layered Navigation URLs

One SEO best practice is to create targeted, unique URLs for each page of your site. That’s SEO 101, but you’d be surprised how often layered navigation throws this concept out the window. This is due to the way many layered navigation plugins utilize AJAX in order to update the DOM (Document Object Model) without changing the URL or forcing a page refresh.

This speeds up page load for the user, but heavy reliance on layered navigation in this instance would (as far as search engines are concerned) eliminate many keyword-rich sub-category pages. This in turn limits your site’s potential keyword breadth by forcing search engines to select a top-level category page or product page for keywords that would be better served by a sub-category page.

2. Graceful Degradation

Some layered navigation systems have what is referred to as “graceful degradation”. This means that in the event that javascript is not available in a browser or bot, old-school HREF links will be used for selecting layered navigation attributes.

In this case, you would have a unique URL for each page because this JS-independent setup would require attributes to be included in the URL. However, in most cases, these attribute URLs are canonicalized to the main category from which they sprung, eliminating them from search engine indexes.

So what can be done? If you’re lucky, your ecommerce platform allows sculpting of your layered navigation to expose/limit specific combinations of filters and provides a way to add custom meta descriptions, titles, etc for each filter combination. But if you’re like 99% of online retailers, this is not the case with your ecommerce platform. In that case, the answer is simply to create the subcategories that make sense for your site.

Do not – I repeat – do not – create subcategories for every attribute that exists within your layered navigation. This will likely be a waste of time and will create unnecessary pages.

A better approach is to let your users tell you what subcategories to create. By utilizing any number of keyword tools, you can do research based around the core category term to determine what your users are searching for. What stems are they using off the core term? Take all of these stemmed terms and organize them by granular topics. Each bucket of similar stems represents a sub-category that you should be creating to address your customer’s needs. By addressing the customer needs, you’ll be directly targeting those mid-level keywords that will drive additional traffic to your site.

3. Layered Navigation Spider Traps

10 Steps to Master SEO Layered Navigation on Your WebsiteSo we’ve just covered the fact that heavy reliance on layered navigation can have a limiting effect on your site’s keyword breadth. But did you know that layered navigation can have the opposite effect on your site as well?

The very nature of layered navigation, which typically involves a huge number of attribute options, often leads to situations where a unique URL is created for each potential combination of attributes (either by default or a byproduct of gracefully degradation). This results in URLs that are likely to get overly long, pages that get overly granular, and the likelihood of a “spider trap.”

If you consider every product attribute combination possible – including variations in the order in which people select the attributes and cause them to appear in the URL – you’ll end up with an insane number of pages on your site. And the overwhelming majority of these pages provide no value to search engines. Think about it – how likely is Google to get a long tail query that involves fifteen adjectives/attributes?

This scenario is bad for a couple reasons:

4. Crawl Budget

Google has a set crawl budget for each site (and the internet in general – they’ve gone on record saying they can’t crawl the entire web). So it’s better for your site to provide Google’s bots with only the pages that are likely to matter to its users. If your layered navigation URLs are exposed to search engine bots, then the bots are likely to expend their crawl budget on these low/no-value pages rather than pick up on new, fresh content on your site.

5. Index Bloat

Furthermore, if indexing limitations are not put on these layered navigation URLs, then search engines could go on an indexing frenzy, resulting in a ‘bloated’ index of your pages. Instead of the concise 750 URLs that are directly related to products, categories, and customer search intent, suddenly you have hundreds of thousands of URLs indexed. Along with that comes a dilution of your site’s content and purpose, and in worst-case scenarios, a site structure that mirrors “thin” content farms full of duplicate or nearly-duplicate content that were targeted by Google Panda and manual actions.

6. Prevention and Remediation Toolkit

If your ecommerce site is using layered navigation, you’ll want to make sure that it’s set up effectively to take advantage of all of its benefits while minimizing the potential negatives. The best way to avoid Google getting lost in your navigation is to simply never give access to it (unless you have robust control capabilities not found in most layered navigation plugins). There are three important tools that can help you keep Google and other search engines focused on your optimized content and not get lost down an endless navigation path.

7. Canonical Tags

These tags allow you to specify the primary version of a URL, with the intent that the search engines index only the primary version. They’re great for keeping search engines from indexing duplicate pages/content, especially for slight variations like those created by layered navigation, session IDs, “print” URLs, etc. There are plenty of resources out there for the implementation of canonical tags, including Google’s own documentation.

8. Meta Robots Tags

10 Steps to Master SEO Layered Navigation on Your WebsiteOn a page-by-page basis, meta robots tags allow you to give a directive to search engines whether the page should be indexed, and whether the links on the page should be followed. Setting the meta robots value to “noindex” prevents index bloat. Setting the value to “noindex,nofollow” prevents index bloat and also keeps the bots from following all the additional layered navigation URLs that might send it deeper into a spider trap.

The meta robots tag has been around for a long time (see this explanatory Google post from 2007), yet most developers and site owners don’t understand how crucial this tag can be.

9. rel=“nofollow”

Having a nofollow directive in a page’s meta robots tag tells search engines that they should not follow any link on that page. But that wouldn’t be appropriate for most non-layered navigation pages. So what do we do about the links from category and subcategory pages that point to layered navigation URLs that we don’t want to be indexed?

We would add rel=”nofollow” attributes to only the links within the layered navigation. This will cause search engines to basically ignore the links to the layered navigation, steering them elsewhere within the site.

10. Robots.txt

You can use the robots.txt file to block access to certain parts of your site based on directory structure or wildcards. However, a robots.txt disallow tells search engines that they can’t access the content on the URL, but it doesn’t instruct the search engine to leave the URLs out of the index.

We attempt to keep the engines from following the links to layered navigation URLs. But when the engines do land on one of these URLs, we’d rather the search engine have the ability to read the HTML to know that the canonical and/or meta robots information is there.

Layered Navigation: A Double Edged Sword

While layered navigation can provide a great user experience, any retailer should be aware of the SEO pitfalls outlined above. Remember to take proper care when setting up layered navigation – be aware of what all those admin checkboxes do so that you don’t create a spider trap. And on the flip-side, don’t miss keyword targeting opportunities by relying too much on layered navigation for site architecture.

If you’re looking for even more information on how you can optimize your website from an SEO perspective, take a look at our special report: The 4 Pillars of a Successful SEO Mindset. This report outlines the importance of creating great content, being realistic with your expectations, and outlines how crucial collaboration and teamwork are in order to take the next step towards SEO success.