An SEO Case Study in Getting Store Locations Found in Natural Search

By Tad Miller | Nov 22, 2011
More Articles by Tad


Changes in site architecture and Search Engine Algorithms had a decimating effect on our client’s rankings for its over 600 store locations.

The Problems

Search Engines can’t fill out a store locator box.  Google-bot can’t type a zip code.  So even though users can easily use these kind of interfaces to find their nearest store location, a search engine can not do that.

As a result, a directory of store locations had to be developed as a path that Search Engines can use to find the store pages.  This was effective until one day in May.  The directory structure could only be accessed from one link on the site, deep in the site map page.  When the Google May Day update hit, the directory structure had all of the store locations 4 clicks away from the home page.  Ever since the May Day update you really need to try to have your content within 2 clicks of your Home Page.

The client also used JavaScript to dynamically pull the addresses, phone numbers and store names from a database and then put that content in JavaScript on the store location pages.  Search Engines can’t really read JavaScript content.  So if a Search Engine somehow did bother to index all the way down to the store location page level (which they didn’t) their wasn’t any readable content with regard to the location of the stores.

The client also had a couple of different websites, which also used the same exact store location pages just with different URLs.  This is duplicate content in the eyes of Search Engines.  Chances are a Search Engine will usually only rank one of the duplicate content URLs, and it may not be the URL that you prefer to rank.

At its worst only 10% of the over 600 location pages had top 10 rankings for our selected keywords which were a variation of the town name and the store names on Google.  Before the May Day update and the use of JavaScript to produce the Name, Address and Phone Number well over 75% of those words had top ten rankings for those same keywords.

The Solutions

Search Mojo cleaned up the directory structure of the store locations and convinced the client to add a footer link to all pages of the website to that directory structure.  This had a significant impact on improving many keyword rankings.  It led to the store pages at least getting found and indexed by Google, which wasn’t happening with the directory structure being buried deep within the site, far from the Home page.

The client was unwilling to abandon the use of JavaScript content on the store pages for the Names, Addresses and Phone Numbers.  However, we convinced them to utilize “noscript tags” that are only visible to Searchers that do not have JavaScript enabled on their browsers.  The identical content for the Names, Addresses and Phone numbers was put in HTML within those noscript tags.  This gave the search engines something they could read containing the name of the store and the town where those stores are located.

Canonical tags which refer search engines to rank the pages that you want ranking rather than their duplicate content counter points on other URLs were implemented to avoid issues of duplicate content.

The Results

Rankings bounced back for the store location pages.  We went from 10% of them having top ten search rankings on Google to about 70% of them having top ten rankings on Google.  Many other pages are very close to becoming top 10 ranked pages.

What Could Have Worked Better

The HTML content within the noscript tags was a good band aid that was useful in improving the selected keyword rankings of store pages.  However, our preference was to abandon the use of JavaScript for this content and use HTML and Rich Snippets.  Google has specifically said that use of Rich Snippets within noscript tags will not work.

Rich Snippets have the potential to give search engines the ability to associate your page with a location, which can be a big deal.  When search engines can personalize search results based on the searchers location (especially on mobile devices enabled with GPS) these store pages can lead to increased store visits and sales.

Make it easy to find your store locations. But not too easy

Share this article

Share on LinkedIn Share on Twitter

Receive Monthly Digital Marketing Tips

Subscribe to monthly updates from the Marketing Mojo Blog to get the latest digital marketing tips, best practices and insights - hot off the presses and straight to your inbox!

 

Blog Search



  • http://www.seobythesea.com Bill Slawski

    Nice case study, showing that small but intelligent changes to a site can make a big difference.

    Interesting perception of the impact of the Mayday update on the number of clicks away content needs to be to surface in Google Web results, but I’m actually surprised that there wasn’t some kind of “store locator” link prominently placed on the front page of that site to begin with.

    Not so much because it may have impacted whether or not the site appeared in geographically relevant results, but more because the appearance of such a link might influence more people to find a store near them, when they visited the homepage during a web search.

  • http://www.search-mojo.com Tad Miller

    Bill, the store locator was (and still is) a zip code entry box on the old version of the site. Humans obviously gravitated to using it, but the typing skills of the search bots obviously weren’t up to the challenge. Even with the addition of the footer link to the directory structure, less than 1% of the people are using the directory structure to find the store location. The footer link to the directory structure is really just a path to indexing necessary for SEO.

  • http://www.seobythesea.com Bill Slawski

    That makes sense, Tad.

    From a usability perspective, a zipcode entry box on the homepage is a shorter and simpler path to what people likely wanted to find then a click to a directory.

    Nice solution though.