Live from Pubcon: State of the Index with Matt Cutts

By Janet Driscoll Miller | Oct 16, 2012
More Articles by Janet


Matt Cutts was back for his tenth year at Pubcon and presented his “state of the index” for Google.

SEO in 2011 was about Panda and duplicate content. In 2012 there was an initiative in Penguin to cut down on spam. This past year has included:

  • an algorithm to keep ads above the fold
  • Penguin
  • cracking down on spammy link networks
  • continue to refine Panda
  • cracking down on spammy use of “exact match” domains

Last year, Googlebot got smarter with AJAX and Javascript. Additionally, you can now get 90 days worth of queries and the 2000 top quereies in Google Webmaster Tools. Google added autocompletetype, which makes it easier for people to fill out forms with Chrome’s saved data, which they believe can help improve conversion (I agree!). They also made suggestions for responsive web design, which is helpful for mobile.

Some other great stuff Google rolled out over the past year include:

  • Webmaster Academy
  • Recommendations for smartphone-optimized websites and a mobile Googlebot
  • Better crawl/site error reporting
  • Emails for critical issues
  • Better user permissions in the Webmaster Tools console
  • Blog posts about algorithm changes
  • Updated Webmaster Guidelines with examples and such
  • Messages for unnatural links

Matt introduced a new tool to allow webmasters to disavow links. Hurray! How does it work? This is not “magic disavow fairy dust”. Do NOT use this tool unless you are sure you need to use it! You should first remove the links if you can first, because remember that other search engines also use this data but don’t have the disavow tool.

To use the tool, go to http://google.com/webmasters/tools/disavow-links-main. Then simply upload a text file into Webmaster Tools. Enter one URL per line or enter domain: followed by the domain to include all links from a particular domain. This tool is still in its early stages, though, so Matt warned that most webmasters and sites should NOT use the tool. Once the information is entered, it will likely take a few days or weeks for Google to recrawl the links. It will then annotate the link with your disavow info.

Matt also reminded the audience that, as SEOs, you should always be skeptical, especially of wild promises.

Eric Enge asked Matt if disavowing with the new tool is similar to nofollow. Matt said that essentially it does, but for the site on the other end of the inbound link. Also, nofollow drops links whereas diavow does not yet do that, as it’s still being tested.

Another audience member asked about how disavow works with a reconsideration request. Matt suggested that he should do a disavow request then allow 2-3 days before doing a reconsideration request, and mention that you submitted data via the disavow tool in that reconsideration request.

Will a disavow on YOUR site hurt you? Matt says not likely. But ideally, take down spam links if others request you to if you can.

Another audience member asked why Google can’t seem to always identify the originator of content versus scrapers. Matt recommended checking out analyzethis.ru to map how Google is doing with addressing duplicate content.

What is the timeframe in which duplicate content is assessed? Matt said that duplicate content is identified far upstream, when indexing is being done. Duplicate content, however, is less of a penalty and more that duplicate content just isn’t shown.

Share this article

Share on LinkedIn Share on Twitter

Receive Monthly Digital Marketing Tips

Subscribe to monthly updates from the Marketing Mojo Blog to get the latest digital marketing tips, best practices and insights - hot off the presses and straight to your inbox!

 

Blog Search