This session I’m sure will prove to be very informative — there is so much data available in both Bing and Google webmaster tools — what can you do with it all? The panelists today included Vanessa Fox, Neil Walker from Get Updated, and Duane Forrester of Bing.
Vanessa spoke first and covered Google Webmaster Central. She started off by discussing how and why to categorize your XML Sitemaps. Google keeps track for you of how often a Sitemap is indexed. If you categorize the Sitemaps, you identify which type of category is having a drop. It could identify a problem with a specific category of pages. You can also identify issues by recognizing certain spikes or drops in indexing by Sitemap.
Vanessa next addressed search queries in Google Webmaster Tools. Even though you have a date range set, the number of impressions isn’t the full amount of impressions for a particular query but rather the impressions only for days in that period when the query was one of the top 1000 queries. So you may see skipped dates in that period and you may not get the full impressions for that timeframe for that query. So how can you use this data? You can categorize the queries and then chart it out by impressions, clicks, CTR, etc.
Neil was up next and discussed how his agency uses this data for clients. In 2010 they downloaded data across multiple clients (100) and merged the data and found average click through rate in organic search based on SERP position. But things have changed a lot since then. They decided to download the data again and see how they compared with the 2010 results across the same clients. Amazingly, CTR went down, likely due to all of the changes we’re seeing in search results.
Does it mean SEO is dead? No — it means there are MULTIPLE ways that you can get clicks — not just from page positions.
So next he looked at mobile CTR versus desktop CTR. He found that the top five mobile results got a bit higher CTR than those same positions on the desktop. For image CTR, #1 gets a high level of clicks and CTR dies off quickly after that.
What about query length? They first removed branded queries. They found that the longer the keyword, the less important it was to rank #1, because it seems that people looking on longer terms are honestly looking further down and more intently at results.
He was then able to also extrapolate data and predict the visits and conversions from organic search. Fascinating.
He also mentioned Analytics Canvas as a good tool to help evaluate analytics.
Duane was up next. He first mentioned the announcement that there’s a new Bing Keyword Research tool. It is powered by organic data from Bing, not from paid ads. The data is updated every 2 weeks or so and the duration of data is six months.
One of the first points he made was that you, as a webmaster, should allow Bing Webmaster Tools to send you alerts. This is very helpful in case you have situations like malware or indexing issues.
Duane showed a sample of tracking actions in the site. If you look at the Average Impression Position column, note the number for each keyword in that column. The closer that number is to 1, the more Bing trusts you on that topic. So if you see the number dropping, put content out there and engage around that term.
Index Explorer can help you uncover content gaps in the Bing index. Drop in a URL, apply the filter and see if Bing has it. So make sure that Bing is fully indexing what you want it to. You can directly add the URL to the index, but this is limited. Add your best URLs first. There is a limit of 10 per day and 50/month.
You can also drop your RSS feed in as a Sitemap. This will help Bing see your freshest content faster. Bing wants clean Sitemaps. If you have one bad entry in a Sitemap, that will likely be fine. But if you have 2 bad entries, the Sitemap is not considered “clean” and might not be indexed as often. RSS feeds might be better for this.
Bottom line: Duane says you need to use the accounts!