Quantcast

Search Engines and ClickStream Data

The Google/Bing Kerfuffle

Get the WebProNews Newsletter:
[ Search]

Recently there was a big kerfuffle over Bing copying Google results, Bing denying it, the Google FUD machine in full force, and some nice conspiracy theory about who is pulling the strings and why. While this makes for some exciting drama and fun twitter banter, there is some actionable information for SEO’s: paying attention to and using clickstream data.

According to the Bing response they are 100% using click stream data from sources such as IE toolbars and factoring that data into their ranking algorithms. In fact, that click stream data is at the heart of Google’s accusation that Bing is copying them. What REALLY happened is that Google engineers set up some fake SERP’s for made up words and made sure Bing got the data by sending clicks to Bing. While it’s technically not the clickfraud that Bing frames it as (since PPC wasn’t involved), it was artificial data, convincing enough that Bing believed 10% of it.

In my opinion the days of being able to ignore or dismiss social as a fad or not a part of SEO are over…

The real question you need to be asking yourself is, does Google use click stream data as part of their ranking algo? If you set the wayback machine to 2002, GoogleGuy (aka Matt Cutts) felt using Toolbar data could help provide better SERPs (hat tip MattMcGee). To the best of my Knowledge at the time this post was written, Google hasn’t disclosed if toolbar clickstream data is or is not used in ranking data (if Matt Cutts or any other rep wants to comment or drop me a link to an official comment, I’ll append this post). That said, in my testing I have seen a lot of evidence pointing to toolbar click stream data being used–at least on a short term basis. Pages with a lot of social proof (aka tweets, stumbles, reddits, etc) will pop into SERP’s for extremely competitive terms and then fade away when the clickstream data stops.

In my opinion the days of being able to ignore or dismiss social as a fad or not a part of SEO are over. Any serious SEO should also have a social component as part of their strategy, unless it’s a strictly B2B play or something “unsocial” like a funeral home.

So what are the takeaways from this post:

  • Bing is using clickthrough data as part of their ranking methodology; it’s likely Google is as well
  • Look for ways to get your URL into the data stream of toolbar users
  • Social websites like twitter provide an easy way to spoon feed data to search engines
  • Clickstream data isn’t a leading factor in the ranking and probably never will be, but it is part of the equation

Originally published on Michael Gray Graywolf’s SEO Blog

Search Engines and ClickStream Data
Top Rated White Papers and Resources
  • http://www.autoreverseweb.com Souleye Cisse

    thank you michael for your article but if I’m following you right, google fed the fake data into bing, then accused them of copying their results. in which case google would have very low standards. so google sets a trap and bing unwittingly falls into the trap and comes out clean. honestly anybody who put ‘microsoft’ and ‘clean’ in the same sentence must live in denial. let’s leave out the fact that very few people will buy that. microsoft has a long history of copying other people’s technologies and ideas. their only innovation is the sophistication of their ‘copymanship’. so when I read about that story my reaction was ‘I wouldn’t put nothing past them’

  • http://forum.smobot.com SMObot

    Of course click stream data is and has been a primary ranking factor. It works like this:

    Results 1-10 have a normal CTR distribution pattern for almost any given search term

    1st result – 40% CTR
    2nd result – 20% CTR
    3rd result – 15% CTR
    …. and so on.

    The ideal spread is known. Any SERP falling above or below it’s expected CTR at rank will either lose or receive rank, independent of inbound links. This is how new websites make their splash. If they do not perform in CTR, new websites will drop in rank.

    In the long term, LTR, or link-through-ratio (the rate at which webmasters link to a ranked hyperlink compared with competing links), factors in. The total number of links is both a weighting and damping factor in rankings.

    Now, since Google, being the dominant player, has most of the clickstream data at their direct disposal, Bing certainly benefits from using any clickstream data detected and reported back to Microsoft through the browser. Chrome is set up the same way. Same with the Google toolbar. Anyway, Microsoft can now track SERP click through and bounce rate on ANY search engine, including Google.

    Browsers can go so far as to essentially report back “heat maps” on individual web pages. Google’s analyitics tools demonstrate this capability. Why should anyone suspect that Google isn’t using browser data to “sculpt” pagerank?

    The Adwords PPC engine provides very direct insight into Google’s ranking methods. Google prefers to rank ads that have high CTR * Bid performance in order to maximize profits. The same logic applies to organic.

    Pending a linguistics engine breakthrough, search engine technology is leveling off, and Bing will catch up soon enough.

    • smobot

      **barring a linguistics engine breakthrough,

  • You Know Me

    Burn Google, burn you Evil wicked company….burn down to the size of the wicked and evil Microsoft so that we have two devils on even, fiery footing.

    .