Google’s Not Checking Your Facts Just Yet

A recently released Google research paper has been drawing some attention throughout the search industry. It proposes a signal for ranking search results based upon “the correctness of factual i...
Google’s Not Checking Your Facts Just Yet
Written by Chris Crum
  • A recently released Google research paper has been drawing some attention throughout the search industry. It proposes a signal for ranking search results based upon “the correctness of factual information provides by the source,” rather than links.

    Do you think this would be a good direction for the algorithm to go in? Let us know in the comments.

    As we reported before, just having this paper out does not mean that Google has implemented such a ranking strategy, nor does it necessarily mean that it will. Still, some misleading reports have circulated implying that Google is going forward with it.

    Just to confirm that this is not currently part of the Google algorithm, Google webmaster trends analyst John Mueller said as much in a Google+ hangout (via Search Engine Roundtable).

    A little over 49 minutes in, Mueller responds to a question about facts potentially being included as a ranking factor, and how Google would handle inaccurate information that can’t be fact checked. Mueller didn’t really have an answer for how Google would deal with that, but did say this:

    This was just a research paper that some of our researchers did and not something that we are using in our rankings. We have researchers that do fantastic research that publish tons of papers all the time, and just because they are researching something and trying to see which options are happening there, or because maybe they are patenting something or creating new algorithms, it doesn’t mean that is something we are using in search. At the moment, this is a research paper. I think it’s interesting seeing the feedback around that paper and the feedback from the online community, from the people who are creating web pages, from the SEOs who are promoting these pages, and also from normal web users who are looking at this. At the moment, this is definitely just a research paper and not something that we’re actually using.

    So there you have it. Now, all of that said…

    The paper is still more interesting than your run-of-the-mill Google research paper, for a few reasons. For one, we’re talking about a signal that could be looked at as more valuable than links, which have long been the backbone of Google’s ranking strategy. If implemented, it would represent a fundamental change in how Google ranks web pages.

    Secondly, the way the paper is written essentially calls out links as an outdated way of ranking content. If this is indeed the case, why would Google want to continue placing so much emphasis on that signal, when it has one that it feels is better representative of authoritative content?

    The opening paragraph of the paper pretty much discredits links as a valuable signal. It says:

    Quality assessment for web sources is of tremendous importance in web search. It has been traditionally evaluated using exogenous signals such as hyperlinks and browsing history. However, such signals mostly capture how popular a webpage is. For example, the gossip websites listed in [16] mostly have high PageRank scores, but would not generally be considered reliable. Conversely, some less popular websites nevertheless have very accurate information.

    Fourteen out of fifteen of those sites it refers to, it says, carry a PageRank among the top 15% of websites due to popularity, but for all of them, the Knowledge-Based Trust (KBT), which is the score for trustworthiness of information, is in the bottom 50% of websites.

    “In other words, they are considered less trustworthy than half of the websites,” Google says in the paper.

    So again, why would Google want to continue ranking content that isn’t trustworthy just because it has a lot of links? And we’re just talking about popular websites here. That’s not even taking into consideration black hat SEO practices, which Google has to constantly play whack-a-mole with.

    Thirdly, Google already uses a lot of “knowledge”-based features. You’re no doubt familiar with Knowledge Graph, and more recently Knowledge Vault. The search engine is constantly trying to deliver information directly in search results. This stuff is clearly of great importance to Google. To me, this just adds to the likelihood that Google will eventually use the signal discussed in the research paper, at least to some extent.

    What will really be interesting is whether or not Google will inform webmasters if it does implement such a signal. Will it announce it like it did its recent mobile-related signals? Time will tell.

    Either way, it can’t hurt websites to strive to include as accurate of information as possible, and do some fact checking when appropriate. Who knows? Maybe one day it will mean the difference in whether or not your page is on the first page of search results. The best part is that there is no down side to this. Accuracy lends to credibility, which good for you no matter what.

    Oh, by the way, Mueller has also been advising webmasters against link building.

    Do you think knowledge-based trust would be a better ranking signal than PageRank? Share your thoughts in the comments.

    Image via Google

    Get the WebProNews newsletter delivered to your inbox

    Get the free daily newsletter read by decision makers

    Subscribe
    Advertise with Us

    Ready to get started?

    Get our media kit