Google Expands Hacking Detection To Deeper Pages

By: Chris Crum - June 8, 2012

Google has released its big list of algorithm changes for the month of May. There are plenty of interesting changes of note, but the first one on the list is:

Deeper detection of hacked pages. [launch codename “GPGB”, project codename “Page Quality”] For some time now Google has been detecting defaced content on hacked pages and presenting a notice on search results reading, “This site may be compromised.” In the past, this algorithm has focused exclusively on homepages, but now we’ve noticed hacking incidents are growing more common on deeper pages on particular sites, so we’re expanding to these deeper pages.

It’s interesting that this comes under the codename “Page Quality”. Of course, one of Google’s quality guidelines (the focus of the Penguin update),is:

Don’t create pages with malicious behavior, such as phishing or installing viruses, trojans, or other badware.

Obviously, this could occur as the result of hacking.

A few months ago, Google improved malware detection in tis ads too:

Google is also alerting users about their malware-infected machines.

This week, Google even began letting users know when their accounts are being targeted by state-sponsored attacks.

About the Author

Chris CrumChris Crum has been a part of the WebProNews team and the iEntry Network of B2B Publications since 2003. Follow Chris on Twitter, on StumbleUpon, on Pinterest and/or on Google: +Chris Crum.

View all posts by Chris Crum
  • John Onwuegbu

    Google has proven to be an indefatigable ‘Watch Dog’ of the internet, amidst its leading internet technology innovations which have revolutionized the way users engage in information retrieval and general web experience. They’re the best, ever!

  • Jack

    I am a Certified Ethical Hacker (CEH) and it’s amazing how much Google has been doing to protect people using the internet. More than meets the eye, that’s for sure…

  • No BS SEO

    It’s funny (not) but I have always endevored to create websites with good content, relevant meta data and easy navigation. If my website is for a Sydney Electrician or a Melbourne Tool Supplier then both the content and the meta data support those search terms. Google however in their wisdom choose to ignore the face value of a site and go looking for some obscure linking process which may well have been instigated by a competitor. Serioulsy Matt, let’s call a spade a spade and kill the sites which supply these “ghastly” links instead of killing the little guy who in most cases doesn’t even know these links exist much less how to get rid of them.