Ajax Applications – Immune to Clickbots?

Get the WebProNews Newsletter:

[ Search]

I’ve been playing with GMail and Google Reader lately and becoming more accustomed to the AJAX/Javascript interfaces. Which got me to thinking how good are they at separating humans from clickbots.

When I’m talking about beating clickbots in this post I’m talking about fake user and zombie bots being used to simulate user behavior, not click on PPC advertising. I’m still a very strong believer that Google is using user data in some way in the algorithm, whether it’s tracking outbound clicks in Gmail, or gathering subscriptions, reads, stars, shares, and clicks in Google reader, the simple fact is Google knows what I’m doing and who I’m doing it to.

I’ve seen a few zombie programs at work browsing, working, even digging stuff. Nothings too advanced and most of it’s kind of buggy and so high maintenance it’s almost worth still doing by hand. However we’re still early in the game and need some R&D to get economies of scale working in our favor. I haven’t seen anybody even trying to build bots for Ajax pages. Looking at the source code on the page shows an incredible amount of stuff that you wouldn’t want/need to fake that acts as the gateway into the good stuff you would want to. I’m not a programming expert but I know the data to build the display and what to click has to be in the stream. However grabbing it out and sorting into meaningful chunks you can work with seems a bit more difficult.

While it’s possible I’m misrepresenting this, I think Google is able to tell the items I opened and read (full or partial) as opposed to those where I scanned the title and marked it read (see the % read column)

I mark everything I’ve scanned over as read sort of “clearing my inbox” so to speak, and if you ever wanted to see the hard stats that proves bloggers are an egocentric bunch notice the things about me, me, me have the highest actual read percentages. So if you were trying to fake a signal of quality you’d have to log in regularly, simulate clicking in and then simulate actually opening the item. Quite a bit harder and more difficult don’t you think.


Bookmark WebProNews:

Michael Gray is SEO specialist and publishes a Search Engine Industry blog at www.Wolf-Howl.com. He has over 10 years experience in website development and internet marketing, helping both small and large companies increase their search engine visibility, traffic, and sales. Michael is a current member of Internet Marketing of New York ( IM-NY.org) and a guest speaker on Webmaster Radio. He is also an editor for the popular search engine new website Threadwatch.org.

Ajax Applications – Immune to Clickbots?
Comments Off on Ajax Applications – Immune to Clickbots?
Top Rated White Papers and Resources

Comments are closed.

  • Join for Access to Our Exclusive Web Tools
  • Sidebar Top
  • Sidebar Middle
  • Sign Up For The Free Newsletter
  • Sidebar Bottom