Reputation Manipulation

    February 28, 2007

User-generated ranking systems are increasingly important these days. Would you buy something on eBay without checking the seller’s feedback rating?

Would you rent an unfamiliar movie at Netflix without glancing at the number of stars it earned from Netflix members? Driven by the dogma of Web 2.0, today most major sites have a way for users to make themselves heard – ratings, comments, reviews, etc. Many of these have commercial implications – a well-reviewed product will sell better, while people will shun a hotel that has bad reviews by past guests. Where there’s commerce, there’s money… and where there’s money, there’s spam. It will be a few days before the March WIRED magazine is online, but this issue contains an interesting article by Annalee Newitz, Herding the Mob.

Community spamming has been around since the beginning of online communities. Whatever the focus of a community, it will attract members who either want to sell something directly or, more subtly, post positive comments about their product or company. Experienced community operators can usually spot these promoters and remove the tainted content, but often this takes a high level of community moderation and considerable experience. The problem is even more difficult for review sites, where virtually every post is a comment about a product or service. Is the guy who posted about finding a cockroach in his hotel bed sharing a legitimately awful experience, or is he working for the hotel’s competition?

Newitz discusses how some of the influential online communities can be spammed. Digg, for example, can be manipulated by groups of members who vote for each other’s stories; at the extreme, these “buddies” may actually be a network for hire. Sometimes, Digg manipulation is as simple as one user creating many accounts and voting for a story many times. Yahoo’s is also susceptible to planned manipulation.

eBay users place great weight on its merchant rating system, and merchants jealously guard their perfect ratings. One high-volume, high-ticket eBay seller I know complained about the downside of having a perfect rating – occasionally, bad customers make outrageous demands and threaten to post negative feedback. Despite the seeming integrity of the system, Newitz shows how some sellers manipulated the numbers by engaging in a large number of low value sham transactions to build an excellent feedback rating, and then moving into selling high value items where an unsuspecting buyer could be victimized.

Community operators have tools to fight the manipulation, and are working on better ones. Digg is aware of a variety of manipulation techniques, and is creating ever-better algorithms for detecting voting patterns that suggest manipulation. Multiple user detection, very similar voting by separate accounts, users who vote without viewing an article, etc. all raise flags at Digg.

Ultimately, though, online reputation may be the ultimate defense against reputation manipulation (even if that sounds a bit circular!). Communities that not only let members build reputation but also create interaction between members will let the real experts rise to the top and out the phony members in the long run. Mob manipulation is all about shortcuts – how to produce the biggest impact with the smallest expenditure of time and money. It’s simply not economic to build a multi-year posting history on a site with thousands of intelligent contributions in order to promote some flash-in-the-pan product. I think we’ll see most of the sites that have added user ratings, reviews, and comments in the last year start to add user reputation features in the next year. Manipulation will still be possible, but powerful user reputation features will raise the ante for potential crowdhackers and make quick-hit spamming far less effective.



Bookmark WebProNews: