In a recent interview, 21-year-old Amine Derkaoui described spending spent three weeks working in Morocco for oDesk, an outsourcing company used by Facebook, to moderate content. Derkaoui’s job, which payed roughly $1 per hour, was to essentially implement Facebook’s strange content standards – ie, he was to delete any pictures of “cameltoes, moose knuckles, insides of skulls” or whatever banned images outlined in oDesk’s “abuse Standards” operations manual. Derkaoui’s short career shed some light upon a seedy facet of the social networking giant, which has been making hundreds of new millionaires.
Other moderators, primarily young, well-educated people working in Asia, Africa and Central America, all describe similar, ridiculously low salaries. Adam Levin, owner of British social network Bebo, says that the process of outsourcing is “rampant” across Silicon Valley. He adds, “we do it at Bebo. Facebook has so much content flowing into its system every day that it needs hundreds of people moderating all the images and posts which are flagged. That type of workforce is best outsourced for speed, scale and cost.”
About 4 billion articles of content are moved every day between Facebook’s 845 million users. Most falls under acceptable standards, but a lot also falls into catergories of pornography, racism and violence – all of which is policed by an outsourced workforce in a third world country, for $1 an hour. Graham Cluley of Sophos states that Silicon Valley’s outsourcing culture is a “poorly kept dirty secret.” Levin adds that he estimates that Facebook employs between 800 and 1000 workers through oDesk, about a third of its “regular” staff.
With Facebook mainly consisting of acquaitances explaining pictures of their breakfast, photos of countless new babies that all look roughly the same, friend requests from strangers users knew for a day 15 years before, generalized misrepresentation of one’s actual life and face, etc., it is interesting that the actual moderators of all of this content don’t even undergo criminal background screening. According to Derkaoui, his past wasn’t looked at, and there were no security measures stopping him from obtaining user information, as well as no barrier blocking him from uploading whatever he’d wanted onto Facebook himself.
Regardless, Facebook has a statement on the matter – “these contractors are subject to rigorous quality controls and we have implemented several layers of safeguards to protect the data of those using our service. No user information beyond the content in question and the source of the report is shared. All decisions made by contractors are subject to extensive audits.” I tend to go with what Derkaoui said.
Still, I find it hard to believe that any information regarding the actuality of Facebook’s weak privacy standards will prompt more than a handful of its 845 million customers to actually quit.