Another day, another call for Facebook to remove content.
This time it comes from officials in India, who claim that content hosted on Facebook has led to a mass panic and exodus of tens of thousands of people from cities in the northeast part of the country. According to the Wall Street Journal, the content that sparked the panic contained rumors that there would be some sort of an attack in retaliation for violence that has rocked the northeastern state of Assam.
The violence in that area has already resulted in more than 78 deaths.
Facebook confirmed to the WSJ that they have been made aware of the requests and are working on a response.
“Facebook will remove content which breaches our terms,” they said. They called on users to help flag content, so they can quickly review and remove any of it that violates their Statement on Rights and Responsibilities. The SSR states that content cannot “bully, intimidate, or harass any user.” Also, content will be removed if it “is hate speech, threatening, or pornographic; incites violence; or contains nudity or graphic or gratuitous violence.”
So whether or not the offending pages are removed will come down to that. They must incite violence or harass or intimidate. And if the pages advocated any sort of violent retaliation, it’s likely that they will in fact be removed by Facebook.
Of course, India has already done what they can and blocked over 245 pages that contained the “inflammatory and hateful” content. But they’re calling on Facebook, Twitter, and Google to expedite the process. By the way, Google has also said that they’re on the case.
Facebook is no stranger to content removal requests. In March of 2011, Israel’s Minister of Public Diplomacy joined the calls for the company to remove a page called Third Palestinian Intifada. Facebook eventually did, as the page clearly proposed violence against an ethnic population. Facebook also recently removed a page targeting Australian Aborigines after a public outcry.
But Facebook is honest when they give everyone the “if it violates our terms” bit. A page being simply offensive is not enough to warrant a takedown, according to the company. Take for instance the plethora of pages that popped up following the Denver Theater Massacre either supporting alleged shooter James Holmes or making jokes about the victims. Facebook refused to remove many of them, saying they were “incredibly distasteful,” but they didn’t violate their terms.