Facebook Clarifies Policy on Graphic Content

Josh WolfordSocial Media

Share this Post

Facebook says that when it comes to graphic, violent content posted on the site, they're going to start taking a more "holistic" approach.

We told you earlier this week that Facebook had reversed a months-old policy that banned all sorts of graphic videos - mainly beheading videos - and started allowing them to appear on the site. In May, after a lot of external pressure, Facebook yanked a specific video that showed a women being decapitated, allegedly by her husband after he caught her cheating. Whether or not the backstory is entirely accurate is one thing, but the video itself was incredibly disturbing.

After first defending users' rights to "describe, depict and comment on the world in which we live," Facebook finally succumbed to the pressure and pulled the video.

But earlier this week, the BBC reported that Facebook was now allowing such videos to return to the site. We reached out and got confirmation from Facebook that people have the ability to share graphic, violent videos - just as long as they share them in an attempt to condemn the actions, not glorify.

“Facebook has long been a place where people turn to share their experiences, particularly when they’re connected to controversial events on the ground, such as human rights abuses, acts of terrorism and other violent events. People share videos of these events on Facebook to condemn them. If they were being celebrated, or the actions in them encouraged, our approach would be different. However, since some people object to graphic video of this nature, we are working to give people additional control over the content they see. This may include warning them in advance that the image they are about to see contains graphic content,” a Facebook spokesperson told me.

Now, the company is making a more public clarification of their policy on graphically violent content:

As part of our effort to combat the glorification of violence on Facebook, we are strengthening the enforcement of our policies.

First, when we review content that is reported to us, we will take a more holistic look at the context surrounding a violent image or video, and will remove content that celebrates violence.

Second, we will consider whether the person posting the content is sharing it responsibly, such as accompanying the video or image with a warning and sharing it with an age-appropriate audience.

According to Facebook, "recent reports of graphic content" have led to the removal of some specific content because they concluded that it "irresponsibly glorifies" the violence. But moving forward, it appears that Facebook will allow such content as long as people share it in the context of condemning it, and in a "responsible manner."

Still no luck on ever seeing any boobies, however.

Image via The Blaze

Josh Wolford
Josh Wolford is a writer for WebProNews. He likes beer, Japanese food, and movies that make him feel weird afterward. Mostly beer. Follow him on Twitter: @joshgwolf Instagram: @joshgwolf Google+: Joshua Wolford StumbleUpon: joshgwolf