Request Media Kit

Is Facebook Getting ‘Freedom Of Expression’ Right?

Facebook has recently taken some heat over its approach to content on the social network that depicts, glorifies and/or trivializes violence against women. Groups like Women, Action and The Media, The...
Is Facebook Getting ‘Freedom Of Expression’ Right?
Written by Chris Crum
  • Facebook has recently taken some heat over its approach to content on the social network that depicts, glorifies and/or trivializes violence against women. Groups like Women, Action and The Media, The Everyday Sexism Project, and no doubt countless other individuals, have had enough.

    The aforementioned groups wrote an open letter to Facebook about the issue, and Facebook responded this week, indicating that it is making immediate changes, but some feel Facebook is walking a fine line between enforcing community standards and stifling free speech.

    Do you think Facebook needs to make significant changes to how it handles offensive content? Let us know what you think in the comments.

    In the letter, the groups called upon Facebook to do three things, specifically:

    1. Recognize speech that trivializes or glorifies violence against girls and women as hate speech and make a commitment that you will not tolerate this content.

    2. Effectively train moderators to recognize and remove gender-based hate speech.

    3. Effectively train moderators to understand how online harassment differently affects women and men, in part due to the real-world pandemic of violence against women.

    “To this end, we are calling on Facebook users to contact advertisers whose ads on Facebook appear next to content that targets women for violence, to ask these companies to withdraw from advertising on Facebook until you take the above actions to ban gender-based hate speech on your site,” the letter said.

    The groups formed a Twitter campaign using the hashtag #FBrape.

    Women, Action & The Media have been sharing a number of examples of the kind of content they’re concerned about. There are indeed some vile displays. Here are a couple, including one Facebook refused to have removed because it “doesn’t violate Facebook’s Community Standard on graphic violence”.

    Facebook graphic violence

    Keep in mind that Facebook has had the following things removed: photos of exposed breasts, breastfeeding photos, photos of cartoon breasts, a woman’s elbows that resembled breasts, and as recently as this month, artwork depicting actress Bea Arthur’s breasts.

    On a sidenote, Pinterest is going to start allowing nudity (as long as it’s “artistic”).

    Also among the examples given are groups like “Rapist Community,” “Slapping hookers in the face with a shoe,” and “Punching Rihanna”.

    Facebook put out its own letter, in which Marne Levine, VP of Global Public Policy at Facebook wrote in direct response to the groups and their claims:

    We prohibit content deemed to be directly harmful, but allow content that is offensive or controversial. We define harmful content as anything organizing real world violence, theft, or property destruction, or that directly inflicts emotional distress on a specific private individual (e.g. bullying). A list of prohibited categories of content can be found in our Community Standards at www.facebook.com/communitystandards.

    In addition, our Statement of Rights and Responsibilities (www.facebook.com/legal/terms) prohibits “hate speech.” While there is no universally accepted definition of hate speech, as a platform we define the term to mean direct and serious attacks on any protected category of people based on their race, ethnicity, national origin, religion, sex, gender, sexual orientation, disability or disease. We work hard to remove hate speech quickly, however there are instances of offensive content, including distasteful humor, that are not hate speech according to our definition. In these cases, we work to apply fair, thoughtful, and scalable policies. This approach allows us to continue defending the principles of freedom of self-expression on which Facebook is founded. We’ve also found that posting insensitive or cruel content often results in many more people denouncing it than supporting it on Facebook. That being said, we realize that our defense of freedom of expression should never be interpreted as license to bully, harass, abuse or threaten violence. We are committed to working to ensure that this does not happen within the Facebook community.

    Facebook has vowed to take some new steps, which it said would begin rolling out immediately.

    For one, Facebook said it will complete its review and update the guidelines its User Operations team uses to evaluate reports of violations of Community Standards on hate speech, while soliciting feedback from legal experts, representatives of the women’s coalition, and other groups. Secondly, it will update the training it gives to its teams that review and evaluate reports, again working with legal experts, the women’s coalition, and others.

    The company says it will establish more formal and direct lines of communications with representatives of women’s groups and others “to assure expedited treatment of content they believe violates” Facebook’s standards. Facebook also says it will encourage the Anti-Defamation League’s Anti-Cyberhate working group and other working groups to include representatives of the women’s coalition to “identify how to balance considerations of free expression, to undertake research on the effect of online hate speech on the online experiences of members of groups that have historically faced discrimination in society, and to evaluate progress on our collective objectives.”

    “We will increase the accountability of the creators of content that does not qualify as actionable hate speech but is cruel or insensitive by insisting that the authors stand behind the content they create,” says Levine. “A few months ago we began testing a new requirement that the creator of any content containing cruel and insensitive humor include his or her authentic identity for the content to remain on Facebook. As a result, if an individual decides to publicly share cruel and insensitive content, users can hold the author accountable and directly object to the content. We will continue to develop this policy based on the results so far, which indicate that it is helping create a better environment for Facebook users.”

    Since Facebook’s response, Women, Action, & The Media has put out a statement praising the company’s actions. “Facebook has already been a leader on the internet in addressing hate speech on its service,” it says. “We believe that this is the foundation for an effective working collaboration designed to confront gender-based hate speech effectively. Our mutual intent is to create safe spaces, both on and off-line. We see this as a vital and essential component to the valuable work that Facebook is doing to address cyber-bulling, harassment and real harm.”

    “We are hopeful that this moment will mark an historic transition in relation to media and women’s rights in which Facebook is acknowledged as a leader in fostering safer, genuinely inclusive online communities, setting industry precedents for others to follow,” the statement says. “We look forward to collaborating with these communities on actions both big and small until we live in a world that’s safe and just for women and girls, and for everyone.”

    Facebook is getting a lot of praise in general for its response, but some are worried about the freedom of speech implications. GigaOm senior writer Mathew Ingram, for example, asks, “Do we really want Facebook to decide what qualifies as hate speech and what doesn’t?”

    “The larger problem in making Facebook take this kind of content down, however, is that it forces the network to take an even more active role in determining which of the comments or photos or videos posted by its billion or so users deserve to be seen and which don’t,” he writes. “In other words, it gives Facebook even more of a licence to practice what amounts to censorship — something the company routinely (and legitimately) gets criticized for doing.”

    “It’s an increasingly slippery slope,” he says.

    There’s no question that Facebook has some terrible stuff on it. What else could you expect from a network that provides a home to over a billion people? But do Ingram and other who share this view have a valid point? Share your thoughts in the comments.

    Get the WebProNews newsletter delivered to your inbox

    Get the free daily newsletter read by decision makers

    Subscribe
    Advertise with Us

    Ready to get started?

    Get our media kit