Facebook has created a War Room ahead of the midterm election that is filled with data scientists and specialists trying to stop the spread of what it considers fake news. However, conservatives on YouTube see this as really an attempt to stop the spread of conservative thought.
“Calling it the “Department of Censorship” would have been a bit too embarrassing,” one commenter said. Another called it, “Facebook’s “Get Democrats elected in the mid-terms” campaign HQ and a warm-up for their much larger “Get a Democrat in the White House 2020” campaign.”
Perhaps reinforcing this perception of bias is that the War Room is lead by former Obama Administration appointee, Nathaniel Gleicher, who is now Head of Cybersecurity Policy at Facebook.
Here are some other comments from people concerned about Facebook bias against conservatives and also those who just don’t like the idea of censorship:
“They are only blocking conservatives.”
“They call it patrolling. I simply call it selective censoring.”
“DNC extension war room😔”
“This is scary. We must have a better way to screen fake news.”
“To be clear: had HRC won in 2016, none of this would be happening.”
“USA learning tricks from Chinese.”
Others are concerned that Facebook’s entire advertising model motivates polarization. “The advertising business model creates the wrong incentives for Facebook,” says Roger McNamee in a discussion on Bloomberg. “Essentially, it forces them to use highly addictive technology and to basically push people to increasingly extreme positions, so polarization is good for their business. Anger and fear are good for their business.”
Facebook is confident that the War Room will be effective in stopping election manipulation, especially by foreign actors. Nathaniel Gleicher, Nathaniel Gleicher, Head of Cybersecurity Policy at Facebook and Samidh Chakrabarti, Facebook’s Head of Civic Engagement discussed the War Room in a recent interview:
Nathaniel Gleicher – Head of Cybersecurity Policy at Facebook:
In order to manipulate public debate, first, you have to understand the culture you are targeting. There’s always going to be more people inside a country that understand that than outside.
We are talking volume. The interference that comes from overseas can be particularly pernicious because there you have a public State that’s looking to influence or manipulate or mettle in another country’s public debate.
Part of what we’ve tried to do, particularly as we need to move very quickly is pushing as much of the decision making to the teams as possible, but obviously, there’s an escalation chain available so that when we need to move something up to Mark (Zuckerberg) or Sheryl (Sandberg) we can do it quickly.
I think our goal and our responsibility is to ensure that we are helping democracy more than we are hurting it. We are ready. That doesn’t mean that there aren’t going to be challenges. When you have malicious actors like this there are always efforts, there are always going to be unexpected threats.
Samidh Chakrabarti – Facebook Head of Civic Engagement:
Right now we have experts from across the company. Data scientists are looking at dashboards and seeing for example if there is any kind of spike in content that could be related to voter suppression to prevent any of it from going viral.
Our investments in machine learning have actually allowed us to block fake accounts usually at the moment of creation.