Facebook Moderators Want Remote Work, Hazard Pay

Facebook moderators are protesting the company’s decision to require them to come back to the office amid the pandemic....
Facebook Moderators Want Remote Work, Hazard Pay
Written by Matt Milano
  • Facebook moderators are protesting the company’s decision to require them to come back to the office amid the pandemic.

    For months, Facebook allowed content moderators to work from home. Recently, however, the company required them to come back into the office. Moderators have penned an open letter, criticizing executives for not taking their safety seriously, and not paying them enough to take risks Facebook requires.

    After months of allowing content moderators to work from home, faced with intense pressure to keep Facebook free of hate and disinformation, you have forced us back to the office. Moderators who secure a doctors’ note about a personal COVID risk have been excused from attending in person. Moderators with vulnerable relatives, who might die were they to contract COVID from us, have not.

    The moderators take Zuckerberg to task for benefiting significantly from the pandemic, with his fortune nearly doubling during it, but not passing on any benefits to the people making Facebook’s success possible. While Zuckerberg is worth over $100 billion, the moderators are only paid roughly $18/hour.

    The letter also addresses the toxic nature of the job, something that has become intolerable when the pressure of the pandemic is added on.

    Before the pandemic, content moderation was easily Facebook’s most brutal job. We waded through violence and child abuse for hours on end. Moderators working on child abuse content had targets increased during the pandemic, with no additional support.

    Now, on top of work that is psychologically toxic, holding onto the job means walking into a hot zone. In several offices, multiple COVID cases have occurred on the floor. Workers have asked Facebook leadership, and the leadership of your outsourcing firms like Accenture and CPL, to take urgent steps to protect us and value our work. You refused. We are publishing this letter because we are left with no choice.

    The moderators highlight how Facebook’s artificial intelligence algorithms have so far failed to replace the human element, making the moderators more important than ever. It remains to be seen if Facebook will address the moderators’ concerns.

    Their message, however, is clear:

    Stop Needlessly Risking Moderators’ Lives

    Get the WebProNews newsletter delivered to your inbox

    Get the free daily newsletter read by decision makers

    Subscribe
    Advertise with Us

    Ready to get started?

    Get our media kit