More than 200 of Facebook Inc.’s content moderators said their lives are being put at risk by the requirement to work in offices in global hot spots during the pandemic.
“Now, on top of work that is psychologically toxic, holding onto the job means walking into a hot zone,” wrote the outside contractors and Facebook employees in a letter to the executives of the social-media company and the contracting companies released Wednesday. Moderators sift through explicit, violent and abusive content to remove it from Facebook’s social network.
“Workers have asked Facebook leadership, and the leadership of your outsourcing firms like Accenture and CPL, to take urgent steps to protect us and value our work. You refused.” Several Covid-19 cases have occurred at moderator workplaces, according to the letter.
At the start of the pandemic, Facebook asked content moderators to work mostly from home. But some content, such as explicit child imagery, needed to be viewed in a secure environment legally; in those cases, Facebook either attempted to use its artificial intelligence to weed out bad content, or asked some moderators to work in offices in places such as Austin, Texas.
The company had been relying more heavily on AI moderation so human workers could focus the past months on election-related content and Covid-19 misinformation. Now, more of Facebook’s contractors are back to work in offices. The letter writers said Facebook’s AI is simply not good enough to replace human judgment.
“Facebook’s algorithms are years away from achieving the necessary level of sophistication to moderate content automatically,” the moderators said in the letter. “They may never get there.”
The moderators, who work for contracting companies such as Accenture Plc, are asking for full-time employment, hazard pay and safer working conditions in light of the spread of Covid-19. Facebook didn’t immediately respond to a request for comment.