Facebook struggled to remove sensitive content under Covid lockdown
1 min readPhoto caption: In March, Facebook sent many of its content reviewers home, […] Photograph: Herwin Bahar / Zuma / Rex / Shutterstock. Article by Alex Hern.
Fewer pieces of suicide, child nudity and exploitation content were removed after staff were sent home
Facebook has admitted it struggled to remove content that promoted suicide or exploited children after global lockdowns forced it to rely more heavily on automatic moderation. […]
As well as the standard challenges of working from home, Facebook had to deal with other problems as it gradually built up its moderators’ capacity for remote working. Mark Zuckerberg said in March that the company faced data protection issues, which meant it could not allow some contractors to work on their own devices.
Mental health concerns also limited the company’s ability to fully shift work remotely, Zuckerberg said. Some of the most distressing work would only be done by full-time staff who were still able to enter the office, since the infrastructure required to provide mental health support to contractors working remotely was not in place. That constraint appears to have been what limited the company’s ability to respond to suicidal content and child exploitation material during the pandemic. […]