After multiple, high-profile incidents of violence recently unfolded on Facebook, Mark Zuckerberg said in an announcement this week that the company will add 3,000 people over the next year to work on reviewing videos and other flagged reports.

“If we’re going to build a safe community, we need to respond quickly,” Zuckerberg writes.

“We’re working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down.”

“We’re working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down.”


Last month, Facebook faced criticism after a Cleveland video showing a deadly shooting stayed on the site for hours.

Facebook apologised for its handling of the situation and pledged “to do better,” but in another incident in Thailand later that month, disturbing videos showing the murder of a child stayed up for a full day.

Zuckerberg writes that the company will add the 3,000 people to the 4,500 working on the “community operations team.”

The statement does not specify the workers’ exact relationship to Facebook, but the company, like others in the tech industry, has been known to outsource moderation to workers around the world.

Share it: