Facebook plans to expand its community operations team by adding 3,000 people as part of a growing effort to screen and combat harmful content on the social network.
CEO Mark Zuckerberg made the announcement on his Facebook page Wednesday, pledging that the new hires will help the company “get better at removing things we don’t allow on Facebook like hate speech and child exploitation.”
Zuckerberg also said the company will simplify the process for users flagging questionable content for review, and will make it easier for employees to contact law enforcement when necessary.
The additional 3,000 screeners represent a near doubling of the company’s community operations team, currently 4,500 strong.
Moderators have struggled to stamp out horrific videos before they go viral in real time. Last month, a 20-year-old man in Thailand killed his infant daughter and streamed the death on Facebook Live. Days before that, a 37-year-old man in Cleveland uploaded a video of him shooting and killing another man on Easter Sunday. That video was live on Facebook for around two hours before moderators took it down.
Read Zuckerberg’s full post below:
If you or someone you know needs help, call 1-800-273-8255 for the National Suicide Prevention Lifeline. You can also text HELLO to 741-741 for free, 24-hour support from the Crisis Text Line. Outside of the U.S., please visit the International Association for Suicide Prevention for a database of resources.