Facebook Inc (NASDAQ:FB) is planning to hire 3,000 people to find and remove problematic content from its website.
In a post on Facebook, Mark Zuckerberg said it is “heartbreaking” to see “people hurting themselves and others” in videos posted on Facebook.
“If we’re going to build a safe community, we need to respond quickly,” Zuckerberg said, adding that the company is working to make it easier for users to report harmful and problematic video and posts.
“So we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down.”
Zuckerberg said that Facebook will hire 3,000 people in its “community operations team” over the next year. The additional staff will join the 4,500 existing people to review the millions of reports Facebook gets every week.
The reviewers will also help the social media company remove hate speech and child exploitation from its platform.
In addition, members of Facebook’s community operations team will work with local community groups and law enforcement “to help someone if they need it — either because they’re about to harm themselves, or because they’re in danger from someone else,” Zuckerberg said.
Moreover, Zuckerberg added that his company is working to build “better tools to keep our community safe.” The company is making it easier and simpler for users to report problems. At the same time, it is developing tools to make it faster for its reviewers to determine which posts violate the company’s standards.
“This is important. Just last week, we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren’t so fortunate,” Zuckerberg said.
Last month, a man was killed in a video streamed live on Facebook. Later in the same month, a Thai man killed his daughter before committing a suicide in a live stream.