Facebook to Hire 3,000 People to Remove Violent Content, ‘Hate Speech’

GABRIEL BOUYS/AFP/Getty Images
GABRIEL BOUYS/AFP/Getty Images

Facebook is hiring 3,000 new employees for its “community operations team” whose job will be to remove posts that are flagged as inappropriate by users.

Though the statement released by Facebook CEO Mark Zuckerberg indicated that the employees would be hired to combat the rise in live streamed murder and suicide, Zuckerberg also added that it would allow them to “get better at removing things we don’t allow on Facebook” such as “hate speech.”

“Over the last few weeks, we’ve seen people hurting themselves and others on Facebook — either live or in video posted later. It’s heartbreaking, and I’ve been reflecting on how we can do better for our community,” announced Zuckerberg in the post on his page. “If we’re going to build a safe community, we need to respond quickly. We’re working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down.”

Over the next year, we’ll be adding 3,000 people to our community operations team around the world — on top of the 4,500 we have today — to review the millions of reports we get every week, and improve the process for doing it quickly. These reviewers will also help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation. And we’ll keep working with local community groups and law enforcement who are in the best position to help someone if they need it — either because they’re about to harm themselves, or because they’re in danger from someone else.

In addition to investing in more people, we’re also building better tools to keep our community safe. We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help. As these become available they should help make our community safer.

This is important. Just last week, we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren’t so fortunate.

No one should be in this situation in the first place, but if they are, then we should build a safe community that gets them the help they need.

Over the last few weeks, we've seen people hurting themselves and others on Facebook — either live or in video posted…

Posted by Mark Zuckerberg on Wednesday, May 3, 2017

“Keeping people safe is our top priority,” added Facebook COO Sheryl Sandberg in a comment. “We won’t stop until we get it right.”

Just last week, an Alabama man live streamed his suicide on Facebook Live, while a Thai father also streamed himself hanging his 11-month-old child before killing himself.

In January, four people were also arrested in Chicago after they filmed themselves torturing and beating a tied up disabled man, who was repeatedly forced to say, “f*ck Trump,” and, “f*ck white people,” while on Easter Sunday, Facebook user Steve Stephens filmed himself shooting and killing 74-year-old Robert Godwin Sr. before he evaded police for days– eventually committing suicide in a standoff.

Following the revelation that the video of the murder was on Facebook for hours before removal, the company admitted in a statement that they “need to do better.”

“As a result of this terrible series of events, we are reviewing our reporting flows to be sure people can report videos and other material that violates our standards as easily and quickly as possible,” said Facebook in their statement. “We disabled the suspect’s account within 23 minutes of receiving the first report about the murder video, and two hours after receiving a report of any kind… But we know we need to do better.”

Zuckerberg’s mention of further tackling hate speech, however, could indicate further censorship of conservative and libertarian users, who have frequently been suspended and sanctioned on the platform.

Last year, Facebook suspended Gab CEO Andrew Torba for sharing a video posted by former Breitbart senior editor Milo Yiannopoulos, while Breitbart reporter James Delingpole was similarly banned for the same reason in March.

In February, Facebook also temporarily banned transgender anti-SJW commentator Blaire White, and in October 2016, Lucian Wintrich, who acts as the White House correspondent for the Gateway Pundit, was also banned for a month after he posted a screenshot of hate mail he had received.

Several comedy pages have also been sanctioned by Facebook over the last two years, including a page that mocked failed presidential candidate Hillary Clinton, a popular anti-feminist page, and various others.

Charlie Nash is a reporter for Breitbart Tech. You can follow him on Twitter @MrNashington or like his page at Facebook.

COMMENTS

Please let us know if you're having issues with commenting.