Facebook Using Artificial Intelligence to Scan Users’ Posts for Suicide Warning Signs

This Monday, June 19, 2017, photo shows a user signing in to Facebook on an iPad, in North
AP Photo/Elise Amendola

Facebook hopes to detect suicidal users on their platform through the use of artificial intelligence after successfully testing the feature in the United States.

The AI reportedly detects users who display “suicidal intent.” Facebook began testing the feature in March, “when the company started scanning the text of Facebook posts and comments for phrases that could be signals of an impending suicide,” according to Reuters.

“Facebook has not disclosed many technical details of the program, but the company said its software searches for certain phrases that could be clues, such as the questions ‘Are you ok?’ and ‘Can I help?'” Reuters reported. “If the software detects a potential suicide, it alerts a team of Facebook workers who specialize in handling such reports. The system suggests resources to the user or to friends of the person such as a telephone help line. Facebook workers sometimes call local authorities to intervene.”

In a statement, Facebook’s Vice President of Product Management, Guy Rosen, claimed that within the past month, “first responders checked on people more than 100 times after Facebook software detected suicidal intent.”

“Speed really matters. We have to get help to people in real time,” Rosen proclaimed, adding that Facebook has specialist employees who are able to contact authorities in various languages.

It is currently unknown as to where the feature will be expanding to, however it won’t be used in the European Union “due to sensitivities.”

There have been numerous cases of Facebook users livestreaming their suicides with the platform’s Facebook Live feature.

In May, a man committed suicide by setting fire to himself in a Facebook Live stream, while in April, an Alabama man livestreamed his suicide on the platform following a breakup.

In the same month, a Thai father streamed himself hanging his 11-month-old child before killing himself, and in January, there were at least two cases of suicide livestreamed on Facebook.

Police officers were able to save one teenager in May, after she attempted suicide on Facebook Live, prompting concerned viewers to call 911.

Earlier this year, Facebook CEO Mark Zuckerberg announced that he would be hiring 3,000 new employees to sort through violent content on the platform, while Facebook also introduced a new feature to combat suicides specifically on Facebook Live.

“We know we need to do better,” said the company in a statement following numerous incidents.

Charlie Nash is a reporter for Breitbart Tech. You can follow him on Twitter @MrNashington and Gab @Nash, or like his page at Facebook.

COMMENTS

Please let us know if you're having issues with commenting.