Facebook has just introduced new artificial intelligence tools that will help with suicide prevention, according to Facebook Vice President Guy Rosen.
Rosen said in a statement Monday that the new technology will help prevent suicide by recognizing patterns in posts, and analyzing videos that users post. He said the company will improve how its first responders are contacted, and dedicate more reviewers to watch for reports of self-harm.
“Facebook is a place where friends and family are already connected and we are able to help connect a person in distress with people who can support them,” he said of the company’s new policies. “It’s part of our ongoing effort to help build a safe community on and off Facebook.”
Rosen wrote that last month, Facebook had worked with first responders on over 100 wellness checks based on reports that the company detected and that their reports are taken care of very quickly.
“We’ve found these accelerated reports — that we have signaled require immediate attention — are escalated to local authorities twice as quickly as other reports,” Rosen claimed.
Rosen wrote that Facebook has been on the watch for comments that begin with, “Are you ok?”, “Can I help?”, or other comments that would suggest that something is wrong. The company’s AI tools will also help monitoring Facebook Live. Rosen encouraged users to report any post that may suggest the possibility of self-harm directly to Facebook itself.
The social media platform is working on making their tools available to the United States and eventually to most of the world.