Click to Skip Ad
Closing in...

Facebook rolls out new AI software designed to identify and assist suicidal users

Published Nov 27th, 2017 8:01PM EST
Facebook Suicide Prevention
Image: Shutterstock

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

In a testament to how entrenched Facebook has become in our day-to-day lives, it’s not uncommon for users who commit suicide to leave telling videos or posts on their wall during the preceding few days. In an effort to pro-actively tackle the problem, Facebook is rolling out a suite of new AI tools designed to detect posts which may suggest a particular user is suicidal. In instances when cryptic posts are essentially cries for help, being able to identify such posts sooner rather than later can ultimately help save lives.

As part of Facebook’s new push to identify situations that warrant more attention, the social networking giant’s new software can analyze posts and even live video streams for times when a user might be having suicidal thoughts. What’s more, the software is designed to even take comments on a problematic post into account, with Facebook noting that comments like “Are you ok?” can be strong indicators that something is wrong. The new AI initiative is rolling out in international markets today and will eventually make its way to the United States. Incidentally, Facebook notes that the feature will not be available across the EU.

Facebook explains how the new initiative works as follows:

  • We are also using artificial intelligence to prioritize the order in which our team reviews reported posts, videos and live streams. This ensures we can get the right resources to people in distress and, where appropriate, we can more quickly alert first responders.
  • Context is critical for our review teams, so we have developed ways to enhance our tools to get people help as quickly as possible. For example, our reviewers can quickly identify which points within a video receive increased levels of comments, reactions and reports from people on Facebook. Tools like these help reviewers understand whether someone may be in distress and get them help.
  • In addition to those tools, we’re using automation so the team can more quickly access the appropriate first responders’ contact information.

Facebook also adds that it has employees working around the clock, 365 days a year, monitoring posts that might suggest a user is suicidal. At the same time, Facebook is working closely with first-responders and support groups to help distressed users as quickly as possible.

You can read more about Facebook’s suicide prevention measures over here.

Yoni Heisler Contributing Writer

Yoni Heisler has been writing about Apple and the tech industry at large with over 15 years of experience. A life long expert Mac user and Apple expert, his writing has appeared in Edible Apple, Network World, MacLife, Macworld UK, and TUAW.

When not analyzing the latest happenings with Apple, Yoni enjoys catching Improv shows in Chicago, playing soccer, and cultivating new TV show addictions.