In recent months a number of Facebook users have committed suicide on live stream which has prompted the popular social media platform to launch an artificially intelligent program to help identify and then stop these users from committing suicide.
The company has said its blog post that engineers are devising software which will identify concerning post or videos, after identification it will automatically offer mental health resources. The post further said that between the age of 15 and 29 suicide is the second biggest reason.
However, research has shown that the most effective way to handle a suicidal person is by the contact and counseling of family members and friends.
Previously Facebook has announced features which will enable friends to report concerning posts, after getting such reports a confidential message is sent by Facebook to the user offering help and assistance in reaching out to friends or professional help.
The new features include; allowing users to report concerning live video, offering of help through crisis support groups and live chat support and featuring artificial intelligence that will identify such users.
Dr. John Draper, the director of United States National Suicide Prevention Lifeline told BBC: “It’s something that we have been discussing with Facebook. The more we can mobilise the support network of an individual in distress to help them, the more likely they are to get help. The question is how we can do that in a way that doesn’t feel invasive. I would say though that what they are now offering is a huge step forward.”
This latest addition of artificial intelligence came as a result of the death of a 14-year-old-girl in Miami, she live streamed her suicide in January. Facebook has said that it has already begun taking measures to stop these incidents from happening.
While watching a live stream if someone finds it concerning they can click on a menu and report it, after this an advice is displayed to the user offering different venues of support to the broadcaster.
The lead researcher of Facebook’s project Jennifer Guadagno, said: “Some might say we should cut off the stream of the video the moment there is a hint of somebody talking about suicide. But what the experts emphasised was that cutting off the stream too early would remove the opportunity for people to reach out and offer support.”
She further said: “So, this opens up the ability for friends and family to reach out to a person in distress at the time they may really need it the most.”