Facebook expands its limited tests to people with suicide and self-harm reporting tools to the masses. to get better at detection the social network will begin implementing pattern recognition for posts and Live videos to detect once somebody could be presenting suicidal thoughts. From there, product management VP Guy Rosen writes that the social network will focus efforts to improve alerting 1st responders once the necessity arises. Facebook will have more humans viewing posts flagged by its algorithms.
Now, the passive/AI detection tools are only available in the U.S., but shortly those can roll out around the world — EU countries notwithstanding. in the past month, Facebook has pinged over a hundred 1st responders about potentially fatal posts, in addition to those that were reported by someone’s friends and family.
Apparently, “Are you okay?” and “Can I help?” comments are good indicators that somebody might be going through a very dark moment. more than that, Rosen says that thanks to the algorithms and those phrases, Facebook has picked up on videos that might’ve otherwise gone unnoticed prior.
“With all the fear about how AI may be harmful in the future, it’s smart to remind ourselves however AI is really helping save people’s lives today,” chief executive officer Mark Zuckerberg wrote during a post on the social network.
It was under fire for testing with whether or not gaming your News Feed can alter your mood, Facebook has to work on repairing its image these days. Stories like this can help, but till there are a lot of successes than unfortunate happenstances the social network needs to keep at it.
If you or somebody you know is experiencing suicidal thoughts, don’t hesitate to contact the National Suicide Prevention Lifeline at 1-800-273-8255. the line is open 24/7 and there is also on-line chat if a phone is not available.