Facebook is expanding its limited test run for suicide- and -self-harm reporting tools to the masses. To get better at detection the social network will begin implementing pattern recognition for posts and Live videos to detect when someone could be presenting suicidal thoughts. From there, VP of product management Guy Rosen writes that the social network will also concentrate efforts to improve alerting first responders when the need arises. Facebook will also have more humans looking at posts flagged by its algorithms.
Currently the passive/AI detection tools are only available in the US, but soon those will roll out across the globe. European Union countries notwithstanding. In the past month, Facebook has pinged over 100 first responders about potentially fatal posts, in addition to those that were reported by someone’s friends and family.
Apparently, “Are you okay?” and “Can I help?” comments are good indicators that someone might be going through a very dark moment. More than that, Rosen says that thanks to the algorithms and those phrases, Facebook has picked up on videos that might’ve otherwise gone unnoticed prior.
“With all the fear about how AI may be harmful in the future, it’s good to remind ourselves how AI is actually helping save people’s lives today,” CEO Mark Zuckerberg wrote in a post on the social network.
Source: Engadget.com