Facebook Has Serious New Suicide Prevention Tools

Yesterday, Facebook announced new measures to help users who might be at risk for self-harm. It’s become too common that we hear stories about Facebook users who speak to their distress and end up hurting themselves or worse, and in the meantime, their friends don’t know what to do.

Facebook’s new tool will allow friends to report statuses that are alarming or concerning. The post will be reviewed by a member of Facebook’s safety team, and then the distressed user will be sent a message from Facebook, alerting them that a friend reported their status and urging them to talk to someone from the National Suicide Prevention Lifeline, or reach out to a friend.

You might recall the failed Samaritans Radar app, a well-intentioned effort to identify tweets that indicated a chance of self-harm. The problem was that Radar allowed people to sign up to monitor their friends’ Twitter accounts, but didn’t ask those friends for permission to be monitored. At any time, a Twitter user who is struggling emotionally could have been tracked by anyone, and they wouldn’t know. Beyond the fact that it took away their agency and was an invasion of privacy, Radar was also potentially going to make it really easy for online bullies to identify people who are struggling and troll them into self-harm.

Facebook’s measures seem, in theory at least, to be a marked improvement on the model. Any status that’s reported to the safety team will have been offered voluntarily by the distressed user. There’s no tracking or monitoring function; it’s only used if a friend notices the status and sees cause for concern. It’s reviewed before anyone is contacted, and when they’re contacted, it’s a pre-planned message of support, stating that a friend was concerned, that the user isn’t alone and Facebook offers help for many people, and, probably most importantly, asking them what they would like to do, the options being “Talk to someone,” “Get tips and support,” or “Skip this.” Everything is voluntary, everything is opted in, and the person who’s struggling maintains their agency.

11001747_817721204932386_5834579040372627570_n 10422432_817721084932398_5471046820094188077_n

We’ll see how it actually works, if it helps to prevent episodes of self-harm or makes Facebook a better experience. Most of the time, I have very few good things to say about Facebook, but this one seems like a genuine win.

 

[Daily Dot]

[Rolling Stone]

[Inquisitr]

[The Smoking Gun]

[Images via Facebook, Shutterstock]


Send me a line at [email protected].