Facebook Adds Lifelines to Prevent Suicide

Facebook Adds Lifelines to Prevent Suicide
3/3/2015
Updated:
2/1/2016

In addition to vacation photos and cat videos, people also share details about their personal lives and feelings on Facebook—including occasional posts about despair and even thoughts of suicide. As the world’s biggest social network, with more than 1.39 billion users, Facebook is uniquely able to provide online resources and support to help suicidal people.

That’s the goal of a new collaboration between Facebook and researchers at Forefront: Innovations in Suicide Prevention, an organization based in the University of Washington’s School of Social Work. Working with Forefront and other mental health experts, Facebook has enhanced its suite of tools to support suicidal people and tell those who see and report suicidal posts on Facebook how they can help.

How It Works

When someone sees a post that suggests its author might be considering suicide, this person can click on a dropdown menu and report the post to Facebook.

That reporting activates a series of responses. The person who flags the post will see a screen with links that allow him or her to send a message to the potentially suicidal person, contact another Facebook friend for support, or connect with a trained professional at a suicide helpline for guidance.

Facebook will then review the reported post. If the poster is thought to be in distress, a series of screens will automatically launch when that person next logs onto Facebook, with suggestions for getting help.

The responses link to a number of positive options, including videos from Now Matters Now, an online program started by Forefront research scientist Ursula Whiteside, which uses real-life accounts of people who have struggled with suicidal thoughts to provide research-based coping strategies.

If the author of a reported post is thought to be suicidal, a series of screens will launch to offer help. (Credit: Facebook)
If the author of a reported post is thought to be suicidal, a series of screens will launch to offer help. (Credit: Facebook)

The tools aim to both direct suicidal people to resources and alternatives and also to guide concerned friends or family members through a situation most are simply not equipped to handle.

“Often, friends and family who are the observers in this situation don’t know what to do,” says Holly Hetherington, a Facebook content strategist working on the project. “They’re concerned, but they’re worried about saying the wrong thing or somehow making it worse. Socially, mental illness and thoughts about suicide are just not something we talk about.”

Stephen Paul Miller knows that all too well. Now Forefront’s operations manager, Miller lost a friend and college classmate to suicide five years ago. One night, Miller noticed a Facebook post from his friend saying that things were too much, that he couldn’t take it anymore. Alarmed, Miller resolved to call his friend in the morning. He died that night.

“The thing that breaks my heart the most about this is that I think it was just episodic. I don’t think he wanted to die,” Miller says. “But I was not trained. I did not know what to do.”

For immediate, confidential help from a trained counselor for yourself or someone you know, call the Suicide Prevention Lifeline at 1-800-273-TALK (8255).

Offering Support

The initiative began after a summit Facebook hosted about a year ago to discuss how technology companies could most effectively combat suicide. Facebook was already working with researchers on promoting compassion and preventing online bullying and wanted to do something similar around suicide prevention.

“We realized there’s a lot we don’t know. We are by no means experts in this space,” says Jennifer Guadagno, a Facebook researcher.

Guadagno reached out to Jennifer Stuber, an associate professor of social work at the University of Washington, who had started Forefront after her husband died by suicide in 2011.

Teams from Facebook and Forefront began working together last fall, starting with discussions that defined and framed the issue. The conversations included suicide-attempt survivors from the Now Matters Now project, who were instrumental in helping Facebook understand the spectrum of suicidal thoughts and how language commonly used around suicide can be insensitive—for example, saying someone “commits” suicide, the same term used for carrying out a crime.

Whiteside, who has herself struggled with suicidal thoughts, says when family or friends express fear or judgment to a suicidal person, they can unwittingly increase an already overwhelming sense of aloneness.

“People just don’t know what to do, and why would they?” she says. “As a society, we really need support in knowing how to respond to someone who’s suffering, and our work with Facebook is a first step.”

Stuber says Facebook has an opportunity to increase social media’s value as a force for good.

Source: University of Washington. This article was originally published on Futurity.org