In addition to vacation photos and cat videos, people also share details about their personal lives and feelings on Facebook—including occasional posts about despair and even thoughts of suicide. As the world’s biggest social network, with more than 1.39 billion users, Facebook is uniquely able to provide online resources and support to help suicidal people.
That’s the goal of a new collaboration between Facebook and researchers at Forefront: Innovations in Suicide Prevention, an organization based in the University of Washington’s School of Social Work. Working with Forefront and other mental health experts, Facebook has enhanced its suite of tools to support suicidal people and tell those who see and report suicidal posts on Facebook how they can help.
How It Works
When someone sees a post that suggests its author might be considering suicide, this person can click on a dropdown menu and report the post to Facebook.
That reporting activates a series of responses. The person who flags the post will see a screen with links that allow him or her to send a message to the potentially suicidal person, contact another Facebook friend for support, or connect with a trained professional at a suicide helpline for guidance.
Facebook will then review the reported post. If the poster is thought to be in distress, a series of screens will automatically launch when that person next logs onto Facebook, with suggestions for getting help.
The responses link to a number of positive options, including videos from Now Matters Now, an online program started by Forefront research scientist Ursula Whiteside, which uses real-life accounts of people who have struggled with suicidal thoughts to provide research-based coping strategies.
