What Facebook's Doing To Help Prevent Suicide
The social media giant has unveiled new features to make it easier for people experiencing suicidal thoughts to find help. (Photo: Facebook)
It’s not an uncommon scenario: You scroll through your Facebook news feed, and you see a post that concerns you — a post indicating potential for self-harm.
But what do you do?
Facebook has actually had suicide prevention strategies in place since 2008, with workers on call around the clock in Ireland, India, and the United States to respond to posts flagged as disconcerting. But the site never really had the expert viewpoint to ensure it was getting its response right. “We’re pretty diligent about monitoring reports, but we’re by no means the expert,” Rob Boyle, product manager at Facebook, tells Yahoo Health.
But now that’s changing. Facebook has partnered with Forefront: Innovations in Suicide Prevention, an organization based at the University of Washington’s School of Social Work, to provide resources, advice, and support to anyone who may be struggling with suicidal thoughts, as well as resources for concerned friends to respond to and report a post that worries them.
The updated tools were announced at Facebook’s fifth Compassion Research Day this week, held at Facebook headquarters in Menlo Park, California. You can read the full announcement on the site’s safety page.
Changing Social Media’s Reputation
Social media has a “bad rap” in the world of suicide prevention, as many cries for help on the web go unacknowledged. But Facebook’s approach impressed Jennifer Stuber, a University of Washington professor of social work, who was personally affected by tragedy when her husband took his life in 2011.
“They really wanted to act positively and proactively,” she tells Yahoo Health. “Being socially connected is a big factor in suicide prevention. Especially when you’re a concerned friend, giving very clear messaging about engaging with the person and responding in as compassionate a way as possible was important.”
Related: Talk Therapy Substantially Reduces Suicide Risk, Study Finds
Ursula Whiteside, a research scientist for Forefront, helmed much of the suicide-prevention materials that will be used on Facebook. She also created Now Matters Now, a first-of-its-kind site with innovative intervention techniques for those experiencing suicidal thoughts; the premise is that mere moments matter when it comes to saving a life.
Whiteside helped translate effective suicide-prevention techniques to Facebook. “The content development included the voices of others who have been personally affected by suicide — and that’s never happened before, to have them at the table and say, ‘You are legitimate, and your voice matters,’” she tells Yahoo Health. “So, the experience was informed by and reviewed by those who have lived it.”
Considering 81 percent of people in the U.S. are on Facebook, and there are 41,000 suicide deaths each year, these tools could have a huge impact, Boyle says. “Since Facebook is essentially a communication tool, we really just played to our strengths,” he says. “When someone reaches out, we can say, ‘OK, here are your 10 closest friends,’ or ‘Here’s the phone number of someone who can help’ — because we have that information.”
Related: 15 Myths and Facts About Suicide and Depression
How To Flag A Worrying Post
If you’re worried about someone who might be in distress based on a posts or activity, here’s what to do:
On the drop-down menu on a Facebook post, you can report that a post “should not be on Facebook.”
(Photo: Facebook)
You can then mark that it’s “hurtful, threatening or suicidal.” That will bring you to a prompt asking you how the post is harmful.
(Photo: Facebook)
From there, you’ll have several options. If you think the person is in imminent danger, Facebook will ask you to dial emergency services right away.
(Photo: Facebook)
Otherwise, you can choose to message the person in distress.
(Photo: Facebook)
You can also ask a closer friend to reach out, or ask Facebook to take a look into things.
(Photo: Facebook)
Facebook will offer guided help and advice for dealing with distressing emotions, as developed and reviewed by the experts. This includes immediate tips for handling negative thoughts…
(Photo: Facebook)
… videos from people who have dealt with suicide and suicidal thoughts …
(Photo: Facebook)
… and the opportunity to connect with trusted friends or helplines.
(Photo: Facebook)
“We spent a lot of time crafting this,” Boyle says. “We will basically be interrupting their Facebook experience to ask, ‘How can we help?’”
Although Boyle says Facebook doesn’t release specific data on the volume of these types distressing of posts, it’s an issue that’s been on the site’s radar for years. “What I can tell you is that this is one of the rarest reports we get — but we do prioritize these above others,” he says. “Now that we’ve done the research on providing help, we’re focusing on making the users more aware.”
Right now, these suicide-prevention tools are limited to U.S. Facebook users, but the site has plans to slowly roll them out globally over the next several months.
Other forms of social media do have some distress-response mechanisms in place. For instance, if you search Tumblr for a self-harm term like “cutting,” a screen pops up asking if everything is OK and lending info about seeking help. The site also prohibits content promoting self-harm, self-mutilation, and eating disorders.
The “Everything okay?” message from Tumblr comes up when a user searches for a self-harm-related keyword. (Photo: Tumblr)
Pinterest does not allow posts that promote self-harm or eating disorders, and encourages users to report this kind of content to administers. On Twitter, you can report a user who has posted concerning tweets, and the platform will reach out and direct the person in distress to help.