How Facebook plans to prevent suicides – it starts with you

At its annual Compassion Research Day, Facebook announced that it would be rolling out new tools to assist in suicide prevention for users who display signs of distress.

Facebook Inc warned of a dramatic increase in spending in 2015 and projected a slowdown in revenue growth this quarter, slicing a tenth off its market value. Facebook shares fell 7.7 percent in premarket trading the day after the social network announced an increase in spending in 2015 and projected a slowdown in revenue growth this quarter.

Dado Ruvic/Reuters/File

February 26, 2015

People use Facebook as a stage to play out their hopes, dreams, fears, anxieties, and sometimes, the lowest possible points in their lives.

There has been a growing cry for Facebook to step up its efforts on suicidal posts, and the company took a huge step in providing help to those who are asking for it.

At the social media site’s fifth Compassion Research Day, Facebook announced new tools to assist concerned friends and family and those thinking of taking their own life.

Why many in Ukraine oppose a ‘land for peace’ formula to end the war

Social media "got a bad rap because there has been a call for help that hasn’t been answered and that’s what Facebook is really trying to do, provide tools to its users to change that,” says Jennifer Stuber, a consultant to the project and faculty director of Forefront, a mental health group in Seattle. “I definitely see this as a really, really good thing. I underscore that because there are roughly 41,000 suicide deaths [a year], and we know that many of those people who died by suicide were Facebook users.”

Facebook teamed up with Forefront, Save.org, the University of Washington, and others, as well as consulting individuals who have dealt with self-injury or suicide, to overhaul its prevention tools.

Originally launched in 2011, Facebook’s first attempt to address suicidal content required links or screenshots to be uploaded to its suicide prevention page. The updated process is a little easier and allows for users to flag a concerning post from within the post itself. From there, the user can choose to message the potentially troubled individual, contact another Facebook friend for support, or can be linked with an expert for guidance.

After a review by Facebook, if the user appears to be suicidal, the next time they log into their account, a series of screens will offer them assistance and ways to receive help, including links to sites such as the National Suicide Prevention Lifeline and Now Matters Now.

For now, the new tool is limited to a few users in the US, but Facebook plans on rolling out the service nationally over the next few months and eventually to other countries. Facebook also stressed that this service is not a substitution for local emergency services.
 
Access to help is crucial for those who seek to end their life and family and friends are often unaware of how to handle the situation. The stigma surrounding mental health can leave those dealing with a loved one’s depression in the dark and many do not know how to approach these individuals. This could be an important lifeline for not only those who feel hopeless, but also those who wish to help but do not know how.

In the race to attract students, historically Black colleges sprint out front

But one crucial aspect for those seeking help for such personal issues is trust.

“My understanding from knowing many people who have lived experiences with being suicidal is that it is very, very important to have trust in a relationship with somebody who’s reaching out to you,” says Dr. Stuber. “A lot of people are my friends on Facebook, [but] if I were feeling suicidal, I would not necessarily be helped by a good portion of them reaching out to me because they’re not my trusted friends and my close community.”

If trust is an essential piece of this feature, past press may make some wary of the tool.

Last year, Facebook faced scandal after it revealed that it conducted a mood manipulation study on 700,000 unsuspecting users. Two years earlier, researchers tweaked the contents of hundreds of thousands of news feeds to give the feeds a more positive or negative overtone. While the study demonstrated that other users (and Facebook) can toy with people's emotions, the backlash from lawyer, politicians, and Internet activists was swift. The study was labeled "scandalous," "spooky," and "disturbing."

The echoes of past controversies followed Facebook to its annual Compassion Research Day. One audience member asked whether Facebook was worried about criticism toward its suicide prevention program from people who did not want to be monitored – to which employees responded by saying they had worked hard to avoid that.

“My experience has been that they have been very, very concerned about people’s privacy, first and foremost, and not being perceived as Big Brother-ish and definitely not trolling,” says Stuber, who says she was apprehensive about working with Facebook initially. “I overcame that because the potential here for good in the suicide prevention space is huge because Facebook's mission is about socially connecting people and that could potentially be a really big positive factor in preventing suicide if they can help do that well.” 

Stuber added that this was a good start, but the project “definitely” needed further development. She mentioned that there was no computer learning involved in the process, though she said many in the suicide prevention field would prefer machine intervention to flag unnoticed cries for help.

Whether or not Facebook is the most trustworthy ally in this fight to save lives, the company is in a unique position.

“I think with issues that are life and death, social media has an ethical obligation, especially if they are dealing with issues with minors and the suicide prevention space ... to be proactively engaged,” says Stuber. ”What they’re doing now is providing resources and tools to people who basically are asking for help.”