Among the many posts you might see day-to-day on Facebook, some of your friends are bound to not always feel at their best. An acquaintance may post about her low self-esteem or a cousin could post about his depression. There have even been people and celebrities posting their suicide notes on their pages. Yesterday, Facebook unveiled a new update that hopes to help the social media platform and its users take a more active role in suicide prevention.
According to The New York Times, these features will include a tool for Facebook users to mark posts from friends that discuss suicide or self-harm. After messages have been flagged, a four-person suicide prevention team will read the post and contact the Facebook user to offer support.
Suicide is the tenth leading cause of death in the United States—and the second for people between the ages of 15 and 24. This is the highest suicide rate in the U.S. in thirty years, with the most dramatic increase among women and people of middle-age. A Pew Research study shows that 72 percent of online Americans have a Facebook account, while 77 percent of online women have an account. As seemingly unrelated as these two facts are, they show that Facebook’s new policies could have a major impact on the suicide rate among its over one billion users.
With its large user base, Facebook definitely has a huge potential to play a major role in suicide prevention.
According to research conducted by Facebook in February, approximately one-third of posts shared by Facebook users express negative feelings in some way. These negative posts receive more responses from other users in the form of longer and more emotionally supportive comments on the posts. Posts referring to feelings of low self-worth, in particular, would experience higher rates of supportive private messages from friends and acquaintances. Facebook's suicide prevention tools are just an extension of the way users already reach out to help one another.
One of Facebook's new suicide prevention tools is a menu from which users can select the most appropriate method of assistance. This includes flagging the messages and sending them to Facebook’s suicide prevention team, sending a message directly to the user or sending a message to another friend to coordinate help. Facebook will even provide a suggested message template. The menu also offers suicide prevention help lines and similar resources. The same menu will be presented to the user whose post was flagged when they next log on to Facebook.
"People really want to help, but often they just don't know what to say, what to do or how to help their friends," said Vanessa Callison-Burch, a Facebook product manager, to the The NY Times.
Even though this effort seems well-intentioned—lifesaving at best, overbearing yet harmless at worst—critics are still concerned that these new tools may give Facebook too much information about their users' lives. Some of the most notable scandals that Facebook has been involved with also dealt with their involvement in users' online experience, such as putting positive posts above negative ones in newsfeeds and possibly forcing a political bias on users by suppressing conservative news in the "Trending" section.
The suicide help tools are finally being implemented worldwide after a year of research in collaboration with suicide prevention organizations including Forefront, NowMattersNow.org and Save.org. These, and similar organizations, have expressed their happiness that an influential company like Facebook is taking measures to work toward preventing suicide.