Ending Soon! Save 33% on All Access

Facebook's Suicide Prevention Tools: Invasive or Essential? The social network now allows all users to flag friends' posts as potentially suicidal and solicit Facebook's help or intervention.

By Lydia Belanger

Opinions expressed by Entrepreneur contributors are their own.

Adam Berry / Contributor | Getty Images

If you have ever scrolled through your News Feed and stopped on a troubling, borderline suicidal post from a friend, you may have been unsure how to help or wondered if reaching out would be appropriate. Facebook understands that people share these types of negative personal thoughts on the platform and has developed tools to help you help your friends.

Facebook now offers resources for users who perceive a friend's posts as suicidal, allowing them to flag a post for review by a team at the company. Users can click a drop-down menu within the post in question that allows them to specify their concerns to Facebook's global community operations team. These reports are directed to employees trained to evaluate suicidal content. The team may then send the reporting user some information about suicide prevention and advice for communicating with the friend. In some cases, Facebook may intervene by contacting local law enforcement where the friend resides, according to The New York Times.

Related: Facebook Updates Its Suicide Prevention Tools

Previously, suicide prevention assistance was limited to some English-speaking Facebook users, but now it is available to everyone.

Among the tools is a page containing a form to report sightings of suicidal content to the team, along with advice for assisting friends who may be considering self-injury, those who may have an eating disorder and members of the military, LGBT individuals and law enforcement officers whose posts indicate they may be contemplating suicide. It also offers direct support to at-risk users seeking help for themselves. All of the tools contain warnings to users, advising them to take immediate action if a post explicitly states suicidal intent by calling law enforcement or a suicide hotline and directing them to said contact information.

Facebook relies on humans on both sides -- users report and team members review. None of the content is detected or evaluated using artificial intelligence or algorithms.

Related: Can We Turn to Our Smartphones During Mental Health Crises?

We asked Entrepreneur's Facebook and Twitter followers whether Facebook should allow users to solicit its employees' help in preventing suicide, or whether the company should refrain from intervening in people's personal lives. Many who responded embraced Facebook's efforts, while others thought sole responsibility should fall on the identifying users themselves. Some thought in terms of the company's image, and some asked questions about how reporting someone would affect how Facebook targets that user in the future. Read some of their comments below.

Lydia Belanger is a former associate editor at Entrepreneur. Follow her on Twitter: @LydiaBelanger.

Want to be an Entrepreneur Leadership Network contributor? Apply now to join.

Career

Is Consumer Services a Good Career Path for 2024? Here's the Verdict

Consumer services is a broad field with a variety of benefits and drawbacks. Here's what you should consider before choosing it as a career path.

Business News

'Creators Left So Much Money on the Table': Kickstarter's CEO Reveals the Story Behind the Company's Biggest Changes in 15 Years

In an interview with Entrepreneur, Kickstarter CEO Everette Taylor explains the decision-making behind the changes, how he approaches leading Kickstarter, and his advice for future CEOs.

Business Ideas

87 Service Business Ideas to Start Today

Get started in this growing industry, with options that range from IT consulting to childcare.