Each year, around 800,000 people commit suicide worldwide. Recognizing the menace, the social media giant – Facebook – decided to intervene by launching a global suicide prevention tool. Built with suggestions from users and suicide prevention organizations, such as Forefront, Lifeline and Save.org, the tool will assist concerned friends or family members in reporting suicidal content to Facebook, according to a post published by Facebook Safety recently.

Besides several other causes, there is evidence that internet and social media, particularly Facebook, can cause depression in users. Research has found that people who use social media excessively are more likely to be lonely and depressed. Teenagers, particularly adolescent girls, are especially hooked to and likely to spend hours on various different social media networks. Scrolling through Facebook feed, seeing friends hanging out, going on a vacation or getting married can spark jealously, induce depression and prompt self-injury among users. No one joins Facebook to be lonely and sad, yet that’s the most pronounced feature of being on these networks. Amidst this scenario, Facebook’s new tool will help aplenty.

The company’s global head of safety Antigone Davis, and  Jennifer Guadagno, a psychologist and researcher in charge of  seeking encouraging, supportive and compassionate interactions online at Facebook wrote that these tools were “developed in collaboration with mental health organizations and with input from people who have personal experience with self-injury and suicide.”

The safety feature has been in trial phase, to a select few users from the US and UK, but Facebook has finally launched it globally. The full release comes in the wake of increasing global suicide rates and growing incidents of cyberbullying on the social media. Concerned friends or family members will have the option to reach out to subjects with suicidal ideations directly or, alternatively, enable anyone to report worrying content to Facebook. The latter has teams working around the world, 24/7, that will review reports pouring in from friends and family. Facebook is committed to ensuring users’ safety and privacy, and ban content that promotes hate speech, bullying and racism. The team will give priority to more serious posts which they think come under the category of self-harm or suicidal thoughts. The company also worked with local partners to provide support in all languages in which the site is available to ensure global support to its users.

After a particular post, that has worrying content, has been flagged, the user receives a notification that their post has been flagged. The user will then receive options to reach out to a loved one, call a helpline, receive safety and suicide prevention tips and assistance. Facebook will also give suggestions about the wordings one should use to resolve the situation.

Suicide is a taboo subject and often swept under the rug. Friends and family members are often oblivious of the patient’s suffering and wrongly assume that it is a phase that will automatically go away with time. This decision will help raise awareness regarding sensitive issues such as suicide and depression and possibly even save lives. According to a survey published a few years ago, one-third of US consumers turn to Facebook and other social media for seeking medical information and healthcare answers. In 2013, Facebook decided to launch online ‘support communities’ to connect individuals battling same medical conditions. Facebook is also exploring healthcare applications for its loyal users.

There is usually an underlying mental illness that causes people to commit suicide. Over 90% of the people who commit suicide, are diagnosed with a mental health problem. Up to 15% of people who are clinically depressed, end up killing themselves. Though not all individuals who are depressed end up killing themselves, depression can significantly affect a person’s mental health and increase suicide risk. Other mental diseases such as bipolar disorder and schizophrenia can also lead to suicide.

Facebook further aids concerned friends and family members by making available suicide prevention and safety information on its Safety Tools and Resources page. This is a part of Facebook’s Help Center suite. The page provides all kinds of information from self-help tips, reporting suicidal content on Facebook, to how to proceed after viewing content that might promote suicide and self- harm. It even provides safety tips for members of the lesbian, gay, bisexual, and transgender (LGBT) community and law enforcement officers.

It is important to keep in mind that this initiative is not a replacement for standard suicide prevention techniques such as psychotherapy and professional advice. Facebook itself suggests its users to call emergency services or law enforcement for help in case of a possible suicide incident. There is also a concern for the possibility of users abusing the service to falsely report users’ Facebook statuses.