“As a global online community, keeping people safe on our apps is incredibly important to us. Since 2006, we’ve worked with experts from around the world to inform our policies, practices and products supporting those at risk of suicide or self-injury” -By Antigone Davis, Global Head of Safety. On World Suicide Prevention Day, Facebook has decided to share an update on what they’ve learned and some of the steps they’ve taken in the past year, as well as additional actions they’re going to take, to keep people safe on their apps, especially those who are most vulnerable. The several types of research made by the team aimed to focus on the following points, how we deal with suicide notes, the risks of sad content online and newsworthy depictions of suicide. Further details of these meetings are available on Facebook’s new Suicide Prevention page in the Safety Center.
The Suicide Prevention Scheme Will Act By These Means
Tightening the policy around self-harm to no longer allow graphic cutting images, to avoid unintentionally promoting or triggering self-harm, even when someone is seeking support or expressing themselves to aid their recovery. On Instagram, it is made harder to search for this type of content and kept it from being recommended in Explore. Steps will be taken to address the complex issue of eating disorder content on the apps by tightening the policy to prohibit additional content that may promote eating disorders. Display a sensitivity screen over healed self-harm cuts to help avoid unintentionally promoting self-harm will be implemented. Introducing CrowdTangle in order to collect data and analyse it, in order to gather information about how information shared on Facebook and Instagram can be used to further advancements in suicide prevention and support. Sharing awareness public content which addresses these issues.
Along with these strict measures taken, resources will be sent to people who post content promoting eating disorders or self-harm, even if the content is taken down. It is reported that: “From April to June of 2019, we took action on more than 1.5 million pieces of suicide and self-injury content on Facebook and found more than 95% of it before it was reported by a user. During that same time period, we took action on more than 800 thousand pieces of this content on Instagram and found more than 77% of it before it was reported by a user”-By Antigone Davis, Global Head of Safety. They also plan on to hire Health and Well-being experts in order to shape their policies and creating a platform to reach out for help. This means Facebook is now hiring! To apply click here. Well, they do realise that we cannot leave a suicidal alone with their thoughts. Thus, they introduced Orygen’s #chatsafe guidelines in Facebook’s Safety Center and in resources on Instagram when someone searches for suicide or self-injury content to help people safely discuss topics like suicide. It is indeed a glad tiding that such a vast platform realises and is ambitious to spread the importance and need for mental health and to cater to mental illness is as vital as any physical illness.