Exploring safe moderation of online suicide and self-harm content
Theme Mental health
Workstream Psychological interventions
Status: This project is ongoing
The aim of this project is to assess the impact of blocking online self-harm content. People who access content about self-harm online can find that it affects their mental health. Policymakers are therefore particularly concerned about the how to manage online content relating to suicide or content encouraging users to self-harm.
We must be able to safely and appropriately moderate information about these topics online, so we can protect vulnerable users from its effects. However, anecdotal evidence suggests moderation can have unintended consequences for blocked users. It can make them feel isolated and stigmatised and there has been little research into the best ways to moderate and reach out to blocked users.
What we did
During our project we analysed data collected for the DELVE study. DELVE collected information from a group of individuals who accessed and created self-harm and/or suicide content. Our analysis focused on their experiences of moderation, including their experiences of having content blocked and fulfilling moderator roles within online groups and communities.
What we found
The individuals who took part in the study reported that:
- It was difficult to report inappropriate content, particularly when their mental health was poor
- Inconsistent moderation and unclear communication left them feeling confused and frustrated when their own content was moderated
- They felt it was important to have moderators with personal experience of self-harm or attempted suicide, but performing this role put these individuals at risk
We have used this information to produce guidance for online industry leaders and policymakers on moderating online self-harm and suicide content.
We worked with Samaritans on this project, who have gained deeper insights into users’ engagement with self-harm and suicide content online. This is informing their ongoing policy and safeguarding work.
What next?
In the next phase of the study, we are looking systematically at how social networking sites manage mental health content and user wellbeing through content moderation and safety resources. This part of the study is looking at Instagram, TikTok, Tumblr, and Tellmi.
