Molly Russell was a typical teenage girl. She liked Harry Potter and horse-riding. She was juggling homework, starring in an upcoming school play, and keeping up friendships, all with the support of her loving family in northwest London.
But in November 2017, Molly took her own life at the age of 14.
An inquest revealed Molly engaged with a huge number of posts on Instagram related to depression, self-harm or suicide in the months before her death.
Coroner rules on schoolgirl’s death – live updates
The coroner’s findings concluded that viewing material on social media “contributed to her death in a more than minimal way”, after suffering from depression and “the negative effects of online content”.
Sky News has found that at least one piece of content identical to that saved by Molly prior to her death, and which glorifies suicide, remained on Instagram this week.
The post was found by searching a term related to a method of suicide – a term Instagram promotes as part of its suggested searches feature and which is available to all users over the age of 13.
Warning: Readers may find this story distressing.
A digital trail
The inquest examined Molly’s social media activity in the six months prior to her death.
Sky News has chosen not to show the posts Molly engaged with, given some of their harmful content.
Among the 2,100 images related to depression or suicide Molly saved or liked on Instagram, the most benign posts show images, phrases, and poetry relating to feeling sad and depressed.
The most disconcerting ones show graphic images of self-harm and others which glorify suicide.
Many of the posts refer to worries around a lack of confidence, body image, and failing to meet family expectations – anxieties likely to particularly resonate with teenagers.
They reveal a picture of a young woman struggling with severe depression, suffering in silence while appearing outwardly happy.
They raise a crucial question: whether Molly’s online activity was a reflection of her state of mind, or if the content she was viewing and the algorithms that promoted it were more directly responsible for her distress.
Read more:
‘Why are you doing this?’ – heated exchange at inquest
Child psychiatrist ‘did not sleep well’ after viewing content
Molly’s timeline – including a tweet to JK Rowling
The exact timeline of when and how Molly began engaging with this material is unknown.
Only six months of data from before her death in 2017 was available from Instagram, as information from before this time is no longer held on its servers.
Molly appears to have been engaging with suicide-related posts throughout this period. Instagram also could not provide information on all content Molly viewed or searched for, only those posts she interacted with, meaning she likely came across far more material than revealed by the inquest.
Instagram was not the only site through which Molly accessed harmful content. Pinterest, another image sharing social platform, sent emails to Molly highlighting posts under the topic of “depression” and “sad depression quotes”.
It was promoting the type of content she had been viewing on her account, an example of how algorithms used by social media companies can run the risk of pushing extreme content on to users as they seek more engagement.
A Pinterest executive gave evidence to the inquiry and admitted that at the time Molly was using the service, it was “not safe”.
Molly also set up a Twitter account, separate to another one that her family were aware of, which she used to follow celebrities who had spoken out about their problems with depression. Tragically, it was through this anonymous account that Molly made some of the few public admissions of her own struggles.
She told JK Rowling, who with almost 14 million followers receives large numbers of mentions: “My mind has been full of suicidal thoughts for a while but reading Harry Potter and the world you created is my escape.”
Read more:
‘No one is immune from such tragedy’
Social media ‘almost impossible to keep track of’
The debate over freedom of expression
It was suggested during the inquest that some online content related to depression, self-harm, or suicide could have some positive effects.
A representative for Meta, Instagram’s parent company, told the inquiry online spaces that touch upon this area may allow those suffering to express themselves and build a community of people experiencing similar struggles.
It is possible Molly found some comfort in following celebrities on Twitter who had been open about their own difficulties and had overcome them.
But Molly’s father told the inquiry he believes, in general, the content his daughter viewed online “normalised” the issue of suicide. He felt its unrelenting bleakness would likely worsen the mental health of anyone looking at it.
The differing views reflect a genuine debate around the extent someone should have the freedom to post about their troubles and those of others online, against the risk this activity could encourage some to harm themselves.
But separate to this issue, details of Molly’s online activity reveal she was still able to engage with harmful posts on Instagram and Pinterest despite the fact they violated the companies’ policies.
The debate around what is considered harmful becomes redundant if content that violates social media companies’ rules cannot be accurately identified and removed.
This was a worry raised by Frances Haugen, a former Meta employee, in her evidence to a committee of MPs considering the draft Online Safety Bill in 2021, which is still proceeding through parliament.
She told the committee that Facebook, another Meta company, was only able to identify 3-5% of misinformation and that Instagram was the most dangerous social media platform due to its focus on body image and social comparison.
“Facebook’s own reports say that it is not just that Instagram is dangerous for teenagers; it is actually more dangerous than other forms of social media.” she warned.
‘Remember who Molly really was’
Some progress has been made in improving automated systems that pick up dangerous content.
Elizabeth Lagone, Meta’s representative at the inquiry, said online harm was an “evolving” area. Instagram does, for example, point users towards a help page if they search some phrases relating to emotional distress. Some other search terms are blocked completely.
However, Sky News found one search term relating to suicide, which is blocked by Instagram, could be accessed simply by typing in part of the term and selecting from the recommended search list that appears.
Worryingly, it means people searching grammatically similar phrases, with no connection to suicide, could be directed towards harmful content.
Using this search, one poem Molly saved to her account shortly before her death, and which glorifies suicide, appeared in the search results.
Instagram has taken down this post and the recommended search term after being alerted by Sky News. It is an example of the type of harmful content that still exists on social media and the dark corner of the internet Molly inhabited before her death.
Because of this, Molly’s family made clear at the inquiry that the digital trail she left behind, and who she really was, shouldn’t be confused.
“We, her family, think it is essential to remember who Molly really was, so we can each hold a picture in our minds of a caring individual, full of love and bubbling with excitement for what should have lay ahead in her life.”
Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email jo@samaritans.org.