Footage from the social media platform showed people jumping in front of trains, hanging from nooses and falling off buildings. Other clips featured graphic images of self-harm.
The videos, all “liked” by 14-year-old Molly, appear to “glamorise suicide,” Coroner Andrew Walker said.
The upsetting material was played to North London Coroners Court yesterday after Mr Walker issued the “greatest of warning.”
He said: “The video content could be edited, but Molly had no such choice. My view is that the video footage should be played as it stands alone.
“Be warned, the footage glamourises suicide. It is of the most distressing nature. It is almost impossible to watch.
“I say this especially to members of Molly’s family, but in my view the video footage ought to be seen.”
Elizabeth Lagone, head of health and wellbeing at Instagram’s parent company Meta, defended the social media platform’s content policies – saying suicide and self-harm material could have been posted by a user as a “cry for help”.
Ms Lagone told the court it was an important consideration of the company, even in its policies at the time of Molly’s death, to “consider the broad and unbelievable harm that can be done by silencing (a poster’s) struggles”.
Instagram’s guidelines at the time, which were shown to the court, said users were allowed to post content about suicide and self-harm to “facilitate the coming together to support” other users but not if it “encouraged or promoted”.
Molly viewed a large volume of bleak and depressing material on self harm and suicide before she took her own life.
Her father Ian accused social media companies of “helping” to kill his daughter and claimed harmful material is still available.
The court then saw more than a dozen clips with suicide, drugs, alcohol, depression and self-harm content that Molly had liked or saved on social media.
Mr Russell and his family stayed in the courtroom while the material was played. Mr Walker had told them: “There’s no need for any of you to stay.”
Molly, who had two older sisters, died at her home in Harrow, north London, in November 2017.
The court heard how she kept her inner torment hidden from her loving family and appealed for help to strangers and celebrities via a secret Twitter account.
They included an influencer who had attempted suicide whose words gave her some comfort, the court heard.
Oliver Sanders, KC, representing the Russell family, asked Ms Lagone whether it was obvious it was not safe for children to see “graphic suicide imagery.”
She replied: “I don’t know… these are complicated issues.”
Experts had informed Meta it was not safe for children to view the material, said Mr Sanders. “Had they previously told you something different?” he asked.
Ms Lagone responded: “We have ongoing discussions with them but there are any number of issues we talk about with them.”
Molly set up an Instagram account in March 2015 when she was 12 and was recommended 34, “possibly more”, sad or depressive related accounts on Instagram, Mr Sanders said.
Of the accounts recommended, one referred to self-injury, one to concealment, four to suicidal feelings, one to “unable to carry on”, two to mortality and one to burial, he added.
Ms Lagone denied Instagram had treated children like Molly as “guinea pigs” when it launched content ranking – a new algorithmic system for personalising and sorting content – in 2016.
Mr Sanders said: “It’s right isn’t it that children, including children suffering from depression like Molly, who were on Instagram in 2016 were just guinea pigs in an experiment?”
She replied: “That is specifically not the way we develop policies and procedures at the company.”
The inquest continues.
- If you need help or support, call the Samaritans for free on 116 123 or visit samaritans.org