Connect with us

Published

on

Davine Lee still has the birthday present she bought for her friend Molly five years ago, an unwrapped box containing a hoodie featuring her favourite TV show, that remains sitting untouched at home.

She learned of her friend’s death, at the age of 14, after seeing her empty chair at school and wondering where she might be.

Hearing the news, nothing made sense.

Warning: Some readers may find the content in this story distressing.

Molly Russell's family have campaigned for better internet safety since her death in 2017.

Molly Russell, a seemingly happy teenager from Harrow, northwest London, was found dead in her bedroom in November 2017, just a day after rehearsing with Davine for a show she had been picked to play a lead role in.

It later emerged she had viewed masses of content related to suicide, depression and anxiety online.

In a landmark ruling at an inquest in September, a coroner ruled she died not from suicide, but from “an act of self-harm while suffering from depression and the negative effects of online content”.

More on Molly Russell

Dealing with the death of a friend in this way, especially at such a young age, is a particularly complex form of grief to process.

“To have to lose a friend at that age, it’s scarring,” Davine quietly explains. “Losing Molly… it’s something we won’t ever be able to forget or entirely move on from…

“I’ve still got her birthday present from 2017 – it would have been her 15th birthday, and of course she never made it to that birthday. That present still sits in my room, I’m just really not sure what to do with it. I obviously can’t give it to her but it feels in some way like I can still hold on to her through that.”

With permission from the Russell family, this is Davine’s first media interview, speaking exclusively to Sky News. Now 20 and at university, she says she was moved to speak publicly for the first time to highlight the importance of bringing the Online Safety Bill back before parliament. Reliving that time, she hopes, might prompt anyone struggling to see how much they would be missed.

“It was shocking to see that it was that bad,” says Davine, referring to the graphic material that was shown at Molly’s inquest. “I want people to know that what happened to Molly isn’t an isolated event and the content that she was being pushed, it still exists.”

‘Poor mental health can hide almost in plain sight’

Davine Lee is speaking out about harmful online content following the death of her friend Molly Russell in 2017
Image:
Davine, now 20, wants social media companies to be held responsible for harmful content

Molly and Davine had been friends since they started secondary school together and shared a love of singing and musicals. They starred together in school productions of Les Miserables and Beauty And The Beast.

“[Molly] had just been given like one of the lead roles for the show we were doing that year… she was still doing the things she loved… either depression or poor mental health can hide almost in plain sight in that sense,” says Davine.

Recalling the horrific day that she and her school friends were told what had happened, she remembers the teachers ushering them all into a room. Then came the news that Molly had died.

“My first thought was like, ‘no’. It was like an instant sense of doubt, like, ‘no, Molly wouldn’t’. It just didn’t even make sense.”

Davine says she was told the news with other students. They were all in tears. “And that’s a sound I can’t forget, the sound of that many children just in such emotion.

“To attend a funeral at that age for someone who is a friend… we were just trying to get through each day.”

The coroner’s ruling: How content ‘romanticised’ self-harm

Molly’s family would later learn that alone in her room, social media algorithms had been feeding her a weight of disturbing content.

The coroner at her inquest ruled the content she had viewed “romanticised” self-harm, “normalised” her depression, and that some content “discouraged” the teenager from seeking “help” – ultimately contributing to her death.

Davine wants to highlight that Molly was not an isolated case, and that young people being drawn into looking at dark content on social media is a huge and damaging issue.

On Instagram, many of the hashtags Molly searched for have now been blocked. However, Sky News’ data and forensics unit found that while these blocks have been made and some content removed, the autofill device or misspellings can still lead users to some content, which Molly viewed; this was shown to her inquest but is too distressing to publish here.

Read more:
The digital trail that sheds light on the final months of Molly Russell’s life

‘No one is immune from such tragedy’
Social media ‘almost impossible to keep track of’

A spokesperson for Meta, which owns Instagram, responded to Sky News to say the company is committed to protecting young people.

“We’ve already been working on many of the recommendations outlined in [the coroner’s] report, including new parental supervision tools that let parents see who their teens follow and limit their time on Instagram,” the spokesperson said.

“We also automatically set teens’ accounts to private when they join, nudge them towards different content if they’ve been scrolling on the same topic for some time and have controls designed to limit the types of content teens see.

“We don’t allow content that promotes suicide or self-harm, and we find 98% of the content we take action on before it’s reported to us. We’ll continue working hard, in collaboration with experts, teens and parents, so we can keep improving.”

‘I felt so unwell I couldn’t work’: Former social media moderator speaks anonymously

A young girl looking at social media apps

Sky News has spoken anonymously to a former social media moderator who described managing harmful content on social media platforms as “an impossible task”.

Working for one of the world’s leading social media companies for a year during the pandemic, her job was to carry out a secondary viewing of content that had been flagged as potentially problematic, including posts that could be “extremely violent, homophobic”, or even show paedophilia.

“But there was probably one video in particular that affected me the most,” she said. This was footage of someone taking their own life.

She said she would watch and tag at least 1,000 videos a day. “It more just upset you about the world – seeing so much, sorry to say it, s**t, really. Things that people would do to themselves or others, it gives you a lack of belief in the world, really.”

After a year, she says that mentally and physically she could not carry on. “I felt so unwell that I literally couldn’t work, and I had to call my GP to advise me not to do it anymore.

“The effect it had on me the most was the sleep. I couldn’t sleep because I was so stressed. I was dreaming about some of the videos I could have incorrectly tagged.

“I won’t go into personally exactly what happened, but it wasn’t far off from Molly [Russell]. I can recognise feelings in how I felt seeing all of this content coming at me.”

The enormity of the task of policing all the content was just too much, she says. “The system there was just chaos… no one really knew what was happening.”

“We’re just a lot of young [people], like a lot of [people who have] just finished their degree… sat there trying to figure out how to judge all this content with no legal background.”

The rise of potentially harmful online content

Woman using her laptop for working from home

Research by mental health charity Young Minds shared exclusively with Sky News suggests disturbing content is a growing problem.

It found that more than a fifth (22%) of young people are automatically shown distressing content by social media platforms, based on their previous online activity, at least once a week.

Nearly all young people (89%) who have had mental health problems said social media helps drive harmful behaviours, and more than half (52%) of that group said they had sought out content which they knew might make them distressed or uncomfortable.

The government has been accused of dragging its feet when it comes to introducing legislation to regulate social media firms but now, after years of delay, the online safety bill is back before parliament next week – proposing fines for tech companies of up to 10 % of their global turnover if they fail to protect users from harmful content – and criminalising posts that encourage self-harm.

But critics such as Baroness Claire Fox want the bill to be scrapped.

“The danger is that we – on the back of a very emotional response to something like the tragedy of Molly Russell – bring in a piece of legislation that doesn’t just protect children but actually infantilises adults and treats them like children,” she told Sky News. “And if you’re a free speech campaigner, as I am, this bill is a major, serious censorship tool.”

To those campaigning for better protections against potentially dangerous social media algorithms, Molly’s case embodies the horrific consequences of doing nothing.

The long-term impact and the ‘crisis’ in children’s mental health

Molly Russell. Pic: Russell family
Image:
Pic: Russell family

Olly Parker, from Young Minds, says: “I’m kind of a researcher in this field, but I’m also a father as well and it absolutely terrifies me.

“I don’t think we’re really going to see what the long-term impacts of this are maybe until 10, 15 years down the line. But one thing we are seeing is a real crisis in children in young people’s mental health. So every month right now we see record numbers of young people being referred to their GPs and doctors for more mental health support.”

When the online safety bill returns to parliament, Molly Russell’s friends and family hope it will be the first step towards holding big tech responsible for the content on their platforms.

Read more: Why the online safety bill is proving so controversial
‘Why are you doing this?’ – heated exchange at inquest
Child psychiatrist ‘did not sleep well’ after viewing content

“It’s big news that they now want to criminalise harmful content and anyone responsible for that but at the same time it does feel like it’s been an awfully long journey,” says Davine. “But I think it’s good to appreciate that we’re here now.”

But while it is something to place hope in, it can never bring back Molly.

“She was so loved by all of us,” Davine says. “I think she genuinely believed we would be better off without her… I think if she saw how much pain we were going through, I don’t think she would have made that choice.”

Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email jo@samaritans.org. Alternatively, letters can be mailed to: Freepost SAMARITANS LETTERS

Continue Reading

Entertainment

Matty Healy reacts to Taylor Swift’s ‘diss track’

Published

on

By

Matty Healy reacts to Taylor Swift's 'diss track'

Matty Healy has reacted to new tracks by supposed ex-girlfriend Taylor Swift that are rumoured to be about him.

The 1975 frontman is never named in any tracks featuring on Swift’s new album, The Tortured Poets Department, but fans have assumed several references are about him.

Many have interpreted the lyrics of the first song on the album, Fortnight, to be about him, where she sings: “And I love you, it’s ruining my life, I touched you for only a fortnight.”

It’s widely assumed he’s also the subject of the track Guilty As Sin, where she sings about having “fatal fantasies” about someone from her past while in a relationship.

Fans are also suggesting the song The Smallest Man Who Ever Lived appears to allude to Healy “ghosting” her.

“You tried to buy some pills, from a friend of mine, they just ghosted you, now you know what it feels like,” she sings.

In a video circulating online, Healy was approached by a reported photographer in Los Angeles and asked how he rates his “Taylor diss track” and how he thought it compared to the other songs on the 31-track double album.

Healy, looking confused, responded: “My diss track?”

The photographer reiterated: “Yeah, Taylor’s new song?”

“Oh!” Healy laughed, adding: “I haven’t really listened to that much of it, but I’m sure it’s good.”

Read more:
The other people Swift referenced in Tortured Poets
The original ‘it girl’ who inspired Swift’s new song

Last May, Healy made a surprise appearance during the Nashville performance of Swift’s Eras tour to play with her support act, indie singer-songwriter Phoebe Bridgers.

Swift also sung two The 1975 songs at their London gig in February 2023.

By June last year, reports surfaced that the pair were “no longer romantically involved”, with a source telling US outlet People the relationship was “always casual”.

“She had fun with him, but it was always casual,” the source said.

Continue Reading

Entertainment

Drake ordered to delete diss track featuring AI-generated voice of Tupac Shakur

Published

on

By

Drake ordered to delete diss track featuring AI-generated voice of Tupac Shakur

Tupac Shakur’s estate has threatened to sue Drake and ordered him to delete a track featuring an AI-generated copy of the late rapper’s voice.

Drake released the song Taylor Made Freestyle – a diss track aimed at Kendrick Lamar – on his Instagram page on Friday, which features verses created by AI software mimicking both Shakur and Snoop Dogg.

In a cease-and-desist letter seen by Sky News’ US partner NBC News, Howard King, an attorney who represents Shakur’s estate, requested that Drake remove the track from all platforms where it is publicly available.

The letter sent on Wednesday states the Canadian rapper has until midday on Thursday to confirm he will remove it or the estate will “pursue all of its legal remedies” against him.

“Not only is the record a flagrant violation of Tupac’s publicity and the estate’s legal rights, it is also a blatant abuse of the legacy of one of the greatest hip-hop artists of all time,” Mr King wrote.

“The estate would never have given its approval for this use.”

The letter also outlines the estate’s “dismay” regarding the topic of the track, saying Lamar is “a good friend to the estate who has given nothing but respect to Tupac and his legacy publicly and privately” and that this “compounds the insult”.

In the track, the AI-generated voice of Shakur urges Lamar to respond to Drake’s previous diss track about him released several days prior, saying lines like: “Kendrick, we need ya, the West Coast saviour / You seem a little nervous about all the publicity / You asked for the smoke, now it seem you too busy for the smoke.”

Tupac was killed in 1996. Pic: Walik Goshorn/MediaPunch/IPx/AP
Image:
Tupac was killed in 1996. Pic: Walik Goshorn/MediaPunch/IPx/AP

The letter claims the track and its popularity have created the “false impression that the estate and Tupac promote or endorse the lyrics for the sound-alike”.

Shakur’s estate is also seeking damages including all profits from the record, which has so far only been posted on Drake’s Instagram page, as well as additional damages for substantial economic and reputational harm caused.

Read more:
Musicians react to AI songs flooding the internet
J Cole: I feel ‘terrible’ about Kendrick Lamar diss track

The letter claimed Drake’s non-consensual use of Shakur’s likeness violates Shakur’s right to publicity, an intellectual property right protecting against the misappropriation of somebody’s name or image.

Sky News has contacted representatives of Drake for comment.

The AI-generated voice of prominent rapper Snoop Dogg was also used on the track.

Snoop Dogg posted a video on his Instagram story shortly after the diss track was posted, where he said: “They did what? When? How… What’s going on… I’m going back to bed.”

The use of AI in the music industry has been the subject of heavy debate since last year, when Drake’s own voice was cloned alongside The Weeknd by the artist known as Ghostwriter.

The track was taken down from all platforms shorty after it was released in April.

Continue Reading

Entertainment

Baby Reindeer: Writer Richard Gadd tells fans to stop speculating about characters

Published

on

By

Baby Reindeer: Writer Richard Gadd tells fans to stop speculating about characters

Richard Gadd has urged fans of his hit show Baby Reindeer to stop speculating about who the characters in his show are based on in real life.

The Netflix series is based on the real-life story of its writer Gadd, who also plays the lead character, and his warped relationship with a female stalker.

Fans have been speculating online about the identity of the stalker played by Jessica Gunning in real life (spoiler warning), as well as who another character, seen sexually assaulting Gadd in the series, is based on.

The character, played by Tom Goodman-Hill, is a TV writer who repeatedly sexually assaults Gadd’s character and supplies him with drugs.

Gadd addressed his fans on his Instagram story on Tuesday, saying: “People I love, have worked with, and admire… are unfairly getting caught up in speculation.

“Please don’t speculate on who any of the real-life people could be. That’s not the point of our show. Lots of love, Richard.”

Read more on Sky News:
Richard Osman reveals Thursday Murder Club cast
Police launch manhunt for Home and Away star

Pic: Netflix
Image:
Pic: Netflix

The show is based on the hit Edinburgh Fringe one-man stage play Gadd performed in 2019.

Gadd, who plays Donny Dunn, a character based on himself, said he didn’t expect the show to “blow up” in the way it has since its release on 11 April.

“I’m super proud of it. I really believed in this show, but the fact it’s gone so stratospheric so quickly, for such a cult, quite niche story… it’s kind of amazing. It’s clearly struck a chord,” he said on This Morning.

The writer, actor and comedian is also an ambassador for We Are Survivors, a charity which supports male survivors of sexual abuse.

Continue Reading

Trending