UK

Online Safety Bill: Parents assured legislation won’t just target illegal content after amendment criticism

Published

on

Parents have been assured that the Online Safety Bill will not just hold social media companies responsible for illegal content on their platforms, but any material which can “cause serious trauma” to children.

In an open letter, Culture Secretary Michelle Donelan sought to assuage fears that the long-delayed legislation had been watered down after finally returning to parliament.

The proposed law – which aims to regulate online content to help keep users safe, especially children, and to make companies responsible for the material – was amended over concerns about its impact on freedom of expression.

It was tweaked to remove social media sites’ responsibility to take down “legal but harmful” material, which had been criticised by free speech campaigners.

Instead, social media platforms will be made to provide tools to hide certain content – including content that does not meet the criminal threshold but could be harmful.

This includes eating disorder content, which a Sky News investigation found could be recommended through TikTok’s suggested searches function, despite not searching for explicitly harmful content.

Writing to parents, carers, and guardians, Ms Donelan said: “We have already seen too many innocent childhoods destroyed by this kind of content, and I am determined to put these vital protections for your children and loved ones into law as quickly as possible.”

Read more:
Why the Online Safety Bill is proving so controversial

Online Safety Bill might not be too little, but it’s certainly too late

Please use Chrome browser for a more accessible video player

What is in the online safety bill?

What’s in the Online Safety Bill?

The letter outlines six measures the bill will take to crack down on social media platforms:

• Removing illegal content, including child sexual abuse and terrorist content

• Protecting children from harmful and inappropriate content, such as cyberbullying or promoting eating disorders

• Putting legal duties on companies to enforce their own age limits, which for most are 13

• Make companies use age-checking measures to protect children from inappropriate content

• Posts encouraging self-harm will be made illegal

• Companies will be made to publish risk assessments on potential dangers posed to children on their sites

If companies are found to be falling short, Ms Donelan said they face fines of up to £1bn and may see their sites blocked in the UK.

The updated legislation comes as platforms fight back against a similar online child safety law in the US state of California, which would mandate that users’ ages are verified.

NetChoice, an industry group which counts Meta and TikTok among its members, is suing over what it said was an attempt to have “online service providers to act as roving internet censors at the state’s behest”.

Please use Chrome browser for a more accessible video player

Ian Russell ‘worried’ about revised Online Safety Bill

What have critics said about the Online Safety Bill?

Ms Donelan’s letter followed criticism of the amendment, including from the father of Molly Russell, who a coroner ruled had died by self-harm while suffering the “negative effects of online content”.

“I am worried – the removal of a whole clause is very difficult to see in other terms,” Ian Russell told Sky News.

“There are promises by the secretary of state that the bill has been strengthened in terms of the provisions to make children safe, but the bill has been changed in other ways.

“If children were to find a way round the dam and get into the adult section, I wonder what would happen.”

Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email jo@samaritans.org. Alternatively, letters can be mailed to: Freepost SAMARITANS LETTERS.

Trending

Exit mobile version