Social media companies must face tough sanctions if they fail to keep children safe from harmful content, the technology secretary has said.
Speaking exclusively to Sky News, Peter Kyle said age verification for adult material would have to be “watertight”, and that apps that do not protect children will face heavy fines and even jail time for company bosses.
He was talking ahead of new requirements, to be announced by the regulator Ofcom in mid-January, for platforms to protect children from a wide range of harmful content including bullying, violence and dangerous stunts.
Apps for adults only will also be required to introduce tighter age verification, via a credit card or ID.
Mr Kyle said: “If they allow the children who are under the age that is appropriate, to view content, then they can face heavy fines and, in some circumstances, they’ll face prison sentences.
“This is the kind of direction of travel you’re going to have with me because I want to make sure kids are kept safe. These are not rules and powers that I’m bringing in just to sit on a shelf.
“These are powers that we’re bringing in for a purpose. At the moment, I accept that parents don’t believe that their kids are safe online because too often they’re not.”
More on Labour
Related Topics:
‘Not enough research’
Mr Kyle said he was “in admiration of what these companies have created” and that lots of organisations, including the government, could learn from the tech sector.
But he added: “I do have a real deep frustration and yes, that could be called anger when it comes to the fact that not enough research has been produced about the impact their products have.
“If I was producing a product that was going to be used ubiquitously throughout society that I knew that children as young as five are going to be accessing it, I would want to be pretty certain that it’s not having a negative impact on young people.”
Image: Peter Kyle said age verification for adult material would have to be ‘watertight’
The Online Safety Act was passed in October 2023 and is being implemented in stages. It will allow companies to be fined up to £18m, or 10% of turnover as well as criminal charges.
In December, the regulator Ofcom set out which content is illegal – including sexual exploitation, fraud and drug and weapons offences.
Mr Kyle said he has no plans for one at this stage, as he met a group of teenagers from across the country at the NSPCC children’s charity to talk about their experiences online.
Some mentioned the “addictiveness” of social media, and coming across “distressing” content. But all were against a ban, highlighting the positives for learning, and of online communities.
The UK chief medical officers reviewed the evidence on harm to children from “screen-based activities” – including social media and gaming – in 2019.
Their report found associations with anxiety and depression, but not enough evidence to prove a causal link. It backed a minimum age of 13 for using these apps.
But the technology secretary has commissioned more research to look at the issue again by next summer, as countries including France and Norway have raised the minimum age to 14 or 15.
Please use Chrome browser for a more accessible video player
2:17
More social media restrictions for under-16s?
Children ‘getting dopamine hits’
Ofcom research last year found nearly a quarter of five-to-seven-year-olds have their own smartphone, with two in five using messaging services such as WhatsApp despite it having a minimum age of 13.
By the time they are 11, more than 90 percent of children have a smartphone.
Lee Fernandes, a psychotherapist specialising in addiction, told Sky News at his London clinic that he has been increasingly treating screen addiction in young adults, some of whose problems began in their teenage years.
“In the last five years, I’ve seen a big increase in addictions relating to technology,” he said.
“I think everyone just thinks it’s mindless scrolling, but we’re habituating children’s minds to be stimulated from using these phones and they’re getting these hits of dopamine, these rewards.”
Social media companies privately say teenagers use over 50 apps a week and argue that app stores should develop a “one-stop shop” rather than ID checks for each individual app.
Some platforms already require teenagers to prove their age through a video selfie or ID check if they attempt to change their age to over-18.
There are also AI models being developed to detect under-18s pretending to be adults. Specific teen accounts by providers including Meta restrict certain messages and content.
The man who killed MP Sir David Amess was released from the Prevent anti-terror programme “too quickly”, a review has found.
Sir David was stabbed to death by Islamic State (ISIS) supporter Ali Harbi Ali during a constituency surgery at a church hall in Leigh-on-Sea in October 2021.
The killer, who was given a whole-life sentence, had become radicalised by ISIS propaganda and had been referred to the anti-terror programme Prevent before the attack, but his case had been closed five years before.
Despite Prevent policy and guidance at the time being “mostly followed”, his case was “exited too quickly”, security minister Dan Jarvis told the House of Commons on Wednesday.
Following the publication of a review into Prevent’s handling of Southport child killer Axel Rudakubana earlier this month, Mr Jarvis said a Prevent learning review into Sir David’s killing would be released this week in a commitment to transparency over the anti-terror programme.
Matt Juke, head of counter-terrorism policing, said it is clear the management and handling of Ali’s case by Prevent “should have been better” and it is “critical” the review is acted on “so that other families are spared the pain felt by the loved ones of Sir David”.
Image: Ali Harbi Ali was referred to Prevent twice before he stabbed Sir David to death. Pic: Met Police
The review found:
• Ali was referred to Prevent in 2014 by his school after teachers said his demeanour, appearance and behaviour changed from a previously “engaging student with a bright future” with aspirations to be a doctor to failing his A-levels and wanting to move to a “more Islamic state because he could no longer live among unbelievers”
• Prevent quickly took his case on and he was referred to Channel, part of the programme that aims to prevent involvement in extremism
• He was “exited from Prevent too quickly”, Mr Jarvis said, just five months later “after his terrorism risk was assessed as low”
• A review by police 12 months after he was released from Prevent “also found no terrorism concerns” and the case was closed. This was not uploaded for eight more months due to an “IT issue”
• People released from Prevent are meant to have a review at six and 12 months
• The assessment of Ali’s vulnerabilities “was problematic and outdated” as it did not follow the proper procedure, which led to “questionable decision-making and sub-optimal handling of the case”
• Ali’s symptoms were prioritised over addressing the underlying causes of his vulnerabilities – and support provided did not tackle those issues
• Record keeping of decisions, actions and rationale was “problematic, disjointed and lacked clarity”
• The rationale for certain decisions was “not explicit”
• Ali’s school was not involved in discussions to help determine risk and appropriate support – they were only called once to be told the “matter was being dealt with”
• A miscommunication led to only one intervention session being provided, instead of two.
Please use Chrome browser for a more accessible video player
1:18
Is the Prevent programme fit for purpose?
The review found most of the failures in Ali’s case would not be repeated today as the guidance and requirements are much clearer.
It said referrers, in Ali’s case his school, are kept informed and engaged, different departments and agencies – not just police – have clear roles, which records need to be kept is clear and guidance for detecting underlying vulnerabilities has changed and would have made a difference.
The review said a Prevent “intervention provider” met Ali at a McDonald’s to deal with his understanding of “haram” (forbidden under Islamic law). No risk assessment was made but they suggested one more meeting, however a breakdown in communication between police and the provider meant there were no more meetings.
Training for providers is “substantially different” now and the review says this would not be repeated today, with the provider in question saying the process is “a completely different one today”.
However, the review said there are still problems – not just in Ali’s case – with the Vulnerability Assessment Form, an “incredibly complex document that is vital to Channel” and the progression of a case.
It also found a decision by the College of Police to only hold Prevent case data for five years “may prove to be problematic” and if Ali’s case material had been deleted under that ruling “it would have been nigh on impossible to conduct this review.
Sir David’s daughter, Katie Amess, 39, last week welcomed the announcement to publish a review into Ali’s case but said every victim failed by Prevent deserves an inquiry, not just the Southport victims.
“We potentially wouldn’t be in the same situation today with repeat failings of Prevent had somebody had just listened to me back when it [her father’s killing] happened and launched a full public inquiry,” she told LBC.
Ms Amess said she believes if the Southport attack had not happened, the review into Prevent’s handling of her father’s killer would never have been released into the public domain.
This breaking news story is being updated and more details will be published shortly.
Please refresh the page for the fullest version.
You can receive Breaking News alerts on a smartphone or tablet via the Sky News App. You can also follow @SkyNews on X or subscribe to our YouTube channel to keep up with the latest news.
President Donald Trump is reportedly planning to pick Brian Quintenz — the head of policy at a16z’s crypto arm — as the next chair of the Commodity Futures Trading Commission.