Connect with us

Published

on

Since last year’s general election, Sir Mel Stride has become a familiar face for those of us who like our politics.

During the campaign, he regularly found himself on breakfast TV and radio. So much so, Sir Mel was referred to as the “minister for the morning round” by some of our industry colleagues.

By our count, he was on Sky News Breakfast at least 10 times during the campaign’s 43 days.

👉 Tap here to follow Electoral Dysfunction wherever you get your podcasts 👈

Following the election, and losing the Conservative leadership race to Kemi Badenoch, Sir Mel now puts questions to Rachel Reeves as shadow chancellor.

Still seen as a safe pair of hands, Sir Mel’s penchant for doing the “morning round” hasn’t slowed down either, making regular appearances on breakfast TV and radio.

More on Conservatives

Luckily, he found some time between all that to sit down for an interview with Sky’s Beth Rigby for the Electoral Dysfunction podcast. He spoke about his transition to Opposition, taking on Reform, and the most controversial topic in Westminster – lunch.

Here’s what we learned:

1. Opposition isn’t ‘awful’ – but it is like ‘warfare’

Please use Chrome browser for a more accessible video player

‘I think people will see through Reform’s populism’

Before the election, Sir Mel served as work and pensions secretary. Shifting to the Opposition was not “awful”, despite losing the muscle of the civil service.

“But it is like guerrilla warfare,” he said.

“You suddenly lose all the trappings of government. Somebody once said to me, ‘when you get in the back of a car and you sit down and it doesn’t go anywhere, that’s when you realise you’re no longer a minister’.

“So it is that sort of sense of being looked after that disappears.”

There’s also a smaller team of Conservatives in the Commons. Before the election, Rishi Sunak had 343 MPs behind him.

Ms Badenoch currently only has 119.

“When you’re down to 120 MPs – and some set piece events, there might be only a fraction of those people there – it’s much quieter.

“What I actually often do is I can be quite provocative of the Opposition to get them going, because then at least you get something to feed off. Sometimes I do that to, just get the energy in the chamber.”

2. Being at the despatch box on big days can be ‘tricky’ – but he has a ‘secret’

You may remember Sir Mel’s lively response to Rachel Reeves’s spring statement in March. He revealed that, on those big political days, he isn’t told what the chancellor will say until about half an hour before it’s said in the Commons.

“It does give you and your team literally 10 or 15 minutes to… work out what the main things are,” he said.

However, he tells Electoral Dysfunction that you do have to be able to think on your feet in that scenario.

He said: “You are thinking about ‘what are the attack lines I’m going to use?’… and amend what you’re going to do.”

He added that he doesn’t get nervous. That might have to do with Sir Mel having been president of the Oxford Union debating society “many, many years ago”.

“Now the secret’s out. The secret is out Beth, and you’re the first to have gleaned that secret from me,” he said.

To be fair, it is on his website.

3. He’s not a huge fan of Reform

Nigel Farage
Image:
Nigel Farage

As the Conservatives battle with Reform for the right, Sir Mel didn’t have many positive words for Nigel Farage’s party.

“With Reform… these are populists, who peddle fantasy economics,” he said.

“‘Take everybody out of income tax up to £20,000 costs about £80bn according to the IFS [Institute For Fiscal Studies].”

The IFS has said it needed “more detail” to exactly cost Reform’s proposal, but “it could easily be in the range of £50 to £80bn a year”.

“I think ultimately,” Sir Mel says, “people will see through a lot of the populism that Reform stands for.”

He added that he believed that Reform’s 2024 manifesto, was, economically, “a work of fiction”.

“I mean, it’s quite dangerous, actually. I think if they’d been elected… the economy would have gone into a very bad place,” he said.

4. His ideal lunch? A cheese and ham toastie

Ms Badenoch and Sir Mel see eye-to-eye on many things - lunch isn't one of them. Pic: PA
Image:
Ms Badenoch and Sir Mel see eye-to-eye on many things – lunch isn’t one of them. Pic: PA

Sir Mel also addressed the most pressing issue of all – lunch.

If you’re unaware, this has proven a controversial subject in Westminster. Ms Badenoch told The Spectator in December she was “not a sandwich person… lunch is for wimps”.

Ms Reeves then told Electoral Dysfunction in March that she whips up a cheddar sandwich in 11 Downing Street when she can.

Read more from Sky News:
Labour MP hits out at ‘farce’ anti-corruption trial in Bangladesh
Lammy refers himself to watchdog after fishing with JD Vance

Sir Mel falls more in line with his opposite number than his leader.

“I’ve always liked a sandwich, particularly a toasted sandwich,” he said.

“I’d go with the Cadillac, the Rolls Royce of sandwiches, a ham and cheese.”

Sir Mel has previously, however, been partial to some more peculiar fillings.

“Do you remember those Breville toastie makers? When I went to university, I had one of those, or whatever the equivalent was,” he said.

“You could put baked beans in, eggs in, and all sorts of things.

“It was fantastic.”

To each, their own.

Electoral Dysfunction unites political powerhouses Beth Rigby, Ruth Davidson, and Harriet Harman to cut through the spin, and explain to you what’s really going on in Westminster and beyond.

Want to leave a question for Beth, Ruth, and Harriet?

Email: electoraldysfunction@sky.uk

WhatsApp: 07934 200444

Continue Reading

Politics

TikTok and Instagram accused of targeting teens with suicide and self-harm content

Published

on

By

TikTok and Instagram accused of targeting teens with suicide and self-harm content

TikTok and Instagram have been accused of targeting teenagers with suicide and self-harm content – at a higher rate than two years ago.

The Molly Rose Foundation – set up by Ian Russell after his 14-year-old daughter took her own life after viewing harmful content on social media – commissioned analysis of hundreds of posts on the platforms, using accounts of a 15-year-old girl based in the UK.

Politics Hub: Follow latest updates

The charity claimed videos recommended by algorithms on the For You pages continued to feature a “tsunami” of clips containing “suicide, self-harm and intense depression” to under-16s who have previously engaged with similar material.

One in 10 of the harmful posts had been liked at least a million times. The average number of likes was 226,000, the researchers said.

Mr Russell told Sky News the results were “horrifying” and showed online safety laws are not fit for purpose.

Molly Russell died in 2017. Pic: Molly Rose Foundation
Image:
Molly Russell died in 2017. Pic: Molly Rose Foundation

‘This is happening on PM’s watch’

He said: “It is staggering that eight years after Molly’s death, incredibly harmful suicide, self-harm, and depression content like she saw is still pervasive across social media.

“Ofcom’s recent child safety codes do not match the sheer scale of harm being suggested to vulnerable users and ultimately do little to prevent more deaths like Molly’s.

“The situation has got worse rather than better, despite the actions of governments and regulators and people like me. The report shows that if you strayed into the rabbit hole of harmful suicide self-injury content, it’s almost inescapable.

“For over a year, this entirely preventable harm has been happening on the prime minister’s watch and where Ofcom have been timid it is time for him to be strong and bring forward strengthened, life-saving legislation without delay.”

Ian Russell says children are viewing 'industrial levels' of self-harm content
Image:
Ian Russell says children are viewing ‘industrial levels’ of self-harm content

After Molly’s death in 2017, a coroner ruled she had been suffering from depression, and the material she had viewed online contributed to her death “in a more than minimal way”.

Researchers at Bright Data looked at 300 Instagram Reels and 242 TikToks to determine if they “promoted and glorified suicide and self-harm”, referenced ideation or methods, or “themes of intense hopelessness, misery, and despair”.

They were gathered between November 2024 and March 2025, before new children’s codes for tech companies under the Online Safety Act came into force in July.

Please use Chrome browser for a more accessible video player

What are the new online rules?

Instagram

The Molly Rose Foundation claimed Instagram “continues to algorithmically recommend appallingly high volumes of harmful material”.

The researchers said 97% of the videos recommended on Instagram Reels for the account of a teenage girl, who had previously looked at this content, were judged to be harmful.

Some 44% actively referenced suicide and self-harm, they said. They also claimed harmful content was sent in emails containing recommended content for users.

A spokesperson for Meta, which owns Instagram, said: “We disagree with the assertions of this report and the limited methodology behind it.

“Tens of millions of teens are now in Instagram Teen Accounts, which offer built-in protections that limit who can contact them, the content they see, and the time they spend on Instagram.

“We continue to use automated technology to remove content encouraging suicide and self-injury, with 99% proactively actioned before being reported to us. We developed Teen Accounts to help protect teens online and continue to work tirelessly to do just that.”

TikTok

TikTok was accused of recommending “an almost uninterrupted supply of harmful material”, with 96% of the videos judged to be harmful, the report said.

Over half (55%) of the For You posts were found to be suicide and self-harm related; a single search yielding posts promoting suicide behaviours, dangerous stunts and challenges, it was claimed.

The number of problematic hashtags had increased since 2023; with many shared on highly-followed accounts which compiled ‘playlists’ of harmful content, the report alleged.

A TikTok spokesperson said: “Teen accounts on TikTok have 50+ features and settings designed to help them safely express themselves, discover and learn, and parents can further customise 20+ content and privacy settings through Family Pairing.

“With over 99% of violative content proactively removed by TikTok, the findings don’t reflect the real experience of people on our platform which the report admits.”

According to TikTok, they not do not allow content showing or promoting suicide and self-harm, and say that banned hashtags lead users to support helplines.

Read more:
Backlash against new online safety rules
Musk’s X wants ‘significant’ changes to OSA

Please use Chrome browser for a more accessible video player

Why do people want to repeal the Online Safety Act?

‘A brutal reality’

Both platforms allow young users to provide negative feedback on harmful content recommended to them. But the researchers found they can also provide positive feedback on this content and be sent it for the next 30 days.

Technology Secretary Peter Kyle said: “These figures show a brutal reality – for far too long, tech companies have stood by as the internet fed vile content to children, devastating young lives and even tearing some families to pieces.

“But companies can no longer pretend not to see. The Online Safety Act, which came into effect earlier this year, requires platforms to protect all users from illegal content and children from the most harmful content, like promoting or encouraging suicide and self-harm. 45 sites are already under investigation.”

An Ofcom spokesperson said: “Since this research was carried out, our new measures to protect children online have come into force.

“These will make a meaningful difference to children – helping to prevent exposure to the most harmful content, including suicide and self-harm material. And for the first time, services will be required by law to tame toxic algorithms.

“Tech firms that don’t comply with the protection measures set out in our codes can expect enforcement action.”

Peter Kyle has said opponents of the Online Safety Act are on the side of predators. Pic: PA
Image:
Peter Kyle has said opponents of the Online Safety Act are on the side of predators. Pic: PA

‘A snapshot of rock bottom’

A separate report out today from the Children’s Commissioner found the proportion of children who have seen pornography online has risen in the past two years – also driven by algorithms.

Rachel de Souza described the content young people are seeing as “violent, extreme and degrading”, and often illegal, and said her office’s findings must be seen as a “snapshot of what rock bottom looks like”.

More than half (58%) of respondents to the survey said that, as children, they had seen pornography involving strangulation, while 44% reported seeing a depiction of rape – specifically someone who was asleep.

The survey of 1,020 people aged between 16 and 21 found that they were on average aged 13 when they first saw pornography. More than a quarter (27%) said they were 11, and some reported being six or younger.

Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email jo@samaritans.org in the UK. In the US, call the Samaritans branch in your area or 1 (800) 273-TALK.

Continue Reading

Politics

SEC pushes back decisions on Truth Social, Solana, XRP crypto ETFs 

Published

on

By

SEC pushes back decisions on Truth Social, Solana, XRP crypto ETFs 

SEC pushes back decisions on Truth Social, Solana, XRP crypto ETFs 

The SEC has pushed back decisions on Truth Social’s Bitcoin-Ethereum ETF, Solana products from 21Shares and Bitwise and 21Shares’ Core XRP Trust — all now set for October deadlines.

Continue Reading

Politics

US Treasury calls for public comment on GENIUS stablecoin bill

Published

on

By

US Treasury calls for public comment on GENIUS stablecoin bill

US Treasury calls for public comment on GENIUS stablecoin bill

The comments, due by Oct. 17, will focus on “innovative methods to detect illicit activity involving digital assets,” as required by the GENIUS Act.

Continue Reading

Trending