A member of rap trio Kneecap was greeted by hundreds of supporters as he arrived at court this morning, charged with allegedly supporting a proscribed terror organisation.
Liam Og O hAnnaidh, who performs under the name Mo Chara, is accused of displaying a flag in support of Hezbollah at a gig in London in November last year.
Demonstrators waving flags and holding banners in support of the rapper greeted him with cheers as he made his way into Westminster Magistrates’ Court.
Image: The rapper is mobbed by fans and media as he arrives at court. Pics: PA
Supported by his Kneecap bandmates Naoise O Caireallain and JJ O Dochartaigh, it took the rapper more than a minute to enter the building as security officers worked to usher him inside through a crowd of photographers.
Fans held signs which read “Free Mo Chara”, while others waved Irish and Palestinian flags.
As the hearing got under way, O hAnnaidh confirmed his name, date of birth and address, with the court hearing an Irish language interpreter would be present.
During a previous hearing, prosecutors said the 27-year-old is “well within his rights” to voice his opinions on the Israel-Palestine conflict, but said the alleged incident at the O2 Forum in Kentish Town was a “wholly different thing”.
More from UK
O hAnnaidh is yet to enter a plea to the charge.
Image: Bandmates Naoise O Caireallain (pictured, centre) and JJ O Dochartaigh are supporting O hAnnaidh. Pic: Reuters
Who are Kneecap?
Kneecap put out their first single in 2017 and rose to wider prominence in 2024 after the release of their debut album and an eponymously titled film – a fictionalised retelling of how the band came together and their fight to save the Irish language.
The film, in which the trio play themselves and co-star alongside starring Oscar nominee Michael Fassbender, won the BAFTA for outstanding debut earlier this year, for director and writer Rich Peppiatt.
A mother-of-two who died after being hit by a falling tree branch on the way home from a family outing would do “everything she could for anyone”, her husband has said.
Madia Kauser, 32, was walking with her family in Witton Park in Blackburn, Lancashire, on 11 August when the incident happened.
She is reported to have pushed her young daughter to safety.
A joint investigation is being carried out by Lancashire Police and the Health and Safety Executive and any witnesses are being asked to come forward.
In a tribute issued by police, her husband Wasim Khan described her as the “most beautiful woman in the world” and said he feels “completely lost without her”.
He said: “My wife, a mother-of-two, a daughter, sister and a friend we lost to a tragic event that came on the way home from a family day out in the park.
“She was the most beautiful woman in the world, she did everything for our two children, she did everything she could for anyone and would bring smiles whenever she entered the room.
“She was my comfort, my partner in life and the love of my life.
“We have so many great memories, went through pain together and started a family together.
“Honestly, I feel completely lost without her and I do not know how to put into words how much I miss her face, her character and her presence. My one and only.”
Detective Inspector Iain Czapowski said: “This is an absolutely tragic incident which has cost a young woman her life and my thoughts are with her loved ones.
“We are working closely with our colleagues from the Health and Safety Executive and with the co-operation of the council to try and establish the full circumstances of what happened, and I would like to speak to anyone with information which could assist with that.
“I am especially keen to speak to anyone who actually saw what happened on that fateful night and I would urge them to contact us.”
A council has won its bid to temporarily block asylum seekers from being housed at a hotel in Essex.
Epping Forest District Council sought an interim injunction to stop migrants from being accommodated at the Bell Hotel in Epping, which is owned by Somani Hotels Limited.
A government attempt to delay the application was rejected by the High Court judge earlier on Tuesday.
The interim injunction now means the hotel has to be cleared of its occupants within 14 days.
Somani Hotels said it intended to appeal the decision.
Several protests have been held outside the hotel in recent weeks after an asylum seeker housed there was charged with sexually assaulting a 14-year-old girl.
Hadush Gerberslasie Kebatu, 38, was charged with trying to kiss a teenage girl and denies the allegations. He is due to stand trial later this month.
Image: Police officers ahead of a demonstration outside The Bell Hotel in July. Pic: PA
At a hearing last week, barristers for the council claimed Somani Hotels breached planning rules because the site is not being used for its intended purpose as a hotel.
Philip Coppel KC, for the council, said the problem was “getting out of hand” and “causing great anxiety” to local people.
He said the hotel “is no more a hotel [to asylum seekers] than a borstal to a young offender”.
Image: File pic: PA
Piers Riley-Smith, for Somani Hotels Limited, said a “draconian” injunction would cause “hardship” for those in the hotel, arguing “political views” were not grounds for an injunction to be granted.
He also said contracts to house asylum seekers were a “financial lifeline” for the hotel, which was only 1% full in August 2022, when it was open to paying customers.
Image: Protesters and counter-demonstrators outside The Bell Hotel in July. Pic: PA
The hotel housed migrants from May 2020 to March 2021, then from October 2022 to April 2024, with the council never instigating any formal enforcement proceedings against this use, Mr Riley-Smith said.
They were being placed there again in April 2025 and Mr Riley-Smith said a planning application was not made “having taken advice from the Home Office”.
At the end of the hearing last week, Mr Justice Eyre ordered that Somani Hotels could not “accept any new applications” from asylum seekers to stay at the site until he had made his ruling on the temporary injunction.
This breaking news story is being updated and more details will be published shortly.
TikTok and Instagram have been accused of targeting teenagers with suicide and self-harm content – at a higher rate than two years ago.
The Molly Rose Foundation – set up by Ian Russell after his 14-year-old daughter took her own life after viewing harmful content on social media – commissioned analysis of hundreds of posts on the platforms, using accounts of a 15-year-old girl based in the UK.
The charity claimed videos recommended by algorithms on the For You pages continued to feature a “tsunami” of clips containing “suicide, self-harm and intense depression” to under-16s who have previously engaged with similar material.
One in 10 of the harmful posts had been liked at least a million times. The average number of likes was 226,000, the researchers said.
Mr Russell told Sky News the results were “horrifying” and showed online safety laws are not fit for purpose.
Image: Molly Russell died in 2017. Pic: Molly Rose Foundation
‘This is happening on PM’s watch’
He said: “It is staggering that eight years after Molly’s death, incredibly harmful suicide, self-harm, and depression content like she saw is still pervasive across social media.
“Ofcom’s recent child safety codes do not match the sheer scale of harm being suggested to vulnerable users and ultimately do little to prevent more deaths like Molly’s.
“The situation has got worse rather than better, despite the actions of governments and regulators and people like me. The report shows that if you strayed into the rabbit hole of harmful suicide self-injury content, it’s almost inescapable.
“For over a year, this entirely preventable harm has been happening on the prime minister’s watch and where Ofcom have been timid it is time for him to be strong and bring forward strengthened, life-saving legislation without delay.”
Image: Ian Russell says children are viewing ‘industrial levels’ of self-harm content
After Molly’s death in 2017, a coroner ruled she had been suffering from depression, and the material she had viewed online contributed to her death “in a more than minimal way”.
Researchers at Bright Data looked at 300 Instagram Reels and 242 TikToks to determine if they “promoted and glorified suicide and self-harm”, referenced ideation or methods, or “themes of intense hopelessness, misery, and despair”.
Please use Chrome browser for a more accessible video player
3:53
What are the new online rules?
Instagram
The Molly Rose Foundation claimed Instagram “continues to algorithmically recommend appallingly high volumes of harmful material”.
The researchers said 97% of the videos recommended on Instagram Reels for the account of a teenage girl, who had previously looked at this content, were judged to be harmful.
Some 44% actively referenced suicide and self-harm, they said. They also claimed harmful content was sent in emails containing recommended content for users.
A spokesperson for Meta, which owns Instagram, said: “We disagree with the assertions of this report and the limited methodology behind it.
“Tens of millions of teens are now in Instagram Teen Accounts, which offer built-in protections that limit who can contact them, the content they see, and the time they spend on Instagram.
“We continue to use automated technology to remove content encouraging suicide and self-injury, with 99% proactively actioned before being reported to us. We developed Teen Accounts to help protect teens online and continue to work tirelessly to do just that.”
TikTok
TikTok was accused of recommending “an almost uninterrupted supply of harmful material”, with 96% of the videos judged to be harmful, the report said.
Over half (55%) of the For You posts were found to be suicide and self-harm related; a single search yielding posts promoting suicide behaviours, dangerous stunts and challenges, it was claimed.
The number of problematic hashtags had increased since 2023; with many shared on highly-followed accounts which compiled ‘playlists’ of harmful content, the report alleged.
A TikTok spokesperson said: “Teen accounts on TikTok have 50+ features and settings designed to help them safely express themselves, discover and learn, and parents can further customise 20+ content and privacy settings through Family Pairing.
“With over 99% of violative content proactively removed by TikTok, the findings don’t reflect the real experience of people on our platform which the report admits.”
According to TikTok, they not do not allow content showing or promoting suicide and self-harm, and say that banned hashtags lead users to support helplines.
Please use Chrome browser for a more accessible video player
5:23
Why do people want to repeal the Online Safety Act?
‘A brutal reality’
Both platforms allow young users to provide negative feedback on harmful content recommended to them. But the researchers found they can also provide positive feedback on this content and be sent it for the next 30 days.
Technology Secretary Peter Kyle said: “These figures show a brutal reality – for far too long, tech companies have stood by as the internet fed vile content to children, devastating young lives and even tearing some families to pieces.
“But companies can no longer pretend not to see. The Online Safety Act, which came into effect earlier this year, requires platforms to protect all users from illegal content and children from the most harmful content, like promoting or encouraging suicide and self-harm. 45 sites are already under investigation.”
An Ofcom spokesperson said: “Since this research was carried out, our new measures to protect children online have come into force.
“These will make a meaningful difference to children – helping to prevent exposure to the most harmful content, including suicide and self-harm material. And for the first time, services will be required by law to tame toxic algorithms.
“Tech firms that don’t comply with the protection measures set out in our codes can expect enforcement action.”
Image: Peter Kyle has said opponents of the Online Safety Act are on the side of predators. Pic: PA
‘A snapshot of rock bottom’
A separate report out today from the Children’s Commissioner found the proportion of children who have seen pornography online has risen in the past two years – also driven by algorithms.
Rachel de Souza described the content young people are seeing as “violent, extreme and degrading”, and often illegal, and said her office’s findings must be seen as a “snapshot of what rock bottom looks like”.
More than half (58%) of respondents to the survey said that, as children, they had seen pornography involving strangulation, while 44% reported seeing a depiction of rape – specifically someone who was asleep.
The survey of 1,020 people aged between 16 and 21 found that they were on average aged 13 when they first saw pornography. More than a quarter (27%) said they were 11, and some reported being six or younger.
Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email jo@samaritans.org in the UK. In the US, call the Samaritans branch in your area or 1 (800) 273-TALK.