Connect with us

Published

on

Instagram is releasing a feature that will let users easily reset their algorithms, as the government strengthens its regulation of online safety.

With the new reset feature, users can clear their recommended content from Explore, Reels and their feed, potentially reducing the amount of harmful content they are exposed to.

It’s all part of Meta’s push to make the app safer for young people, after announcing more private Teen accounts in September.

The feature, which will soon be rolled out globally, was announced as the government outlined its priorities for online safety.

Peter Kyle, Labour’s technology secretary, said Ofcom should ensure the concept of “safety by design” is being followed by tech companies from the outset.

That would ensure more harm is caught before it occurs.

Peter Kyle and Hilary Benn.
Pic: Reuters
Image:
Peter Kyle (L). Pic: Reuters

He also pushed for more transparency from tech giants on what harms are occurring on their platforms.

More on Instagram

“From baking safety into social media sites from the outset, to increasing platform transparency, these priorities will allow us to monitor progress, collate evidence, innovate, and act where laws are coming up short,” Mr Kyle said.

While the announcement was welcomed by child protection groups, some cautioned that the government needed to go further.

Ian Russell, chair of trustees at Molly Rose Foundation, said: “This announcement outlines a much needed course correction, vital for improved online safety, and to prevent the new regulation falling badly short of expectations.

“However, while this lays down an important marker for Ofcom to be bolder, it is also abundantly clear that we need a new Online Safety Act to strengthen current structural deficiencies and focus minds on the importance of harm reduction.

Read more from Sky News:
Civil plane goes supersonic for the first time since Concorde
Trump watches Space X launch but it does not go to plan
Google’s AI chatbot Gemini tells user to ‘please die’

Meanwhile, the NSPCC has urged social media platforms to be more transparent and proactive about child safety.

“They should be disrupting ‘safe havens’ for offenders by tackling the hidden abuse taking place through private messaging,” said Maria Neophytou, director of strategy and knowledge at the NSPCC.

“It is right that the government is focusing on driving innovation and new technology that can identify and disrupt abuse and prevent harm from happening in the first place.

“The regulatory framework has the potential to change the online world for children.”

Continue Reading

Politics

Europe’s new chat police: Chat Control legislation nudges forward in the EU

Published

on

By

Europe’s new chat police: Chat Control legislation nudges forward in the EU

Representatives of European Union member states reached an agreement on Wednesday in the Council of the EU to move forward with the controversial “Chat Control” child sexual abuse regulation, which paves the way for new rules targeting abusive child sexual abuse material (CSAM) on messaging apps and other online services.

“Every year, millions of files are shared that depict the sexual abuse of children… This is completely unacceptable. Therefore, I’m glad that the member states have finally agreed on a way forward that includes a number of obligations for providers of communication services,” commented Danish Minister for Justice, Peter Hummelgaard.

The deal, which follows years of division and deadlock among member states and privacy groups, allows the legislative file to move into final talks with the European Parliament on when and how platforms can be required to scan user content for suspected child sexual abuse and grooming.

The existing CSAM framework is set to expire on April 3, 2026, and is on track to be replaced by the new legislation, pending detailed negotiations with European Parliament lawmakers.

EU Chat Control laws: What’s in and what’s out

In its latest draft, the Council maintains the core CSAM framework but modifies how platforms are encouraged to act. Online services would still have to assess how their products can be abused and adopt mitigation measures.

Service providers would also have to cooperate with a newly-established EU Centre on Child Sexual Abuse to support the implementation of the regulation, and face oversight from national authorities if they fall short.

While the latest Council text removes the explicit obligation of mandatory scanning of all private messages, the legal basis for “voluntary” CSAM detection is extended indefinitely. There are also calls for tougher risk obligations for platforms.

Related: After Samourai, DOJ’s money-transmitter theory now looms over crypto mixers

A compromise that satisfies neither side

To end the Chat Control stalemate, a team of Danish negotiators in the Council worked to remove the most contentious element: the blanket mandatory scanning requirement. Under previous provisions, end-to-end encrypted services like Signal and WhatsApp would have been required to systematically search users’ messages for illegal material.

Yet, it’s a compromise that leaves both sides feeling shortchanged. Law enforcement officials warn that abusive content will still lurk in the corners of fully encrypted services, while digital rights groups argue that the deal still paves the way for broader monitoring of private communications and potential for mass surveillance, according to a Thusday Politico report.

Lead negotiator and Chair of the Committee on Civil Liberties, Justice and Home Affairs in the European Parliament, Javier Zarzalejos, urged both the Council and Parliament to enter negotiations at once. He stressed the importance of establishing a legislative framework to prevent and combat child sexual abuse online, while respecting encryption.

Law, Government, Europe, Privacy, European Union, Policy
Source: Javier Zarzalejosj

“I am committed to work with all political groups, the Commission, and member states in the Council in the coming months in order to agree on a legally sound and balanced legislative text that contributes to effectively prevent and combating child sexual abuse online,” he stated.

The Council celebrated the latest efforts to protect children from sexual abuse online; however, former Dutch Member of Parliament Rob Roos lambasted the Council for acting similarly to the “East German era, stripping 450 million EU citizens of their right to privacy.” He warned that Brussels was acting “behind closed doors,” and that “Europe risks sliding into digital authoritarianism.”

Telegram founder and CEO Pavel Durov pointed out that EU officials were exempt from having their messages monitored. He commented in a post on X, “The EU weaponizes people’s strong emotions about child protection to push mass surveillance and censorship. Their surveillance law proposals conveniently exempted EU officials from having their own messages scanned.”

Related: Advocacy groups urge Trump to intervene in the Roman Storm retrial

Privacy on trial in broader global crackdown

The latest movement on Chat Control lands in the middle of a broader global crackdown on privacy tools. European regulators and law‑enforcement agencies have pushed high‑profile cases against crypto privacy projects like Tornado Cash, while US authorities have targeted developers linked to Samurai Wallet over alleged money‑laundering and sanctions violations, thrusting privacy‑preserving software into the crosshairs.

In response, Ethereum co‑founder Vitalik Buterin doubled down on the right to privacy as a core value. He donated 128 ETH each (roughly $760,000) to decentralized messaging projects Session and SimpleX Chat, arguing their importance in “preserving our digital privacy.”

Session president Alexander Linton told Cointelegraph that regulatory and technical developments are “threatening the future of private messaging,” while co-founder Chris McCabe said the challenge was now about raising global awareness.

Magazine: 2026 is the year of pragmatic privacy in crypto — Canton, Zcash and more