Connect with us

Published

on

“a slippery slope of unintended consequences” — Apple details reasons to abandon CSAM-scanning tool, more controversy ensues Safety groups remain concerned about child sexual abuse material scanning and user reporting.

Lily Hay Newman, wired.com – Sep 2, 2023 10:33 am UTC EnlargeLeonardo Munoz/Getty reader comments 181 with

In December, Apple said that it was killing an effort to design a privacy-preserving iCloud photo scanning tool for detecting child sexual abuse material (CSAM) on the platform. Originally announced in August 2021, the project had been controversial since its inception. Apple first paused it that September in response to concerns from digital rights groups and researchers that such a tool would inevitably be abused and exploited to compromise the privacy and security of all iCloud users. This week, a new child safety group known as Heat Initiative told Apple that it is organizing a campaign to demand that the company detect, report, and remove child sexual abuse material from iCloud and offer more tools for users to report CSAM to the company.

Today, in a rare move, Apple responded to Heat Initiative, outlining its reasons for abandoning the development of its iCloud CSAM scanning feature and instead focusing on a set of on-device tools and resources for users known collectively as Communication Safety features. The company’s response to Heat Initiative, which Apple shared with WIRED this morning, offers a rare look not just at its rationale for pivoting to Communication Safety, but at its broader views on creating mechanisms to circumvent user privacy protections, such as encryption, to monitor data. This stance is relevant to the encryption debate more broadly, especially as countries like the United Kingdom weigh passing laws that would require tech companies to be able to access user data to comply with law enforcement requests. Advertisement

Child sexual abuse material is abhorrent and we are committed to breaking the chain of coercion and influence that makes children susceptible to it, Erik Neuenschwander, Apple’s director of user privacy and child safety, wrote in the company’s response to Heat Initiative. He added, though, that after collaborating with an array of privacy and security researchers, digital rights groups, and child safety advocates, the company concluded that it could not proceed with development of a CSAM-scanning mechanism, even one built specifically to preserve privacy.

Scanning every users privately stored iCloud data would create new threat vectors for data thieves to find and exploit,” Neuenschwander wrote. “It would also inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types.

WIRED could not immediately reach Heat Initiative for comment about Apple’s response. The group is led by Sarah Gardner, former vice president of external affairs for the nonprofit Thorn, which works to use new technologies to combat child exploitation online and sex trafficking. In 2021, Thorn lauded Apple’s plan to develop an iCloud CSAM scanning feature. Gardner said in an email to CEO Tim Cook on Wednesday, which Apple also shared with WIRED, that Heat Initiative found Apple’s decision to kill the feature disappointing.

We firmly believe that the solution you unveiled not only positioned Apple as a global leader in user privacy but also promised to eradicate millions of child sexual abuse images and videos from iCloud, Gardner wrote to Cook. I am a part of a developing initiative involving concerned child safety experts and advocates who intend to engage with you and your company, Apple, on your continued delay in implementing critical technology Child sexual abuse is a difficult issue that no one wants to talk about, which is why it gets silenced and left behind. We are here to make sure that doesnt happen. Advertisement

Apple maintains that, ultimately, even its own well-intentioned design could not be adequately safeguarded in practice, and that on-device nudity detections for features like Messages, FaceTime, AirDrop, the Photo picker are a safer alternatives. Apple has also begun offering an application programming interface (API) for its Communication Safety features so third-party developers can incorporate them into their apps. Apple says that the communication platform Discord is integrating the features and that app makers broadly have been enthusiastic about adopting them.

We decided to not proceed with the proposal for a hybrid client-server approach to CSAM detection for iCloud Photos from a few years ago, Neuenschwander wrote to Heat Initiative. We concluded it was not practically possible to implement without ultimately imperiling the security and privacy of our users.

On Heat Initiative’s request that Apple create a CSAM reporting mechanism for users, the company told WIRED that its focus is on connecting its vulnerable or victimized users directly with local resources and law enforcement in their region that can assist them rather than Apple positioning itself as a middle man for processing reports. The company says that offering this intermediary service may make sense for interactive platforms like social networks.

The need to protect children from online sexual abuse is urgent, though, and as these concerns intersect with the broader encryption debate, Apple’s resolve on refusing to implement data scanning will continue to be tested.

Read the full exchange between Heat Initiative and Apple at WIRED. WIRED has redacted sensitive personal information for the privacy of senders and recipients.

This story originally appeared on wired.com. reader comments 181 with Advertisement Channel Ars Technica ← Previous story Next story → Related Stories Today on Ars

Continue Reading

Science

Germany to Send First European Astronaut Around the Moon on Artemis Mission

Published

on

By

Europe has secured its first astronaut seat to orbit the Moon through NASA’s Artemis program, marking a historic milestone for ESA. Director General Josef Aschbacher confirmed that a German astronaut will take the inaugural European lunar-orbit mission, enabled by Europe’s contributions to Orion’s service module and the Lunar Gateway. Veteran astronauts Matthias…

Continue Reading

Politics

Lawmakers stumble on stablecoin terms as US Congress grills Fed’s Bowman

Published

on

By

Lawmakers stumble on stablecoin terms as US Congress grills Fed’s Bowman

US Representative Stephen Lynch pressed Federal Reserve Vice Chair Michelle Bowman on Tuesday over her past remarks encouraging banks to “engage fully” with digital assets, questioning the Fed’s role in advancing crypto frameworks while showing confusion over the definition of stablecoins.

In a Tuesday oversight hearing, Lynch asked Bowman, the Fed vice chair for supervision, about remarks she had made at the Santander International Banking Conference in November. According to the congressman, Bowman said she supported banks “[engaging] fully” with respect to digital assets.

However, according to Bowman’s comments at the conference, she referred to “digital assets” rather than specifically cryptocurrencies. The questioning turned into Lynch asking Bowman about distinctions between digital assets and stablecoins.

The Fed official said that the central bank had been authorized by Congress — specifically, the GENIUS Act, a bill aimed at regulating payment stablecoins — to explore a framework for digital assets.

“The GENIUS Act requires us to promulgate regulations to allow these types of activities,” said Bowman.

Cryptocurrencies, Federal Reserve, Law, Congress, Stablecoin
Representative Stephen Lynch at Tuesday’s oversight hearing. Source: House Financial Services Committee

While the price of many cryptocurrencies can be volatile, stablecoins, like those pegged to the US dollar, are generally “stable,” as the name suggests. Though there have been instances where some coins have depegged from their respective currencies, such as the crash of Terra’s algorithmic stablecoin in 2022, the overwhelming majority of stablecoins rarely fluctuate past 1% of their peg.

Related: Atkins says SEC has ‘enough authority’ to drive crypto rules forward in 2026

Bowman said in August that staff at the Fed should be permitted to hold small “amounts of crypto or other types of digital assets” to gain an understanding of the technology.

FDIC acting chair says stablecoin framework is coming soon

Also testifying at the Tuesday hearing was Travis Hill, acting chair of the Federal Deposit Insurance Corporation. The government agency is one of many responsible for implementing the GENIUS Act, which US President Donald Trump signed into law in July.

According to Hill, the FDIC will propose a stablecoin framework “later this month,” which will include requirements for supervising issuers.