Dark Mode
  • Saturday, 24 February 2024

Apple’s Decision to Kill Its CSAM Photo-Scanning Tool Sparks Fresh Controversy

Apple’s Decision to Kill Its CSAM Photo-Scanning Tool Sparks Fresh Controversy
In December, Apple announced that its efforts to develop a privacy-preserving iCloud photo scanning tool to detect child sexual abuse material (CSAM) on its platform would be revoked. Originally announced in August 2021, the project has been controversial since its inception. Apple first discontinued the feature in September in response to concerns from digital rights groups and researchers that such tools would inevitably compromise the privacy and security of all iCloud users by misusing them. did. This week, a new child safety organization called the Heat Initiative told Apple to "detect, report, and remove" child sexual abuse content from iCloud and provide users with tools to add CSAM to iCloud reporting companies. He said he was organizing a campaign to demand it.
 
Today, in an unusual move for the Heat initiative, Apple abandoned the development of an iCloud CSAM scanning feature and instead focused on a set of on-device tools and resources for users collectively known as communication security features. explained and responded. The company's response to Heat Initiative, which Apple shared with WIRED this morning, offers a rare look not just at its rationale for pivoting to Communication Safety, but at its broader views on creating mechanisms to circumvent user privacy protections, such as encryption, to monitor data. This stance is relevant to the encryption debate more broadly, especially as countries like the United Kingdom weigh passing laws that would require tech companies to be able to access user data to comply with law enforcement requests.
“Child sexual abuse material is abhorrent and we are committed to breaking the chain of coercion and influence that makes children susceptible to it,” Erik Neuenschwander, Apple's director of user privacy and child safety, wrote in the company's response to Heat Initiative. He added, though, that after collaborating with an array of privacy and security researchers, digital rights groups, and child safety advocates, the company concluded that it could not proceed with development of a CSAM-scanning mechanism, even one built specifically to preserve privacy.
“Scanning every user’s privately stored iCloud data would create new threat vectors for data thieves to find and exploit," Neuenschwander wrote. "It would also inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types.”
 
WIRED could not immediately reach Heat Initiative for comment about Apple's response. The group is led by Sarah Gardner, former vice president of external affairs for the nonprofit Thorn, which works to use new technologies to combat child exploitation online and sex trafficking. In 2021, Thorn lauded Apple's plan to develop an iCloud CSAM scanning feature. Gardner said in an email to CEO Tim Cook on Wednesday, August 30, which Apple also shared with WIRED, that Heat Initiative found Apple's decision to kill the feature “disappointing.”
 
“We firmly believe that the solution you unveiled not only positioned Apple as a global leader in user privacy but also promised to eradicate millions of child sexual abuse images and videos from iCloud,” Gardner wrote to Cook. “I am a part of a developing initiative involving concerned child safety experts and advocates who intend to engage with you and your company, Apple, on your continued delay in implementing critical technology … Child sexual abuse is a difficult issue that no one wants to talk about, which is why it gets silenced and left behind. We are here to make sure that doesn’t happen.”
  
Apple maintains that, ultimately, even its own well-intentioned design could not be adequately safeguarded in practice, and that on-device nudity detections for features like Messages, FaceTime, AirDrop, and the Photo picker are safer alternatives. Apple has also begun offering an application programming interface (API) for its Communication Safety features so third-party developers can incorporate them into their apps. Apple said its communications platform Discord has integrated these features, and app makers are broadly excited about the acquisition.
 
“We have decided not to continue with the proposal that was put forward several years ago for his hybrid client/server approach to his CSAM detection of iCloud Photos,” Neuenschwander wrote to his Heat Initiative. said in a letter. “Ultimately, we concluded that it was virtually impossible to implement without compromising user security and privacy.”
 
When asked by the Heat Initiative for Apple to create a CSAM reporting mechanism for users, the company asked WIRED to direct vulnerable and victimized users to local resources and law enforcement agencies who can help them. The focus is on connecting, he said. Apple identifies itself as an intermediary in the reporting process. The company says it makes sense to offer this intermediary service to interactive platforms like social networks.

Comment / Reply From