Apple Pauses Plans To Scan Phones For Images Of Child Sexual Abuse

Mary J. Phillips
5 min readSep 5, 2021

Tech company Apple has delayed its plan to implement technology that would scan people’s iPhones and iPads and report images of child sexual abuse and pornography. The rollout, which was announced in August as part of Apple’s heightened protections for children, has not been issued a new release date.

“Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of child sexual abuse material,” the company said in a statement Friday.

Apple’s plans came under sharp criticism from privacy advocates who argued the technology could be retrofitted and used by government agencies or other officials as a form of spyware and was a steppingstone to large violations of user and public privacy.

“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” the company said in its release.

And iPhone 12 Pro ad is seen at an iSpot inside shopping mall in Krakow, Poland on August 26, 2021. Beata Zawrzel/NurPhoto via Getty Images

The planned feature would have made direct updates to the operating systems of iPhones and iPads, allowing the software to detect explicit images involving a child that were stored in iCloud photos or directly on devices. The programming would suspend users the software flags as having child sexual abuse content and would then report it to the National Center for Missing and Exploited Children.

The update was part of additional features from Apple to ensure safety for child users. In addition to the scanning feature, the company also promoted a communication safeguard for underage users, which would blur explicit images from being sent and received and notify parents if their child opened explicit photos.

Famed former National Security Agency contractor and surveillance whistleblower Edward Snowden wrote in his newsletter Apple’s potential update “will permanently redefine what belongs to you, and what belongs to them.”

The Surveillance Technology Oversight Project, a New York-based anti-surveillance group, said in a statement Friday that the pause of the scanning technology was welcomed, but still did not do enough to prevent breaches of privacy from Apple.

“If you’re building a system to search, you can set it to search for anything,” Executive Director Albert Fox Cahn told CBS News. “When you look at something as grotesque as child abuse, we instinctively want to do anything we can to stop it. In the process, we can easily create systems that exceed their mandates.”

“Even worse, they’ve created a model that can be easily hijacked by any foreign government whose jurisdiction Apple operates in to search for political materials or religious tracts or anything else they want to target,” he said.

Apple has not announced when the new technology will begin but said the company plans to take the next several months to implement additional “feedback” into the programming.

Apple Delays Plans To Scan Devices For Child Sexual Abuse Images

Apple has delayed plans to scan iPhones and iPads to look for collections of child sexual abuse images after backlash from customers and privacy advocates, the tech giant said Friday.

The company last month announced features aimed at flagging child sexual abuse images that users store on its iCloud servers. Apple did not say how long it will delay the program.

“Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material,” according to a company statement.

“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

The system was built to look for images that match those from libraries assembled by law enforcement to find and track the dissemination of child abuse material on the internet.

Some child safety advocates were disappointment by Apple’s announcement.

“We absolutely value privacy and want to avoid mass surveillance in any form from government, but to fail children and fail the survivors of child sexual abuse by saying we’re not going to look for known rape videos and images of children because of some extreme future that may never happen just seems wholly wrong to me,” said Glen Pounder, chief operating officer of Child Rescue Coalition, a nonprofit that develops software to help law enforcement identify people downloading child sexual abuse material.

And Nelson O. Bunn Jr, executive director of the National District Attorneys Association lashed out at privacy advocates, who he claims failed “to articulate how the protection of children and the prosecution of offenders is unable to be balanced with the privacy concerns of Apples’ customers.”

“Prosecutors and law enforcement continue to face significant hurdles in the fight to end the abuse and exploitation of our society’s most vulnerable victims,” Bunn added.

But privacy advocates, who had feared Apple’s plans had the potential of opening innocent tech users to needless inspection, celebrated Friday’s announcement.

“I’m very happy to hear that Apple is delaying their CSAM technology rollout. It is half-baked, full of dangerous vulnerabilities, and breaks encryption for everyone,” tweeted Eran Toch, a professor in the Department of Industrial Engineering at Tel-Aviv University who specializes in privacy issues.

“Child safety deserves a solution which is more thoughtful and more impactful.”

Apple’s effort seemed doomed from the start, with tech experts also saying the plan wasn’t focused enough.

“I’m not OK with it (Apple’s backtrack) but I’m not surprised, Apple bungled the release and the narrative got away from them very quickly,” University of California, Berkeley computer science professor Hany Farid said.

Farid, who in 2009 developed the Microsoft tool PhotoDNA to help police and tech companies find and remove known images of child sexual exploitation, was particularly critical of Apple’s plans that focused on images rather than video — which accounts for a majority of this abuse.

“They’re 10 years late to the game, they were solving at best less than half the problem and you can circumvent the technology very quick” by just not storing 30 or more illegal images on iCloud, Farid said.

CORRECTION (Sept. 3, 2021, 2:26 p.M. ET): A previous version of this article misspelled the first name of the chief operating officer of Child Rescue Coalition. He is Glen Pounder, not Glenn.

David K. Li is a breaking news reporter for NBC News.

Olivia Solon is a senior reporter on the tech investigations team for NBC News.

Sara Mhaidli contributed.

[It includes affiliate links]

--

--