After a week of criticism over its planned new system for detecting images of child sex abuse, Apple said on Friday that it will hunt only for pictures that have been flagged by clearinghouses in multiple countries.
That shift and others intended to reassure privacy advocates were detailed to reporters in an unprecedented fourth background briefing since the initial announcement eight days prior of a plan to monitor customer devices.
After previously declining to say how many matched images on a phone or computer it would take before the operating system notifies Apple for a human review and possible reporting to authorities, executives said on Friday it would start with 30, though the number could become lower over time as the system improves.
Apple also said it would be easy for researchers to make sure that the list of image identifiers being sought on one iPhone was the same as the lists on all other phones, seeking to blunt concerns that the new mechanism could be used to target individuals. The company published a long paper explaining how it had reasoned through potential attacks on the system and defended against them.
Apple acknowledged that it had handled communications around the program poorly, triggering backlash from influential technology policy groups and even its own employees concerned that the company was jeopardising its reputation for protecting consumer privacy.
It declined to say whether that criticism had changed any of the policies or software, but said that the project was still in development and changes were to be expected.
Asked why it had only announced that the US-based National Center for Missing and Exploited Children would be a supplier of flagged image identifiers when at least one other clearinghouse would need to have separately flagged the same picture, an Apple executive said that the company had only finalised its deal with NCMEC.
The rolling series of explanations, each giving more details that make the plan seem less hostile to privacy, convinced some of the company’s critics that their voices were forcing real change.
“Our pushing is having an effect,” tweeted Riana Pfefferkorn, an encryption and surveillance researcher at Stanford University.
Apple said last week that it will check photos if they are about to be stored on the iCloud online service, adding later that it would begin with just the United States.
Other technology companies perform similar checks once photos are uploaded to their servers. Apple’s decision to put key aspects of the system on the phone itself prompted concerns that governments could force Apple to expand the system for other uses, such as scanning for prohibited political imagery.
The controversy has even moved into Apple’s ranks, with employees debating the move in hundreds of posts on an internal chat channel, Reuters reported this week.
© Thomson Reuters 2021