Apple bowed to privacy and civil liberties advocates Friday when it agreed to delay and modify a controversial plan to scan users’ photos for child pornography.
The company’s tool, called “neuralMatch,” would scan images on Apple users’ devices before they’re uploaded to iCloud. A separate tool would sift through users’ encrypted messages for child pornography.
Privacy advocates reacted to Apple’s August announcement.
Electronic Frontier Foundation, a digital privacy organization, collected more than 25.000 signatures on a petition to stop the tool. Meanwhile, the American Civil Liberties Union stated in a letter, that the tool would not be allowed. “censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.”
Critics say the tool could easily be misused by repressive governments to track and punish users for all kinds of content besides child pornography, including political content.
And some privacy activists have pointed to Apple’s seemingly accommodating relationship with the government of China, where the vast majority of its devices are manufactured, as evidence that the company would allow the tool to be used for political repression.
In a call with reporters prior to Friday’s announcement, an Apple representative was asked whether the company would exit the Chinese market if authorities demanded the company to use the scanning tool for other purposes. An Apple representative said such a decision would be “above their pay grade,” Vice reported.
In Friday’s announcement, Apple did not provide specifics on how it would change its child protection features, but acknowledged the backlash.
Apple claims that the tool will only flag images found in a database of child pornography.
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,”Apple stated this in a statement sent to media outlets. “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
The initial plan was for the feature to be released this year. It’s now unclear when the company plans to release the features or how they’ll be changed.
Apple stated that the tool will not flag images already in a database that contains known child pornography. For example, parents who take photos showing their children bathing wouldn’t be flagged.
Another one of Apple’s features would have scanned images sent to minors through iMessage for porn, blurring such images and sending a warning to the child.
Johns Hopkins University cybersecurity researcher Matthew Green, a critic of Apple’s features, said Friday’s move “looks promising.”
“Talk to the technical and policy communities before you do whatever you’re going to do,” Green wrote in a Twitter thread addressing Apple. “Talk to the general public as well. This isn’t a fancy new Touchbar: it’s a privacy compromise that affects 1 [billion] users.”
Electronic Frontier Foundation Executive Director Cindy Cohn said in a statement to The Post that Friday’s delay, while welcome, doesn’t go far enough.
“The company must go further than just listening and drop its plans to put a backdoor into its encryption entirely,”Cohn spoke. “These features would create an enormous danger to iPhone users’ privacy and security, offering authoritarian governments a turnkey mass surveillance system to spy on citizens.”
“The enormous coalition that has spoken out will continue to demand that user phones — both their messages and their photos — be protected, and that the company maintain its promise to provide real privacy to its users,”Cohn also added.
Groups like the Electric Frontier Foundation and American Civil Liberties Union condemned Apple’s move.Apple