Privacy firestorm: Apple announces it’s going to scan all iPhones for images of child abuse

Apple has announced that it plans to scan all of its iPhones for images of child abuse and child pornography, which has ignited a firestorm of protest from privacy advocates concerned that the tech giant will also scan for unrelated content as well.

The company confirmed media reports in a blog post on Thursday that its new scanning technology which stems from a bevy of programs aimed at protecting children will “evolve and expand.” The suite will be part of iOS 15, which the company plans to release sometime this month.

While Apple has often promoted itself as highly protective of users’ privacy rights, the company noted in its post that the software being deployed will actually strengthen existing protections “by avoiding the need to carry out widespread image scanning on its cloud servers,” The Epoch Times reported.

“This innovative new technology allows Apple to provide valuable and actionable information to [the National Center for Missing and Exploited Children] and law enforcement regarding the proliferation of known CSAM,” the blog post said, using an acronym that stands for “child sexual abuse material.”

“And it does so while providing significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account. Even in these cases, Apple only learns about images that match known CSAM,” the post noted.

Apple also said that their new suite of programs will employ brand-new artificial intelligence and cryptography tech in order to locate CSAM when stored in iCloud Photos. The programs work by matching iCloud images to a known database of illegal pictures and photos, according to the blog post. If a specific number of such images find their way to iCloud Photos accounts, the company will then review them.

If images are found to be illegal, the company will report them to the National Center for Missing and Exploited Children. Apple noted that the tech software won’t be used to scan videos.

“Apple’s expanded protection for children is a game-changer,” John Clark, president and CEO of the center, said in a statement. “The reality is that privacy and child protection can coexist.“

However, The Epoch Times noted, several security researchers and experts aren’t so sure. While they applaud all efforts to root out and combat child abuse in all forms, they note that the program Apple plans to use could in fact result in major privacy breaches.

Apple’s new system is “an absolutely appalling idea,” Ross Anderson, a professor of security engineering at the University of Cambridge, told the Financial Times.

“It is going to lead to distributed bulk surveillance of … our phones and laptops,” he told the outlet.

Matthew Green, a professor and cryptographer at Johns Hopkins University, agreed.

“This sort of tool can be a boon for finding child pornography in people’s phones,” Green tweeted. “But imagine what it could do in the hands of an authoritarian government?”

He added that “even you believe Apple won’t allow these tools to be misused [crossed fingers emoji] there’s still a lot to be concerned about,” noting further that “these systems rely on a database of ‘problematic media hashes’ that you, as a consumer, can’t review.”

Green went on to tell The Associated Press that he is also concerned that Apple could be pressured by authoritarian governments around the world to use the technology for nefarious purposes.

All said, Apple isn’t alone in this endeavor.

“Microsoft created photoDNA to assist companies in identifying child sexual abuse images on the internet, while Facebook and Google have implemented systems to flag and review possibly illegal content,” The Epoch Times reported.

Jon Dougherty

Comments

Latest Articles