Ailan Evans, DCNF
Apple is delaying the release of a software update that scans iPhones for child pornography after criticism suggested that the features violated user privacy, the tech giant announced Friday.
“Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material,” the company said in a statement posted to its website Friday. “Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Apple unveiled a tool in August that automatically performed “on-device” scans of users’ images uploaded to iCloud, matching the scans against a database of sexually explicit content of children. If enough scans matched a threshold of content in the database, Apple would manually review the images and report the user to a child sexual abuse watchdog.
The tool also included a feature enabling Apple to access encrypted messages by performing on-device scans to detect explicit content in message attachments.
Security experts and privacy advocates warned that the tool was a “backdoor” into users’ private communications and data which could be exploited by governments or bad actors.
“All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts,” wrote India McKinney and Erica Portnoy of the Electronic Frontier Foundation.
“Americans are already spied on enough—and every Apple user should be acutely aware how much privacy they are losing,” Young Americans for Liberty spokesman Eric Brakey told the Daily Caller News Foundation.
Following the backlash, Craig Federighi, a senior vice president of software engineering at Apple, attempted to assuage privacy concerns in an interview with The Wall Street Journal. Federighi pointed out that Apple could only identify users if enough of their images matched the child pornography database, and he said that the tech company only scanned images uploaded to iCloud.
“We, who consider ourselves absolutely leading on privacy, see what we are doing here as an advancement of the state of the art in privacy, as enabling a more private world,” Federighi said.
Federighi also said Apple’s poor messaging was partly to blame for the scanning tool’s reception.
Matthew Green, a cryptography professor at Johns Hopkins University and the first person to leak details of the scanning tool, called Apple’s announcement “promising.”
“Considering the number of privacy invasions users have learned to live with, the pushback on this line means something,” Green tweeted. “Learn from it.”
For licensing opportunities of our original content, please contact [email protected]
DONATE TO BIZPAC REVIEW
Please help us! If you are fed up with letting radical big tech execs, phony fact-checkers, tyrannical liberals and a lying mainstream media have unprecedented power over your news please consider making a donation to BPR to help us fight them. Now is the time. Truth has never been more critical!
- Newsmax dropped by DirecTV, spokesperson claims it was over a cost dispute - January 25, 2023
- First on CNN: Classified documents found at Pence’s Indiana home - January 24, 2023
- DOJ subpoenas Giuliani over Trump fundraising after 2020 election - January 10, 2023
We have no tolerance for comments containing violence, racism, profanity, vulgarity, doxing, or discourteous behavior. If a comment is spam, instead of replying to it please click the ∨ icon below and to the right of that comment. Thank you for partnering with us to maintain fruitful conversation.