Go to home page

Apple Delays Rollout of Snoopy Phone Scanning—Will It Drop It Entirely?

Sept. 3, 2021 (EIRNS)—In response to an uproar against its plans to implement a system to scan users’ iPhones for child sexual abuse material, Apple has announced that it will rethink its proposal:

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material [CSAM]. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

Although the system was designed to detect what is essentially universally considered to be detestable content, it presented enormous problems that would make it ineffective for its stated purpose, liable to political expansion to other prohibited content, and prone to abuse in generating false reports.

• If Apple relied on a list of hashes (not images themselves) provided by security agencies, what would prevent those agencies from having phones scan for images representing political positions it opposed?

• If the goal was actually to stop the creation of this material, why not focus on finding the people creating it? To opt out of the system, a user simply has to turn off iCloud syncing, something that everyone with this objectionable material will now do, completely eliminating the supposed benefit of this snooping system.

• It could easily be spoofed, as demonstrated by a proof of concept website that alters pictures of kittens so that they trigger the detection system. Want to cause someone to be investigated? Just send them these kitten pictures and Apple will generate an alert.

Apple, which has heavily built its brand on privacy, should entirely eliminate its plans for on-phone scanning of users’ material.

Back to top    Go to home page clear
clear
clear