Request Media Kit

Apple Abandons Plans to Scan Devices for CSAM

Apple has completely abandoned one of its most controversial initiatives that would have involved scanning all devices for CSAM....
Apple Abandons Plans to Scan Devices for CSAM
Written by Matt Milano
  • Apple has completely abandoned one of its most controversial initiatives that would have involved scanning all devices for CSAM.

    Tech companies are always looking for ways to identify and root out Child Sexual Abuse Material (CSAM) from their platforms. Google, Microsoft, Meta, and others routinely scan content on their cloud platforms against a centralized database of CSAM content maintained by the National Center for Missing & Exploited Children (NCMEC).

    Apple’s proposed solution was much different. Apple created a two-step process that involved scanning a consumer’s device. Apple planned to install a database of hashes representing the files in NCMEC’s database on each and every iPhone, iPad, Mac, and Apple TV.

    To be clear, Apple was not going to place CSAM material on devices, only mathematical hashes that represent them. Any device with iCloud enabled would then run the same mathematical hash on local photos and videos and compare them to the database of NCMEC hashes. Once a threshold of matches was reached, the case would undergo human review before being forwarded to the authorities if the matches were accurate. Until that happened, all results would remain completely anonymous.

    Read More: The Biggest Beneficiary of Apple’s Privacy Crackdown: Apple

    After pushback from the industry and security and privacy experts, Apple originally delayed rollout and has now abandoned its plans in favor of other, less dangerous methods.

    “After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021,” the company told WIRED in a statement. “We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.”

    The company will instead focus on its opt-in Communication Safety features that parents can activate to flag inappropriate texts, pictures, and videos sent to their children via iMessage.

    “Potential child exploitation can be interrupted before it happens by providing opt-in tools for parents to help protect their children from unsafe communications,” the company continued in its statement. “Apple is dedicated to developing innovative privacy-preserving solutions to combat Child Sexual Abuse Material and protect children, while addressing the unique privacy needs of personal communications and data storage.”

    See Also: Apple’s Privacy Hypocrisy: The $15 Billion Google Deal

    The new approach is a far more balanced one to the responsibilities Apple is trying to wield while preserving individual privacy. While Apple’s original scanning approach seemed promising in terms of privacy, it also posed a host of problems. Security and privacy experts immediately pointed out the danger of Apple being forced by governments to use its matching algorithm for other purposes, such as political, religious, or human rights surveillance. There are also documented instances of non-CSAM images being placed in the NCMEC database, opening the possibility of false positives.

    Not surprisingly, the EU recently proposed new rules that sound eerily similar to Apple’s method, while simultaneously acknowledging “the detection process would be the most intrusive one for users.”

    Interestingly, Princeton researchers developed a similar system shortly before Apple and ultimately tabled it, and wrote a paper on why it should never be used.

    “Our system could be easily repurposed for surveillance and censorship,” the researchers wrote. “The design wasn’t restricted to a specific category of content; a service could simply swap in any content-matching database, and the person using that service would be none the wiser.”

    Overall, Apple’s announcement is a welcome one. To be fair, however, more time will need to pass to ensure Apple lives up to its promise and has not been forced to implement its scanning technology covertly.

    Get the WebProNews newsletter
    delivered to your inbox

    Get the free daily newsletter read by decision makers

    Subscribe
    Advertise with Us

    Ready to get started?

    Get our media kit