Photo: Medium.com

CSAM stands for Child Sexual Abuse Material, and it is something that tech has been trying to catch up with for decades, not very successfully. Countless are the horror stories of CSAM content popping up all over the Internet (I am not going to talk about the dark web since that is another universe in its own where CSAM runs rampant).

9TO5Mac perfectly summarizes what is it and how Apple is detecting it:

How is it usually detected?

The usual way to detect CSAM is when cloud services like Google Photos scan uploaded photos and compare them against a database of known CSAM images. This database is provided by NCMEC and similar organizations around the world.

The actual matching process uses what’s known as a hash, or digital fingerprint. This is derived from key elements of the image and is deliberately fuzzy so that it will continue to work when images are resized, cropped, or otherwise processed. This means there will sometimes be false positives: an innocent image whose hash happens to be a close enough match to a CSAM one.

How is Apple detecting CSAM?

Apple made an announcement in early August 2021 about its own plans to begin scanning for CSAM.

Apple has chosen to take a somewhat different approach, which it says better protects privacy. This process is:

  • Apple downloads the CSAM database hashes to your iPhone
  • An on-device process looks for matches with hashes of your photos
  • If fewer than 30* are found, no action is taken
  • If 30+ matches are found, low-resolutions of your photos are manually examined by Apple
  • If the photos are found to be innocent, no further action is taken
  • If manual review confirms them as CSAM, law enforcement is informed

According to Gordon Kelly from Forbes, “in a new editorial published by The Washington Post, a pair of researchers who spent two years developing a CSAM (child sexual abuse material) detection system similar to the one Apple plans to install on users’ iPhones, iPads and Macs next month, have delivered an unequivocal warning: it’s dangerous”.

The technicalities of how this technology is being implemented are mind-boggling, and Apple has admitted they have been running the technology since 2019.

Should you be worried about this?

This is a very tricky question, and it all depends on where you stand in terms of privacy: the researchers mentioned in the WaPo article (Jonathan Mayer and Anunay Kulshrestha from Princeton University) conclude that the technology can easily get weaponized, like anything digital; they mentioned how the Chinese government could turn it against political dissidents. Another downside of this implementation is that the system could create false positives and potentially ruin someone’s reputation.

On the other hand, this technology has the potential to detect and curve the circulation of CSAM. It is also a conundrum that Apple is facing since it is well known in the cyber security community that iPhones/iOS are (or were?) more secure than Android systems.

Privacy experts express their main concern on the method Apple is choosing to implement the tech (downloading the database to your phone and running the comparison on-device).

CSAM will launch on iOS 15, iPadOS 15, watchOS 8 and macOS Monterey in September 2021.