Thousands of victims have sued Apple over its alleged failure to detect and report illegal child pornography, also known as child sex abuse materials (CSAM).
The proposed class action comes after Apple scrapped a controversial CSAM-scanning tool last fall that was supposed to significantly reduce CSAM spreading in its products. Apple defended its decision to kill the tool after dozens of digital rights groups raised concerns that the government could seek to use the functionality to illegally surveil Apple users for other reasons. Apple also was concerned that bad actors could use the functionality to exploit its users and sought to protect innocent users from false content flags.
Child sex abuse survivors suing have accused Apple of using the cybersecurity defense to ignore the tech giant's mandatory CSAM reporting duties. If they win over a jury, Apple could face more than $1.2 billion in penalties. And perhaps most notably for privacy advocates, Apple could also be forced to "identify, remove, and report CSAM on iCloud and implement policies, practices, and procedures to prevent continued dissemination of CSAM or child sex trafficking on Apple devices and services." That could mean a court order to implement the controversial tool or an alternative that meets industry standards for mass-detecting CSAM.
In a statement, Apple did not address survivors' key complaints regarding detection of known CSAM. Some survivors are in their late 20s now but were victimized when they were only infants or toddlers and have been traumatized by ongoing crime notifications that they received for decades, including some showing that images of their abuse have been found on Apple devices and services. But Apple's current safety efforts seem to focus more on detecting grooming or new CSAM than stopping the re-traumatization from known CSAM spreading.
"Child sexual abuse material is abhorrent and we are committed to fighting the ways predators put children at risk," Apple's spokesperson said. "We are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users. Features like Communication Safety, for example, warn children when they receive or attempt to send content that contains nudity to help break the chain of coercion that leads to child sexual abuse. We remain deeply focused on building protections that help prevent the spread of CSAM before it starts."
One of the plaintiffs suing anonymously to prevent further harm accused Apple of turning a "blind eye" to CSAM for years while she endures a "never-ending nightmare." One survivor told The New York Times that she's "suing Apple because she says it broke its promise to protect victims like her" when Apple failed to implement the CSAM detector.