Apple faces lawsuit over child sexual abuse material detection on iCloud

Apple sued for $1.2 billion over alleged failure to stop child abuse images on iCloud

Apple faces lawsuit over child sexual abuse material detection on iCloud
Apple faces lawsuit over child sexual abuse material detection on iCloud 

A lawsuit has been filed against Apple on Saturday in U.S. District Court in Northern California.

The Lawsuit accused the tech giant of knowingly allowing its iCloud storage service to be used for storing and distributing child sexual abuse material (CSAM).

The suit, filed by a 27-year-old woman, on behalf of thousands of victims of child sexual abuse, alleges that Apple's inaction has caused further harm to victims.

To note, the victim got abused in infancy from her relative, who molested her, recorded the abuse, and shared the images online.

The woman continues to receive notifications from law enforcement about the discovery of these images on various devices, including one that was stored on Apple's iCloud.

According to the lawsuit, which seeks $1.2 billion in damages, Apple had developed a CSAM detection tool using NeuralHash technology but stopped the program in 2021 due to privacy concerns raised by activists and security researchers.

"Instead of using the tools that it had created to identify, remove, and report images of her abuse, Apple allowed that material to proliferate, forcing victims of child sexual abuse to relive the trauma that has shaped their lives,” the lawsuit stated.

The lawsuit aims to compel Apple to implement robust measures to prevent the storage and distribution of CSAM on its platform.

Additionally, it aims to provide compensation to a potential group of 2,680 victims who may be eligible to join the case.