Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse material (CSAM). The lawsuit alleges that by not taking stronger measures to prevent the spread of such material, Apple is causing further harm to victims, forcing them to relive their trauma. The legal action comes after Apple had initially announced plans in 2021 to use digital signatures from organizations like the National Center for Missing and Exploited Children to detect known CSAM content in iCloud libraries. However, these plans were reportedly abandoned after privacy and security concerns were raised, suggesting that such a system could potentially create a backdoor for government surveillance.
The lawsuit was filed by a 27-year-old woman, who is suing under a pseudonym. She claims that images of her being molested as an infant were shared online by a relative, and she continues to receive daily law enforcement notifications about the possession of those images. According to attorney James Marsh, there may be up to 2,680 victims entitled to compensation in this case.
Apple responded by saying it is “urgently and actively innovating” to combat CSAM while ensuring the security and privacy of its users. In August, a similar lawsuit was filed by a 9-year-old girl and her guardian, accusing Apple of failing to address CSAM on iCloud.