Apple has been sued over its decision not to implement the CSAM system, which would scan photos from iCloud for child sexual abuse materials. This was reported by TechCrunch.
In 2021, Apple first announced a system to detect CSAM content in iCloud libraries. It was supposed to use digital signatures from the National Center for Children's Services. However, the initiative drew criticism due to concerns that it could create a backdoor for government surveillance of users. As a result, Apple refused to implement the system and is now forced to defend its position in court.
The lawsuit alleges that Apple's inaction in combating the spread of illegal content forces victims to relive their trauma. It is noted that the company announced "a much-publicized improved design to protect children" but did not implement it and did not take any measures to identify and restrict such content.
The lawsuit was filed by a 27-year-old woman who uses a pseudonym. She claims that she was sexually abused as a child by a family member who posted the images on the Internet. The victim states that she still receives almost daily notifications from law enforcement agencies that someone is accused of possessing these images.
Lawyer James Marsh, who is involved in the trial, says there is a potential group of 2,680 victims who could receive compensation in this case.