A lawsuit has been filed against Apple for failing to put in place a system that would have checked iCloud images for child sexual abuse material (CSAM).
According to The New York Times, the complaint contends that victims are being forced to relive their pain by the lack of action taken to stop the circulation of this content. According to the lawsuit, Apple failed to “implement those designs or take any measures to detect and limit” this information after introducing “a widely touted improved design aimed at protecting children.”
When Apple first revealed the technology in 2021, it explained that it will identify known CSAM content in customers’ iCloud libraries using digital signatures from organizations like the National Center for Missing and Exploited Children. But as privacy and security activists argued that they might open a backdoor for government monitoring, it seemed to drop those intentions.
According to reports, a 27-year-old woman is suing Apple under a false identity. She said that she was assaulted as a baby by a relative who posted pictures of her online, and that she continues to get police notifications almost daily about someone being charged for having those pictures.
A possible group of 2,680 victims may be eligible for compensation in this case, according to attorney James Marsh, who is participating in the case.
ICYMT: Election 2024: EC Explains Presidential Results Delay
Apple has been contacted by TechCrunch for comment. Apple is “urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users,” a company spokeswoman told The Times.
A 9-year-old child and her guardian filed a lawsuit against Apple in August, claiming that the firm had neglected to fix CSAM on iCloud.
SOURCE: TECHCRUNCH