Apple Hit With $1.2B Lawsuit Over Abandoned CSAM Detection System - MacRumors
Skip to Content

Apple Hit With $1.2B Lawsuit Over Abandoned CSAM Detection System

Apple is facing a lawsuit seeking $1.2 billion in damages over its decision to abandon plans for scanning iCloud photos for child sexual abuse material (CSAM), according to a report from The New York Times.

iCloud General Feature
Filed in Northern California on Saturday, the lawsuit represents a potential group of 2,680 victims and alleges that Apple's failure to implement previously announced child safety tools has allowed harmful content to continue circulating, causing ongoing harm to victims.

In 2021, Apple announced plans to implement CSAM detection in iCloud Photos, alongside other child safety features. However, the company faced significant backlash from privacy advocates, security researchers, and policy groups who argued the technology could create potential backdoors for government surveillance. Apple subsequently postponed and later abandoned the initiative.

Explaining its decision at the time, Apple said that implementing universal scanning of users' private iCloud storage would introduce major security vulnerabilities that malicious actors could potentially exploit. Apple also expressed concerns that such a system could establish a problematic precedent, in that once content scanning infrastructure exists for one purpose, it could face pressure to expand into broader surveillance applications across different types of content and messaging platforms, including those that use encryption.

The lead plaintiff in the lawsuit, filing under a pseudonym, said she continues to receive law enforcement notices about individuals being charged with possessing abuse images of her from when she was an infant. The lawsuit argues that Apple's decision not to proceed with its announced safety measures has forced victims to repeatedly relive their trauma.

In response to the lawsuit, Apple spokesperson Fred Sainz underlined the company's commitment to fighting child exploitation, stating that Apple is "urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users." Apple pointed to existing features like Communication Safety, which warns children about potentially inappropriate content, as examples of its ongoing child protection efforts.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Popular Stories

Apple Event Logo

Apple's Next Era Begins September 1

Thursday May 7, 2026 10:36 am PDT by
Apple recently announced that Tim Cook will be stepping down as CEO later this year, after 15 years of leading the company. Effective September 1, Apple's hardware engineering chief John Ternus will become the company's next CEO, while Cook will become executive chairman of Apple's board of directors. In his new role, Apple said Cook will assist with "certain aspects" of the company,...
Instagram Feature 2

PSA: Instagram Encrypted Messaging Ends on Friday, May 8

Tuesday May 5, 2026 8:24 am PDT by
Instagram will remove end-to-end encryption for direct messages between users from May 8, 2026. When the date comes around, Meta will potentially be able to see the contents of all messages between users on the social media platform. Encrypting messages has been an optional feature in Instagram since 2023, but in March of this year the social media platform quietly updated a help page to say ...
Apple Event Logo

Apple Just Released a New Accessory

Monday May 4, 2026 8:13 am PDT by
Apple today released a new Pride Edition Sport Loop for the Apple Watch. The band features a rainbow design with 11 colors of woven nylon yarns. The new Pride Edition Sport Loop is available to order now on Apple.com and in the Apple Store app in 40mm, 42mm, and 46mm sizes, and it will be available at Apple Store locations starting later this week. In the U.S., the band costs $49. There...

Top Rated Comments

19 months ago
Maybe we should sue display manufactures for having the ability to display illicit content. We could also sue town, counties, states, and national governments for allowing people who engage in illegal activities to live within their boarders. I mean surely the government should be “scanning” your home to make sure you aren’t engaged in any activity that harms others right?

If we are OK holding innocent people accountable for the actions of the perpetrators, it kind of seems like we could sue anyone and everyone…
Score: 113 Votes (Like | Disagree)
19 months ago
Sued if you do, sued if you don’t.

Attorneys are the people winning either way.
Score: 104 Votes (Like | Disagree)
BelgianChoklit Avatar
19 months ago
Apple had good intentions with CSAM, but it was abandonned and for good reasons. iThink Apple was drunk having had the idea to introduce this thing.
Score: 54 Votes (Like | Disagree)
cjsuk Avatar
19 months ago
Oh I'm really going to be popular with this one.

2680 people's situation does not represent the greater good which is hundreds of millions of people's communication security being put at risk by non-deterministic reporting and content moderation.

But of course 2860 people will be happy to live with their problem, which won't be solved either way, if they get some cash for it. They just want money. And so do lawyers.

If people really want to fix this problem it'll be case of dealing with individuals via good old fashioned social methods i.e. effective policing and rehabilitation. But that's hard, so they'll take some money instead.
Score: 40 Votes (Like | Disagree)
TVreporter Avatar
19 months ago
Apple is damned if they do; damned if they don’t.

While I can sympathize if the individual’s claim is true, how can they blame Apple?

The image(s) are likely circulated on far more Android and Windows devices than Apple’s.

And who is to say if Apple implemented its program that it would detect the victim’s images. All circumstantial- a judge should quickly quash this.
Score: 27 Votes (Like | Disagree)
justperry Avatar
19 months ago
This is beyond dumb.
Score: 21 Votes (Like | Disagree)