Apple Removes All References to Controversial CSAM Scanning Feature From Its Child Safety Webpage [Updated]

Apple has quietly nixed all mentions of CSAM from its Child Safety webpage, suggesting its controversial plan to detect child sexual abuse images on iPhones and iPads may hang in the balance following significant criticism of its methods.

Child Safety Feature yellow
Apple in August announced a planned suite of new child safety features, including scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

Following their announcement, the features were criticized by a wide range of individuals and organizations, including security researchers, the privacy whistleblower Edward Snowden, the Electronic Frontier Foundation (EFF), Facebook's former security chief, politicians, policy groups, university researchers, and even some Apple employees.

The majority of criticism was leveled at Apple's planned on-device CSAM detection, which was lambasted by researchers for relying on dangerous technology that bordered on surveillance, and derided for being ineffective at identifying images of child sexual abuse.

Apple initially attempted to dispel some misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more, in order to allay concerns.

However, despite Apple's efforts, the controversy didn't go away. Apple eventually went ahead with the Communication Safety features rollout for Messages, which went live earlier this week with the release of iOS 15.2, but Apple decided to delay the rollout of CSAM following the torrent of criticism that it clearly hadn't anticipated.

Apple said its decision to delay was "based on feedback from customers, advocacy groups, researchers and others... we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."

The above statement was added to Apple's Child Safety page, but it has now gone, along with all mentions of CSAM, which raises the possibility that Apple could have kicked it into the long grass and abandoned the plan altogether. We've reached out to Apple for comment and will update this article if we hear back.

Update: Apple spokesperson Shane Bauer told The Verge that though the CSAM detection feature is no longer mentioned on its website, plans for CSAM detection have not changed since September, which means CSAM detection is still coming in the future.

"Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features," Apple said in September.

Popular Stories

Apple Logo Zoomed

Tim Cook Teases Plans for Apple's Upcoming 50th Anniversary

Thursday February 5, 2026 12:54 pm PST by
Apple turns 50 this year, and its CEO Tim Cook has promised to celebrate the milestone. The big day falls on April 1, 2026. "I've been unusually reflective lately about Apple because we have been working on what do we do to mark this moment," Cook told employees today, according to Bloomberg's Mark Gurman. "When you really stop and pause and think about the last 50 years, it makes your heart ...
wwdc sans text feature

Apple Rumored to Announce New Product on February 19

Thursday February 5, 2026 12:22 pm PST by
Apple plans to announce the iPhone 17e on Thursday, February 19, according to Macwelt, the German equivalent of Macworld. The report, citing industry sources, is available in English on Macworld. Apple announced the iPhone 16e on Wednesday, February 19 last year, so the iPhone 17e would be unveiled exactly one year later if this rumor is accurate. It is quite uncommon for Apple to unveil...
Finder Siri Feature

Why Apple's iOS 26.4 Siri Upgrade Will Be Bigger Than Originally Promised

Friday February 6, 2026 3:06 pm PST by
In the iOS 26.4 update that's coming this spring, Apple will introduce a new version of Siri that's going to overhaul how we interact with the personal assistant and what it's able to do. The iOS 26.4 version of Siri won't work like ChatGPT or Claude, but it will rely on large language models (LLMs) and has been updated from the ground up. Upgraded Architecture The next-generation...
iOS 26

iOS 26.3 and iOS 26.4 Will Add These New Features to Your iPhone

Tuesday February 3, 2026 7:47 am PST by
While the iOS 26.3 Release Candidate is now available ahead of a public release, the first iOS 26.4 beta is likely still at least a week away. Following beta testing, iOS 26.4 will likely be released to the general public in March or April. Below, we have recapped known or rumored iOS 26.3 and iOS 26.4 features so far. iOS 26.3 iPhone to Android Transfer Tool iOS 26.3 makes it easier...
iphone 17 pro dark blue 1

iPhone 18 Pro Max Rumored to Deliver Next-Level Battery Life

Friday February 6, 2026 5:14 am PST by
The iPhone 18 Pro Max will feature a bigger battery for continued best-in-class battery life, according to a known Weibo leaker. Citing supply chain information, the Weibo user known as "Digital Chat Station" said that the iPhone 18 Pro Max will have a battery capacity of 5,100 to 5,200 mAh. Combined with the efficiency improvements of the A20 Pro chip, made with TSMC's 2nm process, the...

Top Rated Comments

Jim Lahey Avatar
54 months ago
If I ever find out this has been surreptitiously added without my knowledge then I will sell every Apple device I own and never buy another. Anyone who doesn’t have an issue with this has no clue of the concept of mission creep. If these systems are allowed to exist then it’s only a matter of time before the Feds batter your door in for having a [insert future dictator here] meme in your iCloud library. The road to hell is paved with good intentions.
Score: 62 Votes (Like | Disagree)
entropys Avatar
54 months ago
Good too much potential for abuse (and I am not saying that ironically).
Score: 46 Votes (Like | Disagree)
DaveFlash Avatar
54 months ago
better, this was bound to fail at the start, you'd only need one bad actor feeding apple's system wrong hashes and everyone is a potential suspect for whatever governmental purpose that bad actor wants to silence, like criticism, dissent, protestors in Hong Kong, LGBT minorities in certain regions you name it. Also, as an EU citizen, I'm glad, as this system Apple proposed, wouldn't have been allowed here anyway because of the strong protections in our GDPR privacy laws.
Score: 31 Votes (Like | Disagree)
DMG35 Avatar
54 months ago
This was a mess from the beginning and pulling it was the only logical thing to do.
Score: 25 Votes (Like | Disagree)
Solomani Avatar
54 months ago
Good. It is only right that Apple should listen to its userbase.
Score: 23 Votes (Like | Disagree)
Agit21 Avatar
54 months ago
Im still not going to activate iMessage again or iCloud backup for pictures. Thank you Tim.
Score: 23 Votes (Like | Disagree)