Apple Confirms Detection of Child Sexual Abuse Material is Disabled When iCloud Photos is Turned Off - MacRumors
Skip to Content

Apple Confirms Detection of Child Sexual Abuse Material is Disabled When iCloud Photos is Turned Off

Apple today announced that iOS 15 and iPadOS 15 will see the introduction of a new method for detecting child sexual abuse material (CSAM) on iPhones and iPads in the United States.

iCloud General Feature
User devices will download an unreadable database of known CSAM image hashes and will do an on-device comparison to the user's own photos, flagging them for known CSAM material before they're uploaded to iCloud Photos. Apple says that this is a highly accurate method for detecting CSAM and protecting children.

CSAM image scanning is not an optional feature and it happens automatically, but Apple has confirmed to MacRumors that it cannot detect known CSAM images if the ‌iCloud Photos‌ feature is turned off.

Apple's method works by identifying a known CSAM photo on device and then flagging it when it's uploaded to ‌iCloud Photos‌ with an attached voucher. After a certain number of vouchers (aka flagged photos) have been uploaded to ‌iCloud Photos‌, Apple can interpret the vouchers and does a manual review. If CSAM content is found, the user account is disabled and the National Center for Missing and Exploited Children is notified.

Because Apple is scanning ‌iCloud Photos‌ for the CSAM flags, it makes sense that the feature does not work with ‌iCloud Photos‌ disabled. Apple has also confirmed that it cannot detect known CSAM images in iCloud Backups if ‌iCloud Photos‌ is disabled on a user's device.

It's worth noting that Apple is scanning specifically for hashes of known child sexual abuse materials and it is not broadly inspecting a user's photo library or scanning personal images that are not already circulating among those who abuse children. Still, users who have privacy concerns about Apple's efforts to scan user photo libraries can disable ‌iCloud Photos‌.

Security researchers have expressed concerns over Apple's CSAM initiative and worry that it could in the future be able to detect other kinds of content that could have political and safety implications, but for now, Apple's efforts are limited seeking child abusers.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Popular Stories

iOS 26

iOS 26.4 Adds Two New Features to CarPlay

Tuesday March 24, 2026 1:55 pm PDT by
iOS 26.4 was released today, and it includes a couple of new features for CarPlay: an Ambient Music widget and support for voice-based chatbot apps. To update your iPhone 11 or newer to iOS 26.4, open the Settings app and tap on General → Software Update. CarPlay will automatically offer the new features so long as the iPhone connected to your vehicle is running iOS 26.4 or later....
Apple Business hero

Apple Unveils 'Apple Business' All-in-One Platform

Tuesday March 24, 2026 8:53 am PDT by
Apple today announced Apple Business, a new all-in-one platform that unifies device management, productivity tools, and customer outreach features. The service is designed to be a consolidated replacement for several of Apple's existing business-focused offerings, including Apple Business Essentials, Apple Business Manager, and Apple Business Connect. It provides organizations with a single...
AirPods Pro Firmware Feature

Apple Releases New Firmware for AirPods Pro 3, AirPods Pro 2 and AirPods 4

Tuesday March 24, 2026 12:31 pm PDT by
Apple today released new firmware for the AirPods Pro 2, AirPods Pro 3, and the AirPods 4. The firmware has a version number of 8B39, up from 8B34 on the AirPods Pro 3, 8B28 on the AirPods Pro 2, and 8B21 on the AirPods 4. There is no word on what's included in the firmware, but Apple has a support document with limited notes. Most updates are limited to bug fixes and performance...

Top Rated Comments

iObama Avatar
61 months ago

Scanning my photos for kiddie porn has no effect on my privacy, since I don’t have any kiddie porn.
Here's the thing. That's great that you don't have any of that on your device!

But if you ever live in a country that, for some reason, wants to find something on your device and have it flagged in order to charge you with a crime, this sets a dangerous precedent.

Surveillance technology, while often well-intentioned, can easily end up in the wrong hands for nefarious purposes.
Score: 77 Votes (Like | Disagree)
haruhiko Avatar
61 months ago
what’s next? scanning your stuff on iCloud for anti government materials for oppressive governments?
Score: 50 Votes (Like | Disagree)
zakarhino Avatar
61 months ago

Scanning my photos for kiddie porn has no effect on my privacy, since I don’t have any kiddie porn.
Random warrantless searches of your property have no effect on you because you're not a criminal.

Warrantless surveilance of your entire digital life has no effect on you because you're not a criminal.

^ Those statements are contingent on what the powers that be define as "criminal." If the definition changes tomorrow then they'll have all the infrastructure and law in place to subjugate you as they please. At one point in time it was practically considered criminal to be Japanese in the USA, not that you would care because that has "no effect" on you if you weren't Japanese during that period of time.


If that were to happen I’d be here railing against it.

But as long as they’re just helping catch these sick freaks they have nothing but my support.
If they have all the infrastructure in place then there is no "railing against it" because by that time it's too late. There's no "railing against" nuclear weapons once we've all gone up in smoke. There's no "railing against" climate change once it's already too late and the planet is no longer viable for human life. The time to "rail against" technologies like this is not after they've been abused when it's too late, the time is NOW when the technology is capable of doing those things but hasn't yet gone that far.

Apple are implementing a system that is capable of scanning all the photos on your device against a database of images that can include images not related to child abuse, regardless of whether or not you have iCloud turned on. It doesn't matter that right now it disables itself when iCloud is off and it doesn't matter the database supposedly only includes child abuse images, it is CAPABLE of being an authoritarian tool at a moment's notice via minimal updates in the same way a nuclear bomb is capable of decimating a country with a few button presses even if the bomb is currently sitting in a silo.

You would say there's no issue with the patriot act because you're not a terrorist but it turns out the patriot act and its sister policies have been used to harass journalists and climate activists. It's not like there haven't been terror attacks on US soil since the patriot act was enacted. There were a 1000 other things the US government could have done to prevent more terrorist attacks globally but they chose the option of spying on every single citizen and violating people's constitutional rights instead. If you're actually interested in stopping "sickos" then support systems that actually combat the core issue rather than the "let's just police the entire public more" solution which won't actually stop "sickos" (terrorists use their own encrypted chat tools they make themselves according to various reports, so do child abusers most likely).

Nobody wants terrorists or child abusers in their community. Increasing the reach of warrantless, global spying programs is not the way to tackle the issue. Make no mistake, this system is capable of being a spying tool that bypasses end to end encryption regardless of how it's configured as of right now.
Score: 42 Votes (Like | Disagree)
61 months ago
Sounds like damage control by Apple. It's a bad feature, period. Having a workaround to disable it does not change that.
Score: 36 Votes (Like | Disagree)
TheYayAreaLiving 🎗️ Avatar
61 months ago
Sounds like I’ll be turning off iCloud.

Apple, go ahead and release that 1TB iPhone.

Please, respect our privacy as consumers. Don’t be creepy. How times have changed!



Attachment Image
Score: 35 Votes (Like | Disagree)
ArtOfWarfare Avatar
61 months ago

I agree with darcyf that if you're not doing anything wrong, then you have nothing to worry about.
Who decides what's wrong, though? Regardless of where you fall politically, there is likely something you do or have that some politician wants to make illegal.

We're starting with something that is fairly universal in people saying it's wrong, but it's a slippery slope. Now that the tools are there, an authoritarian government can start telling Apple to do whatever with it.

And everyone knows that Apple's commitment to human rights and privacy goes right out the window the moment the Chinese Communist Party asks for assistance in trampling them.
Score: 34 Votes (Like | Disagree)