Apple Open to Expanding New Child Safety Features to Third-Party Apps - MacRumors
Skip to Content

Apple Open to Expanding New Child Safety Features to Third-Party Apps

Apple today held a questions-and-answers session with reporters regarding its new child safety features, and during the briefing, Apple confirmed that it would be open to expanding the features to third-party apps in the future.

iphone communication safety feature
As a refresher, Apple unveiled three new child safety features coming to future versions of iOS 15, iPadOS 15, macOS Monterey, and/or watchOS 8.

Apple's New Child Safety Features

First, an optional Communication Safety feature in the Messages app on iPhone, iPad, and Mac can warn children and their parents when receiving or sending sexually explicit photos. When the feature is enabled, Apple said the Messages app will use on-device machine learning to analyze image attachments, and if a photo is determined to be sexually explicit, the photo will be automatically blurred and the child will be warned.

Second, Apple will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies. Apple confirmed today that the process will only apply to photos being uploaded to iCloud Photos and not videos.

Third, Apple will be expanding guidance in Siri and Spotlight Search across devices by providing additional resources to help children and parents stay safe online and get help with unsafe situations. For example, users who ask Siri how they can report CSAM or child exploitation will be pointed to resources for where and how to file a report.

Expansion to Third-Party Apps

Apple said that while it does not have anything to share today in terms of an announcement, expanding the child safety features to third parties so that users are even more broadly protected would be a desirable goal. Apple did not provide any specific examples, but one possibility could be the Communication Safety feature being made available to apps like Snapchat, Instagram, or WhatsApp so that sexually explicit photos received by a child are blurred.

Another possibility is that Apple's known CSAM detection system could be expanded to third-party apps that upload photos elsewhere than iCloud Photos.

Apple did not provide a timeframe as to when the child safety features could expand to third parties, noting that it has still has to complete testing and deployment of the features, and the company also said it would need to ensure that any potential expansion would not undermine the privacy properties or effectiveness of the features.

Broadly speaking, Apple said expanding features to third parties is the company's general approach and has been ever since it introduced support for third-party apps with the introduction of the App Store on iPhone OS 2 in 2008.

Popular Stories

iOS 26

iOS 26.4 Adds Two New Features to CarPlay

Tuesday March 24, 2026 1:55 pm PDT by
iOS 26.4 was released today, and it includes a couple of new features for CarPlay: an Ambient Music widget and support for voice-based chatbot apps. To update your iPhone 11 or newer to iOS 26.4, open the Settings app and tap on General → Software Update. CarPlay will automatically offer the new features so long as the iPhone connected to your vehicle is running iOS 26.4 or later....
Apple Business hero

Apple Unveils 'Apple Business' All-in-One Platform

Tuesday March 24, 2026 8:53 am PDT by
Apple today announced Apple Business, a new all-in-one platform that unifies device management, productivity tools, and customer outreach features. The service is designed to be a consolidated replacement for several of Apple's existing business-focused offerings, including Apple Business Essentials, Apple Business Manager, and Apple Business Connect. It provides organizations with a single...
AirPods Pro Firmware Feature

Apple Releases New Firmware for AirPods Pro 3, AirPods Pro 2 and AirPods 4

Tuesday March 24, 2026 12:31 pm PDT by
Apple today released new firmware for the AirPods Pro 2, AirPods Pro 3, and the AirPods 4. The firmware has a version number of 8B39, up from 8B34 on the AirPods Pro 3, 8B28 on the AirPods Pro 2, and 8B21 on the AirPods 4. There is no word on what's included in the firmware, but Apple has a support document with limited notes. Most updates are limited to bug fixes and performance...

Top Rated Comments

crawfish963 Avatar
60 months ago
This is getting worse and worse. No way this will backfire….
Score: 85 Votes (Like | Disagree)
TheYayAreaLiving 🎗️ Avatar
60 months ago
Third-party Apps? Come on now. Apple is doing the MOST now. Imagine if Facebook gets ahold of the photos/information. Isn't Whatsup App belong to Facebook? SMH.

This is just getting creepier and creepier. What happened to this, Apple?



Attachment Image
Score: 78 Votes (Like | Disagree)
Naraxus Avatar
60 months ago
Apple used to be about privacy and security. Not any more. Apple has no more highground to stand on.
Score: 71 Votes (Like | Disagree)
60 months ago
Complete blatant invasion of privacy no matter how you spin the benefits of it. Hope this severely backfires.
Score: 67 Votes (Like | Disagree)
HiVolt Avatar
60 months ago
Apple wouldn't decrypt a freakin terrorists phone to help the investigation. Yet they are doing this on a massive scale now.
Score: 52 Votes (Like | Disagree)
60 months ago

Or you know, just don’t have iCloud photos turned on.
Or be like 99.999% of people, and don’t be worried about features that Will not ever apply to you
That's not how this works..... that's not how any of this works!

haha... anyways - this argument is super weak and just begging for exploits/issues. Typical weak-ass argument against mass surveillance. "I'm not hiding anything, why do I care if the police randomly pull me over and throw me out of my car and search it." This thinking rapidly escalates and it's a VERY slippery slope and hard to turn back from.
Score: 43 Votes (Like | Disagree)