European Commission to Release Draft Law Enforcing Mandatory Detection of Child Sexual Abuse Material on Digital Platforms - MacRumors
Skip to Content

European Commission to Release Draft Law Enforcing Mandatory Detection of Child Sexual Abuse Material on Digital Platforms

The European Commission is set to release a draft law this week that could require tech companies like Apple and Google to identify, remove and report to law enforcement illegal images of child abuse on their platforms, claims a new report out today.

European Commisssion
According to a leak of the proposal obtained by Politico, the EC believes voluntary measures taken by some digital companies have thus far "proven insufficient" in addressing the increasing misuse of online services for the purposes of sharing child sexual abuse content, which is why the commission wants to make detection of such material mandatory.

After months of lobbying, groups representing tech companies and children's rights organizations are said to be waiting to see how stringent the rules could be, and how they will work without tech companies having to scan the gamut of user content – a practice deemed illegal by the Court of Justice of the European Union in 2016.

Apart from how identification of illegal material would operate within the law, privacy groups and tech companies are worried that the EU executive could result in the creation of backdoors to end-to-end encrypted messaging services, the contents of which cannot be accessed by the hosting platform.

The EC's Home Affairs Commissioner Ylva Johansson has said technical solutions exist to keep conversations safe while finding illegal content, but cybersecurity experts disagree.

"The EU shouldn't be proposing things that are technologically impossible," said Ella Jakubowska, speaking to Politico. Jakubowska is policy adviser at European Digital Rights (EDRi), a network of 45 non-governmental organizations (NGOs.)

"The idea that all the hundreds of millions of people in the EU would have their intimate private communications, where they have a reasonable expectation that that is private, to instead be kind of indiscriminately and generally scanned 24/7 is unprecedented," said Jakubowska.

MEPs are far from aligned on the issue, however. Reacting to the leak of the proposal, centrist Renew Europe MEP Moritz Körner told Politico the Commission's proposal would mean "the privacy of digital correspondence would be dead."

The heated debate mirrors last year's controversy surrounding Apple's plan to search for CSAM (child sexual abuse material) on iPhones and iPads.

Apple in August 2021 announced a planned suite of new child safety features, including scanning users' iCloud Photos libraries for CSAM and Communication Safety to warn children and their parents when receiving or sending sexually explicit photos. The latter, and arguably less controversial, feature is already live on Apple's iMessage platform. Apple's method of scanning for CSAM has yet to have been deployed.

Following Apple's announcement, the features were criticized by a wide range of individuals and organizations, including security researchers, the Electronic Frontier Foundation (EFF), Facebook's former security chief, politicians, policy groups, university researchers, and even some Apple employees.

The majority of criticism was leveled at Apple's planned on-device CSAM detection, which was lambasted by researchers for relying on dangerous technology that bordered on surveillance, and derided for being ineffective at identifying images of child sexual abuse.

Apple initially attempted to dispel some misunderstandings and reassure users by releasing detailed information and sharing interviews with company executives in order to allay concerns. However, despite Apple's efforts, the controversy didn't go away, and Apple decided to delay the rollout of CSAM following the torrent of criticism.

Apple said its decision to delay was "based on feedback from customers, advocacy groups, researchers and others... we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."

In December 2021, Apple quietly nixed all mentions of CSAM from its Child Safety webpage, suggesting its controversial plan to detect child sexual abuse images on iPhones and iPads hanged in the balance following significant criticism of its methods.

However, Apple says its plans for CSAM detection have not changed since September, which suggests CSAM detection in some form is still coming in the future.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Popular Stories

iOS 26

iOS 26.4 Adds Two New Features to CarPlay

Tuesday March 24, 2026 1:55 pm PDT by
iOS 26.4 was released today, and it includes a couple of new features for CarPlay: an Ambient Music widget and support for voice-based chatbot apps. To update your iPhone 11 or newer to iOS 26.4, open the Settings app and tap on General → Software Update. CarPlay will automatically offer the new features so long as the iPhone connected to your vehicle is running iOS 26.4 or later....
Apple Business hero

Apple Unveils 'Apple Business' All-in-One Platform

Tuesday March 24, 2026 8:53 am PDT by
Apple today announced Apple Business, a new all-in-one platform that unifies device management, productivity tools, and customer outreach features. The service is designed to be a consolidated replacement for several of Apple's existing business-focused offerings, including Apple Business Essentials, Apple Business Manager, and Apple Business Connect. It provides organizations with a single...
AirPods Pro Firmware Feature

Apple Releases New Firmware for AirPods Pro 3, AirPods Pro 2 and AirPods 4

Tuesday March 24, 2026 12:31 pm PDT by
Apple today released new firmware for the AirPods Pro 2, AirPods Pro 3, and the AirPods 4. The firmware has a version number of 8B39, up from 8B34 on the AirPods Pro 3, 8B28 on the AirPods Pro 2, and 8B21 on the AirPods 4. There is no word on what's included in the firmware, but Apple has a support document with limited notes. Most updates are limited to bug fixes and performance...

Top Rated Comments

contacos Avatar
51 months ago
and at the same time another entity in the EU demands end-to-end encryption. Hilarious.

It starts with child porn and ends with having an opinion. Scary future
Score: 24 Votes (Like | Disagree)
51 months ago

and at the same time another entity in the EU demands end-to-end encryption. Hilarious.

It starts with child porn and ends with having an opinion. Scary future
Europe is once again heading towards a very very dark place. Was always obvious that ever more centralisation of power and ever bigger empire was going to lead to misery.
Score: 17 Votes (Like | Disagree)
51 months ago

Eh- not a fan of this.

Inevitably, "think of the children" always wins.
Also not a fan. EU is doing too much As. Per usual. The EU doesnt know how to leave people alone. The its for a good cause is a terrible argument. Do the people of the EU get to vote for the representation in the EU?
Score: 16 Votes (Like | Disagree)
VulchR Avatar
51 months ago
Many posting above assume that Apple's local CSAM-detecting spying software is a response to pressure from governments like the US and EU. Perhaps. However, it is likely that Apple's system has given encouragement to governments that want 24/7 surveillance on our private lives. I can just picture authoritarian legislators now: 'See? Apple has a system that guarantees privacy [sic], so we can move ahead with this requirement for surveillance'.

Criminal investigations should begin with detection of crime. Global surveillance should not be used for the prevention of crime. The cost to liberty is too high.
Score: 11 Votes (Like | Disagree)
Mac Fly (film) Avatar
51 months ago

and at the same time another entity in the EU demands end-to-end encryption. Hilarious.

It starts with child porn and ends with having an opinion. Scary future
Careful with having an opinion around here. MR don’t love that; you may get a temporary ban for such. Live in the EU and hate the EU. Centralised power corrupts. If child porn was the issue we’d know Maxwell’s client list and the court case transcript would be made public. Alas, child porn only matters when the perps are not wealthy and powerful.
Score: 11 Votes (Like | Disagree)
Abazigal Avatar
51 months ago
I previously opined that Apple’s implementation was them trying to have their cake and eat it too - find a way to detect illegal material one’s device without human intervention and thus preserving one’s privacy.

I continue to stand by this statement, and I believe that if Apple were ever to roll out said feature, it would be the least invasive means of scanning for CSAM compared to what the other companies are doing.

We also know now why Apple was exploring such a feature in the first place. Totally makes sense now.
Score: 10 Votes (Like | Disagree)