University Researchers Who Built a CSAM Scanning System Urge Apple to Not Use the 'Dangerous' Technology - MacRumors
Skip to Content

University Researchers Who Built a CSAM Scanning System Urge Apple to Not Use the 'Dangerous' Technology

Respected university researchers are sounding the alarm bells over the technology behind Apple's plans to scan iPhone users' photo libraries for CSAM, or child sexual abuse material, calling the technology "dangerous."

apple privacy
Jonanath Mayer, an assistant professor of computer science and public affairs at Princeton University, as well as Anunay Kulshrestha, a researcher at Princeton University Center for Information Technology Policy, both penned an op-ed for The Washington Post, outlining their experiences with building image detection technology.

The researchers started a project two years ago to identity CSAM in end-to-end encrypted online services. The researchers note that given their field, they "know the value of end-to-end encryption, which protects data from third-party access." That concern, they say, is what horrifies them over CSAM "proliferating on encrypted platforms."

Mayer and Kulshrestha said they wanted to find a middle ground for the situation: build a system that online platforms could use to find CSAM and protect end-to-end encryption. The researchers note that experts in the field doubted the prospect of such a system, but they did manage to build it and in the process noticed a significant problem.

We sought to explore a possible middle ground, where online services could identify harmful content while otherwise preserving end-to-end encryption. The concept was straightforward: If someone shared material that matched a database of known harmful content, the service would be alerted. If a person shared innocent content, the service would learn nothing. People couldn't read the database or learn whether content matched, since that information could reveal law enforcement methods and help criminals evade detection.

Knowledgeable observers argued a system like ours was far from feasible. After many false starts, we built a working prototype. But we encountered a glaring problem.

Since Apple's announcement of the feature, the company has been bombarded with concerns that the system behind detecting CSAM could be used to detect other forms of photos at the request of oppressive governments. Apple has strongly pushed back against such a possibility, saying it will refuse any such request from governments.

Nonetheless, concerns over the future implications of the technology being used for CSAM detection are widespread. Mayer and Kulshrestha said that their concerns over how governments could use the system to detect content other than CSAM had them "disturbed."

A foreign government could, for example, compel a service to out people sharing disfavored political speech. That's no hypothetical: WeChat, the popular Chinese messaging app, already uses content matching to identify dissident material. India enacted rules this year that could require pre-screening content critical of government policy. Russia recently fined Google, Facebook and Twitter for not removing pro-democracy protest materials.

We spotted other shortcomings. The content-matching process could have false positives, and malicious users could game the system to subject innocent users to scrutiny.

We were so disturbed that we took a step we hadn't seen before in computer science literature: We warned against our own system design, urging further research on how to mitigate the serious downsides....

Apple has continued to address user concerns over its plans, publishing additional documents and an FAQ page. Apple continues to believe that its CSAM detection system, which will occur on a user's device, aligns with its long-standing privacy values.

Popular Stories

apple lock security bug vulnerability fix privacy

Apple Warns Canada's Bill C-22 Could Force Encryption Backdoors

Friday May 8, 2026 4:22 am PDT by
Apple and Meta have opposed a Canadian bill that the companies say could force them to create backdoor access to encrypted user data, should it pass through the country's parliament. Proposed by Canada's ruling Liberal Party, Bill C-22 contains provisions that could be similar ​to a UK data access provision order sent to Apple last year, depending on how they are implemented. Back in Feb...
Apple Event Logo

Apple's Next Era Begins September 1

Thursday May 7, 2026 10:36 am PDT by
Apple recently announced that Tim Cook will be stepping down as CEO later this year, after 15 years of leading the company. Effective September 1, Apple's hardware engineering chief John Ternus will become the company's next CEO, while Cook will become executive chairman of Apple's board of directors. In his new role, Apple said Cook will assist with "certain aspects" of the company,...
Four iPhone 18 Pro Colors Mock Feature

iPhone 18 Pro Launching in September With These 10 New Features

Saturday May 9, 2026 6:03 am PDT by
While the iPhone 18 Pro and iPhone 18 Pro Max are not launching until September, there are already plenty of rumors about the devices. It was initially reported that the iPhone 18 Pro models would have fully under-screen Face ID, with only a front camera visible in the top-left corner of the screen. However, the latest rumors indicate that only one Face ID component will be moved under the...

Top Rated Comments

62 months ago
Apple, just cancel this on device surveillance software
Score: 166 Votes (Like | Disagree)
nvmls Avatar
62 months ago
They clearly don't know how the technology works.. oh wait
Score: 125 Votes (Like | Disagree)
62 months ago
This is getting more and more disturbing
Score: 117 Votes (Like | Disagree)
TheRealNick Avatar
62 months ago
If they don’t cancel this I’m seriously going to have to look at alternative products, which saddens me.
Score: 108 Votes (Like | Disagree)
62 months ago
It's time for Apple to admit it's made a mistake. It's seriously bizarre they ever thought this was a good idea.
Score: 95 Votes (Like | Disagree)
VulchR Avatar
62 months ago
Waiting for the surveillance apologists to argue these two researchers "don't understand" the process...
Score: 93 Votes (Like | Disagree)