Here’s why Apple’s new child safety features are so controversial

Last week, Apple, without very much warning at all, announced a new set of tools built into the iPhone designed to protect children from abuse. Siri will now offer resources to people who ask for child abuse material or who ask how to report it. iMessage will now flag nudes sent or received by kids under 13 and alert their parents. Images backed up to iCloud Photos will now be matched against a database of known child sexual abuse material (CSAM) and reported to the National Center for Missing and Exploited Children (NCMEC) if more than a certain number of images match. And that matching process doesn’t just happen in the cloud — part of it happens locally on your phone. That’s a big change from how things normally work.

Apple claims designed what it says is a much more private process that involves scanning images on your phone. And that is a very big line to cross — basically, the iPhone’s operating system now has the capability to look at your photos and match them up against a database of illegal content, and you cannot remove that capability. And while we might all agree that adding this capability is justifiable in the face of child abuse, there are huge questions about what happens when governments around the world, from the UK to China, ask Apple to match up other kinds of images — terrorist content, images of protests, pictures of dictators looking silly. These kinds of demands are routinely made around the world. And until now, no part of that happened on your phone in your pocket.

Riana Pfefferkorn and Jen King, from Stanford, in the the Decoder art style

Riana Pfefferkorn and Jen King
Photo Illustration by Grayson Blackmon / The Stock Market Pioneer

To unpack all of this, I asked Riana Pfefferkorn and Jennifer King to join me on the show. They’re both researchers at Stanford: Riana specializes in encryption policies, while Jen specializes in privacy and data policy. She’s also worked on child abuse issues at big tech companies in the past.

I think for a company with as much power and influence as Apple, rolling out a system that changes an important part of our relationship with our personal devices deserves thorough and frequent explanation. I hope the company does more to explain what it’s doing, and soon.

The following excerpt has been lightly edited for clarity.

It feels like one enormous aspect of this entire controversy is the fact that the scanning is being done on the device at some point. That’s the Rubicon that’s been crossed: up until now, your local computer has not scanned your local storage in any way. But once you hit the cloud, all kinds of scanning happens. That’s problematic, but it happens.

But we have not yet entered the point where law enforcement is pushing a company to do local scanning on your phone, or your computer. Is that the big bright line here that’s causing all the trouble?

Riana Pfefferkorn: I view this as a paradigm shift, to take where the scanning is happening from in the cloud, where you are making the choice to say, “I’m going to upload these photos into iCloud.” It’s being held in third-parties’ hands. You know, there’s that saying that “it’s not the cloud; it’s just somebody else’s computer,” right?

You’re kind of assuming some level of risk in doing that: that it might be scanned, that it might be hacked, whatever. Whereas moving it down onto the device — even if, right now, it’s only for photos that are in the cloud — I think is very different and is intruding into what we consider a more private space that, until now, we could take for granted that it would stay that way. So I do view that as a really big conceptual shift.

Not only is it a conceptual shift in how people might think about this, but also from a legal standpoint. There is a big difference between data that you hand over to a third party and assume the risk that they’re going to turn around and report to the cops, versus what you have in the privacy of your own home or in your briefcase or whatever.

I do view that as a big change.

Jen King: I would add that some of the dissonance here is the fact that we just had Apple come out with the “asks apps to not track” feature, which was already in existence before, but they actually made that dialog box prominent to ask you when you were using an app if you want the app to track you. It seems a bit dissonant that they just rolled out that feature, and then suddenly, we have this thing that seems almost more invasive on the phone.

But I would say, as someone who’s been studying privacy in the mobile space for almost a decade, there is already an extent to which these phones aren’t ours, especially when you have third-party apps downloading your data, which has been a feature of this ecosystem for some time. This is a paradigm shift. But maybe it’s a paradigm shift in the sense that we had areas of the phone that we maybe thought were more off-limits, and now they are less so than they were before.

The illusion that you’ve been able to control the data on your phone has been nothing more than an illusion for most people for quite a while now.

The idea that you have a local phone that has a networking stack that then goes to talk to the server and comes back — that is almost a 1990s conception of connected devices, right? In 2021, everything in your house is always talking to the internet, and the line between the client and the server is extremely blurry to the point where we market the networks. We market 5G networks, not just for speed but for capability, whether or not that’s true.

But that fuzziness between client and server and network means that the consumer might expect privacy on local storage versus cloud storage, but I’m wondering if this is actually a line that we crossed — or if just because Apple announced this feature, we’re now perceiving that there should be a line.

RP: It’s a great point because there are a number of people who are kind of doing the equivalent of “If the election goes the wrong way, I’m going to move to Canada” by saying “I’m just going to abandon Apple devices and move to Android instead.” But Android devices are basically kind of just a local version of your Google cloud. I don’t know if that’s better.

And at least you can fork Android [although] I wouldn’t want to run a forked version of Android that I sideloaded from some sketchy place. But we’re talking about a possibility that people just don’t necessarily understand the different ways that the different architectures of their phones work.

A point that I’ve made before is that people’s rights, people’s privacy, people’s free expression, that shouldn’t depend upon a consumer choice that they made at some point in the past. That shouldn’t be path-dependent for the rest of time for whether or not their data that they have on their phone is really theirs or whether it actually is on the cloud.

But you’re right that, as the border becomes blurrier, it becomes both harder to reason about these things kind of from arm’s length, and it also becomes harder for just average people to understand and make choices accordingly.

JK: Privacy shouldn’t be a market choice. I think it’s a market failure, for the most part, across industry. A lot of the assumptions we had going into the internet in the early 2000s was that privacy could be a competitive value. And we do see a few companies competing on it. DuckDuckGo comes to mind, for example, on search. But bottom line, many aspects of privacy shouldn’t be left up to the market.

Full transcript coming soon.

Leave a Comment