GioCities

blogs by Gio

Tagged: encryption

politics Client CSAM scanning: a disaster already

On August 5, 2021, Apple presented their grand new Child Safety plan. They promised “expanded protections for children” by way of a new system of global phone surveillance, where every iPhone would constantly scan all your photos and sometimes forward them to local law enforcement if it identifies one as containing contraband. Yes, really.

August 5 was a Thursday. This wasn’t dumped on a Friday night in order to avoid scrutiny, this was published with fanfare. Apple really thought they had a great idea here and expected to be applauded for it. They really, really didn’t. There are almost too many reasons this is a terrible idea to count. But people still try things like this, so as much as I wish it were, my work is not done. God has cursed me for my hubris, et cetera. Let’s go all the way through this, yet again.

![giovan_h: I am so deeply frustrated at how much we have to repeat these extremely basic principles because people just refuse to listen. Like, yes, we know. Everyone should know this by now. It’s mind boggling. https://t.co/ythah58HHV](https://twitter.com/giovan_h/status/1423407586628493315)

The architectural problem this is trying to solve🔗

Believe it or not, Apple actually does address a real architectural issue here. Half-heartedly addressing one architectural problem of many doesn’t mean your product is good, or even remotely okay, but they do at least do it. Apple published a 14 page summary of the problem model (starting on page 5). It’s a good read if you’re interested in that kind of thing, but I’ll summarize it here.