SoS

Apple's CSAM "solution" was a disjointed executive-driven fantasy project rolled out incompetently to what they thought would be career-benefiting accolades that their own people didn't have basic answers to how or why, or plans for making this work subject to client-side attack. https://t.co/5X8YNaoNKs

Apple even had a whole little out-of-schedule announcement area on their site some bubble-inhabiting PR group was forced to build to for whatever executive posse needed it for their own public fellatiation.

I had people messaging me asking if Apple was going to flag photos of their own children as CSAM, due to conflation of technologies.
For a firm hailed for marketing, this was a colossal public fuck up that hurt everybody including CSAM victims.
Seriously you inexcusable morons.

Apple's "Child Safety" announcement may be one of the worst public communications failures in recent memory. It is vague where it should be specific, technical where it should be human, and ultimately either an artifact of rank incompetence or a failed gambit.

Let's analyze it.

Here is the link to the original as it appeared on August 5th, 2021. A Thursday. This was intended to be an applauded topic of policy discussion, and not a closeted Friday night news dump disclosure. Apple wanted you to know their action.
Now, the content.
https://t.co/lMk3Uv2Ozg

Here we establish Apple is operating under a theme of protecting children from abuse – an extremely serious subject of immense suffering, social ostracization, and legal penalty. It demands immense precision. The following topics are packaged together for the same news cycle. https://t.co/vMlX8hOK3z

The most important, and first example, is Apple claiming they have developed machine-learning to detect, without prior knowledge or human input, "sensitive content" in the context of child abuse.
There is a machine deciding if a picture is sensitive. What does "sensitive" mean? https://t.co/XAzSxrfSNl

Immediately after this revelation about heuristically detecting child abuse, they say a machine will scan your pictures for child abuse and report you to the police. Are these technologies connected? No need to be specific here!
Also, "children" will be ratted on to their parents https://t.co/o0XVhLIt8x

"Child" has multiple definitions.
Biologically, it's before adolescence.
Legally and in context of child sexual abuse, it's someone below age of 18. That's the context we're operating under.

This literally says Apple will detect teenagers in sexual situations and alert someone. https://t.co/vJkIUfOP4k

(Intermission as I attend to some cooking: Apple should fire everyone involved at this point. This is far enough to be customer-endangering drivel. But we've got so much longer to go in this nightmare of English text.)

They follow this highlighting your iPhone will know what's sexually explicit, and rat on you. They say it applies to children (whatever that means). Next section gets into the iPhone you bought running Apple's code to report you to the police for committing indefensible crimes! https://t.co/7CFTY5d7c3

Here we get into real failure. Apple says they will detect known CSAM images. What does that mean? Consumers will think, "Nobody keeps every child abuse image ever discovered, that would be illegal and weird. Why would anyone be allowed to have this? It must be just guessing." https://t.co/ejh1JPY0MH

(Note: I know this abuse photo database exists, I tweeted about it years ago. I'm speaking in the voice of consumers who have no idea what a mathematical hash is or how this works. I saw their tweets and questions. This is real and a severe deficit in their public communication.) https://t.co/wfkdlFJEkQ

An important precept in Communications Theory is that under no possible gamed endpoint will the recipient be fearful they could be procedurally accused of sexually abusing their own children by a faceless global corporation for which there is no conceivable ameliorative result.

Note: This thread isn't done, but I only have the energy to do this in spurts. And yes I talk like a dork on the Internet for fun there's nothing you can do to stop me.

Except the Apple Global Security kill-team, they can stop me no matter where I try to hide.

Mon Sep 06 08:53:13 +0000 2021