Adrian Short
@adrianshort
Fri Jan 24 14:50:31 +0000 2020

So @metpoliceuk are launching live #FacialRecognition to supposedly catch people wanted for serious crimes. This is a bad idea in principle: People should be free to walk down the street without being identified. But let's do the math: Does it even work? #LFR @bbw1984 @libertyhq

Let's start with some assumptions: The police say they're only looking for people wanted for serious crimes. So maybe that's 1 in 1,000 people. And let's assume the system is pretty accurate. When it flags up someone as a suspect it gets it right 90% of the time. #LFR

And when the system decides that someone *isn't* a suspect, it gets that right 90% of the time too. Sounds good. So the people that the system flags as a suspect get stopped by the police and they search and question them. #LFR

We might assume that this system that's 90% accurate will mean that most of the people that get stopped will be suspects. The police will be spending most of their time questioning people they're actually looking for and will rarely stop people who aren't wanted by mistake. #LFR

Here's what would happen: 99.11% of the people stopped under a system like this would be stopped in error. For every 9 genuine suspects stopped the police would also be stopping 999 people they're not looking for. 1 in 10 of the suspects would evade detection too. #LFR https://t.co/bWMyl7dQeh

This seems counter-intuitive: How can a system that's supposedly very accurate - making the right call in 90% of individual cases - produce so many errors? Why would so many entirely innocent people be having their lives disrupted by being stopped by police? #LFR

Assuming a system like this would work much better than it actually would is the "base rate fallacy". People get this wrong because they fail to understand what happens when you apply an imperfect test to a population where you're looking for very few members. #LFR

Low prevalence is a killer for tests of discrimination unless those tests are at near-perfect accuracy. Even 90% accuracy isn't good enough because if you're only looking for one person in a thousand, nearly everyone you test *isn't* going to be someone you're looking for. #LFR

You want to drop that 90% accurate facial recognition system down to 70%? Now you're stopping 428 people in error for every genuine suspect, while letting 3 in 10 of the real suspects evade detection too. 99.77% of people stopped will be stopped in error. #LFR https://t.co/33ty6Xv4TV

I'm no expert but this strikes me as a fucking bad idea. It's also literally the kind of thinking and system that a police state would implement, because people who support "catch the bad guys at all costs" know that those costs are always borne by others. #LFR #FacialRecognition