jeudi 22 août 2013

Bayesian limits to screening populations for rarities e.g. "being a terrorist"

Silicon valley software engineer Ben Adida (36) certainly doesn't hesitate to grovel before his president in his recent blog post concerning the recent revelations of US state surveillance of the internet: ["...I’m no stubborn idealist... I know you cannot steer a ship as big as the United States as quickly as some would like. I know tough compromises are the inevitable path to progress... The responsibility you feel, the level of detail you understand, must make prior principles sometimes feel quaint. I cannot imagine what it’s like to be in your shoes..."] Yuk! The ship of state, with its concomitant all-powerful captain fully implied. Yuk, and yuk again, puke! But M. Adida does make a good point about the unintuitive statistical operation of true and false positives in population screening programmes, be they medical or criminological. The explanation he offers on his blog is correct, but lacks detail. More seriously, for anyone who'd like to know more, a couple of keywords needed to construct a quality search are missing. These were "false positive" AND "Bayes", so I decided to post a wee comment mentioning that, and a link to the best explanation I found. Perhaps M. Adida has just had enough of blog spam, but he seems to accept comments only very selectively on his blog: and mine was not one. Ho hum! Anyway, he's quite right to point out that even a good (say 99% accurate) test for a rare condition (e.g. being a terrorist) applied to a large, mostly innocent population will generate WAY MORE false positives than true positives. This has two bad effects: 1) innocent people are wrongly suspected and subjected to further unjustified intrusion and harassment; and 2) law enforcement time is wasted. You might suppose that intelligences sufficient to build data centres capable of archiving the whole internet would also be fully conversant with Bayes' Theorem. But with no meaningful oversight, how can we be sure? Idiocy is common; and mission creep happens all the time. Just like the Hackney wide boys who can't resist trying out their guns over the back hedge once they've bought them, the temptation to mine all that data must be well-nigh irresistible--but wrong.

1 commentaire: