Dec 07, 2010 09:15
I'm reading "SuperFreaknonomics." Those books are MADE for me because I'm borderline-obsessed with that kind of detail-oriented analysis and the unusual relationships between things that the authors have turned up.
One of the chapters is about prostitution. In the course of discussing it, the authors made a statement that was, to borrow Oprah's term, an a-ha moment. It was like choirs of angels descending and suddenly something I'd known in a vague way was achingly clear. Levitt was able to clarify in a few short sentences why the war on drugs will never succeed.
When attempting to regulate illegal sales, whether it be drugs or guns or sex, the authorities always go after the seller, not the buyer. Why? Because there are far fewer sellers, and they're easier to demonize. But that approach dooms any attempt to curtail illegal sales.
You arrest a supplier, you remove him from the market. Supply goes down. Price goes up. Which entices more sellers to enter the market.
Lather, rinse, repeat. Reducing the supply does not work, all it does is drive up the price and encourage more people to sell. Only removing the demand works. Prostitutes now make vastly less money than they did a hundred years ago. Why? Because it's about a million times easier now for men to get sex for free than it was in the Victorian era.
He just spelled out something else amazing that ought to be self-evident but isn't. He's talking about the difficulties in locating terrorists before they act. He discusses an expert in bank fraud who's developed some very accurate algorithms for sniffing out suspicious activity, a financial detection approach that could be applied to terrorism. But this approach is almost impossible to implement. Why?
If his algorithm is 99% accurate, that sounds really good, right? We ought to be able to use that to find some terrorists. If there are 500 of them in London, it'll find 495. Super! Let's do it!
Whoa, wait a minute. A 99% accuracy rate has a flip side. That means a 1% rate of false positives. There are 50 million people in London who have nothing to do with terrorism. The algorithm would idenfity a whopping 500,000 of them as possible terrorists. The false positives totally overwhelm the system. There's no way you could comb through all these people to find the 495 who are actually terrorists. Plus half a million people would get really pissed off. The same problem exists in pushes for across-the-board medical screenings. The rates of false positives eat up so many resources it's an nearly inviable system.
I'm only like a tenth of the way through this book. *devours*
books: reading