Book review: Liars and Outliers

Nov 04, 2012 14:19





Yesterday I finished reading a book that I've been hauling around for months now. It wasn't an easy read, but it was useful. It's called Liars and Outliers: Enabling the Trust That Society Needs to Thrive, by Bruce Schneier. Some years back, I started following Bruce Schneier's blog. Shortly thereafter, I subscribed to his newsletter, and I've watched his TED talk. He lives in the Twin Cities, where I lived for 8 years, but I wasn't following him back then, though one of my Twin Cities friends says he's met him and Bruce is cool. Generally the subjects he tackles are the trade-off in risk and cost between different levels of security. In doing this, he talks about different things people do to break the system, whether they be political activists, hackers, terrorists, or scam artists. Then he talks about the ways people try to stop them, preserve the system, or prevent being taken advantage of.

I've always wished that he would talk more about the greatest danger to each of us, which is those close to us. Although his principles can be extrapolated to a scale that small, his main area of interest is in larger scale - groups of people and probably groups that don't know each other. He's gained some notoriety for speaking out against the US reaction to terror post 9/11. He coined the term "security theatre" to refer to the act of pretending to heighten security by requiring elaborate procedures that don't increase security at all.

The book is dry. It's 250 pages long and is an excellent illustration of non-fiction writing style at work. He tells you what he's going to tell you, then he tells you what he's going to tell you, then he tells you what he told you. He'll introduce, describe, and illustrate an idea in one chapter, and then in the next chapter, he'll bring it up again, summarize it, and apply it to the topic of the new chapter. I suppose all that repetition is good for something, but I found it hard to slog through.

Here's some ideas it taught me (or at least drilled into my head):
  • Dunbar number - the number of people in relationships with you that you can identify and mentally track (generally around 150). It's your personal circle, your 'society', and includes family, lovers, friends, co-workers, neighbors, etc.
  • Red Queen Effect - the arms race of evolution where you have to keep improving just to stay at steady state
  • Deacon's Paradox - the trust required in pair-bonding, where you aren't with your partner constantly and have to believe that they will be faithful to you. To support this, you make a public declaration of your private intentions, enlisting the public to assist you in keeping your partner faithful
  • Prisoner's Dilemma - two people have large incentive to betray the other, small incentive to cooperate, but no way of enforcing or predicting cooperation from the other. However, if they both opt to betray, they both lose. So how often will they both cooperate and settle for the smaller reward, versus both betraying and getting nothing? And how often does one try to cooperate when the other betrays?
  • Principal-agent problem - also known as a conflict of interest. It's putting the fox in charge of the hen house.
  • Tragedy of the Commons - each person's personal advantage is to take as much as possible of a public resource, but if everyone does that, then the resource will be ruined for all
  • Hawk-Dove Game - a simple simulation where you imagine that some agents in the game are hawks and attack and kill anyone they encounter, and some are doves who cooperate with anyone they encounter (and if they encounter other doves, they reproduce). If there are lots of hawks, their chances of encountering another hawk and being killed increase. If there are lots of doves, a few hawks can go their entire lives without ever having to fight another hawk. Various factors are introduced into the simulation like speed and visibility of hawks and doves, reproduction rate, cooperation between doves to fight off hawks, etc.
  • Ultimatum game, Dictator game, Trust game, Public Goods game - all variations of a simple Prisoner's Dilemma type experiment where two people are held in separate rooms, one is given some money, and can choose (or not) to share it with the other person. In an ultimatum game, the first person chooses the division and the second person picks whether to accept or reject. If they reject, neither gets any money. In a dictator game, the first person divides and the second person is not allowed to reject (so there's no incentive for the first to divide at all, but most do anyway). There are a lot of variations of this.

I grew up repressed and utilitarian, with the expectation that other people would turn against me as soon as it was to their advantage to do so (Prisoner's Dilemma - always betray). I assumed that most people were like that, because most people I'd met were like that. They were either users, or they had no use for me. It was that simple. I still believe that's how things are, but the book went over the many factors that influence or incentify people to harm others and defect from society's interest. It supports my therapist's assertion that some people are predisposed to take advantage and some aren't.

I'm still struggling to reconcile this with my experiences and the people I've talked to personally about their lives. There's always this dilemma I run into in talking with people about humanity - they tell me people are good and my low opinion is misplaced, but when I ask them about those whom they should have most been able to trust - parents, siblings, lovers, spouses, children - over and over it's the same tale of abandonment, abuse, and betrayal. There's the occasional standout good person, made remarkable by their rarity. My own family, which I regard as very supportive, is a case study of why I don't trust.

It was a useful book that left me with a lot to ponder.

books

Previous post Next post
Up