belief

May 03, 2017 20:22

I saw a lot of people post an oatmeal comic in the past couple days. It's called "you're not going to believe what I'm about to tell you" and it makes a stab at discussing the so-called "backfire effect", the (well known) tendency for a person, when told some fact that contradicts their beliefs, to disbelieve the fact and increase their belief to its contrary. In other words: the new fact backfires, and the person "resists" learning the fact.

The oatmeal comic calls this phenomenon "backwards and batshit-fucking-bonkers" and goes on to discuss a pile of neurological characterizations of people rejecting information they find disfavourable. The point the author is making is that we have this little irrational, defensive part of our brains that doesn't want to hear information we disagree with and builds walls and fortresses of denial against it.

Now, I'm not going to say that doesn't happen. People do absolutely behave defensively and deny things they know, deep down, to be true. Further, there is a huge and well-documented list of similar human cognitive biases to familiarize ourselves with. It's definitely important to be discussing that aspect to ideological polarization.

However, something rubbed me the wrong way reading the oatmeal comic, and I think it's this: polarization (and confirmation bias in general) is not just a result of irrational cognitive biases. It is also something completely reasonable to do if you are willing to believe that someone might be lying to you.

And in many cases, that's not unlikely! We all know that people lie to us. We might not say "liar" in each case -- and it's surely a recipe for a miserable life to assume too many people are lying to you too often -- but from a very early age, most children figure out that lies are cheap and easy. We go through a gleeful, horrified phase of learning that anyone can lie -- and try it out ourselves! -- and then settle into a grim slog of maturation in which we learn how very pervasive the problem is.

We realize that adults lied to us about various things we were taught, that the media is literally financed by lying trying to sell things, that people and companies lie about their intentions, abilities and actions, that think-tanks full of people are paid to lie all day, that politicians lie to acquire power, that histories are written by the victors, that countries manufacture causes for wars. That layer upon layer of lies permeate our world. That is just how it is and we all arrive at adulthood reasonably aware that, when Person X tells us Fact Y, there is a reasonable chance that X is lying and Y is false.

That's not the amygdala talking. That's not emotional defensiveness talking. It's cold, dispassionate reasoning based on awareness of misinformation, awareness of the cost of a given bit of misinformation balanced against the set of interests that might benefit from you believing it. As much as one might dislike lies -- and to be clear, I much prefer people simply tell the truth! -- it's disingenuous to paint skeptical reception of presented facts as solely the result of cognitive bias. It's also cognition about epistemology working exactly as it should.

I've little more to add of my own, so I'll just present a chunk of a book that phrases the relationship between "odds of X being true" and "odds of X being a lie" in more detail: E.T. Jaynes' Probability Theory: The Logic Of Science

(Note: I am not some latter-day positivist à la lesswrong; I enjoyed this book but also feel it's unrealistically reductionist, mechanistic and optimistic. But it is charming and lucid!)

[...] in practice we find that this convergence of opinions usually happens for small children; for adults it happens sometimes but not always. For example, new experimental evidence does cause scientists to come into closer agreement with each other about the explanation of a phenomenon.

Then it might be thought (and for some it is an article of faith in democracy) that open discussion of public issues would tend to bring about a general consensus on them. On the contrary, we observe repeatedly that when some controversial issue has been discussed vigorously for a few years, society becomes polarized into opposite extreme camps; it is almost impossible to find anyone who retains a moderate view. The Dreyfus affair in France, which tore the nation apart for 20 years, is one of the most thoroughly documented examples of this. Today, such issues as nuclear power, abortion, criminal justice, etc., are following the same course. New information given simultaneously to different people may cause a convergence of views; but it may equally cause a divergence.

This divergence phenomenon is observed also in relatively well-controlled psychological experiments. Some have concluded that people reason in a basically irrational way: prejudices seem to be strengthened by new information which ought to have the opposite effect.

But now, in view of the above ESP example [in which interpretations diverged], we wonder whether probability theory might also account for this divergence and indicate that people may be, after all, thinking in a reasonably rational, Bayesian way (i.e. in a way consistent with their prior information and prior beliefs). The key to the ESP example is that our new information is not

S = fully adequate precautions against error or deception were taken, and Mrs. Stewart did in fact deliver that phenomenal performance.

It was that some ESP researcher has claimed that S is true. But if our prior probability for S is lower than our prior probability that we are being deceived, hearing this claim has the opposite effect on our state of belief from what the claimant intended.

The same is true in science and politics; the new information a scientist gets is not that an experiment did in fact yield this result, with adequate protection against error. It is that some colleague claimed that it did. The information we get from the TV evening news is not that a certain event actually happened a certain way; it is that some news reporter has claimed that it did.

Scientists can reach agreement quickly because we trust our experimental colleagues to have high standards of intellectual honesty and sharp perception to detect possible sources of error. And this belief is justified because, after all, hundreds of new experiments are reported every month, but only about once in a decade is an experiment reported that turns out later to have been wrong. So our prior probability for deception is very low; like trusting children, we believe what experimentalists tell us [Note: this book was written before the current crisis in reproducibility / academic fraud].

In politics, we have a very different situation. Not only do we doubt a politician's promises, few people believe that news reporters deal truthfully and objectively with economic, social, or political topics. We are convinced that virtually all news reporting is selective and distorted, designed not to report the facts, but to indoctrinate us into the reporter's socio-political views. And this belief is justified abundantly by the internal evidence in the reporter's own product -- every choice of words and inflection of voice shifting the bias invariably in the same direction.

Not only in political speeches and news reporting, but wherever we seek for information on political matters, we run up against this same obstacle; we cannot trust anyone to tell us the truth, because we perceive that everyone who wants to talk about it is motivated either by self-interest or ideology. In political matters, whatever the source of information, our prior probability for deception is always very high.

This entry was originally posted at http://graydon2.dreamwidth.org/249109.html. Please comment there using OpenID.
Previous post Next post
Up