Book Thirteen
When Prophecy Fails by Leon Festinger, Henry W. Riecken and Stanley Schachter
You're a good person, right? Of course you are, I never doubted it for a moment. We all like to think were good people - fair, honest, generous, all that. Very few people, if asked, would say, "Well, I'm a right bastard and I don't care who knows it!"
So imagine that you - a good person - do something bad. Genuinely bad. You cheat on your spouse. You lie to a friend. You steal from your boss. You commit an act which, if someone else did it, you would roundly condemn them, forcing them into public shame and ignominy. What kind of heel, what kind of cad, what kind of a bastard would do such a thing?
Well, you, as it turns out.
Now you have a problem. The vision of you that you carry in your head - the good, honest, kind, humble (let's not forget humble) person - directly conflicts with the nasty, dishonest thing that you have just done. They're grossly dissonant views, and there is no room for both of them in your head. So what do you do?
Your first option is to reduce your opinion of yourself. Maybe you're not that good a person. Maybe you are a bit of a dick. Maybe, when it comes right down to it, you're just a jerk who knows how to hide it. That right there is some painful truth, and very few people are willing to face up to it.
So you turn to your other option: justify what you did. The spouse you cheated on? Well, maybe if they paid a little more attention to you,you wouldn't have to do it. The friend you lied to? Well, was he honest about that "business trip" that made him miss your annual Memorial Day Meatapalooza Barbecue? Hell, no. He was "in the hospital," visiting "his sick mother." As for work, well if your boss actually paid you what you were worth, you wouldn't need to steal from the register.
You rationalize what just happened, which allows you to not only move on with your life, but paves the way for similar actions in the future, making it that much easier to cheat, lie, and steal the next time.
Welcome to cognitive dissonance.
The classical view of humankind was that we were, ultimately, rational animals. That if you show a person sufficient evidence, that person will alter his opinion accordingly. So, under that model, our Imaginary You (tm) would admit to your inherent badness when confronted with the evidence if your misdeeds.
In the 20th century, however, psychologists were noticing that this wasn't true at all. In fact, in a lot of cases the direct disconfirmation of a belief merely made that belief stronger. Show a smoker data on how dangerous cigarettes are, and she'll tell you that they help her relax, or they only take off the bad years at the end. Show a climate change denier data on the warming of the planet, and you know who you'll hear from only minutes after the first snowfall of the season.
Humans, as it turned out, were a lot less rational than we had suspected. By being able to hold two thoughts in our minds that are mutually incompatible, we set ourselves up for mental disaster, and the only way out is to fool ourselves.
In the mid 1950s, the authors of this book were looking into this phenomenon, especially as it applied to groups and millennialism - the belief that the world is rapidly in danger of ending. They looked at various historical examples, such as the early Christian church, who believed that Jesus' return was right around the corner, the Anabaptists of the 16th century, the followers of Sabbatai Zevi in the 17th century and the Millerites of the nineteenth. They all believed that the end of the world was at hand, they all collected groups of followers who believed wholeheartedly that they were right, and they were all, without exception, wrong. Despite that, not only were they not swayed from their beliefs, they actually became more convinced that they were, ultimately, right.
What could account for such patently irrational behavior? Festinger and his partners believed they knew what it was, and set out five simple conditions under which the phenomenon could arise. In brief:
1. The believer must believe implicitly and that belief must have an effect on how he or she behaves.
2. The believer must have committed him or herself to the belief, performing actions that are difficult or impossible to undo. For example, giving away all their money, quitting their job, etc.
3. The belief must be specific, related to the real world, and able to be proven unequivocally wrong.
4. Evidence disconfirming the belief must occur, must be undeniable, and must be recognized by the believer
5. (and most important) The believer must have social support for his or her belief system.
Under these conditions, Festinger hypothesized, not only would a person persist in their belief, but they would become more convinced, and likely try to convert more followers. After all, if more people believe that you're right, then maybe you are.
But how to test it out? Their best cases, after all, were at least a hundred years gone, and time travel hadn't been invented yet. Fortunately, they got wind of a group of UFO believers who held that the earth was going to be ravaged by floods and that aliens would rescue the faithful to make them the new enlightened rulers of the species. Led by a woman out of Chicago who was receiving messages through automatic writing, this group held that the event would take place before dawn on December 21, 1954.
Knowing a good chance when they saw one, Festinger and his colleagues managed to infiltrate the group and observe their progress, attitudes and beliefs up to, during, and after the event that never happened. In the book, they go through the timeline and touch on all the major players - names changed to protect the innocent, of course - and watched to see if their hypothesis would hold. Would the media-shy Mrs. Keech do an about-face once the disaster didn't show? What would happen to people like Dr. Armstrong, who sacrificed his job and his good name in order to assure that he would be picked up by the aliens? How would the group handle predictions that never came true, follow orders that never worked out, and rationalize this fundamentally irrational behavior?
The study does have some fairly glaring flaws, which the authors themselves point out in the epilogue. For one, they had barely enough time to get involved with the group, and gaining entry was a matter of brute force more than finesse. For another, it was almost impossible not to influence the group. Observers were taken as believers, and expected to act as such. Acting undercover, they couldn't record meetings or, in many cases, take notes until after the fact. Any meeting with the academics had to be carefully arranged so as not to blow their cover, and the long hours, erratic schedule and generally high tension of the group made being an academic double agent very difficult indeed.
Despite that, Festinger and his group present a textbook case of group cognitive dissonance that follows the pattern they expected it to. Believers who met all five criteria were much more likely to seek out new believers than the ones who, for example, were not with the group when the world didn't end.
Of course, the reason I picked up the book was because of the May 21, 2011 Rapture prediction by Harold Camping. He had the Rapture scheduled down to the minute, and had attracted followers who met the initial criteria set out by Festinger more than fifty years ago. Sure enough, when the Big Day came and went, Camping and his followers kept to the script. They saw that the Rapture hadn't come, then revised their predictions and went out looking for people to convince.
More interestingly, though, is how this can apply to other group dynamics. It can be applied to political parties, regional differences, racial differences, bigotry of every flavor and color. It can be connected to celebrity worship and religious fervor, to economic theories, institutional groupthink and scientific biases. Almost any common belief that can gather a crowd is an open invitation to Festinger's five criteria. Lovers of organic food. Adherents to market capitalism, homeopathy, religions of every size and shape. The antivaxxers, conspiracy theorists, Democrats, Republicans, Tea Partiers, Klansmen, environmentalists, educators.... The list is endless.
What slowly dawned on me the day after I originally wrote this review was the implications of the Internet on Point Five (the need for social support). Let's say it's 1956, and you have a favorite political candidate. For our purposes, let's call her, I dunno, Kara Whelan. You really believe she is a good candidate, and you've spent a good deal of time and energy supporting her. Maybe you've tried to convince friends and family - perhaps encountering resistance, maybe had a few arguments - donated money, or even worked on her campaign in the belief that she is smart and capable, thus fulfilling the first three of Festinger's requirements.
Then she says or does something that is breathtakingly stupid, thereby disconfirming your opinion of her. Point four. In the 1950s, it might have been harder to find people to commiserate with. In the book's case study, people who were away from the group when the flood didn't happen almost invariably gave up on their belief and went back to their lives. Being cut off, or only having access by phone just wasn't enough to keep their belief supported. So, our 1956 person might read the paper, think, "Holy cow, Kara Whelan is dumber than a box of dead ducklings," and have no one around to help fight against that realization.
But here in the 21st century, that kind of support is just a click away. You can go to the Kara Whelan website or supporters' forum and talk to dozens of people who are all busy rationalizing the boneheaded thing she just said and finding reasons why it actually makes her a stronger candidate. The Internet makes it easier to find support for whatever you believe, no matter how untethered to reality it may be, and it allows these beliefs to survive and propagate in a way that would have been unthinkable fifty years ago. Working together, your fellow supporters can elevate your belief and trash those who disagree, generating an internal logic that confirms your belief despite evidence to the contrary. If Mrs. Keech had had a website, this would have been a very different story.
So what does this do for us, other than make us skeptical of anything that more than five people believe at a time? Just that: it keeps us skeptical. When you know what to look for, you can figure out who is likely to be persuaded by reason and who is not. You know who is a valid source of information and who is not. You know who you want to trust, and who you do not.
Most importantly, it allows you to check yourself, to see if you're being as skeptical as you should be. None of us are exempt from this little psychological phenomenon, but we are all equipped with the ability to deal with it properly. Let Mrs. Keech and her UFO cult serve as an object lesson.
------------------------------------------------------
"When you stop and think of it, it seems rather cruel to drown all these people just to teach them a lesson, doesn't it? The way to teach people a lesson, or the way to educate people is to educate them slowly; you can't educate them with one big jolt. And it seems rather silly to drown people and hope to educate them in the astral life. It doesn't seem very logical, does it?"
"Fred Purden", in When Prophecy Fails