May 13, 2010 22:53
What makes us honest? The process of natural selection that honed our minds is supposedly one of cutthroat competition. We’re quite obviously driven to succeed, but often we choose not to lie or steal or cheat even when we rationally expect no consequences. In the Freudian account, it’s the rational superego that restrains the selfish id. Plato had similar thoughts. We’re certainly all used to the metaphor of a devil whispering temptations in one ear and an angel trying to shout him down in the other.
Most people think that, in a situation where they could act dishonestly to get ahead without any fear of getting caught, they would be able to act honestly but would have to actively fight temptation to do so. A competing theory holds that honest people genuinely lack any feeling of temptation, and they just do what comes naturally and also happens to be the morally right thing to do.
so what’s going on in the brains of people when they cheat or act morally. fMRI lie detection has received quite a bit of funding and attention recently, and it’s one of the more imminent examples of applied neuroscience research. Despite that, most fMRI lie detection studies are marked by a potentially critical flaw: the experimenters must ask the participants to lie rather than catching in a moment of genuine dishonesty.
Greene devised a clever experimental design to experimentally test authentic dishonesty. Participants thought they were taking part in a study examining paranormal abilities, and their task in the scanner was simply to make guesses about a basic coin flip. Half of the time, they recorded their guess ahead of time, but the other half of the time they just told the experimenters whether they had been right or wrong (under the guise of an experiment to test the effect of privacy on predictive ability).
Unless you put credence in the interpretation that paranormal abilities actually exist-some of the subjects turned out to be dirty rotten cheaters. A group of fourteen participants claimed a success rate above 69%, with a mean of 84%. If you’re into statistics, the p-value on that actually happening is less than 0.001. In contrast, another group of 14 scored 52% on the trials when they reported their guesses after knowing the outcome. In other words, it’s safe to assume that these people were being honest.
One way of examining the act of lying is to look at reaction time data. The honest group displayed no significant difference in reaction time to a prompt in the trials when they made their predictions before the flip and on those when they reported them after. The dishonest group also had no difference between those two types of trials when they reported a win, but it took them longer to indicate their choice when they were indicating after the flip that they had lost.
More revealing, though, were the fMRI data. If it were the case that honest behavior requires the active suppression of a desire to cheat, you would expect to see increased activity in the cognitive control network-roughly, the dlPFC, anterior cingulate cortex, and posterior parietal cortex. This network responds in all sorts of studies that ask participants to overcome some prepotent habit or desire. Honest subjects, though, displayed no such effect. Dishonest subjects, on the other hand, showed more activity in exactly this network. And the more often subjects cheated, the stronger this effect was.
It seems relatively safe to say, that these results support the somewhat counter-intuitive theory that good people are such because they are free of temptation, not because they succeed in quelling it. Certainly, this study opens up lots of really interesting questions. What’s different about people who don’t feel the allure of the easy buck? Can this experimental design tell us more about how to use fMRI for genuine lie detection-most current methods are no more reliable than polygraphy. And, finally, if we can learn why people act dishonestly, can we come up with better ways of convincing them to do it?
moral brain