Cognitive bias

You’re not as rational as you think

Psychologists show how we react in the face of uncomfortable truths. The results are shocking.

by Lee McIntyre
Updated: 
Originally Published: 
brain
Lidiia / Shutterstock

To say that facts are less important than feelings in shaping our beliefs about empirical matters seems new, at least in American politics. In the past we have faced serious challenges — even to the notion of truth itself — but never before have such challenges been so openly embraced as a strategy for the political subordination of reality, which is how I define “post-truth.” Here, “post” is meant to indicate not so much the idea that we are “past” truth in a temporal sense (as in “postwar”) but in the sense that truth has been eclipsed by less important matters like ideology.

One of the deepest roots of post-truth has been with us the longest, for it has been wired into our brains over the history of human evolution: cognitive bias. Psychologists for decades have been performing experiments that show that we are not quite as rational as we think. Some of this work bears directly on how we react in the face of unexpected or uncomfortable truths.

This article is adapted from Lee McIntyre’s book Post-Truth.

MIT Press

A central concept of human psychology is that we strive to avoid psychic discomfort. It is not a pleasant thing to think badly of oneself. Some psychologists call this “ego defense” (after Freudian theory), but whether we frame it within this paradigm or not, the concept is clear. It just feels better for us to think that we are smart, well-informed, capable people than that we are not. What happens when we are confronted with information that suggests that something we believe is untrue? It creates psychological tension. How could I be an intelligent person yet believe a falsehood? Only the strongest egos can stand up very long under a withering assault of self-criticism: “What a fool I was! The answer was right there in front of me the whole time, but I never bothered to look. I must be an idiot.” So the tension is often resolved by changing one of one’s beliefs.

It matters a great deal, however, which beliefs change. One would like to think that it should always be the belief that was shown to be mistaken. If we are wrong about a question of empirical reality — and we are finally confronted by the evidence — it would seem easiest to bring our beliefs back into harmony by changing the one that we now have good reason to doubt. But this is not always what happens. There are many ways to adjust a belief set, some rational and some not.

3 classic findings from social psychology

In 1957, Leon Festinger published his pioneering book “A Theory of Cognitive Dissonance,” in which he offered the idea that we seek harmony between our beliefs, attitudes, and behavior, and experience psychic discomfort when they are out of balance. In seeking resolution, our primary goal is to preserve our sense of self-value.

In a typical experiment, Festinger gave subjects an extremely boring task, for which some were paid $1 and some were paid $20. After completing the task, subjects were requested to tell the person who would perform the task after them that it was enjoyable. Festinger found that subjects who had been paid $1 reported the task to be much more enjoyable than those who had been paid $20. Why? Because their ego was at stake. What kind of person would do a meaningless, useless task for just a dollar unless it was actually enjoyable? To reduce the dissonance, they altered their belief that the task had been boring (whereas those who were paid $20 were under no illusion as to why they had done it). In another experiment, Festinger had subjects hold protest signs for causes they did not actually believe in. Surprise! After doing so, subjects began to feel that the cause was actually a bit more worthy than they had initially thought.

To one degree or another, all of us suffer from cognitive dissonance.

But what happens when we have much more invested than just performing a boring task or holding a sign? What if we have taken a public stand on something, or even devoted our life to it, only to find out later that we’ve been duped? Festinger analyzed just this phenomenon in a book called “The Doomsday Cult,” in which he reported on the activities of a group called The Seekers, who believed that their leader, Dorothy Martin, could transcribe messages from space aliens who were coming to rescue them before the world ended on December 21, 1954. After selling all of their possessions, they waited on top of a mountain, only to find that the aliens never showed up (and of course the world never ended). The cognitive dissonance must have been tremendous. How did they resolve it? Dorothy Martin soon greeted them with a new message: Their faith and prayers had been so powerful that the aliens had decided to call off their plans. The Seekers had saved the world!

From the outside, it is easy to dismiss these as the beliefs of gullible fools, yet in further experimental work by Festinger and others it was shown that — to one degree or another — all of us suffer from cognitive dissonance. When we join a health club that is too far away, we may justify the purchase by telling our friends that the workouts are so intense we only need to go once a week; when we fail to get the grade we’d like in organic chemistry, we tell ourselves that we didn’t really didn’t want to go to medical school anyway. But there is another aspect of cognitive dissonance that should not be underestimated, which is that such “irrational” tendencies tend to be reinforced when we are surrounded by others who believe the same thing we do. If just one person had believed in the “doomsday cult” perhaps he or she would have committed suicide or gone into hiding. But when a mistaken belief is shared by others, sometimes even the most incredible errors can be rationalized.

In his path-breaking 1955 paper “Opinions and Social Pressure,” Solomon Asch demonstrated that there is a social aspect to belief, such that we may discount even the evidence of our own senses if we think that our beliefs are not in harmony with those around us. In short, peer pressure works. Just as we seek to have harmony within our own beliefs, we also seek harmony with the beliefs of those around us.

In his experiment, Asch assembled seven to nine subjects, all of whom but one were “confederates” (i.e., they were “in on” the deception that would occur in the experiment). The one who was not “in on it” was the sole experimental subject, who was always placed at the last seat at the table. The experiment involved showing the subjects a card with a line on it, then another card with three lines on it, one of which was identical in length to the one on the other card. The other two lines on the second card were “substantially different” in length. The experimenter then went around the group and asked each subject to report aloud which of the three lines on the second card were equal in length to the line on the first card. For the first few trials, the confederates reported accurately and the experimental subject of course agreed with them. But then things got interesting. The confederates began to unanimously report that one of the obviously false choices was in fact equal to the length of the line on the first card. By the time the question came to the experimental subject, there was obvious psychic tension. As Asch describes it:

He is placed in a position in which, while he is actually giving the correct answers, he finds himself unexpectedly in a minority of one, opposed by a unanimous and arbitrary majority with respect to a clear and simple fact. Upon him we have brought to bear two opposed forces: the evidence of his senses and the unanimous opinion of a group of his peers.

Before announcing their answer, virtually all dissonance-primed subjects looked surprised, even incredulous. But then a funny thing happened. Thirty-seven percent of them yielded to the majority opinion. They discounted what they could see right in front of them in order to remain in conformity with the group.

Another piece of key experimental work on human irrationality was done in 1960 by Peter Cathcart Wason. In his paper “On the Failure to Eliminate Hypotheses in a Conceptual Task,” Wason took the first in a number of steps to identify logical and other conceptual mistakes that humans routinely make in reasoning. In this first paper, he introduced (and later named) an idea that nearly everyone in the post-truth debate has likely heard of: confirmation bias.

Wason’s experimental design was elegant. He gave 29 college students a cognitive task whereby they would be called on to “discover a rule” based on empirical evidence. Wason presented the subjects with a three-number series such as 2, 4, 6, and said that their task would be to try to discover the rule that had been used in generating it. Subjects were requested to write down their own set of three numbers, after which the experimenter would say whether their numbers conformed to the rule or not. Subjects could repeat this task as many times as they wished, but were instructed to try to discover the rule in as few trials as possible. No restrictions were placed on the sorts of numbers that could be proposed. When they felt ready, subjects could propose their rule.

The results were shocking. Out of 29 very intelligent subjects, only six of them proposed the correct rule without any previous incorrect guesses. Thirteen proposed one incorrect rule and nine proposed two or more incorrect rules. One subject was unable to propose any rule at all. What happened?

As Wason reports, the subjects who failed at the task seemed unwilling to propose any set of numbers that tested the accuracy of their hypothesized rule and instead proposed only those that would confirm it. For instance, given the series 2, 4, 6, many subjects first wrote down 8, 10, 12, and were told “yes, this follows the rule.” But then some just kept going with even numbers in ascending order by two. Rather than use their chance to see whether their intuitive rule of “increase by intervals of two” was incorrect, they continued to propose only confirming instances. When these subjects announced their rule they were shocked to learn that it was incorrect, even though they had never tested it with any disconfirming instances.

When a mistaken belief is shared by others, sometimes even the most incredible errors can be rationalized.

After this, 13 subjects began to test their hypotheses and eventually arrived at the correct answer, which was “any three numbers in ascending order.” Once they had broken out of their “confirming” mindset, they were willing to entertain the idea that there might be more than one way to get the original series of numbers. This cannot explain, however, the nine subjects who gave two or more incorrect rules, for they were given ample evidence that their proposal was incorrect, but still could not find the right answer. Why didn’t they guess 9, 7, 5? Here Wason speculates that “they might not have known how to attempt to falsify a rule by themselves; or they might have known how to do it, but still found it simpler, more certain or more reassuring to get a straight answer from the experimenter.” In other words, at this point their cognitive bias had a firm hold on them, and they could only flail for the right answer.

All three of these experimental results — (1) cognitive dissonance, (2) social conformity, and (3) confirmation bias — are obviously relevant to post-truth, whereby so many people seem prone to form their beliefs outside the norms of reason and good standards of evidence, in favor of accommodating their own intuitions or those of their peers.

Yet post-truth did not arise in the 1950s or even the 1960s. It awaited the perfect storm of a few other factors like extreme partisan bias and social media “silos” that arose in the early 2000s. And in the meantime, further stunning evidence of cognitive bias — in particular the “backfire effect” and the “Dunning–Kruger effect,” both of which are rooted in the idea that what we hope to be true may color our perception of what actually is true — continued to come to light.

Implications for post-truth

In the past, perhaps our cognitive biases were ameliorated by our interactions with others. It is ironic to think that in today’s media deluge, we could perhaps be more isolated from contrary opinion than when our ancestors were forced to live and work among other members of their tribe, village, or community, who had to interact with one another to get information. When we are talking to one another, we can’t help but be exposed to a diversity of views. And there is even empirical work that shows the value that this can have for our reasoning.

In his book “Infotopia,” Cass Sunstein has discussed the idea that when individuals interact they can sometimes reach a result that would have eluded them if each had acted alone. Call this the “whole is more than the sum of its parts” effect. Sunstein calls it the “interactive group effect.”

When we open our ideas up to group scrutiny, this affords us the best chance of finding the right answer.

In one study, J. C. Wason and colleagues brought a group of subjects together to solve a logic puzzle. It was a hard one, and few of the subjects could do it on their own. But when the problem was later turned over to a group to solve, an interesting thing happened. People began to question one another’s reasoning and think of things that were wrong with their hypotheses, to a degree they seemed incapable of doing with their own ideas. As a result, researchers found that in a significant number of cases a group could solve the problem even when none of its members alone could do so. (It is important to note that this was not due to the “smartest person in the room” phenomenon, where one person figured it out and told the group the answer. Also, it was not the mere “wisdom of crowds” effect, which relies on passive majority opinion. The effect was found only when group members interacted with one another.)

For Sunstein, this is key. Groups outperform individuals. And interactive, deliberative groups outperform passive ones. When we open our ideas up to group scrutiny, this affords us the best chance of finding the right answer. And when we are looking for the truth, critical thinking, skepticism, and subjecting our ideas to the scrutiny of others works better than anything else.

Yet these days we have the luxury of choosing our own selective interactions. Whatever our political persuasion, we can live in a “news silo” if we care to. If we don’t like someone’s comments, we can unfriend him or hide him on Facebook. If we want to gorge on conspiracy theories, there is probably a radio station for us. These days more than ever, we can surround ourselves with people who already agree with us. And once we have done this, isn’t there going to be further pressure to trim our opinions to fit the group?

Solomon Asch’s work has already shown that this is possible. If we are a liberal we will probably feel uncomfortable if we agree with most of our friends on immigration, gay marriage, and taxes, but are not so sure about gun control. If so, we will probably pay a social price that may alter our opinions. To the extent that this occurs not as a result of critical interaction but rather a desire not to offend our friends, this is likely not to be a good thing. Call it the dark side of the interactive group effect, which any of us who has ever served on a jury can probably describe: we just feel more comfortable when our views are in step with those of our compatriots. But what happens when our compatriots are wrong? Whether liberal or conservative, none of us has a monopoly on the truth.

Still from one of Leon Festinger's classic experiments on cognitive dissonance.

MIT Press

I am not here suggesting that we embrace false equivalence, or that the truth probably lies somewhere between political ideologies. The halfway point between truth and error is still error. But I am suggesting that at some level all ideologies are an enemy of the process by which truth is discovered. Perhaps researchers are right that liberals have a greater “need for cognition” than conservatives, but that does not mean liberals should be smug or believe that their political instincts are a proxy for factual evidence. In the work of Festinger, Asch, and others, we can see the dangers of ideological conformity. The result is that we all have a built-in cognitive bias to agree with what others around us believe, even if the evidence before our eyes tells us otherwise. At some level we all value group acceptance, sometimes even over reality itself. But if we care about truth, we must fight against this. Why? Because cognitive biases are the perfect precursor for post-truth.

If we are already motivated to want to believe certain things, it doesn’t take much to tip us over to believing them, especially if others we care about already do so. Our inherent cognitive biases make us ripe for manipulation and exploitation by those who have an agenda to push, especially if they can discredit all other sources of information. Just as there is no escape from cognitive bias, a news silo is no defense against post-truth. For the danger is that at some level they are connected. We are all beholden to our sources of information. But we are especially vulnerable when they tell us exactly what we want to hear.

This article was originally published at MIT Press Reader by Lee McIntyre at Boston University. Read the original article here.

This article was originally published on

Related Tags