It is the evening of the 20th of December, 1954. You believe the world will end later this night. Together with other “Seekers,” you are waiting anxiously for the UFO that will carry you to safety before the Earth is destroyed.
Now imagine that you are presented with plain evidence you are wrong—morning arrives, no space ship in sight, and the world hasn’t ended—what would you do?
You could stand up and confront the others: “Hey, the Earth is still here—it looks like we were wrong after all! Thank goodness for that.”
But you don’t. The group’s leader, Dorothy Martin, explains she has received a message from the planet Clarion. “The Earth has been saved because of our devout prayers,” she says. Full of joy, you believe her. You are more certain than ever you are right. You decide to go out and convince others to join the Seekers.
One way to resolve the tension is to rationalize: The world was spared because of our devotion.
The motivation to reduce cognitive dissonance drives much of our irrational behavior. We choose to deceive ourselves rather than admit we were wrong. Being able to recognize when we are in a state of dissonance can help us make better decisions and improve our relationships. Read on to learn more.
Making Better Decisions
Choice-supportive bias: We feel more positive about our choices after we have made them—magnifying the attractiveness of the option we chose and devaluing the option we didn’t choose. For example, after hiring someone we are convinced they were the best qualified candidate all along.
The opposite is also true. When we want something but find we cannot have it, we devalue it. If our first choice candidate takes a job somewhere else, we rationalize that she wouldn’t have been a good fit with the team, anyway.
Confirmation bias: We seek information that confirms our beliefs and ignore information that contradicts our beliefs. Once we make up our mind, we are more likely to discount evidence to the contrary than to change our opinion. Confirmation bias is widespread, even in professions where objectivity is essential: police officers, judges, scientists, doctors, therapists, and teachers.
Backfire effect: When our deeply held beliefs are challenged by contradictory evidence that we can’t ignore, our beliefs get stronger. This is why it is notoriously difficult to change someone’s mind by presenting them with “the facts.” The stronger the arguments, the more resistance you will encounter through the backfire effect.
Struggle effect: We feel more positive about goals or objects that we voluntarily struggle to achieve. For example, psychologist Elliot Aronson found that people identify more strongly with groups that have a grueling initiation ritual (e.g. fraternities, the military, some business organizations).
Intrinsic value effect: People identify strongly with self-determined goals that have no clear external reward. In the absence of extrinsic rewards we amplify the intrinsic value of the task or work. In one study, preschool children who were “rewarded” for drawing (with a gold seal and ribbon) spent less time drawing afterwards than children who were not rewarded.
More than 200 years before the term cognitive dissonance entered our vocabulary, Benjamin Franklin had already demonstrated its effectiveness in building relationships. In his Autobiography, Franklin describes how he developed a friendship with a political rival in the Pennsylvania legislature.
“Having heard that [the rival] had in his library a certain very scarce and curious book, I wrote a note to him, expressing my desire of perusing that book, and requesting he would do me the favour of lending it to me for a few days. He sent it immediately, and I return’d it in about a week with another note, expressing strongly my sense of the favour. When we next met in the House, he spoke to me (which he had never done before), and with great civility; and he ever after manifested a readiness to serve me on all occasions, so that we became great friends, and our friendship continued to his death.”
Franklin correctly noticed that a person who has done you a favor in the past will be inclined to do you another favor in the future.
Cognitive dissonance explains this apparently contradictory behavior. After a person does you a favor, he or she will form a more positive opinion of you. The person is likely to believe, “If I did you a favor, you must be ok.”
The oft-used foot-in-the-door sales technique is an application of this principle. By asking someone to comply with a small request, a sales person makes it much more likely that the other person will comply with a larger request later. A small yes leads to a big yes. Why? We are motivated to act in ways that are consistent with our self-perception and previous behavior.
What’s a Leader to Do?
Ask a rival to do you a favor. If you can get a rival to do you a favor, they may form a more positive opinion of you. Once the ice is broken, repay the favor. Reciprocity will generate good will and strengthen the relationship.
Ask people to agree to a small request. People are more likely to support you in the future if they have (publicly) agreed with you in the past. By agreeing with a request, however small, people will begin to form a bond with you. And they will be motivated to act in ways that are consistent with their previous behavior.
Search for common ground in an argument, rather than restating the facts. Let’s say you are convinced that Technology A is best for your project. Your colleague is convinced that Technology B is best. You will not succeed in convincing your colleague that Technology A is best by beating him over the head with the facts. A better approach would be to “agree on what you agree on”—for example, you both want to deliver the project on time. From there, you might reach the mutual conclusion to use the technology you can implement most quickly.
Be wary of advice from people who have decided in favor of something you are considering. They have a strong positive bias. For example, a CTO who just implemented a major upgrade to her organization’s IT system is motivated to justify her decision by pointing out the benefits and ignoring the pitfalls of the new system.
Festinger, L., Riecken, H., & Schachter, S. (1956/2008). When Prophecy Fails: A Social and Psychological Study of a Modern Group that Predicted the Destruction of the World. London: Pinter & Martin Ltd.
Franklin, B. (2008). The Autobiography of Benjamin Franklin. Project Gutenberg. Retrieved on 24. 02 2014 from http://www.gutenberg.org/ebooks/148
Kahneman, D. (2011). Thinking Fast and Slow. London: Allen Lane.
Lepper, M. R., Greene, D., & Nisbett, R. E. (1973). Undermining children’s intrinsic interest with extrinsic reward: A test of the “overjustification” hypothesis. Journal of Personality and Social Psychology, 28(1), 129-137. doi:10.1037/h0035519
Tavris, C., & Aronson, E. (2007). Mistakes Were Made (But Not By Me). New York: Harcourt.
Image courtesy of cuman14 / CC-BY SA 2.0