Top critical review
Another look at a particular set of cognitive biases
on February 23, 2016
Mistakes Were Made (But Not by Me) was featured a while ago on an episode of the Rationally Speaking podcast as a recent book that deals with the cognitive biases we face when our opinions are demonstrated to be wrong, particularly confirmation bias and cognitive dissonance. The main thesis of the book describes what the authors call "self-justification": the ways we convince ourselves (rightly or not) that a particular action or belief is correct.
In particular, the authors posit that most problems arise when we make a series of incremental self-justifications. This way, it's nearly impossible to know (at the time) when we've passed the point of justifiable belief (the belief can be factual but is more often moral or ethical). For instance, if a pharmaceutical company outright approached a doctor and asked him/her to preferentially prescribe their new drug in exchange for payment, the doctor would probably refuse on ethical grounds. But say the pharmaceutical company first hires the doctor to give a community lecture on depression and mental health—this seems acceptable. Next, they hire the doctor to give a community lecture on anti-depressants—well, this isn't so ethically different from the last lecture, the doctor thinks. Finally, the pharmaceutical company hires the doctor to give a lecture on *their* new antidepressant. The doctor compares the ethics of this new prospect with the last lecture he/she gave, rationalizes it as only a small, incremental change, and proceeds to shill for the compnay's new drug in exchange for payment. Somewhere along the line we would say an ethical line was crossed, but for the doctor, every step has been incremental and no decision stands out as that much worse than the last. A similar example comes from the Milgram experiment. If participants were asked to outright turn up the shock voltage to 450 V, they would refuse. But if they were directed to start at 10 V, then proceed to 20 V—well, what's 10 V more? 30 V—what's another 10 V? And so on.
Self-justification is therefore the process by which we end up only seeking out evidence that confirms our previous beliefs or end up at a conclusion we should have rejected if we reasoned rationally from the get-go. Furthermore, to outsiders not afflicted by our self-justification, our irrational conclusions seem to come out of the blue. This discrepancy in conclusions, the authors suggest, was active in the slow foment of Iranian frustration leading up to the violence of the Iran hostage situation, which caught the US completely by surprise. The authors suggest both parties are to blame here: on one hand, the Iranians engaged in a cycle of self-justification until violence erupted; on the other, the US continually poked at Iran but self-justified by downplaying the magnitude of their insults. Either way, both parties found themselves in a situation they would not have initially expected, and, having stuck to their guns through all their incremental justifications, found it cognitively difficult to back down once crisis erupted.
Overall, while Tavris and Aronson take a fresh tack on two particular psychological phenomena, cognitive dissonance and confirmation bias, you can't do much better than to read the encyclopedia that is Kahneman's Thinking Fast and Slow. Three stars.