summer boutiques-francophones vpcflyout Home All-New Kindle Music Deals Store sports Tools

Your rating(Clear)Rate this item


There was a problem filtering reviews right now. Please try again later.

on March 29, 2009
I highly recommend this book. The insight you will gain into yourself and everyone around you makes it more than worth the price. I find myself muttering "cognitive dissonance" under my breath frequently whenever I observe people espousing ridiculous notions and defending irrational claims. It has helped me as a classroom to teacher to understand the motivations behind many of my students' behaviours. If you do not go to get this book, a mistake will have been made by you.
0Comment|7 people found this helpful. Was this review helpful to you?YesNoReport abuse
TOP 500 REVIEWERon January 3, 2010
It is difficult to change your mind, and it is especially difficult to admit that you have made a mistake. This book explains how we become committed to our own initial decisions, even when all the evidence suggests we were wrong. The authors provide numerous real examples of people who made decisions that turned out to be mistakes, such prosecutors who successfully pursued criminal cases, only to later learn that the defendant was in fact innocent. It is very hard to admit you were wrong, after you put someone in jail for years.

Mistakes Were Made provides some wonderful insight into how the human mind works -- and how it often fails to work. No system is perfect, and the human mind, for all its abilities, is certainly no exception to that rule. Yet, by understanding the way the mind works, and the mistakes we are prone to make, we can learn to reduce those mistakes and improve ourselves. Everyone should read this book.
0Comment|2 people found this helpful. Was this review helpful to you?YesNoReport abuse
on February 23, 2016
Mistakes Were Made (But Not by Me) was featured a while ago on an episode of the Rationally Speaking podcast as a recent book that deals with the cognitive biases we face when our opinions are demonstrated to be wrong, particularly confirmation bias and cognitive dissonance. The main thesis of the book describes what the authors call "self-justification": the ways we convince ourselves (rightly or not) that a particular action or belief is correct.

In particular, the authors posit that most problems arise when we make a series of incremental self-justifications. This way, it's nearly impossible to know (at the time) when we've passed the point of justifiable belief (the belief can be factual but is more often moral or ethical). For instance, if a pharmaceutical company outright approached a doctor and asked him/her to preferentially prescribe their new drug in exchange for payment, the doctor would probably refuse on ethical grounds. But say the pharmaceutical company first hires the doctor to give a community lecture on depression and mental health—this seems acceptable. Next, they hire the doctor to give a community lecture on anti-depressants—well, this isn't so ethically different from the last lecture, the doctor thinks. Finally, the pharmaceutical company hires the doctor to give a lecture on *their* new antidepressant. The doctor compares the ethics of this new prospect with the last lecture he/she gave, rationalizes it as only a small, incremental change, and proceeds to shill for the compnay's new drug in exchange for payment. Somewhere along the line we would say an ethical line was crossed, but for the doctor, every step has been incremental and no decision stands out as that much worse than the last. A similar example comes from the Milgram experiment. If participants were asked to outright turn up the shock voltage to 450 V, they would refuse. But if they were directed to start at 10 V, then proceed to 20 V—well, what's 10 V more? 30 V—what's another 10 V? And so on.

Self-justification is therefore the process by which we end up only seeking out evidence that confirms our previous beliefs or end up at a conclusion we should have rejected if we reasoned rationally from the get-go. Furthermore, to outsiders not afflicted by our self-justification, our irrational conclusions seem to come out of the blue. This discrepancy in conclusions, the authors suggest, was active in the slow foment of Iranian frustration leading up to the violence of the Iran hostage situation, which caught the US completely by surprise. The authors suggest both parties are to blame here: on one hand, the Iranians engaged in a cycle of self-justification until violence erupted; on the other, the US continually poked at Iran but self-justified by downplaying the magnitude of their insults. Either way, both parties found themselves in a situation they would not have initially expected, and, having stuck to their guns through all their incremental justifications, found it cognitively difficult to back down once crisis erupted.

Overall, while Tavris and Aronson take a fresh tack on two particular psychological phenomena, cognitive dissonance and confirmation bias, you can't do much better than to read the encyclopedia that is Kahneman's Thinking Fast and Slow. Three stars.
0Comment|Was this review helpful to you?YesNoReport abuse
on July 1, 2015
I've often wondered how seemingly good honest people turn into dishonest and self serving politicians. This book covers a lot of experiments and examples that show how little by little, small acts of dishonesty, eventually lead to the justification of big acts of dishonesty. You get a man to lose his ethical compass one step at a time.

Self justification is a scary thing we do to preserve our ego and even ourselves. It's more powerful than a lie and it is absolutely more dangerous than a lie because we're not conscious that we're doing it.

This is such an excellent book for revealing why we do that as humans, helping you see where you might be hiding the truth from yourself and understanding how it plays into your attempts to influence others. The research covered in this book is great ... not too scientific but detailed enough that you understand what the point is.

For a business person or anyone interested in human psychology, but not wanting a hard read, this book will be highly satisfying for you!

From business to home (there is an entire chapter dedicated to how this plays into marriages) - this book will equip you with useful insights into the human mind and behaviors around mistakes and justifications for them. And you'll be in a better position to learn from your mistakes and help influence others when they are dead wrong too. :)
0Comment|Was this review helpful to you?YesNoReport abuse
on October 12, 2011
I'm doing a Masters of Counselling Psychology and came across cognitive dissonance theory, which is a neatly simple way of explaining human behaviour that we've all been banging our heads against since we've been born. I made my way to this book, and when reading it not only have I inwardly shouted, "yes! yes!" a thousand times, I've also been made to think, made to squirm, learned a huge amount about pyschological research out there, and been immensely entertained. Of great interest to me are the parts when Tavris and Aronson turn their sights towards psychology and psychologists themselves, revealing stuck thinking, obstinacy, and willful blindness towards methodologies of therapy that don't sit with their pet theories. It's a stark warning to any of us in the therapy business, some of whom might pat themselves on the back for being helpful without moving forward themselves - and to all those brave and wonderful people seeking therapy as well. Even if you don't fall into either of these categories, this is a really insightful, engaging, hit-the-bullseye book. I devoured it, and will turn to it again and again!
0Comment|Was this review helpful to you?YesNoReport abuse
on February 19, 2010
There are so many books out there giving psychology a bad name. Tavis and Aronson get it. They demand the use of the scientific method in psychology and point out the differences between the ill-conceived ideas that lead to pseudo-psychology and the real deal. And the differences are vast. In the process, we are given insightful information (of the scientific variety) and thoughtful presentations. We are asked to think about our motives and our hard-wiring.

Every wonder why people say things that are obviously not true? How the Nazis could have been so far out there? How do people go on believing the end of the world is coming when it didn't end last year (as they predicted)? What about those folks who say they were abducted by aliens? How come the perfect marriage falls apart? Where do those sickos come from who exist in our society? How can juries of 12 honest people listen to the facts for a lengthy trial and then turn around and convict an innocent person or let a murderer go free?

This book provides us with sound well-researched answers. VERY insightful. I borrowed the book, but having read it, now I have to buy it. It is one I need to own. If you're the least bit interested in the inner workings of the human mind, this book will give you lots of facts and lots to think about.
0Comment|Was this review helpful to you?YesNoReport abuse
on November 28, 2009
We all know that taking responsibility for our actions is the right thing to do. But why do so many people -- some of them very influential -- fail to do so?

This engaging book tells dozens of fascinating stories, some of them well-known historical accounts, some of them from the news. They come from medicine, the criminal justice system, marriage, and nations. In all these stories, people deal awkwardly with situations, make bad decisions, behave foolishly or cruelly, or hold strange beliefs. The stories alone would make this book a great read.

The common thread running through all the stories, the common reason for all the behaviours, is a simple subconscious act: self-justification.

The psychological term "cognitive dissonance" is well known: the tension that occurs when a person holds two inconsistent cognitions. Typically, one will be a thought, a belief or a value and the other will be something the person does or did. For instance, "eating a lot will make me fatter" and "I really like my coffee and cake every afternoon".

What few people seem to realize is how difficult it is for us to live with cognitive dissonance, and how self-justification automatically kicks in: the elaborate mental gymnastics we do to justify *to ourselves* what we've done. "The cake puts me in a good mood". "It's instead of a snack". "I just have to have it". "I exercise so much anyway".

The authors go to great lengths to explain that self-justification isn't just about clever excuses or not admitting mistakes. It's a natural subconscious mechanism that helps us go on living. The trouble with it -- which they communicate eloquently and unequivocally -- is that it often backfires, getting us deeper in trouble.

Some of their examples are chilling. The detectives who planted evidence because the evidence they did have was inadmissible; couples whose marriages are quickly going downhill because each side believes he's acting rationally but his partner isn't; the judge who said that convicting the true killer doesn't mean that the wrongfully-convicted person is innocent.

This trouble extends to groups and nations. The example I'm personally familiar with: my homeland, Israel, where we always said how right and moral we were and how wrong and immoral "they" were; it's always the other side who "started it"; and here we are in 2009 and still no closer to the solution, which is *both* sides agreeing to put down their self-justification and admitting their responsibility for the situation. The book gives the counter-example of South Africa, where tensions had run so high a bloodshed seemed inevitable, but a mutual, unlikely agreement to stop self-justifying has led to most outstanding results.

I'm a little disappointed that only a few pages are dedicated to the research and examples that show that not all is lost. That different approaches to learning, and the meaning of mistakes, are not ingrained; they are cultural and learned, and can be overcome.

If you'd like to take more responsibility in your life, read this book. If you'd like to figure out others' behaviour better, read this book. And if all you want is an interesting pastime... read this book.
0Comment|Was this review helpful to you?YesNoReport abuse
Sometimes, I think that the world is full of hypocrites. The news is full of politicians who preach family values and then are caught in an affair. Everyday we see religious advocates who call for peace and in the same breath state that their God is the only true God. Then, there's the business world where lying and cheating seem to be part of the game.

Sometimes, I wonder how these people live with themselves.

Mistake Were Made (but not by me) addresses that exact question. It would seem that the human mind is designed to selectively remember and process information. Thus, the politician, religious leader, business person, or even ourselves often don't realize that we are being hypocritical. Moreover, as our actions and logic become further and further separated, we tend to hold tighter onto our original notions. Instead of admitting that we were wrong, we justify our actions even more strongly.

Mistake Were Made (but not by me) was a huge eye opener. People don't justify stupid decisions because they are bad people. On the contrary, no one wants to admit they are a fool. Look within, what beliefs do you fight the most adamantly about?
0Comment|11 people found this helpful. Was this review helpful to you?YesNoReport abuse
on March 16, 2009
Carol Tavris has written a book that explains the most extraordinary human ability to deny the facts, appearing to be delusional and/or so self-absorbed absorbed as to be hiding from reality. Well-written and clearly described with examples from our daily world, it brings the psychological reasons behind the rationalizations we so willingly engage in to the reader in an understandable and most human way. Please read: "Mistakes were made (but not by me): why we justify foolish beliefs, bad decisions and hurtful acts" by Carol Tavris (2007). How you see the world (and indeed yourself) will be changed.
0Comment|Was this review helpful to you?YesNoReport abuse
on February 10, 2014
Simply written . Easy read . Well researched .
It is only worth reading if you are ready to loosen your attachments to your own belief systems otherwise leave it on the shelf.
None of their examples of " foolish beliefs " include politically correct dogma which means the authors might want to read their own book .
A book that might help you attain detachment so in one way it might be called a spiritual book .
I enjoyed it
0Comment|Was this review helpful to you?YesNoReport abuse