Mistakes Were Made (but not by me), by Carol Tavris and Elliot Aronson

Tarvis_Comp2.indd

I don’t have a background in psychology, but I taught critical thinking when I was young, I’ve done a lot of reading about reasoning and especially how and why people go astray in their reasoning, and I’ve had a lifelong commitment to truth and rationality. I think Mistakes Were Made (but not by me) is one of the best, most useful, books of its kind I’ve ever read.

The book is about some of the most common psychological tendencies that cause us to acquire false beliefs. Not so much errors like an optical illusion might lead us to, but more self-serving beliefs, beliefs that safeguard our egos. These are the fallacies we fall prey to because to instead believe what is most rationally justified would somehow put us in a bad light or cause us to let go of some other cherished belief(s).

It’s not a technical book, or a book that breaks new ground. It’s more a summary for laymen of some important conclusions that have been reached about human psychology and human behavior.

Mostly the authors are not talking about cases of conscious deception, or of making excuses or intentionally being evasive to avoid some unwanted consequence like losing one’s job, losing an argument with one’s spouse, or going to jail. The psychological phenomena in question are more about self-deception than other-deception.

There are times, though, that they attribute more sincerity to certain folks (e.g., politicians) than I believe is warranted. That is, some of their examples may not be cases of self-deception as they intend, but of outright lying.

With some of their other examples, I’m not convinced the beliefs in question are unjustified. I think the authors may err occasionally in bending over backwards to be too even-handed, almost to the point of relativism, in finding both parties to disputes equally guilty of spinning things in self-serving ways.

Also, though there’s something to be said for pointing out that no one is exempt from wishful thinking and bias, that can be misleading also.

Yes, it’s not just dumb people who rationalize in these self-deceptive ways, or uneducated people, or emotional women, or believers in religion or supernaturalism, but also intelligent people, educated people, men who pride themselves on their rationality, scientists, etc. But at the same time, not everyone is equal. Some people, due to their rational capacities, values, etc., don’t succumb to these psychological temptations to the same degree that some other people do.

The authors don’t deny this; I’m just saying they aren’t as clear on it as I might like, and I could see some readers feeling their relativism—their conviction that everyone’s beliefs are equally justified—has somehow been affirmed.

I’m confident enough in my commitment to truth and critical thinking to say that in terms of avoiding deceiving myself in the ways discussed in this book I’m well ahead of the average person in the general population, and well ahead of where I would have been had I not worked at my rationality for decades. On the other hand, I’m humble enough, and had enough “Uh oh, I think I can see myself in that example” moments reading this book to know that I still have a lot of room for improvement, and that I can be a lot better at avoiding deceiving myself and others if I continue to work on my rationality.

There are a lot of things discussed in this book that I appreciated. I’ll mention just some of them.

Experiments reveal the rather surprising fact (surprising if one assumes any significant degree of rationality in humans) that people routinely become more committed to a belief when confronted with conflicting evidence.

So it’s not just that a person who already has a certain belief requires more evidence to switch to believing the contrary than does a person who previously had no opinion on the matter; the evidence actually pushes him further in the wrong direction.

For example, studies have been done on believers in kooky religions with end-of-the-world claims that show that when the facts blatantly disprove the claims of their religious leaders (e.g., the world was absolutely, positively going to end on Friday at noon because God said so, but then the world didn’t end on Friday at noon), many of those believers had a higher level of confidence in the infallibility of their religious leaders than they had had before the embarrassing failure of the world to end. The more dug in they already were—having proclaimed their belief in this infallibility to others, having sold their possessions and tied up the loose ends of their life in preparation for the end of the world, etc.—the more they insisted on somehow spinning the disproof into some sort of confirmatory evidence.

At a certain level you can kind of understand why this is so. The farther you’ve gone out on a limb, the bigger a deal it is to be wrong. Of course the best thing in that case is to not be wrong, but psychologically a lot of people discover that the second best thing is to be wrong but not realize you’re wrong.

When something happens to challenge your belief—or even flat out disprove it—you either give up your belief or you find some way to interpret things so as to be able to retain your belief. But then each time you do the latter, it raises the stakes—puts you even further out on that limb—making it that much harder to give up the belief the next time something happens that casts doubt on it.

In poker, it’s the “throwing good money after bad” phenomenon. Your ability to understand the value of your hand gets worse as you commit more money to the pot. You tend to overrate your likelihood of winning the hand if you stay in, because you have already paid a certain amount to get this far and you don’t want to reverse course in the middle and relinquish the money you’ve already committed (even though mathematically it’s often correct to cut your losses that way).

It’s also kind of like the way the families of dead soldiers can be manipulated. A mother might have had no strong opinion one way or the other on the Vietnam War, but once her soldier son is killed in it she becomes vulnerable to hawks telling her what a noble and necessary cause it is, and how pulling out of the war now would mean her son died in vain. Now that she has an emotional reason to commit to one side of the argument, any evidence she is confronted with from the other side will just make her more angry and defensive, as in her mind it will be coming from people wanting to dishonor her son, and she’ll cling to her comforting beliefs even harder.

The book has a lot to say about memory. As I’ve read elsewhere, and still find quite surprising, memory isn’t just imperfect but routinely wrong in major ways. That is, it’s not just that details get fuzzier the more time passes, but that whole memories can be dead wrong.

For instance, the book tells of a woman who had clear and treasured memories of her father reading her a certain book when she was young, only to find out later that the book was published after her father died.

Even memories like that that people have with no prompting, and are quite confident of, can be completely wrong. Much more dubious are “recovered memories,” where people are coaxed, cajoled, pressured, and hypnotized until they claim to “remember” such things as sexual abuse when they were small children.

Given how fallible memory is and what pressure certain therapy patients are put under, it’s not surprising that they can be induced to “remember” all kinds of things that may be inaccurate and may even be ridiculous. But the book is at least as interested in why the therapists in such situations can’t see the flaws in their methodology and can’t see why the claims produced by it are highly dubious.

And the answer is because at some level they don’t want to see. Often, to believe that the ‘memories’ now being claimed by their patients are accurate fits with their ideology, their worldview. They may have a feminist perspective that men are so routinely abusive toward women and children that it’s no surprise at all, and indeed to be expected, that if you probe in the right way you’ll find that there are massively more such cases of abuse than was previously known. It’s easy for them to believe that in addition to all the cases of abuse that victims (or “survivors,” to use the common euphemism) remember, it turns out that there are also countless cases they repressed the memories of.

Plus once they’ve committed to using forms of therapy that generate these “memories,” it becomes psychologically harder and harder to later see and admit the flaws in them, because then they wouldn’t just be criticizing some methodology in the abstract but would be admitting that they themselves have in all likelihood perpetrated great harm to both those they convinced to become accusers and those who were accused.

Which is not to deny, by the way, that it can work the same way on the other side. Let’s say you’re a therapist, a memory researcher, an attorney, whatever, and you’ve argued very strongly and very tenaciously that these recovered “memories” are extremely unreliable and often made up out of whole cloth, and then new research establishes that they are a great deal more reliable than they appear to be now. Will you be honest and rational enough to change your opinions as the evidence changes, even if it means going against much of what you’ve based your career on, even if it means admitting you’ve just spent the last few years or decades basically calling people liars who in fact were victims of terrible abuse as children?

There’s also a worthwhile section of the book that talks about the criminal justice system, and why cops and prosecutors wildly overrate the reliability of forced confessions, and in general overrate their ability to intuit who is guilty and who is not, to the point that even when there’s overwhelming evidence they were wrong in a given case (DNA evidence, whatever), they routinely stick to their guns and look for some way to spin things to be able to still insist that the person is guilty.

Such people want to believe that their experience enables them to be able to ascertain guilt or innocence more readily than a layperson can. So interrogation becomes not a means of discovering the truth—since in their mind they already have that—but of securing what they need for a conviction, namely a confession. Then, as in so many other cases, once they’ve thought that way for a while and conducted themselves that way for a while, then they’re committed and the psychological stakes become much higher, making it all the more unlikely they’ll be able to honestly assess evidence that shows that they’ve been persecuting innocent people.

Then there’s also the matter of why an innocent person would ever confess to a crime he didn’t commit. People who have never been interrogated try to imagine themselves in that situation, and they conclude that if it were them then there’s no way they’d say they murdered or raped somebody or whatever if they hadn’t. Yeah you’re under pressure, they say, but why would you tell some blatant lie against yourself in response, a lie that’s so contrary to your own self-interest?

The authors note that people thinking that way are greatly underrating the inherent coerciveness of such situations. Not to mention, innocent people tend to have a naïve faith that their innocence in and of itself will protect them regardless of what they do or say, which is why they routinely do things like agree to talk to the police without their attorney present, or agree to searches without insisting on a warrant. (“What have I got to hide? I’m innocent!”)

I would add a point I remember hearing from a law school professor. When you’re in an abusive interrogation situation like that, it’s very easy to assume anything you say will not be given any significant credence later precisely because of the situation that produced it. So it seems harmless to sign a false confession, since in the short run it’ll get you out of this extremely unpleasant interrogation, and in the long run all you have to do when you can speak freely is to say you made it up in order to end that abuse. What will people believe after all, you think to yourself, this confession made under blatant duress, or my later uncoerced statement retracting it?

You don’t realize—in part because you’ve almost certainly never experienced this before, and in part because the situation itself has clouded your judgment—that people put enormous weight on confessions, and that juries specifically almost always believe confessions regardless of the circumstances under which they came about and regardless of what the defendant is now claiming.

Another interesting example given of how we deceive ourselves into certain beliefs and attitudes is that of a new, somewhat reluctant, member of a gang. (And this can be analogized to war and any number of other situations.) Let’s say the group you’ve just joined or the group you’re kind of hoping will accept and like you is having some fun one day bullying a weaker kid, and you feel you have to join in if you want to remain in good standing with them.

And let’s say you’re troubled by this because at some level you recognize that the victim is just some innocent kid who doesn’t deserve to be beaten or robbed or humiliated in front of his girlfriend or whatever. Initially this will generate ambivalence—you want to behave in the way this group wants, and you also want to refrain from wantonly victimizing an innocent kid.

If you end up joining in the bullying, chances are that psychologically you’ll go back and reinterpret the situation so as to eliminate the ambivalence. Instead of continuing to recognize that there were reasons to be pulled in different directions, your mind will convince itself that actually everything was pulling you in the direction you went in.

So you’ll adopt the attitude that there was something about this person—whether as an individual or as a member of some group (he’s Black, he’s Jewish, he’s a nerd, whatever)—that rendered him deserving of ill treatment after all. Maybe you’ll adopt a kind of hardass philosophy that in a dog-eat-dog world like ours, it’s a person’s own fault if they put themselves in a position to be victimized, making it practically a tautology that people who are abused deserve it. You figure if he’s fool enough to be out here walking down the street without a weapon, without having joined a rival gang, etc., then he deserves what he gets.

Studies show that people who may have started off neutral or even sympathetic toward someone they—for whatever reason—harmed are apt to retroactively regard the person they harmed as contemptible in one way or another. People don’t want to think of themselves as capable of harming good, innocent people after all.

I think you can see a lot of the kind of self-deception the authors are talking about in the positive thinking aspects of the self-help movement. I would think if anything it’s especially prevalent there, as instead of being something people fall prey to unwittingly, they are actively encouraged to interpret everything in their life so as to put themselves in the best light.

“Guilt” is anathema to such folks. I’ll always remember a woman I was briefly acquainted with who had embraced some sort of positive thinking philosophy that involved insisting over and over to oneself that one is perfect. She loved the philosophy, raving about how it enabled her to free herself of anything she had ever felt guilty about in her life.

I can state with a high level of confidence that anyone encouraged to convince themselves that they are perfect, including this woman, is being encouraged to lie to themselves, since there is no human being who is perfect or even remotely close to it. Furthermore, given all the shitty ways we treat each other in life, I would think the last thing we need is to unplug our consciences and never feel guilt. But people eat this stuff up because it makes them feel good.

Also noted in the book is our natural tendency to seek circumstantial reasons for the bad things we do in life—to excuse them in effect—while not similarly explaining away the good things we do.

When we do a bad thing we think of it as an anomaly, a product of a misunderstanding, not a reflection of our true character, etc. When someone else does something bad (unless it is someone close to us that we identify with), then we take it as precisely reflecting their true character.

Unlike us, other people do bad things because they’re bad people.

When I read that, I thought about David Foster Wallace’s commencement address This is Water, in which he asked his audience to make a conscious effort to find circumstantial explanations for the ill behavior of others the way we naturally do for ourselves.

Toward the end of the book the discussion turns to marriages and relationships, where we see some of these same self-deceptions at work.

The key to a successful relationship, the authors conclude, is to avoid falling into the trap of always feeling defensive, always trying to justify ourselves, always trying to put whatever we did in the best light and show that it can’t really be our fault or reveal some character flaw in us. We should work instead at being empathic toward our partner, giving them the benefit of the doubt as much or as more as we do for ourselves. We need to seek to better understand them, see things from their perspective, appreciate their good points and excuse their flaws, while admitting our own flaws.

As the authors point out, we tend to anticipate that admitting our errors and apologizing and such will have worse consequences than it really will, like that it’ll be very embarrassing, it’ll make us appear weak, it’ll make people respect us less, it’ll give people openings for an “I told you so,” etc.

In fact, in most circumstances that isn’t true. Honestly acknowledging wrongdoing is more likely to make people feel closer to us, to make them respect us more. It’s pride, not a rational assessment of the evidence, that makes us anticipate bad consequences if we’re honest with others, and with ourselves.

All-in-all, Mistakes Were Made (but not by me) is a clear winner. There is a lot to think about here, a lot that can help you to deceive yourself less, a lot that can, frankly, make you a better person.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s