Guest: Carol Tavris discusses her book Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Actsand its implications for the skeptical community.
Download MP3 (50:12min, 23MB)[audio:http://skeptiko.com/older-shows/skeptiko-2008-10-14-81999.mp3]
“Of course, there are many people who pride themselves on being skeptics, who are not skeptical about the things they believe in. We, all of us, are inclined to read a disconfirming research more critically than confirming research. There’s actually been research on this.” – Dr. Carol Tavris
Alex: Welcome to Skeptiko where we explore controversial science with leading researchers, thinkers and their critics. I’m your host, Alex Tsakiris.
Before we jump into today’s interview, I wanted to update you on some of the things that have been going on here at Skeptiko. We really have quite a bit happening. For those of who are regular listeners, I know that you’re interested in the research projects that we have going on, and I do receive a bunch of e-mails about that. So, I want to update you.
First, the dog that know research that we started about a year ago. I want you all to know that that is kind of throttling along. I’m in contact with the folks at the University of Florida Canine Cognition Lab on a regular basis. We’ve tested a couple of dogs, and probably haven’t found quite the right dog yet, but it takes more time and effort than I think you could imagine. They’ve been fantastic and we’re definitely pursuing that research, they’re pursuing that research. I’m very optimistic that the preliminary results that we’ve seen will be confirmed later in that research.
Next is the medium demonstration project. Several of you picked up on the fact that I recently issued a press release asking for participants for this research. I received quite a few participants and did a little preliminary testing on my own to kind of work out the protocol. We changed the protocol slightly in a way that I think skeptics are going to be very, very happy with. It’s a very minimalistic protocol in terms of the amount of the amount of information that goes on between the medium and the person requesting the reading. As a matter of fact, they never even talk. In my preliminary tests, the results have been quite significant, quite amazing. So you’ll hear a lot more about that in the upcoming weeks and months as we’re about ready to reconnect with the folks at the Skeptics’ Guide to the Universe and see if they are still willing to go forward with this research and find a way to do that. I’d also like to consider moving that into a university setting as well because I think that’s where all this research really belongs, and that’s really the goal of Skeptiko, is to incubate real scientific research into these controversial areas.
Which brings us to the third research project that we’ve started and have initiated, and that’s with psychic detectives. Of course Ben Radford and I are looking for a case that we can investigate together, and I think we’re close to finding that. Ben I think is onboard with that and I’m trying to make sure that I have access to the information that I need to do a good job of investigating whatever case we decide as well. In addition to that, Noreen Renier has agreed to take part in a university sponsored research project to test the efficacy of the kind of information that psychic detectives generate. That’s in the very early stages obviously, but stay tuned and we hope to get that research initiative going as well.
So a lot of things going on and it’s really going in the direction that I’d like to see it going which is collaborative research that puts these questions to the scientific test and does it in a way that addresses the legitimate skeptical concerns going in.
But now let’s move on to today’s interview. I still try and stay tuned to a lot of the skeptical broadcasts that are out there, even though they are sometimes a little bit difficult to listen to as it is hard to listen to people that you usually don’t agree with. But our guest today, Dr. Carol Tavris, appeared on a number of these skeptical shows. Even though she’s a very mainstream psychologist, she has aligned herself with several of the skeptical groups and has been out talking up her book, which is really a very mainstream book about cognitive dissonance. I couldn’t help in listening to these interviews how all of her principles apply so much to the skeptical community and their blind spots in looking at real research that disconfirms their world view. So I thought she’d be a great guest to have on, and I think we had a pretty interesting dialog. Here’s my interview with Dr. Carol Tavris.
(Start of Interview with Dr. Carol Tavris)
Alex: We’re joined to day by Dr. Carol Tavris who along with Dr. Elliot Aronson is the author of Mistakes Were Made (But Not By Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts, which is a wonderful book title and one that really gets to the heart of what the book’s about, and that is the theory of cognitive dissonance. So, Dr. Tavris, I thought we’d start with a little background on you, talk about your training and particularly what sparked your interest in dissonance theory.
Carol: Well, I have PhD in Social Psychology and a very quirky career as a social psychologist, lecturer, writer. My interest right along from the beginning was in translating psychological science for the public interest and to persuade the public that psychological science is not an oxymoron. This is an uphill battle, it’s really is, I don’t have to tell you. But it’s been a very interesting career because I love science, I love psychological research. In our culture we have such a huge gap between pop psych nonsensical notions and the really good and useful work that good psychological science can provide. In effect, that’s what my work has been in my whole career.
So my first book, which was on anger, was an examination of many myths and misconceptions about anger that had been perpetuated by the Freudian and other psychoanalytic establishments in our culture to show no many of those assumptions were wrong. So, that’s what I love to do. Over the years, I have discovered an extremely interesting thing which is that when you present people with wonderful, good scientific data showing them that they might be actually be wrong about some belief that they hold dear, they don’t say, “Thank you so much for giving me this fabulous information. I will change my mind immediately and move on to a better way of life.” No, they tell you what you can do with your data.
You asked what sparked my interest in dissonance theory, my interest has been in mistaken ideas and the harm that they can cause, as in recovered memory therapy or therapeutic SADs that have been more harmful than helpful and so on. I was sitting with my very old friend, Elliot Aronson, one day and we got to talking about why it was that George Bush unable to say – even with the whole country calling on him to say so – “I was wrong about weapons of mass destruction. I was wrong that the war in Iraq would be over in six weeks. I was wrong that it was not going to costs us anything. I was wrong about the consequences of our involvement there. Since I was wrong, I did the best thing with the information I had at that time. But I was wrong, and now let’s rethink this whole adventure.” Commentators on the left and right were urging Bush to speak straight to the country, and he didn’t do it.
Now Elliot is one of the country’s, the world’s, greatest social psychologist and he had been studying dissonance theory for many years starting when he was a graduate student working with his mentor and advisor at Stanford, Leon Festinger, who develop the theory of cognitive dissonance. What Elliot did was transform that theory into one of self-justification. As we got to talk, we realized that his life work on self-justification and my interest in the social uses of psychological science and how it is that people don’t respond more openly to information that would be beneficial made a nice match. So we put our interest together in deciding that we would write this book.
Alex: Great. It’s interesting the path that you traced there because it has some interesting crossroads with me. I have to say I have to thank my wife, Dr. Joni Johnston, who is a psychologist and an author for giving me your book as well as for a lot of other things.
Carol: I thank her, too.
Alex: But also, for a long time ago exposing me to the work of Leon Festinger in the book When Prophecy Fails which I think has become somewhat of a touchstone for her and I, and I found it just fascinating when you recapped that research in the beginning of your book. So I thought maybe you’d want to share with people a little bit about what he found because I think it’s obviously very relevant to what you’re talking about.
Carol: Exactly. When Prophecy Fails was indeed a wonderful and interesting book. Festinger and his colleagues infiltrated a little band of believers who predicted that the world would come to an end on December 21st. It was the first field study of the theory of cognitive dissonance that Festinger was developing. So the question was, assuming that the world did not in fact end on December 21st, what would these people do? How would they reconcile their attachment to this group and their belief that the world would end with the evidence that it did not? What he found was – and this is the heart of dissonance theory, it’s why it’s not such an obvious way of understanding how the mind works – that the more that people invest in an idea, the more committed they are to it; the more of their time, money, effort and pubic displays of commitment to the belief that they’ve done, the harder it is going to be for them to say, “Boy, what I fool I was. Boy, was I wrong.” So indeed in this little group of true believers, the ones who waited for the end of the world quietly in their homes, and the world didn’t end, there was no problem, “Okay, I believe that for a month, it didn’t happen, so no problem.” But the ones who sat there with the woman he called Mrs. Keech, the leader of the group who had somehow had sold their homes, who had given up their jobs, who were waiting to be whisked away in a spaceship to be rescued and so forth, what were they going to do? What he found was they became even stronger in their belief. In the middle of the night, Mrs. Keech had a vision that her commitment to the prophecy had saved the world. Now it’s a fabulous way of resolving the dissonance.
Now why this is important, what Festinger found is that cognitive dissonance is the uncomfortable feeling we have when two ideas conflict with each other. Now for him this is a very cognitive theory, “I smoke, I know smoking can kill me. This is uncomfortable, I have to resolve this dissonance in one of two ways. I quit smoking or I justify smoking.” Now, that’s true. Of course the other great contribution to dissonance theory was to show that before we make a decision, we are all extremely open-minded, we get information about both choices. After we make a decision, the mind shuts, and now we will justify the decision that we made. Information about the choice that we didn’t make, the car we didn’t buy, the house we didn’t buy, the spouse we didn’t choose; any information suggesting we were wrong in the decision we made is dissonant, is uncomfortable. The key thing about Festinger’s theory is it’s motivating. It’s as uncomfortable as extreme hunger. We don’t like walking around in a state of dissonance. So to reduce it, we will stop considering information suggesting we made the wrong choice and we will start paying attention only to information supporting the decision we made.
Now you can see why this is basically a good thing. Evolutionary speaking, it allows us to sleep at night and not beat ourselves up over roads we didn’t take. In fact we all know people who can’t reduce dissonance quickly enough and they suffer for it, right? They keep beating themselves upside the head saying, “Why didn’t I? Why didn’t I?” Well, you don’t want to live your life that way either. But you can see how in science, in personal decisions, in professional choices, if we in fact shut our minds to dissonant information; information showing that we made the wrong choice so that we might be on the wrong path, how are we ever going to get off it? What Elliot did that advanced this theory so profoundly was to say, “Dissonance is greatest when an element of the self-concept is involved,” not when it’s any two cognitions but when it’s “I’m a smart, good, confident person. You’ve just given me information that I did something stupid, foolish, harmful or wrong. What do I do, change my view of myself or dismiss the evidence that you’re giving me?”
Alex: The other part of that that I found fascinating that I think the book does a very good job of exploring is the process. I love the concept of the pyramid. It’s easy to look at these extreme examples and say, “Isn’t that a great case of how people can really get their beliefs twisted around?” But I think the pyramid that you described, it really brings home how we can to that point. Do you want to talk a little bit about that?
Carol: Thank you. I find that pyramid enormously helpful as a way of understanding so many of the polarized stances that people end up with because they have opposite views that they have always had. I mean look at the therapist who supported or don’t support recovered memory therapy, this war has continued now for some 20 years, but of course it didn’t begin that way. The metaphor in the pyramid works like this, you take two people who are at the top of the pyramid; they’re side by side, they’re a millimeter part. They have the same kind of attitude, let’s say, about cheating or whatever the subject is. They don’t think it’s a horrible thing, they don’t think it’s a good thing. They’re kind of neutral about it.
This was actually an experiment that was done with school children. Now you give people an opportunity to cheat. One cheats and one doesn’t cheat. What now happens is that each one will justify their behavior to make their behavior consonant with their attitudes. So the one who cheats begins to justify cheating, “Well, cheating isn’t really such a bad thing, and besides I really needed this grade and besides everybody does it and it’s no big deal,” so on and so forth. The one who didn’t cheat will decide that cheating is really a serious matter after all. It’s not a victim with crime, everybody suffers as a result of this, this is not a good thing and so forth. Over time, each one will move further away form the other down the two sides of a pyramid until they are standing at the base of the pyramid miles apart, these two people who were once very alike. Each one will think that their views about cheating are the way they have always believed, they will internalize this new beliefs and they will have pretty much blinded themselves to understanding how the other person sees this problem.
Alex: It’s amazing, and in some ways it really explains the phenomena that we all experience so much which is how two groups that seemed to be aligned in their interests and in their motivation for knowledge wind up to be the most polarized; not on particular topic. They know it so well and they’ve walked down the pyramid. I think that offers an explanation for why there is such a disagreement between, let’s say, Protestants and Catholics or in the psychology example you talked about in the recovered memory. These are groups that share a lot of the same information as a starting point and wind up extremely polarized at the end.
Carol: Exactly right. I mean these are normal biases of the brain. We understand that from cognitive science that this is in fact how the brain works; to make our lives easier, if you will. So one of the things that happens is not only do we look for reasons to justify the decision we made, but because of the confirmation bias – I’m sure you’ve talked about this on show before – it’s the universal bias we all have of noticing and remembering information that confirms what we believe; and ignoring, minimizing, trivializing and forgetting information that disconfirms what we believe. Of course the essence of science is to be open to disconfirming evidence, but that’s not a natural habit of the human mind. So if you put together the need to justify decisions we’ve made with the confirmation bias, which keeps us seeing only the evidence that we right about the decision we made, then you can see further how people might end up in polarized decisions, and be unable to move back up that pyramid and say, “You know what, I was wrong about this.”
As Elliot puts it, if you pride yourself on your professionalism, on your skill as a therapist, on your being a good guy as a prosecutor, on your brilliance as a surgeon, on your political acumen; if these are important sources of self-esteem and pride for you to feel confident and smart, as is the case for just about all of us, then how much more difficult is it to learn that you a prosecutor has put an innocent guy in prison, that you a therapist have been practicing a form of therapy that might really have really destroyed your clients and your client’s family, that you a doctor have just cut-off the wrong arm of your patient; these are profoundly dissonant shocks to the system.
Alex: Right. As you were just talking there, I think as you do in the book, you do a really nice job of expanding this theory into many areas; into politics, religions and even personal relationships. But I want to bring it back to science for minute because I think it really has particularly a strong impact in terms of how we look at science because this is obviously an area where we’ve tried very hard to avoid these kinds of biases in trying to create a system that prevents us from falling into these critical thinking puddles, if you will. So as I read your book and I listened to a couple of the interviews, I was just alternatively inspired and just irritated as well. For example, I listened to your interview on the podcast for the Skeptical Inquirer and I was listening to you talking to an audience of skeptics, and it was just ironic to me. Number one, PsyCop now with CSI, is not exactly a group that sought out a diversity of opinion or sponsored any collaborative research. Number two, as I’ve explored in this show over the last couple of years, this kind of self justifying “defend the status quo” is exactly what I’ve found as I’ve attempted to bridge the gap between skeptics and controversial science. So I wanted to delve into the skeptical part of it a little bit, and I don’t know how you became aligned with these groups or whether you even consider yourself a skeptic. I mean is that a title that you hold or okay with?
Carol: That’s a very good question. Do I consider myself a skeptic? I want to think about this for moment. What’s the question?
Alex: Yes, let me clear because I think these terms get thrown around and they loose their meaning. We can make it real easy. I mean if you pick up the Skeptical Inquirer Magazine, with the position they take week after week or month after month – I think it’s a monthly publication – it’s a position that is the skeptical position. So you’re a fellow for CSI, you’ve been aligned with a lot of those folks. Do you consider yourself a skeptic in that vain?
Carol: Okay, this is a very fine question, and I’d not thought of this before this way. So, I would say this, I’m a skeptical person but I’m not a skeptic. No one has ever asked me this before. I don’t feel that I am a skeptic as an identity. I’d like to see myself as a person who – listen, I have many passionate beliefs; personal, political, scientific beliefs. The goal for me as a psychologist in the textbooks of Carole Wade, for example, our psychology textbooks were among the first to talk about critical thinking as a way of doing psychology, that the findings of psychology are going to change from year to year, edition to edition. But if you can understand what it means to be a critical thinker, then that’s the skill we want you take with you when you leave this course. Meaning, that you look for the evidence for a certain claim, that you don’t just accept somebody’s claim because it’s said by an authority or a famous person or someone who’s cute, that you keep an open mind if better evidence comes along to cause you to revise your beliefs, that you’re willing to consider other explanations for something that you hold dear, that you have convictions, that you have beliefs that guide your life, that you hold them lightly enough so that if you need to change them when better evidence comes along you can. I would say that is my guiding philosophy. What that means though is I’m skeptical of course about a lot of the big claims that people make in our [Audio Gap 00:24:38] claims that people make all the time without evidence. So let’s separate the goal of being able to be skeptical about so many of the kinds of claims that people make in our culture that are unscientific to debunk the psychics, mystics and people who prey on…
Alex: But wait a minute, you started there with something that I agree with but is very, very hard to do. Then we were kind of then wandering into another tier here. I’ll give you just a small example of something that I’ve talked about on this show before and it has to do with Michael Shermer who is a very smart, very capable person, excellent writer and I enjoy listening to. But this example kind of brings the point to a certain clarity and focus that we can talk about. It’s rather brief and it takes a little while to go into it.
So in 2001, I don’t know if you’re aware of this, but there’s this Dutch researcher named Pim van Lommel, and he publishes this very extensive multi-year study on near-death experience. It’s very well received, it’s published in the Lancet, one of the top medical journals in the world, so it’s obviously extremely well-thought of to be published and to be reviewed there. But the problem is this study suggests that near-death experience can’t be explained by conventional neurology and it kind of points to survival of consciousness after death as a very real possibility. So here’s the cognitive dissonance part, Michael Shermer, again who’s a smart guy, doesn’t make mistakes, writes his column in Scientific America, but he cites the study as evidence against near-death experience. Now this whole upsets the original researcher, that he publicly makes a rebuttal and says this guy totally got it wrong. What’s the upshot? Sherman never backs down. Shermer never admits that he was wrong. He never admits that the studies did say the opposite, and without saying, “Obviously, I’m probably exposing you to information here that’s somewhat new to you.” I think that what I found over and over again is this need to – the starting point that you did to say, “Hey, we all need to be open-minded, just confirming information,” then you go into the psychic thing. Well, what is the best evidence that we have of psychic research? It was done by Dr. Julie Beischel at the University of Arizona and we’re now working on replicating that study, but it’s published and appears reviewed. The lens has to be very carefully focused when we talk about skepticism and what we need to be skeptical of
Carol: Yes. First of all I don’t want to weigh in on any issue or argument that I haven’t been following or don’t know anything about. Second though, what I would say is all of us, everybody, without fail is going to be close-minded about something. Of course, there are many people who pride themselves of being skeptics, who I’m skeptical about the things they believe in. I’m not referring about Michael in this particular case but everybody. Whatever the particular issue is, of something that they care about, all us are inclined to read research, too. I mean scientists to this as well, to read disconfirming research more critically than confirming research, there’s actually been research on this. It’s really a funny thing. We look at studies, if you believe that men and women have important biological differences in the brain, you look at research one way and disconfirming research another way. So nobody is a pure open-minded thinker. So the goal in our book is not to call anybody names or to say that somebody does this better than others, but to point to the ways in which we all feel dissonance when something that we believe as so, suddenly there’s disconfirming evidence. That’s what it means to be in a state of dissonance. Whatever the belief is, that there is a near-death experience or that there isn’t or that there’s a psychic powers or that there are not, nobody is happy with disconfirming evidence.
So the task then becomes how do we assess information in a way that allows us to move the conversation forward in a thoughtful and useful way. One of our concerns in our book is that it’s hard enough for people to do this when they don’t have vested interests in the outcome of research. But the big change in our society now where so much research is funded by corporate interests who have a financial stake in the results of the studies, this has a way of tainting the research that people do without realizing it because people see themselves as good independents, scientific, critical thinkers, and don’t even want to entertain the possibility that they’re being biased by the people who are funding them. But there is also intellectual vested interest of course in an idea.
Alex: Right. We’ve taken the idea of science being value-free and we can throw that out the window because as we socially become more and more polarized on our views, it’s clear that we’re not value-free in our science, then we add the corporate element to it and we’re financially not value-free either. So, it’s a tough situation, I think you make a wonderful contribution in terms of prescribing a way to navigate that. Let’s talk about how you think groups that have different beliefs can create a dialog and how they can come together. In some respects, you don’t paint a very optimistic picture for changing people’s mind, especially if their beliefs are firmly entrenched.
Carol: That’s true. For example, one of the conclusions that we make in talking about the legal system is that because so many prosecutors, police, the ones that have been sliding down the pyramid for a long time to put someone in prison, seeing themselves as the good guys all the way along, it’s extremely difficult for a district attorney to look at DNA evidence and say that the guy they put in prison is in fact innocent. So, as we show in the book, it has often taken an independent commission, outside parties, a new district attorney coming in to re-look at these cases and make sure that justice prevails. That’s what Morgenthau did in the case of the Central Park jogger case, they put five innocent kids in prison for years until the real rapist/attacker confessed. When they then went back and looked at the actual evidence against those five kids, those kids have been put away during a time of hysteria and craziness, as often happens when there’s a terrible miscarriage of justice or a rape and a murder and so forth, public opinion runs, it’s very passionate, the district attorneys are under the gun to get somebody and put them away.
So, in that case, precisely because it is so difficult for a professional to stand up on his or her hind feet and say, “I was wrong and by God, I’m going to fix this.” That is why we need to have correcting mechanisms in place whether it’s videotaping of interviews, other precautions and independent commissions. Sometimes, the solutions can only be institutional. That said, in many institutions, at an individual level, it’s possible indeed for individuals once they understand the mechanism of dissonance and to feel how it works in themselves and in their partners and loved ones to make changes. It has been so interesting for Eliot and for me, the letters that we have gotten from people saying, “Whoa,” I read those and it was like looking into a mirror. But one of the things that happen is that you become very aware of how dissonance works. So what that means is that if you want for example to explain to another person, like your beloved, why you’re right and the beloved is wrong, one of things you don’t say is, “What were you thinking when you did that? What is the matter with you that you can’t possibly see how wrong you are?” You understand that that kind of language is going to put them right smack in dissonance. They’re going to say, “What was I thinking? What I was thinking was that I’m right of course and that I’m the smart person here.” They’re going to feel defensive, they going to want to protect the decision that they made.
When we understand what dissonance feels like and the mental effort we put into reducing it, it becomes easier to break that habit. As Eliot says, it makes us more mindful, and in that mindfulness is the possibility of change. So as he says, for example, if you know that after you buy a car you’re going to be feeling dissonance about the car you didn’t get and a little worried about the car you did get, you know what to expect, you know what it feels like and you know to be careful about not cutting off information that could be helpful to you in the future. That’s the small example of car, but of course it applies to big decisions as well. It’s a very useful thing to understand both in managing our own post decision dissonance, our own intellectual dissonance and our own personal dissonance with our loved ones.
Alex: I can certainly see in the personal realm how that tactful approach can give much better results than otherwise. But I do question whether or not we should or can always be that way when it comes to science. I mean, for example, it almost sounds a little bit politically correct kind of speak and I wonder if we can’t be more direct sometimes. For example, I mentioned this in my e-mail to you and you just mentioned it earlier, but in your book you talk about Sigmund Freud who is the guy who has now been so thoroughly discredited that if he were a modern day scholar he’d certainly be facing civic if not criminal charges, and it would be a complete academic pariah. So, how are we supposed to treat them? In your book you do a balancing act, I feel like. On one hand you’re mentioning that, “Hey, a lot of these theories don’t work out so well.” But on the other hand, you don’t go all the way in terms of truth telling that I think a lot of people still don’t know just how exposed his research and his deception was. So, what about this need for truth telling and how do we draw the line on when we’re maybe being a little bit too direct and we need to be a little bit more tactful, when we really need to stand up and say, “Wait, this is the way things are and here are the facts that I’m basing that on and I’m willing to withstand whatever blowback I get from that.”
Carol: Well, first of all, boy I thought we were pretty strong – we weren’t talking about Freud, we were talking about Freudian ideas that have bubbled up in two ways. One was the completely unscientific way that Freud preceded. Here’s something I believe, everything I see confirms what I believe, everything you tell me that you don’t think that I’m right is just evidence that you’re not seeing what I see and that just shows how bad perceiver you are. I mean there was nobody less scientific in that respect than Freud, which I thought we were pretty clear about.
Alex: No, to be fair you were. But I mean to go one step further it’s now been revealed that the way he went about promoting those ideas was to fake…
Carol: He’s a self promoter.
Alex: He faked the data, right? I mean he generated case studies that were completely fictitious and as a matter of fact his patients are getting ready to leave him there saying none of his stuff is working, he turns around and writes these case studies that “I’ve cured these people and all these evidence.”
Carol: That’s a lie.
Alex: Yes, that’s a lie. Imagine that from a modern day scholar, it wouldn’t be, “Well he still had some interesting contributions although these theories didn’t work out.” It would be, “No, we need to really put this person aside and hold them up as exactly the opposite of how science is supposed to work.”
Carol: Right. Well I think there might be a couple of issues that you’re raising here. Our book really is not about –we talked about this Freud’s way of proceeding very briefly as an example of a non scientific approach, but then we’re really hard on the Freudian notion of repression when we talk about the recovered memory.
Alex: Sure, and I don’t want to get us off track. Here is my point, is that would you feel comfortable, and maybe you do. I just have to feel that there’s a certain feeling that you must have being a psychologist and being in that community that still has some very tender exposed feelings about Freudian theory of completely laying it on the line of just how – like the kind of conversation we’re just having here. So, what do you do when you feel that balance? Let’s take it away from that specific example. What do you feel when you feel that tension between, “I need to kind of pull back here a little bit to be tactful from this is information that needs to come out and I really can’t worry too much about whose feeling are going to be hurt. If it’s a taboo subject, I just need to do some truth telling.”
Carol: That’s a terrific example and I have done both in my lifetime because I feel that Freud has done enormous harm. I agree with Sir Peter Medawar, who said, “Psychoanalysis was the greatest intellectual confidence trick of the 20th century.” It’s a dinosaur in the history of ideas.
Over the years, I have written and lectured to different audiences with different goals, and the language I used varies in a way depending on what my goal is. In our psychology textbook we have had Peter Medawar’s quote alright, which we do – along with quote from Peter saying, “”My God, this guy was the genius right up there with Copernicus and Galileo because he unseated human hubris,” and we say to students you’re going to encounter both views of Freud as you travel through your lifetime and to your psychology courses, let’s tell you what kind of theory this was to provoke such polarized attitudes, that scientists and psychology no longer have very much to do with Freud for these reasons. But you will encounter a lot of his ideas about repression and denial and this thing and that thing, you need to know where the evidence is for this and you need to know what was so compelling about Freudian Theory that it lasted for so long.
So, that is a different goal from, for example, what I have written about the recovered memory movement for the New York Times book review when I reviewed The Courage to Heal, Secret Survivors and all of those books and said, “These books are scientifically illiterate.” It follows Freudian notions that are completely discredited and wrong. They’re wrong about repression, they’re wrong about memory and they’re wrong about trauma. They wouldn’t pass the Psyche 101 course. So, you can see that the language I used and the passion I bring to the discussion will vary based on the audience.
In the case of our book here, Mistakes Were Made (But Not by Me), we wanted to focus on what happens when some therapist adopting a notion of repression or of a children’s testimony and so forth don’t submit those beliefs to mere evidence and continue perpetuating ideas that can have such devastatingly harmful consequences, and I think we’re pretty strong about that. But our goal is not to make people feel stupid for having these beliefs but to bring readers along. If you’re a reader, you don’t know much about Freud and you have a kind of a vague idea that repression is what we do when we have traumatic experience, then it’s our job as educators to bring all of our readers along in the conversation and not to immediately alienate people who have come from a Freudian tradition or who believe some of these things but haven’t thought about it very seriously.
Alex: But isn’t that just exactly the kind of side-stepping or balancing act that gets us into trouble whenever we talk about important areas where science and the social or public misunderstanding of science come into play? I mean, if you want to talk about creationism or if you want to talk about Darwinism or you want to talk about any topics, stem cell research where the science seems to be clearly on one side or another, these balancing acts seems to just kind of perpetuate more misunderstanding.
Carol: Okay, let’s be really clear about the balancing act. I am absolutely with you on this. There are no two points of view about creationism. “Oh, let’s do a little creationism here and a little evolution there.” On that one, no, and I entirely agree in fact with Richard McNally, he’s a clinical scientist at Harvard who has written a spectacular book on repression and memory and so forth. He said, “You know, on this recovered memory business, it’s not like it’s a little bit right here and a little bit wrong there.” This is not a view where we’re going to take two circles and they can oval out of it or something. One side is right and one side is wrong. Either it is possible to repress the memory that your father raped you every day for 16 years, only you forgot it until you went into therapy or it’s not. I entirely agree with that. I don’t think in my career I have been wishy-washy about issues in which the science was clearly to one side. Homosexuality is not a disease, it is not an aberration, it is not a mental illness even though you can probably still find psychiatrist somewhere who will tell you that it is. So, I think, it is important for scientist in particular to take public stands on issues in which in fact there really is no scientific debate.
Alex: Well, I just think you certainly are unashamed to take a position. I love that. I think that’s very refreshing in the book. It’s also very challenging because if you do have beliefs at all that differ I think, sometimes it can kind of butt up against that. But I think that’s part of the discourse. I think the problem is that, as the point that you just made, there are so very, very few issues where the science is so clear. So if we can agree that creationism has no scientific base, and I do completely, I don’t think we probably agree on whether the fundamental concept of non-living universe, which we based everything on, is really scientifically proven; or the existence of an afterlife or some intelligent force that might have some factor on our life. I bet all those are still open to scientific inquiry and the problem – I guess I’m going to jump on the other side of the issue – is when we are in the mode of standing up in saying, “Wait, this is truth telling. This is what the science really says.” We have to, I think as your book cautions, be really, really super open to disconfirming information, and that’s just a very hard balancing act to manage.
Carol: You bet you, it is. That’s the point. I think our goal is to teach people a little humor and humility. It goes a long the way in these debates. You asked before what can we do about these things, I think so often of the issue of family risks where one side sees the other as the perpetrator and they are merely the victim responding to what the perpetrator did and so forth, these risks can then go on and on for years and years with each side feeling increasingly more righteous and neither side being able to see what might have been right in the other side’s point of view.
Well, this is an example of something that people can take away with them if you are willing to put down the burden of being right, just put it down for a moment and think about whether the other person might have something to contribute here, and why am I perpetuating this quarrel. Is it important, do I really need to, what’s my motive for this vindictiveness or for keeping this going? Does it really matter that it can be a very liberating thing to say, “You know, I was wrong. I harmed my patient. I put the wrong guy away. I insulted my mother, my child, my cousin, and I didn’t need to.” I could actually go to them and say, “I’m so sorry, I was wrong. What can I do to make amends?” When people do that, they find that it is often not as onerous and terrifying and scary and depressing as they anticipated it would be. Doctors too have learned this. You think it’s such a horrible thing to go to say to a patient, “Boy, I made a bad mistake.” The patient is craving honesty from the doctor as our country is craving honesty from its leaders, as our spouses are craving honesty from us. If you say to your spouse, “Boy, honey, you know that fight we’ve been having all those years about my memory of that thing, you were right and I was wrong.” Is the spouse going to be unhappy with you? I don’t think so.
Alex: Yes, but only if you mean it.
Carol: Only if you mean it.
Alex: I tell you what, I think that’s probably a great place to live. I very, very much appreciate the contribution that the book makes, I think it’s a wonderful book and I highly recommend it to anyone. Again, Mistakes Were Made (But Not by Me). Thank you, Dr Tavris for joining us today on Skeptiko.
Carol: Thank you, Alex. It’s been a pleasure.