Professor of Law
Cultural Cognition Project, Part 2
Recorded June 10, 2010
Joe: What I was going to do, Dan, would be to quiz you a little bit on the methods and the findings from the 2007 paper, and the more recent 2010 one, that both relate to global warming or climate change.
Dan: We have a paper, a study on perceptions. How might cultural cognition influence perceptions of scientific consensus? I mean, this is often something that people remark on – How can it be that the people who disagree with me on this issue, climate change, or what have you, are ignoring scientific consensus? And, you know, our hypothesis was that, well, maybe people aren’t really ignoring scientific consensus. Scientific consensus is a fact that you have to collect evidence on or accept the words of others about, just like any other kind of fact. In the process of forming impressions of scientific consensus, then your beliefs are going to be influenced by your values in exactly the same way as any other kind of belief that you might form.
So what we did was an experiment, where we…Well, there were two parts to it, actually. One part was a correlational part, where we just took a large sample of people. And after we measured their values and characterized them in the way that our scheme says, we asked them what their positions were. Or, what they believed most scientists thought about a variety of issues. The issues included climate change. You know, so is global warming happening? Is it being caused by humans? That was one issue. Another issue was whether the deposit of nuclear waste and deep geologic isolation was something that could be done safely and effectively. And then there was another one about whether permitting private citizens to carry concealed weapons in public increases crime, or maybe even decreases it because it actually discourages violent people from engaging in predation.
Now we picked these issues because, in fact, the National Academy of Sciences has issued expert consensus reports, they call them, on all of these issues. But we also knew that these were issues, based on previous studies, on which people who have different values tend to have different beliefs.
People who have relatively hierarchical values and individualistic values, they tend to be rather skeptical about claims of environmental risk. The reason is that they perceive that the widespread acceptance of those claims would justify restricting commerce and industry. And those are activities that are important to them, culturally and otherwise. So they have some motivation to resist that.
People who are more egalitarian and communitarian, on the other hand, they tend to be morally suspicious of commerce and industry. They see those as sources of unjust social disparity, right? Or as of outlets for untoward self-seeking. And in previous studies, we’ve observed correlations and we’ve done experimental work to show that people with those kinds of views are going to form risk perceptions on climate change that are consistent with those predispositions.
Now we then asked people, well, what are your perceptions of what most scientists believe? And, of course, most egalitarians and communitarians, they believe that there was overwhelming consensus, among scientists, that climate change is happening and that humans are causing it. That was something that the people who had hierarchical and individualistic views were much more ambivalent about. They didn’t think that a majority of scientists believed that.
I mean, the idea is that people…Nobody is saying, screw scientific consensus. People are saying that, well, what I believe is consistent with scientific consensus. But they’re forming different views of that, that are consistent with their values.
Now the second part of the study was an experimental one, where we tried to show a mechanism for this. And here what we did was we asked the subjects to tell us whether they thought that a fictional scientific expert; somebody who was described as having very respectable credentials, trained at an Ivy League University with a Ph.D. in the relevant subject matter, on the faculty of an elite university, many publications and so forth, member of the National Academy of Scientists, was an expert on the issue, either climate change, nuclear power, waste disposal or gun control.
But what we did was, in addition to showing them the CVs of these people, we would show them an excerpt…a short excerpt from the book. And here we could experimentally alter…manipulate, as they say, the conclusion that we attributed to the person.
Sometimes, that person would say high risk, sometimes low risk. And we formed the hypothesis that when the expert…Well, that the perception of the subjects that the pictured author was an expert, would be conditional on the fit between the position the author was taking and the subject’s own cultural predisposition. And we found that that was overwhelmingly true.
All right, so what happens is that people, they’re much more likely, when they observe putative experts making claims about these issues, to actually credit the person as being an expert, when that person takes a position consistent with their own cultural predispositions than when they’re not. And if that happens systematically, well, then you’re going to end up with systematic differences across people with different values on what most scientists believe. Right? They’ll kind of do a little mental survey.
I say to you, what do you believe about climate change? You’re not a Luddite. You kind of call up a mental image of all the scientists you’ve known who’ve taken a position on this issue, and you count how many on one side and how many on the other. Magically, they’re all on one side, the side that fits your predisposition.
Well, the reason is you’ve excluded the ones who took the other position from your sample. Right? Not consciously, but because of this dynamic of cultural cognition. That will lead, I mean, even in a society where people have a lot of confidence and trust and agree we should be basing policy on good science, to persistent cultural conflict on issues that are amenable to scientific investigation.
And just one last point about this. We picked those issues out because we think that there’s a tendency, no matter what peoples’ cultural predispositions are, to think that the people who disagree with them are just somehow against science or oblivious to what scientists are saying. This study shows that this is a ubiquitous phenomenon. It happens to everybody. So everybody ought to see that that’s a problem. Everybody ought to have a common interest in resolving it. And maybe everybody ought to stop the kind of recrimination, accusations about kind of stupidity and lack of reason that oftentimes get involved in these debates. And that makes it a lot harder.
Joe: Sure it does. It was interesting, I was talking to a room full of climate scientists, primarily. And they said, you know, those science skeptics just don’t understand the science. They just aren’t smart enough to understand. And so therefore, they listen to “know-nothings” like Rush Limbaugh and they get their opinions from him. Right?
Dan: Um hmm.
Joe: So I think you just already answered that question in your last paragraph. Anything else you’d like to say about it though?
Dan: Well, you know, they’re right that people aren’t smart enough to understand it. But the people who agree with them about their finding on climate change, they’re also not smart enough to understand it.
Al Gore is not smart enough to understand it. Al Gore did not go to the North Pole and stick a thermometer in the ice. If he had, somebody would have had to tell him what that meant. Al Gore, like the very people who disagree with him, is coming to his opinions in the same way, by following the processes he’s familiar with for identifying who he can believe about what. And there’s no choice. We have to do this.
Now here’s the point. There’s nothing wrong with that. We’re extremely good, as a species …We’re all experts at being able to familiarize ourselves with, and have a facility and applying, the kinds of techniques and heuristics–cues–that we use to know who knows what they’re talking about and who can be relied on. Right? So that works really well. If it didn’t, we’d all be dead. And it works so well we’re able to accumulate vast stores of knowledge and build it up over time…because nobody can start from the ground level and build up.
The problem is, what happens when the necessary diversity of systems that we’re using to do that, all of which tend to work, for whatever reason, start to work at cross purposes? How does that happen? And what can you do about it? But to say that the other side is stupid, that turns out to be one of the things you shouldn’t do about it. And this is really a basic mechanism – that people react in a dismissive way, if they associate a certain kind of factual claim with an implicit threat on their identity.
I know what I know because I know whom to trust. If you tell me I’m wrong, you’re telling me the people I respect are stupid. Now, and if you say that directly, you say, well, your side is devoid of reason. Then, actually, the consequence of my assent, or others assent, to that position is a really kind of humiliating defeat. That’s not something that you want to happen. You know, it’s not good. It’s not nice. It’s not civil. But it’s also just completely corrosive of creating a climate where the best science is going to make itself felt. We’re very interested in trying to promote a sense of humility. And we all have beliefs about these things, but we’re not trying to convince people about our beliefs. But we do, actually, want to convince a lot of people, probably ones who agree with us on a lot of issues, that they’re making a misdiagnosis of what the problem is. And there is a problem, that this gap between what we know and how much it affects our policy. They’re making a mistake about that.
It isn’t that the other guys are anti-science. It isn’t that they’re stupid. It’s that we’ve come to a misadventure, a bad situation where diverse citizens, who rely on diverse systems of cultural certification, are in conflict. And that’s not inevitable.
Joe: Let me see if I can make a gross overview of the current state of, particularly climate communication science. The research like the Six Americas study? I hope you’re familiar with it? . . . and others which segment the public, seem to then focus on crafting messages that speak to values, in some general sense. There seems to be a great deal of attention being given to adroit messaging. I think you’ve already talked about why that would be. Are the downsides to that approach to focusing on messaging?
Dan: You know, a downside? How about the idea that if we just try to sell people, you know, on things and kind of make this into a species of advertising, I think that that does debase debate. But it also has an extreme risk associated with it. I keep wanting to mention..[chuckles]…the HPV Vaccine debate… We had certain predictions about which kinds of groups would have a certain kind of predisposition on these issues. That it would, in fact, affect how they reacted to information on one side or the other of the debate when they saw it. But also then that people would be very sensitive to what they perceived the moral values of people giving them information on the question were. And that turned out to be a much more consequential factor–that kind of cue. You know, who’s saying this mattered a lot more. And when people believed that it was a kind of, oh, that side versus this side, they polarized.
Now…[chuckles]…gee, that’s not that hard to figure out. And Merck, the manufacturer of the HPV Vaccine, what they did was, they got the Governor of Texas, Governor Perry, to issue the Executive Order that would have made Texas the first and only state to mandate the HPV Vaccine for school girls, as the CDC was recommending.
He’s a conservative traditionalist, what we would call hierarchal-individualist. And they’re right that those are the people who are going to be most skeptical about the benefits and most concerned about the risks. Now, gee, you know, if he’s for it, maybe that means it’s OK, you know. . . .Well, when it came out that Merck had given him campaign contributions, right, not only did that alienate the people they were trying to persuade, it made what we call the egalitarian-communitarians suspicious too. Ah ha, corporations are trying to corrupt democracy. That’s one of the concerns that they have. So there’s a problem. If you try to…you know, you cannot orchestrate things very effectively. You can make…you vitiate trust.
The better way to do it is to say, well, given that people of all cultural persuasions would recognize that this is a kind of dilemma that they can all face, what kinds of things would they agree – if they weren’t talking about any issue in particular – should be done to structure deliberations and the provision of information to minimize that? To make it as likely as possible that everybody will give considered attention to good information?
Messaging framing is part of that. Right? And I think messaging framing, in the service of that, is a sensible thing and also a moral thing. Does it have a downside? I mean, one downside is that…that this is a pragmatic area and a developing area of science. You know, we might make a mistake. But then a downside too of manipulation is there. And that should be resisted.
Joe: One of the previous interviewees was Baruch Fischhoff. Spoke at some length about his model of risk communication he and Granger Morgan and others have a book on that, and so on, which is pretty much the opposite of the persuasive communication model that we’ve been just talking about. In fact, Baruch calls it “non-persuasive communication”. And I think by that he generally means that the communication is sort of focused on the user. And takes into account particularly that person’s behavioral needs and constraints, and also focuses on the decisions that they need and want to make.
I read an opinion piece of yours from earlier this year in Nature, [and] your last paragraph included the phrase, “We need a theory of risk communication that takes full account of the effects of culture on decision making.” I don’t know how familiar you are with the model of risk communication and decision making that I was just laying out for you. But, in any case, how would you integrate into decision making, taking the full account of the effects of culture? What would you do?
Dan: Well, the goal is to try to create a climate in which people of diverse outlooks are most likely to recognize and attend in an open-minded way to information. So at that point, they can make a decision based on their values. And there are a number of important things to do. One is to present the information in a way, if you can, that emphasizes consequences…or, implications of the information that are congenial to, consistent with peoples’ values, and not only ones that are threatening to them.
The reason that individualists are skeptical about environmental risks is that it seems to entail that you’re going to have to restrict markets and commerce and private ordering. That’s something that they care about. Well, if it turns out that restrictions aren’t the only thing you’re gonna be doing, that you also have to have innovation, you also have to employ private orderings to surmount the problems, you better make sure that that gets prominent consideration. When people understand…and we’ve done a study like this. When individualists are told that nuclear power is a solution, they’re much more open-minded to facts, the kinds of facts that are in the intergovernmental report about climate change, than when they’re told only emissions, right, or something like geoengineering.
This is true. It’s not going to be enough to control emissions. We’re going to have to rely on all our ingenuity. And we’re also going to have to avail ourselves of all of the energy and private orderings and markets to come up with technical solutions to this problem. It’s not something new. You know, go back to cholera. We used to think that cities could only be of a certain density before people would…Well, they literally were dying on their own waste products. But we figured out how to fix that. Here we need to do it again.
And it’s a story that shows that the common attention to this problem, far from being hostile to a vision about what makes human beings excellent, and societies excellent, is gonna to have to embrace it. So it’s wrong to leave that out. And some people, I think, leave it out on purpose.
Now, another thing to do is use communicators who are culturally diverse. In our HPV study, we showed that when people see that the message…that the risk position they’re inclined to accept is being advanced by the person whose values they share, and the one they’re inclined to reject is coming from the person whose values seem alien to them, that’s when you get maximum polarization. That happens all too often. And it doesn’t need to.
It’s not the case though in climate change. There isn’t a broad array of people of diverse values who can make the claim effectively. Don’t brand it. Right? And certainly then, don’t present the information in a way that makes acceptance of it tantamount to acknowledgment that there’s some kind of incompetence in a group.
Now, you might say, well, this is obvious. But as my grandmother would say, If you’re so smart, why aren’t you rich? [Chuckles] We’ve not been doing this. We’ve been doing almost the opposite!
And finally, this is a very important: Narrative. People process information in a narrative way. They try to understand…If they can’t understand it narratively, they can’t understand it. But what they try to do is they say, well, who are the good guys? Who are the bad guys here? What are they fighting about? What’s the lesson of the story? And they draw their conclusions accordingly.
Well, it turns out that there are different kinds of narrative templates across the different cultural types. You can imbed the same information in a narrative that is congenial to a type that otherwise might be closed-minded about it, and make it easier for that person to be receptive to it. Right?
These are techniques that are emerging, and that need to be pursued. They should be pursued, certainly, in climate change. And I hope that they’ll make a difference there. We will get through climate change as a problem. And then we’re going to have more of these issues. We’re going to have these issues on synthetic biology, on nanotechnology. There are things that ought to be thought about at the very outset, so that you don’t find yourself in the position of trying to fix the…to uncross the entangled lines of cultural certification that people use.
Make sure they don’t get that way in the first place, because they tend to work. And people will tend then to converge on the best science.
[End of Interview: 23:20]
Note: This is an accessible version of a document originally produced for the Web in .pdf format. While it contains all significant content of the original print document, it may omit layout and graphic elements which contribute to the look and feel of the original, and make the .pdf version more suitable for printing.
Note: This is an accessible version of a document originally produced for the Web in .pdf format. While it contains all significant content of the original print document, it omits layout and graphic elements which contribute to the look and feel of the original, and make the .pdf version more suitable for printing.
Contact us: firstname.lastname@example.org