06 March 2008

Ignorance and trust

The reliably fascinating Science Show had some interesting comments about how people evaluate things that they know nothing about. In this case, the example is nanotechnology. Interestingly, according to Dan Kahan:
(A)lthough most people don't know very much about nanotechnology, they're still pretty opinionated about it.

Wait. How can someone hold an opinion on something they know nothing about?
What we found is that that very quick visceral reaction was driven a lot by emotions. So just the term 'nanotechnology' or even a very brief description of it can give somebody an initial sense of whether it's risky or beneficial.

So people are making snap judgments based on very limited information.
(A)s people start to learn about nanotechnology they don't form a uniform opinion. In fact they become culturally polarised. There's a body of research and cultural cognition is the mechanism that describes the phenomenon that shows that people tend to conform their beliefs about risks to their values. So if you're somebody who likes commerce and industry and private initiative you tend to be very sceptical about environmental risks. If you're somebody who believes that commerce and industry does bad things and creates inequality, you'll embrace findings of risk. We found that people who have values like that, when they're exposed to even just a little bit of information about nanotechnology they divide along those lines.

And the thing that interests (and scares) me most is how people viewed expert opinion.
We did an experiment where we created fictional experts, and we found that people, just by looking at them and by reading a mock CV, would impute to them values just about how society should be organised. Then we assigned to those advocates positions on nanotechnology just randomly -- suspend it pending more research on risk, allow it to continue pending more research on risks -- and we then saw how people reacted to the arguments of these fictional experts. It turned out that people would adopt whatever argument on nanotechnology was being advanced by the experts whose values were closest to theirs.

Emphasis added. And a big factor, which the piece talks about later, is religiosity. You don't have to look very hard for evidence of that in America. They call it the culture war.

This, to me, is a very depressing set of facts, but Dr. Kahan did leave me with a little bit of optimism:
(Y)ou shouldn't just assume that people are going to form beliefs about nanotechnology that match the best scientific understandings out there. In the normal course they're going to form beliefs that fit their cultural predispositions but then don't assume that's inevitable. In fact it is possible to devise communication techniques that can help to counteract that bias.

We're left hanging on what those communication strategies are. Those are going to be hard to figure out, increasingly so, since people can avoid information that conflicts with their world view.

No comments: