Blog Archive

Wednesday, February 2, 2011

Felicity Barringer, NYT: Are We Hard-Wired to Doubt Science?

Are We Hard-Wired to Doubt Science?

Deborah Tavares, with a sign protesting smart-meter installations, in Sebastopol, Calif.Annie Tritt for The New York TimesDeborah Tavares, with a sign protesting smart-meter installations, in Sebastopol, Calif.
Green: Science
In researching Monday’s article about opposition to smart meters, I found myself once again facing a dilemma built into environmental reporting: how to evaluate whether claims of health effects caused by some environmental contaminant — chemicals, noise, radiation, whatever — are potentially valid? I turned, as usual, to the peer-reviewed science.
But some very intelligent people I interviewed had little use for the existing (if sparse) science. How, in a rational society, does one understand those who reject science, a common touchstone of what is real and verifiable?
The absence of scientific evidence doesn’t dissuade those who believe childhood vaccines are linked to autism, or those who believe their headaches, dizziness and other symptoms are caused by cellphones and smart meters. And the presence of large amounts of scientific evidence doesn’t convince those who reject the idea that human activities are disrupting the climate.
What gives? A recovering journalist, David Ropeik, who is an instructor at the Harvard University extension school and the author of a book, “How Risky Is It Really?” offers one explanation.
He uses peer-reviewed science to explain the limits of peer-reviewed science as a persuasive tool.
DESCRIPTIONDr. Martha R. Herbert,
Massachusetts General HospitalThe cerebral cortex and hippocampus amygdala (in red) in a normal brain.
Humans, he argues, are hard-wired to reject scientific conclusions that run counter to their instinctive belief that someone or something is out to get them.
Here, slightly edited and condensed, is Mr. Ropeik’s explanation of the role of neuroscience, psychology and anthropology in creating this societal cognitive dissonance about peer-reviewed science. (Or, as my colleague Andrew Revkin says, why we have “inconvenient minds.”)
The assumption that there is a single truth to know that the scientific method can bring us — or a useful truth we will all ascribe to — overlooks large bodies of science that show that there is no such thing as a fact. We are subjective analyzers of data. You and I and everybody there in that story will look at “the facts” — no matter how peer-reviewed and scientifically robust they may be —through the lenses that evolution has given us for our survival.
First, the way information comes into and is processed by the brain is part of this. It’s processed sooner by the amygdala, where fear starts, than the cortex, the seat of reason. We are hard-wired to respond to external or internal information with emotion and instinct first and cognition second. With emotion and instinct more and reason less.
In the case of radiation, it is invisible. It is a risk that we have no information to immediately use to protect ourselves. Look at the risks that are scary — chemicals, pesticides, radiation — we are uncertain and it scares us because have less control over what we can’t detect. Even if you have a Geiger counter, you would see this information and it would still be partial. Most of us would still be a couple of degrees short of knowing what it meant. Meanwhile, your amygdala is screaming: Alert! Alert!
Second, there is the time element when it comes to being averse to loss. If a risk is down the road, we see it with rose-colored glasses. “It won’t happen to me.” This means people like smokers, or cellphone-using drivers, or people in Manhattan about something like 9/11. But when something is more immediate, the risk side of the equation carries more weight.
Third — and this is the cutting edge field of research into risk perception — we tend to identify in four major groups over how we want society to be organized and to operate. You and I tend to conform our opinions about the validity of science to match what would be consistent with how our tribe operates.
Two of the groups involved, he said, are simply characterized: individualists (most people would call them libertarians, who want the government to butt out) and communitarians, the two poles on the political spectrum. The two other groups, he said, are called hierarchists and egalitarians. “Hierarchists like the status quo, class, old money,” he said. “They like a nice predictable social ladder with rungs on the ladder. Egalitarians don’t want any rungs.”
Based on their remarks, he said, some of the smart-meter opponents are a blend of egalitarian and communitarian. “They don’t like new technology,” he explained, and they are bothered by an economic status quo that produces things like smart meters.
“They believe that society would be better if it stood up more to the hierarchist status quo,” he said. “When something that represents that status quo comes along, there is a cultural resistance to it. That is the underlying cultural reason they will cherry-pick their symptoms and the facts into their ostensibly rational argument against smart meters.”
The science on which Mr. Ropeik bases his conclusions includes the work of Joseph LeDoux, a neuroscientist at New York University and the author of“The Emotional Brain: The Mysterious Underpinnings of Emotional Life,”and of Paul Whalen, an associate professor of psychology and brain science at Dartmouth who maintains a Web site called “The Whalen Lab, An Affective-Cognitive Neuroscience Laboratory.”
Those who developed the theory of cultural cognition, including scholars at Yale who are writing about the reception of the science of climate change, convene at this Web site.
The literature on the psychology of risk perception is cited in a chapter of Mr. Ropeik’s book.
Now, why do I think that not everyone is going to agree with him… ?

1 comment:

dk said...

this is very interesting post. I'd say, thought, we are not hardwired to doubt science. in fact, we are *amazingly* good at believing it: think how many people take antibiotics, refrigerate food, etc., all w/o having done any of the experiments themselves. What we *are* "hard wired" to do (actually, that is very hackneyed way of putting things)is trust people with whom we have a cultural affinity. That's what makes transmission of scientific knowledge possible. It is also what sometimes leads to public *conflict* over science -- in those cases (they actually are statistically *rare*) in which the diverse communities of trust we belong to end up at odds.