brain MorgueFile ClaritaAs Felicity Barringer reports at NYT’s Green blog, scientific evidence reinforces something that communicators have long understood as a guiding rule of our craft: “[Humans] are hard-wired to respond to external or internal information with emotion and instinct first and cognition second. With emotion and instinct more and reason less.”

In other words, (this is not the scientific way of putting it, either, BTW) we react to information with our heart or gut more than our brains. Of course, this impacts responses to our messages about policy solutions and new technologies.

For example, reporting another story for the Times, Barringer ran into opposition to smart meter technology from an unlikely mix of Tea Party conservatives and left-leaning individualists in California, many of whom are convinced that the meters’ “radio frequency radiation” poses a health threat.

But there’s absolutely no scientific evidence of a link to health problems. The predicament led Barringer to ask:

How, in a rational society, does one understand those who reject science, a common touchstone of what is real and verifiable?

The absence of scientific evidence doesn’t dissuade those who believe childhood vaccines are linked to autism, or those who believe their headaches, dizziness and other symptoms are caused by cellphones and smart meters. And the presence of large amounts of scientific evidence doesn’t convince those who reject the idea that human activities are disrupting the climate.

What gives?

  • Our work is made possible by the generosity of people like you!

    Thanks to Valerie Streeter for supporting a sustainable Northwest.

  • Barringer looked to David Ropeik, who is an instructor at the Harvard University extension school and the author of a book, “How Risky Is It Really?” for an explanation. (And, yes, self-mockery is appropriate here: He uses peer-reviewed science to explain the limits of peer-reviewed science as a persuasive tool.)

    According to Ropeik, there are several reasons humans are born science skeptics.

    First problem: As discussed, information is processed sooner by the brain’s amygdala, where fear starts, than the cortex, the seat of reason. Back to my layman’s terms: our gut kicks in before our rational mind. (Andrew Revkin calls this our “inconvenient minds.”)

    Second problem: Loss now is more powerful a motivator than risk of loss—or even gain—down the road. “There is the time element when it comes to being averse to loss. If a risk is down the road, we see it with rose-colored glasses. ‘It won’t happen to me.'” This principle applies to cigarette smokers, or texting drivers, or just about anybody trying to fathom the likelihood of global climate change. “But,” Ropeik says, “when something is more immediate, the risk side of the equation carries more weight.”

    The third problem (and Ropeik says this is the cutting edge field of research into risk perception) is that our opinions—about science and a host of other worldviews—tend to match what is consistent with how our “tribe” operates. It’s a simplification, but Ropeik divides these “tribe” cultures into four major groups, each with a particular way of organizing, experiencing, or perceiving society and its operation. It’s called “cultural cognition” by social scientists.

    Barringer writes:

    Two of the groups involved, he said, are simply characterized: individualists (most people would call them libertarians, who want the government to butt out) and communitarians, the two poles on the political spectrum. The two other groups, he said, are called hierarchists and egalitarians. “Hierarchists like the status quo, class, old money,” he said. “They like a nice predictable social ladder with rungs on the ladder. Egalitarians don’t want any rungs.”

    In other words, based on these underlying cultural understandings, different people will cherry-pick their symptoms and the facts to fit into their ostensibly rational argument. (For similar reasons, they—we—probably also consume media that backs up their—our—predetermined opinions/worldviews.)

    Another researcher—Dan M. Kahan, Yale Law School, describes cultural cognition this way in an article called “Fixing the Communications Failure”: “People endorse whichever position reinforces their connection to others with whom they share important commitments. As a result, public debate about science is strikingly polarized.” So, there is a strong emotional pull to reject information that drives us away from our community or “tribe.”

    What do we do with this information?

    Dumping more facts on people to convince them of scientific consensus is probably NOT the answer. Why? Kahan points out that groups with opposing values often become even more polarized, not less, when exposed to scientifically sound information. Yep. We actually build up our defenses and entrench our opinions when confronted with new information that threatens our worldview!

    So, what if someone from within our community or “cognitive culture” is the messenger for new information—even if it conflicts with our worldview? That’s a good start. Kahan and his colleagues have found that the experts whom laypersons see as credible are ones whom they perceive to share their values.

    Are there tips for communicators, journalists, scientists, policymakers and the like for combating this? Kahan and his colleagues admit that research on how to control cultural cognition is less advanced than research on the mechanisms behind it. Nevertheless, they offer two techniques of science communication that may help.

    First, understand peoples’ cognition culture—and tap into the core values that drive it. As Drew Westen would say (and as we’ve said before, again and again), go for the gut. Here’s Kahan:

    For instance, people with individualistic values resist scientific evidence that climate change is a serious threat because they have come to assume that industry-constraining carbon-emission limits are the main solution. They would probably look at the evidence more favourably, however, if made aware that the possible responses to climate change include nuclear power and geoengineering, enterprises that to
    them symbolize human resourcefulness. Similarly, people with an egalitarian outlook are less likely to reflexively dismiss evidence of the safety of nanotechnology if they are made aware of the part that nanotechnology might play in environmental protection, and not just its usefulness in the manufacture of consumer goods.

    Second, make sure that sound information is vouched for by a diverse set of experts. In more than one experiment, polarization on scientific information was substantially reduced when people encountered advocates with diverse values on both sides of the issue. “People feel that it is safe to consider evidence with an open mind when they know that a knowledgeable member of their cultural community accepts it.”

    As Kahan points out, this isn’t about more sophisticated “spin” or better marketing for science (though good marketing wouldn’t hurt). Rather, the idea is to create the best possible context for the most open-minded, unbiased consideration of the best available scientific information—across cultures of cognition.

    Photo courtesy Clarita, MorgueFile.