As Felicity Barringer reports at NYT’s Green blog, scientific evidence reinforces something that communicators have long understood as a guiding rule of our craft: “[Humans] are hard-wired to respond to external or internal information with emotion and instinct first and cognition second. With emotion and instinct more and reason less.”
In other words, (this is not the scientific way of putting it, either, BTW) we react to information with our heart or gut more than our brains. Of course, this impacts responses to our messages about policy solutions and new technologies.
For example, reporting another story for the Times, Barringer ran into opposition to smart meter technology from an unlikely mix of Tea Party conservatives and left-leaning individualists in California, many of whom are convinced that the meters’ “radio frequency radiation” poses a health threat.
But there’s absolutely no scientific evidence of a link to health problems. The predicament led Barringer to ask:
How, in a rational society, does one understand those who reject science, a common touchstone of what is real and verifiable?
The absence of scientific evidence doesn’t dissuade those who believe childhood vaccines are linked to autism, or those who believe their headaches, dizziness and other symptoms are caused by cellphones and smart meters. And the presence of large amounts of scientific evidence doesn’t convince those who reject the idea that human activities are disrupting the climate.
Find this article interesting? Support more research like this with a gift!
Barringer looked to David Ropeik, who is an instructor at the Harvard University extension school and the author of a book, “How Risky Is It Really?” for an explanation. (And, yes, self-mockery is appropriate here: He uses peer-reviewed science to explain the limits of peer-reviewed science as a persuasive tool.)
According to Ropeik, there are several reasons humans are born science skeptics.
First problem: As discussed, information is processed sooner by the brain’s amygdala, where fear starts, than the cortex, the seat of reason. Back to my layman’s terms: our gut kicks in before our rational mind. (Andrew Revkin calls this our “inconvenient minds.”)
Second problem: Loss now is more powerful a motivator than risk of loss—or even gain—down the road. “There is the time element when it comes to being averse to loss. If a risk is down the road, we see it with rose-colored glasses. ‘It won’t happen to me.'” This principle applies to cigarette smokers, or texting drivers, or just about anybody trying to fathom the likelihood of global climate change. “But,” Ropeik says, “when something is more immediate, the risk side of the equation carries more weight.”
The third problem (and Ropeik says this is the cutting edge field of research into risk perception) is that our opinions—about science and a host of other worldviews—tend to match what is consistent with how our “tribe” operates. It’s a simplification, but Ropeik divides these “tribe” cultures into four major groups, each with a particular way of organizing, experiencing, or perceiving society and its operation. It’s called “cultural cognition” by social scientists.
Two of the groups involved, he said, are simply characterized: individualists (most people would call them libertarians, who want the government to butt out) and communitarians, the two poles on the political spectrum. The two other groups, he said, are called hierarchists and egalitarians. “Hierarchists like the status quo, class, old money,” he said. “They like a nice predictable social ladder with rungs on the ladder. Egalitarians don’t want any rungs.”
In other words, based on these underlying cultural understandings, different people will cherry-pick their symptoms and the facts to fit into their ostensibly rational argument. (For similar reasons, they—we—probably also consume media that backs up their—our—predetermined opinions/worldviews.)
Another researcher—Dan M. Kahan, Yale Law School, describes cultural cognition this way in an article called “Fixing the Communications Failure”: “People endorse whichever position reinforces their connection to others with whom they share important commitments. As a result, public debate about science is strikingly polarized.” So, there is a strong emotional pull to reject information that drives us away from our community or “tribe.”
What do we do with this information?
Dumping more facts on people to convince them of scientific consensus is probably NOT the answer. Why? Kahan points out that groups with opposing values often become even more polarized, not less, when exposed to scientifically sound information. Yep. We actually build up our defenses and entrench our opinions when confronted with new information that threatens our worldview!
So, what if someone from within our community or “cognitive culture” is the messenger for new information—even if it conflicts with our worldview? That’s a good start. Kahan and his colleagues have found that the experts whom laypersons see as credible are ones whom they perceive to share their values.
Are there tips for communicators, journalists, scientists, policymakers and the like for combating this? Kahan and his colleagues admit that research on how to control cultural cognition is less advanced than research on the mechanisms behind it. Nevertheless, they offer two techniques of science communication that may help.
First, understand peoples’ cognition culture—and tap into the core values that drive it. As Drew Westen would say (and as we’ve said before, again and again), go for the gut. Here’s Kahan:
For instance, people with individualistic values resist scientific evidence that climate change is a serious threat because they have come to assume that industry-constraining carbon-emission limits are the main solution. They would probably look at the evidence more favourably, however, if made aware that the possible responses to climate change include nuclear power and geoengineering, enterprises that to
them symbolize human resourcefulness. Similarly, people with an egalitarian outlook are less likely to reflexively dismiss evidence of the safety of nanotechnology if they are made aware of the part that nanotechnology might play in environmental protection, and not just its usefulness in the manufacture of consumer goods.
Second, make sure that sound information is vouched for by a diverse set of experts. In more than one experiment, polarization on scientific information was substantially reduced when people encountered advocates with diverse values on both sides of the issue. “People feel that it is safe to consider evidence with an open mind when they know that a knowledgeable member of their cultural community accepts it.”
As Kahan points out, this isn’t about more sophisticated “spin” or better marketing for science (though good marketing wouldn’t hurt). Rather, the idea is to create the best possible context for the most open-minded, unbiased consideration of the best available scientific information—across cultures of cognition.
Photo courtesy Clarita, MorgueFile.
Hey Anna,Nice work on this one. It’s critical that more people come to understand the role of human cognition in shaping political and cultural outcomes. Those of us in the field advocating for this work are enjoying the outpouring of new research supporting what we do.Best,Joe BrewerFounder & DirectorCognitive Policy Works
This is good as far as it goes, but ignores one other important reason for science-doubting. People who’ve followed any breaking issues in science more than a few years have seen numerous examples of science topics that are first derided by the established experts, then debated (sometimes for years), then finally accepted as part of the new paradigm (old example: the dangers of DDT; new example: new acceptance of the downsides to BPA). When someone knows the current science model may be adjusted or reversed with further studies they’re less likely to wholeheartedly adopt a new finding.
Since scientific research is generated mostly by malesI am leary about it until I really investigate whobankrolled the research, for example, drug companies.The mega media corporations understand what this article isall about. They understand that in an economic downturn particularly that people react with their emotions.It is the ability to sense the feelings of others that makesfor companship and eventually coupling into close partnershipssuccessful. With so many people escaping to Facebook, we have lost theface to face communication that tells all.
Great work Anna!I think it’s also important to remember that science is, by nature, in a constant state of development and refinement. What is considered hard scientific fact today may very well be disproven tomorrow.
Interesting case study of how contradictory scientific evidence is ignored can be seen in Victoria, BC, with the long-standing sewage plant issue. Advocates, environmental lawyers, and some journalists have assume that Victorians are negligent in maintaining our marine-based sewage treatment system. However, scientists such as Dr. Peter Chapman, an editor of the peer-reviewed journal Marine Pollution Bulletin, have done a superb job of dissecting the un-scientific basis for criticising our sustainable marine-based sewage treatment system: https://sites.google.com/site/sewageplantsvictoria/Home/science_politics_ideology_Chapman1.pdf
I find the quote, “But there’s absolutely no scientific evidence of a link to health problems” interesting (AND non-scientific).Interesting because there was no (accepted) scientific evidence of harms from coal-mine dust, asbestos, radium, tobacco smoke, etc., etc., etc. …until there was…For that matter, emotional positions are arrived at in some way or other, too—they might be less valid on average than a thorough, well-done scientific approach, but that doesn’t mean they are necessarily inaccurate (particularly relative to *poorly* done science).Non-scientific because there *IS* scientific evidence of a link between health problems and EMFs/EMR, and that body of research is growing steadily. There might not be a link to SmartMeters, per se, but the general properties are linked.So I agree with Barringer: “What gives?” What gives, when someone claiming to act as an authoritative evaluator of science evaluates information in a non-scientific way?What if people aren’t BORN science skeptics?What if they become skeptics of “science” because of the limits of science and because of all the poorly done “science” out there?
This is a thoughtful piece; however, our thinking on these topics seems to overlook the formidable body of work generated in clinical psychological contexts, work that recognizes the power of emotions, and specifically affect. Affect here concerns unconscious (largely) energetic responses we may have, i.e. anxiety or fear, that may not even relate to the specific content at hand. In other words, it is about these topics may mean- and what associations they generate.We’ve known for a very, very long time that humans have creative strategies for managing uncertainty and particularly, anxiety. These strategies may be called defense mechanisms; this is no longer the domain of psychotherapy but has been extensively researched in evidence-based studies. What we know is that anxiety and fear, etc can distort our perceptions of data or information. Hence a possible “distrust” of science. It seems that until we can take account of these powerful dimensions, that are totally irrational and yet have a profound logic in the psychic sense, we shall be spinning our wheels and not addressing the roots of why we continue to avoid confronting what we must. There are good reasons. It’s time to think thoughtfully about these reasons, and perhaps bring in the clinical psychotherapists and practitioners – who work on the front lines with people addressing resistance, defenses, and all the myriad ways we avoid facing what is true.Renee Lertzmanwww.reneelertzman.com
Well, well, what a load of crap this article was. First there is mention of resistance to new (untested) technology, ie. “smart” meters. Then an abrupt turn towards discrediting the skeptics takes center stage, wrapping up to a general put down of those (unwashed?) critics.
No science? Only emotion? Bah! You grossly overlook the “canary in the coal mine.” Simply put, the birds have left. Why? RF waves put out by the strong, pulsed, “smart” meters messes up their navigation systems.