Remember those old anti-drug television commercials with an egg sizzling in a frying pan? Here’s a new twist: This is your brain. This is your brain on climate change.
I’ve written before that with or without multimillion dollar campaigns to discredit climate science (and scientists), our brains don’t seem very well equipped to fathom the scope or urgency of global warming (we are chronic optimists, fear shuts us down, we cherry-pick information that confirms what we already believe and reject that which challenges our take on reality, we tend to trust messengers who share our worldview and distrust those who don’t, etc., etc., etc.) To be clear, I do not point this out in order to excuse our inaction. Rather, I aim to arm us with an understanding of the brain in order to more effectively communicate about the problem and the myriad benefits of climate and energy solutions. To that end, I recommend Michael Shermer‘s Scientific American article, “The Believing Brain.”
Find this article interesting? Support our work during Sightline’s fall fund drive now!
Based on extensive research for his book on how our brain works to construct beliefs and reinforce them as “truths” more generally, Shermer’s article also doubles as another primer on your brain on climate science.
Here are the basics as Shermer describes them:
We form our beliefs for a variety of subjective, emotional and psychological reasons in the context of environments created by family, friends, colleagues, culture and society at large. After forming our beliefs, we then defend, justify and rationalize them with a host of intellectual reasons, cogent arguments and rational explanations. Beliefs come first; explanations for beliefs follow. In my new book The Believing Brain (Holt, 2011), I call this process, wherein our perceptions about reality are dependent on the beliefs that we hold about it, belief-dependent realism. Reality exists independent of human minds, but our understanding of it depends on the beliefs we hold at any given time.
According to Shermer, “once we form beliefs and make commitments to them, we maintain and reinforce them through a number of powerful cognitive biases that distort our percepts to fit belief concepts.” Here they are, in his words:
- Anchoring Bias. Relying too heavily on one reference anchor or piece of information when making decisions.
- Authority Bias. Valuing the opinions of an authority, especially in the evaluation of something we know little about.
- Belief Bias. Evaluating the strength of an argument based on the believability of its conclusion.
- Confirmation Bias. Seeking and finding confirming evidence in support of already existing beliefs and ignoring or reinterpreting disconfirming evidence.
Shermer also describes something called “in-group bias,” in which we place more value on the beliefs of those whom we perceive to be fellow members of our group and less on the beliefs of those from different groups. (Along these lines, we are apt to “demonize and dismiss” out-group beliefs as “nonsense or evil, or both.”)
Finally, there’s the bias blind spot, which Shermer describes as “the tendency to recognize the power of cognitive biases in other people but to be blind to their influence on our own beliefs.”
No one is immune.