March 7, 2013 — In 1784, philosopher Immanuel Kant penned a now-famous essay entitled, “What Is Enlightenment?” He called out people’s tendency to blindly follow thought leaders. “If I have a book that thinks for me, a pastor who acts as my conscience, a physician who prescribes my diet, and so on — then I have no need to exert myself. I have no need to think, if I can only pay; others will take care of that disagreeable business for me,” he wrote. His thesis for the essay became a rallying cry for the era. “Sapere aude! (Dare to know.) ‘Have the courage to use your own understanding,’ is therefore the motto of the enlightenment.”
Kant’s take on the way people think — or don’t — has been borne out by decades of research. It turns out our minds use shortcuts, or “heuristics,” when making decisions and forming judgments. “If we’re buying toothpaste, we go by price or brand. We don’t sit there and read the label for 20 minutes,” says University of Wisconsin-Madison communications professor Dietram Scheufele. Similarly, most people have formed opinions on what to do about climate change without having reviewed the scientific peer-reviewed literature.
“We trust scientists, we trust a political party,” Scheufele says, because we don’t have time to dig into what someone else spent 20 years researching. It’s rational, he says, for people to act as “cognitive misers” in a world that contains far more information than one can personally process.
Problems arise, however, when such shortcuts become counterproductive to our self-interests.
Anchors and Aversion
In the 1970s, psychologists Daniel Kahneman and Amos Tversky discovered a couple dozen “cognitive biases” people consistently display that fly in the face of laissez-faire or neoclassical economics. Just as humans are prone to optical illusions, Kahneman and Tversky found, we also consistently make certain errors in judgments and thinking, and our behaviors follow suit.
One example involves a concept known as anchoring, which holds that people place greater emphasis on the first information they receive about something, even when later information contradicts it. Consider these individuals:
Alan: intelligent, industrious, impulsive, critical, stubborn, envious.
Ben: envious, stubborn, critical, impulsive, industrious, intelligent.
Most people think more highly of Alan than of Ben, even though the same adjectives were used for both, because for Alan the positive adjectives come first. Similarly, research shows that job interviewers make up their minds about interviewees within moments of meeting, then look for facts and ask directed questions that support their initial opinion.
Additionally, people place more weight on a single personal anecdote than on data — which is why, for example, media reports of one individual who died or was cured lead to strong and potentially incorrect opinions about a medical procedure.
Kahneman and Tversky also studied loss aversion, in which people prefer to avoid losses rather than seek gains. People would much rather avoid a $5 charge than get a $5 discount, and consumer choices reflect this.
Studies have shown that people become more, not less, entrenched in their opinions when presented information that opposes their views.
A recent study by Toby Bolsen and James Druckman, professors of political science at Georgia State and Northwestern, respectively, found that facts were of limited value to people who had formed preexisting opinions about genetically modified foods and nanotechnology. Study participants’ initial opinions not only were influenced by their political views, but also colored their subsequent evaluation of new facts. When participants were told about studies that revealed GM beets had a harmful effect on biodiversity or that carbon nanotubes injected into mice resulted in adverse health impacts similar to those of asbestos, the new information did not change participants’ previously held opinions, whether positive or negative. Their prior opinions even colored how they gauged the quality of the studies on these technologies.
Psychologists call this “motivated reasoning” — when our preformed opinions distort our evaluation of new information. “This tendency becomes more pronounced as people become more knowledgeable,” says Bolsen. Not only that, studies have shown that people become more, not less, entrenched in their opinions when presented information that opposes their views.
Solution: Slow Thought
Kahneman summarized his work in the 2011 book Thinking, Fast and Slow. Just as “slow food” involves thoughtful meals crafted from sustainable ingredients and stands in contrast to the industrialization embodied by “fast food,” slow thinking involves deliberate, reflective and critical assessment of one’s own beliefs rather than reliance on largely unconscious thought processes. Kahneman argues that by understanding our inherent biases and questioning our intuitive beliefs, preferences, impressions and feelings, we can limit the damage of bad judgments and decisions. Critical thinking, another term for slow thinking, is defined as “the mind continually engaged in states of self-direction, self-discipline, and self-command,” according to the Center for Critical Thinking.
Ensia shares solutions-focused stories free of charge through our online magazine and partner media. That means audiences around the world have ready access to stories that can — and do — help them shape a better future. If you value our work, please show your support today.
Yes, I'll support Ensia!
This is a good article. I think as a nation, people have gotten so used to everything provided for them fast. Not enough time, not enough patience to look things over.
The mini critical thinking guide and the evaluation of information sources provided by Purdue are great tools for coming to an educated conclusion on sources.
Sadly, I also must get myself off of autopilot and realize that there is plenty of time to read, if you want to make time.
Thanks for the good read!
-J