In 1784, philosopher Immanuel Kant penned a now-famous essay entitled, “What Is Enlightenment?” He called out people’s tendency to blindly follow thought leaders. “If I have a book that thinks for me, a pastor who acts as my conscience, a physician who prescribes my diet, and so on — then I have no need to exert myself. I have no need to think, if I can only pay; others will take care of that disagreeable business for me,” he wrote. His thesis for the essay became a rallying cry for the era. “Sapere aude! (Dare to know.) ‘Have the courage to use your own understanding,’ is therefore the motto of the enlightenment.”

Kant’s take on the way people think — or don’t — has been borne out by decades of research. It turns out our minds use shortcuts, or “heuristics,” when making decisions and forming judgments. “If we’re buying toothpaste, we go by price or brand. We don’t sit there and read the label for 20 minutes,” says University of Wisconsin-Madison communications professor Dietram Scheufele. Similarly, most people have formed opinions on what to do about climate change without having reviewed the scientific peer-reviewed literature.

“We trust scientists, we trust a political party,” Scheufele says, because we don’t have time to dig into what someone else spent 20 years researching. It’s rational, he says, for people to act as “cognitive misers” in a world that contains far more information than one can personally process.

Problems arise, however, when such shortcuts become counterproductive to our self-interests.

Anchors and Aversion

In the 1970s, psychologists Daniel Kahneman and Amos Tversky discovered a couple dozen “cognitive biases” people consistently display that fly in the face of laissez-faire or neoclassical economics. Just as humans are prone to optical illusions, Kahneman and Tversky found, we also consistently make certain errors in judgments and thinking, and our behaviors follow suit.

One example involves a concept known as anchoring, which holds that people place greater emphasis on the first information they receive about something, even when later information contradicts it. Consider these individuals:

Alan: intelligent, industrious, impulsive, critical, stubborn, envious.
Ben: envious, stubborn, critical, impulsive, industrious, intelligent.

Most people think more highly of Alan than of Ben, even though the same adjectives were used for both, because for Alan the positive adjectives come first. Similarly, research shows that job interviewers make up their minds about interviewees within moments of meeting, then look for facts and ask directed questions that support their initial opinion.

Additionally, people place more weight on a single personal anecdote than on data — which is why, for example, media reports of one individual who died or was cured lead to strong and potentially incorrect opinions about a medical procedure.

Kahneman and Tversky also studied loss aversion, in which people prefer to avoid losses rather than seek gains. People would much rather avoid a $5 charge than get a $5 discount, and consumer choices reflect this.

Studies have shown that people become more, not less, entrenched in their opinions when presented information that opposes their views.

A recent study by Toby Bolsen and James Druckman, professors of political science at Georgia State and Northwestern, respectively, found that facts were of limited value to people who had formed preexisting opinions about genetically modified foods and nanotechnology. Study participants’ initial opinions not only were influenced by their political views, but also colored their subsequent evaluation of new facts. When participants were told about studies that revealed GM beets had a harmful effect on biodiversity or that carbon nanotubes injected into mice resulted in adverse health impacts similar to those of asbestos, the new information did not change participants’ previously held opinions, whether positive or negative. Their prior opinions even colored how they gauged the quality of the studies on these technologies.

Psychologists call this “motivated reasoning” — when our preformed opinions distort our evaluation of new information. “This tendency becomes more pronounced as people become more knowledgeable,” says Bolsen. Not only that, studies have shown that people become more, not less, entrenched in their opinions when presented information that opposes their views.

Solution: Slow Thought

Kahneman summarized his work in the 2011 book Thinking, Fast and Slow. Just as “slow food” involves thoughtful meals crafted from sustainable ingredients and stands in contrast to the industrialization embodied by “fast food,” slow thinking involves deliberate, reflective and critical assessment of one’s own beliefs rather than reliance on largely unconscious thought processes. Kahneman argues that by understanding our inherent biases and questioning our intuitive beliefs, preferences, impressions and feelings, we can limit the damage of bad judgments and decisions. Critical thinking, another term for slow thinking, is defined as “the mind continually engaged in states of self-direction, self-discipline, and self-command,” according to the Center for Critical Thinking.

As a simple example of fast thinking, Kahneman offers the math problem 2 + 2. The number 4 flashes to mind without having to ponder or calculate. Other examples include an instinctual comprehension of facial expressions, understanding simple sentences and driving a car on an empty road.

An example of slow, or critical, thinking involves the calculation 17 x 24. “If you do carry out that multiplication, it is a deliberate act,” Kahneman said in a lecture at the Arthur M. Sackler Colloquia on the Science of Science Communication in May 2012. “It is intentional. It is effortful. It involves mental work.” Other examples include focusing on one person’s voice in a crowded room, filling out a tax form and checking the validity of a complex logical argument.

“To the degree that one can recognize his or her own tendencies to engage in biased information acquisition — especially as the political context around a given issue becomes polarized — one may be better equipped to override automatic processing rules,” Bolsen says.

“Most of the beliefs, feelings, decisions and responses that describe us come to us very quickly, without what you might call thoughtful reflection,” says Arthur Lupia, a political science professor at the 
University of Michigan. “Critical thinking, per se, is the exception to a rule.”

Fortifying Ourselves

“[I]f we are willing, how exactly do we go about fortifying ourselves against these biases?” asks Columbia Business School professor Sheena Iyengar in her book The Art of Choosing. We have to be willing to make ourselves uncomfortable, Iyengar posits, by vigilance, persistence and a healthy dose of skepticism. “Ask yourself how you arrived at a particular preference. Were you overly influenced by a vivid image or anecdote? Did you discard an option too quickly because it was framed as a loss? Is it possible you imagined a trend or pattern that doesn’t really exist? Try to find reasons not to choose what you’re immediately drawn to. Gather evidence against your own opinion.”

“Understanding the motivation behind a particular source of information can reveal much about its trustworthiness.” — The Age of Propaganda

While Iyengar’s focus is on evaluating how we form our own opinions, authors of the book The Age of Propaganda list several questions one can ask when evaluating information from others: “What does the source of information have to gain? Are there other options and other ways of presenting these options? What are the arguments for the other side? … Understanding the motivation behind a particular source of information can reveal much about its trustworthiness.” The Foundation for Critical Thinking’s Mini-Guide to Critical Thinking and Purdue University Library’s handout Evaluating Information Sources similarly offer advice on how to employ critical thinking.

During graduate school, I developed a curriculum for teaching college students how to evaluate the reliability of information. In one exercise, students read a book chapter, magazine article, website article and peer-reviewed journal article, and answer these questions in a spreadsheet that ranks them according to their reliability:

  • Are the authors’ names and credentials listed on the publication?
  • Does the information source provide more than one viewpoint or hypothesis?
  • Are specific studies mentioned or cited in the text? (Do not give a Yes answer if the article contains general phrases like “studies indicate” or “research shows” unless the article/source explicitly mentions a specific study and researcher mentioned.)
  • Does the paper test a prediction? (Usually only peer-reviewed academic articles do, and they get higher points.)
  • Are the authors trying to persuade you to accept a particular viewpoint?
  • When was it published? (Older publications receive fewer points.)
  • Is the article or website sponsored by a company or by advertising?
  • Do the authors blatantly or implicitly insult other perspectives?
  • Does the perspective presented seem exaggerated or extreme?
  • What is the purpose of the publication? (Is it intended to inform/share data, disclose information, persuade, sell or entice?)

Scientific papers are considered highly reliable because the scientific method is explicitly designed to remove biases from investigations of the natural world. Developing a hypothesis, putting it to the test, running statistics on the results and then having a paper reviewed by one’s peers before publication all work to keep human biases out of scientific findings. While mistakes occur, history shows that the scientific process reveals reliable truths about the natural world.

However, most people get their information about science from media — a shortcut that allows them to get up to speed on important issues in a limited amount of time and incorporate multiple primary information sources in a single swoop. To increase the likelihood that information gained in this way is reliable, Lupia suggests we expand our repertoire of sources and consider whether the TV, radio, blog and print sources we use are biased toward our own political values, or are an objective portrayal of fact.

“In politics, debates are not only about facts but also about values. In such cases, many people prefer to follow information sources who share their values — even if such sources are not the most accurate,” he says.

Whether making decisions on purchasing organic versus conventionally grown produce, or forming opinions about a politician or medical procedure, some scholars consider critical thinking a moral imperative in a world replete with misinformation because the opposite can have tragic consequences, both personally and societally.

“We have no choice but to return again and again to critical thinking in the strongest possible sense,” says Richard Paul, director of research and professional development at the Center for Critical Thinking, which urges not just active reflection on one’s thinking, but the commitment to actually use those skills to guide behavior. “Critical thinking is one of the few hopeful forces in the world.”