Confirmation Bias
Confirmation Bias means we actively seek out information that agrees with our beliefs and dismiss contrary information.
Confirmation Bias means we actively seek out information that agrees with our beliefs and dismiss contrary information.
Our innate tendency to pay more attention to evidence that supports our pre-existing notions is called the confirmation bias. When it comes to drawing evidence from reality, decision-makers are inclined to seek out information and value information corroborating their existing ideas. This bias can therefore lead us to make poor decisions. When working with evidence, we may be biased in terms of collecting it. We are more likely to draw conclusions from unbiased evidence that are more accurate than conclusions drawn from biased evidence, as the former is closer to reality.
The trouble with individual confirmation bias, in the aggregate, is that it can have serious consequences. If we are so entrenched in our preconceptions that we only consider evidence that supports them, this may impede societal cooperation (which often requires taking account of other perspectives). Social divides and policy stagnation may result from our tendency to favor information that corroborates our existing opinions and ignore data that doesn't.
Our brain looks for cognitive shortcuts when collecting and processing information to make the process more efficient. When evaluating evidence, it requires time and effort, so our brain looks for shortcuts to make the process more efficient.
It is uncertain whether or not confirmation bias can be formally categorized as a heuristic. In any case, cognitive strategies like confirmation bias are used to seek out evidence that supports our prior beliefs. The easiest hypotheses to find are those we already possess.
We do this because it makes sense. We frequently need to understand information quickly, and developing new ideas or beliefs takes time. We often take the easiest route out of necessity, and that's what we've adapted to do.
When a furious animal is charging at our ancestors, they only have a few seconds to choose whether to stand their ground or flee. There is no time to consider all the different variables involved in a fully informed choice. Instinct and past experience may cause them to consider the size of the animal and flee, but another hunting group now tilts the chances of a successful confrontation in their favor. Our reliance on these short cuts to make quick decisions in modern life, according to many evolutionary scientists, is a consequence of our survival instinct. 1.
We sometimes exhibit confirmation bias because it protects our self-esteem.
It can be painful to realize that a cherished belief is incorrect, especially if our identities are built upon it. We often look for corroborating information in place of disconfirming information, because we fear that being wrong might be interpreted as a sign of stupidity. 1
Clinical psychologist Harriet Lerner and political psychologist Phillip Tetlock's 2002 peer-reviewed paper theorizes that when we connect with other people, we usually adopt the same ideas in order to fit in with the group.
The authors describe confirmatory thought as “a one-sided attempt to rationalize a certain perspective.” They define exploratory thought as “a balanced consideration of alternative perspectives.” Groupthink is an example of confirmatory bias in social situations, in which group conformity results in ineffective decision-making. In addition to being an individual phenomenon, confirmation bias can also occur among individuals. 3
Confirmation bias can occur either individually or collectively, and both situations require careful deliberation.
At the individual level, confirmation bias affects our decision-making. Our decisions cannot be fully informed if we only look at evidence that confirms our assumptions. It can cause us to overlook important information in both our professional and everyday lives. We may make poorly informed decisions because they haven't taken the environment into account. A voter might continue to support a candidate after learning about his or her misconduct. A company CEO who is reluctant to examine new opportunities might be as a consequence of a negative past experience with comparable plans. To handle issues and make choices with an open mind, it is critical to first be aware of confirmation bias.
Groupthink can be sustained at the group level. A culture where groupthink occurs is one in which decision-making is prevented from believing that group harmony and coherence are the most important values for success. This minimizes dissent within the group.
An employee of a technology company who failed to disclose a revolutionary discovery out of fear of reorienting the firm's direction is just one example of how this bias prevents people from becoming informed about the differing views of their fellow citizens, and consequently, from engaging in constructive discussion in many democracies.
When we are gathering information, this bias is most likely to influence our decision-making. It may also occur subconsciously, making us unaware of its impact.
The first step to avoiding confirmation bias is to be aware that it is a problem. By understanding its effect and how it operates, we may better recognize it in our decision making. Cialdini, a psychology professor and author, suggests two approaches to recognizing when these biases are affecting our decision making:
Firstly, because bias is most likely to occur early in the decision-making process, we should start with a neutral factual base. A more objective body of information can be created by gathering facts from multiple third parties (or, even better, one). 4
Decision-makers should also consider inter-personal discussions when drawing hypotheses from the assembled data, in order to identify individual cognitive biases in hypothesis selection and evaluation. These procedures might help manage cognitive bias and make better decisions as a result of it, even if it is impossible to eliminate it completely.
According to classical historian Thucydides, people rely on “careless hope” to make things they wish to be true become real, while they make things they don’t want to be real disappear by relying on “reason”.
In 1960, Peter Wason described confirmation bias as the phenomenon. Wason conducted the Wason Rule Discovery Test, in which participants were asked to find a rule that applied to a set of three numbers. The numbers '2-4-6' were given, and Wason told the participants that they satisfied the rule. To discover what the rule was, Wason asked them to make several other sets of numbers and see if they satisfied it. The examiner would tell them whether their conjured numbers satisfied the rule or not.
The subjects proposed a set of even numbers and doubled the given numbers in order to test their hypothesis. Wason's rule, however, was that the numbers in the set were increasing.
Subjects tended to form the same hypothesis and only tried numerical sequences that corroborated it rather than sequences that disproved it. 4 They sought to affirm their own rule rather than rebel against it.
In 1979, researchers at Stanford University investigated the psychological processes involved in confirmation bias in a major study. Participants were undergraduates with opposing views on the topic of capital punishment, and they evaluated two fabricated research studies on the subject.
In addition to the real study, participants were given one of two false studies supporting opposite conclusions: that capital punishment deters crime (the opposite view, that capital punishment has no appreciable impact on the overall level of crime in the population, was also supported).
The objective statistics in both studies were entirely fabricated by the Stanford researchers, but they were designed to appear equally compelling. The researchers split up responses according to participants' pre-existing views:
The findings revealed that both groups became even more staunchly committed to their original position after being confronted with evidence either supporting or refuting capital punishment. 5
“We are able to identify the flaws in somebody else's argument, but we are usually unable to see the flaws in our own position.”
- The subject of Elizabeth Kolbert's article in The New Yorker is the sixth mass extinction.
We are considering one particular manifestation of the filter bubble impact—cognitive isolation caused by technology-enabled confirmation bias—to be an example of technology amplifying and facilitating such bias. In this scenario, internet activist Eli Pariser coined the term to describe the phenomenon where websites use algorithms to predict what information users want to see, and then provide them with this information.7
Filter bubbles can result in us being shown content that matches our interests, while avoiding content that runs contrary to our interests. Typically, we prefer content that confirms our beliefs because it requires less critical reflection. As a result, your online experience might favor information that agrees with your existing views and avoid information that contradicts them.
Pariser describes the filter bubble in "The Filter Bubble: What the Internet Is Hiding from You" using the example of internet searches for an oil spill:
My two friends, both Northeastern, white, well-educated women, conducted a search for 'BP' in the spring of 2010, while the Deepwater Horizon oil rig was dumping oil into the Gulf of Mexico. They both identified different results. One saw investment information about BP, while the other saw news. For one, there was only a BP advertisement on the first page, whereas the other saw nothing about the oil spill besides it."8END OF QUOTE It is important to note that despite their similarities, my friends saw quite different results when they searched for 'BP'.
The information on the search engine was tailored to fit with the beliefs formed from previous searches, and to match the women's reactions to the BP oil spill. It is likely that their conceptions of the oil spill would have been very different if this was the only source of information. Confirmation bias was unintentionally facilitated.
Despite the fact that this particular filter bubble was innocuous, filter bubbles on social media platforms have been shown to influence elections by customizing campaign messages and political news to specific groups of voters. 9 In addition to fragmenting democratic debate, this sort of personalized news stream may entrench different demographic groups in their political views by presenting them with a filtered body of evidence supporting them.
Our tendency to notice, focus on, and give greater credence to evidence that corroborates our pre-existing beliefs is known as confirmation bias.
Our brains look for cognitive shortcuts when gathering and processing information to save time and energy. Confirmation bias is a cognitive shortcut that we use to process information. Evaluating data takes time and effort, so we look for shortcuts. It's because the most readily available hypotheses are the ones we already have that we seek evidence that supports our existing hypotheses. We seek out information that supports rather than disproves our existing beliefs to protect our self-esteem. Nobody wants to feel bad about themselves, and realizing that a belief they valued was false definitely can have this effect. As a result, we seek out information that corroborates rather than discredits our current beliefs.
Subjects became even more committed to their original position after being presented with equally compelling evidence in support of capital punishment and counter-evidence in refutation of it in a 1979 Stanford study.
The filter bubble effect is a result of the way modern preference algorithms amplify and reinforce our tendency to rely on confirmation bias. Websites select which information a user wants to see based on algorithms and provide it to them. Because it requires less critical thinking, we typically favor content that confirms our beliefs. Filter bubbles might prevent you from accessing information that clashes with your existing opinions while browsing online. Filter bubbles and the confirmation bias they produce have been shown to influence elections and may therefore obstruct the important discussion democracy requires.
The gathering of data is the moment when confirmation bias is most prone to happen. It is also likely to happen subconsciously, with us unaware that it influences our decision-making. We must be conscious of its presence if we wish to avoid it. Because it is most likely to occur at the beginning of the decision-making process, we should begin with a neutral factual foundation. A single (or multiple) third party may gather information to develop a more objective database.
The purpose of this article is to present evidence that firm performance is positively affected by gender diversity. By capitalizing on confirmation bias (among other psychological principles), firms may increase diversity and thereby improve performance.
According to this piece, the reliance on ‘trigger warnings’, preference algorithms, and similar cues results in a highly curated information stream, which in turn promotes cognitive biases such as confirmation bias. We are prevented from empathizing with others and consolidating our opinions in the light of opposing opinions, the author contends.
People tend to make decisions based on the way a problem is phrased rather than on its content
We prefer options that are certain rather than those that are vague or missing information
We tend to focus on trivial details and delay addressing complex subjects