Our ability to deceive ourselves seems to be quite resilient. There is this thing called “confirmation bias” where we search for, interpret, favor, and recall information in a way that validates what we already think. And researchers in Europe recently found the following:
We tend to pay more attention to information that confirms our own beliefs and biases, and we are prepared to lose money to stick to our guns. We ignore what doesn’t fit with our biases – even if it costs us | New Scientist
We appear to dismiss the obvious costs of bad choices, and this presumably extends beyond financial incentives. The tendency might help explain we keep defending appalling behavior from elected officials we voted for. Or can’t cut loose from time commitments that are clearly burying us.
The one comment I would make about the study is that someone ought to design broader tests for the outcome (if this hasn’t already been done). The study was based on two experiments on groups of 20 participants. The description of participant demographics was a bit thin. Sex was identified and beyond that, adults with mean age in the early 20’s and a negative report of neurological or psychiatric issues.[**]
I work with large data sets on a regular basis and I am always a bit cautious about conclusions drawn from small numbers. Interpreting small scale results is a bit like navigating in the dark with a flashlight. What you see can be an artifact of where you happen to be looking.
Or as the study results suggests, where we choose to shine the light. If this effect holds it should be repeatable by other researchers using different populations, and at larger scales. I suspect that it probably does, which might suggest some caution about the purported wisdom of crowds.
Information in large groups tends to stovepipe around occupational specializations and areas of interests. What is known inside the stovepipe becomes self-reinforcing and bad information becomes highly resistant to change. Which suggests the aggregation of confirmation bias might be part of why it sometimes seems to take the retirement and/or death of an entire generation of theorists, researchers, and practitioners to erase a bad idea. Even when the costs of being wrong were found to be high.
It is possible that confirmation bias may have had a survival benefit in our deep history before civilization. We still do second-guess ourselves and our group decisions. But second-guessing slows down making and implementing critical decisions when they have to be acted on immediately. Or at that are at least urgent, and irrevocable once adopted. Second-guessing doesn’t contribute anything particularly useful when a mistake got you or your small band of hunter-gatherers killed. As in:
This was a really bad valley in which to camp for the winter. We’ve run out of food.
If you and your group survived to pass on the benefits of experience your choices were clearly correct. Or at least they were among a number of several acceptable alternatives. But in the complexities of the modern world the bad consequences tend to be less lethal and less immediate. Confirmation bias might not serve us very well any more.
[**] The citation for the paper referenced in the news clip is for those of you with the math to understand the statistical work. While I work with data I lack the background required to get very far into this.
Palminteri S, Lefebvre G, Kilford EJ, Blakemore S-J (2017) Confirmation bias in human reinforcement learning: Evidence from counterfactual feedback processing. PLoS Comput Biol 13(8): e1005684. https://doi.org/10.1371/journal.pcbi.1005684