Our Bias Can Cost Us

Our ability to deceive ourselves seems to be quite resilient.  There is this thing called “confirmation bias” where we search for, interpret, favor, and recall information in a way that validates what we already think.  And researchers in Europe recently found the following:

We tend to pay more attention to information that confirms our own beliefs and biases, and we are prepared to lose money to stick to our guns.  We ignore what doesn’t fit with our biases – even if it costs us | New Scientist

We appear to dismiss the obvious costs of bad choices, and this presumably extends beyond financial incentives.  The tendency might help explain we keep defending appalling behavior from elected officials we voted for.  Or can’t cut loose from time commitments that are clearly burying us.

The one comment I would make about the study is that someone ought to design broader tests for the outcome (if this hasn’t already been done).  The study was based on two experiments on groups of 20 participants.  The description of participant demographics was a bit thin.  Sex was identified and beyond that, adults with mean age in the early 20’s and a negative report of neurological or psychiatric issues.[**]

photos-public-domain.com

I work with large data sets on a regular basis and I am always a bit cautious about conclusions drawn from small numbers.  Interpreting small scale results is a bit like navigating in the dark with a flashlight.  What you see can be an artifact of where you happen to be looking.

Or as the study results suggests, where we choose to shine the light.  If this effect holds it should be repeatable by other researchers using different populations, and at larger scales.  I suspect that it probably does, which might suggest some caution about the purported wisdom of crowds.

Information in large groups tends to stovepipe around occupational specializations and areas of interests.  What is known inside the stovepipe becomes self-reinforcing and bad information becomes highly resistant to change.  Which suggests the aggregation of confirmation bias might be part of why it sometimes seems to take the retirement and/or death of an entire generation of theorists, researchers, and practitioners to erase a bad idea.  Even when the costs of being wrong were found to be high.

It is possible that confirmation bias may have had a survival benefit in our deep history before civilization.  We still do second-guess ourselves and our group decisions.  But second-guessing slows down making and implementing critical decisions when they have to be acted on immediately.  Or at that are at least urgent, and irrevocable once adopted.  Second-guessing doesn’t contribute anything particularly useful when a mistake got you or your small band of hunter-gatherers killed.  As in:

This was a really bad valley in which to camp for the winter.  We’ve run out of food.

If you and your group survived to pass on the benefits of experience your choices were clearly correct.  Or at least they were among a number of several acceptable alternatives.  But in the complexities of the modern world the bad consequences tend to be less lethal and less immediate.  Confirmation bias might not serve us very well any more.

—————

[**] The citation for the paper referenced in the news clip is for those of you with the math to understand the statistical work.  While I work with data I lack the background required to get very far into this.

Palminteri S, Lefebvre G, Kilford EJ, Blakemore S-J (2017) Confirmation bias in human reinforcement learning: Evidence from counterfactual feedback processing. PLoS Comput Biol 13(8): e1005684. https://doi.org/10.1371/journal.pcbi.1005684

 

The Wisdom of Jack

There is this scene in the first Pirates of the Caribbean movie where Jack Sparrow tells Will Turner his father was a pirate.

Will experiences a bit of cognitive dissonance with the revelation.  This is not what he thought his father to be.  He reacts by drawing his sword and Jack responds by knocking Will off his feet with an adroit flip of a sail boom.  Will finds himself dangling over water with Jack now holding Will’s sword.  Jack proceeds to educate him: “…the only rules that really matter are these: what a man can do, and what a man can’t do.”

Jack expands his meaning to include will and won’t when he asks his captive audience, “…can you sail under the command of a pirate, or can you not?” The idea being that squaring with the world as it exists is a matter of intent.

Unfortunately this is usually not easy in real life. We come equipped with robust facilities for self-deception to avoid engaging with unpleasant discoveries. Most of what we believe we either absorb from those around us, or develop as a result of intuitions or exposure to events with emotional content. And when confronted with contrary ideas we tend to back-fill our beliefs with information and reasoning to support holding them.

In “I told me so” Gregg A. Ten Elshof identifies this as rationalization and describes it as “…the most recognizable of our strategies for self-deception.” Elshof defines the process as constructing “…a rational justification for a behavior, decision, or belief arrived at in some other way.”* It doesn’t follow that a belief is wrong if arrived at through gut intuitions. But as Elshof comments we’re very reluctant to admit that as a basis and feel compelled to come up with reinforcing justifications.

Which is a problem if the belief actually is wrong — rationalization packages a powerful urge to create a false context to support the false belief. Such as a father being a law-abiding merchant sailor instead of a pirate, contrary facts notwithstanding. In Will’s case the contrary facts included a gold medallion with a skull on one side. A major theme in the movie is Will’s struggle to engage with those contrary facts.

Jack’s speech is a good personal reminder. I’ve worked for a couple of large organizations – on more than one occasion in my career history I’ve been yanked out of one project and dropped in another. The process is uncomfortable, particularly when there is a gut-level sense of personal investment. It’s hard not to back-fill that with reasons as to why what I’m suddenly not doing is still relevant to the broader organization.

But adapting to the changes comes far more quickly if I am properly engaged with practical realities rather than avoiding them.

—————-

*Elshof, Gregg. (2009). I told me so: Self-deception and the Christian life. Grand Rapids, Mich.: W.B. Eerdmans Pub. (p. 54. Kindle Edition).