Don’t forget to seek refuting evidence

“It is hard for us, without being flippant, to even see a scenario within any kind of realm of reason that would see us losing one dollar in any of those transactions.”[1]
–AIG Executive, August, 2007 one year before the financial meltdown of AIG
The first thing to do when you can’t find data that contradict your conclusions is to look harder.
The other day, I received a request to schedule my virtual workshop on Rethinking Data. The workshop is broken into four consecutive modules.  The person making the request asked for the following dates:
Module 1 – July 3
Module 2 – July 17
Module 3 – July 14
Module 4 – July 31
It was clear that she made a typo on the dates since the sessions need to occur in order.  So, what was the mistake?  Can you see it?
It seemed obvious to me that she wanted July 24thfor module 3 rather than July 14.  I was wrong. The mistake was on module 2.  She wanted July 7 rather than the 17th. In retrospect it’s clear that the mistake could have been on either date. The fact that I guessed wrong isn’t the problem. The problem is that I never even considered module 2 as possibly having the error.  Therefore,  I never even had a chance of guessing right.
I fell victim to confirmation bias.  Confirmation bias is our tendency to 1) seek information that supports our beliefs and 2) stop seeking information once our hypothesis is confirmed.  In the date example, I saw an ascending pattern that had one number (July 14) out of place.  Changing it to July 24 made the sequence work and proved that my hypothesis was “correct.” So I stopped looking for alternatives.
Confirmation bias is an interesting problem. It’s almost impossible for you to see it when it is happening to you.  In fact, when it is happening things often appear to be going quite well.  You have a hypothesis (or belief or conclusion) and your data appear to support it.  In many cases you are probably relieved that you found an answer.
The problem with confirmation bias is that it decreases your understanding of your data while increasing your confidence in that  understanding. 

One of the more famous studies of confirmation bias came from Peter Wason [2]  in 1960.  Wason provided people with a sequence of three numbers,
2-4-6
The participants’ job was to figure out the rule that Wason used to create the sequence.  Participants were allowed to test as many additional three digit sequences as they wanted.  He would tell them whether or not each sequence followed the rule.  Surprisingly, despite the fact that most participants came up with sequences that followed the rule, only about 20% correctly guessed it. 
Confirmation bias prevented them from figuring out what turned out to be a very simple rule.  An analysis of the participants’ test sequences revealed two things about their strategies for guessing the rule.  First, most people offered very few test cases before guessing.  In another words, once they discovered just one or two data points that fit what they believed the rule to be, they stopped testing it with data. 
Second, when coming up with their test cases, many people  tended to only offer positive examples – those that supported what they thought the rule to be.  For example, if they thought the rule was “add two” they might try the sequence 8-10-12 or 22-24-28.  If they thought the rule was “even numbers only”, they might try 8-16-28. 
While this seems reasonable on the surface it’s flawed.  You can’t test your hypotheses or conclusions by only using positive cases.  You need to test cases that contradict your hypothesis to see whether it still holds up.  Doing so in this case would have quickly and clearly shown that they were wrong about the rule.  Instead,  they kept getting false positives for why they believed the rule to be. 
People who tested sequences that didn’t fit their hypothesized rule tended to discover the actual rule much more quickly than those who only tested sequences that confirmed their belief.  in case you are wondering,  the rule is that the numbers need to occur in ascending order. 

Here is another of Wason’s experiments.  See if you can figure out the answer.
The following four cards have a letter on one side and a number on the other:
A             K             2              7
The rule for labeling the cards is that if there is a vowel on one side of the card, there must be an even number on the other.  Which cards must you turn over to determine if the rule is true?
Do you think you know?  According to Duca Simone [3] there is a very high probability that you missed it.
If confirmation bias got the best of you (as it does around 33% of people), you chose card “A”.  Your logic is that if card “A” has an even number on the back, then the rule is confirmed and you can stop.  But remember, it’s important to check refuting data as well.  So while Card “A” confirms the rule, it is not sufficient given the other cards.
Since you knew I was up to something, you might have looked a bit deeper.  Confirmation bias combined with a simple logic error might have pushed you (and 46% of people) toward cards “A” and “2”.  There are two problems with this.  First, it is still only testing positive cases.  Second, and more importantly (this is where the logic error comes in), the rule doesn’t say that consonants can’t have an even number.  It just states that a vowel must have one.  Therefore, turning over the “2” card doesn’t actually confirm or refute the rule being tested.  The same is true for testing the “K” card.
The correct answer is card “A” and card “7”.  Don’t feel bad if you missed it, Simone states that only about 4% of people get this right. The “7” card is the negative test.  If it has a vowel on the other side then the rule would be broken  You can’t conclusively confirm or refute the rule without testing a positive and a negative case.
Our brains seek data that support our beliefs.  Once we find that data we tend to stop looking.  That gets us in trouble.  When trying to determine if your hypotheses, conclusions, and beliefs are true, don’t just test data that agree with the hypothesis.  Be sure to also consider data that don’t support the hypothesis. 
Brad Kolar is an executive consultant, speaker, and author.  He can be reached at brad.kolar@kolarassociates.com.



[1]Morgenson, Gretchen. “Behind Insurer’s Crisis, Blind Eye to a Web of Risk.” The New York Times 27 Sept. 2008, sec. Business Day. New York Times. Web. 3 June 2014. (http://www.nytimes.com/2008/09/28/business/28melt.html?em&_r=0)
[2]Wason, Peter. “On the failure to eliminate hypotheses in a conceptual task.”Quarterly Journal Of Experimental Psychology 12.3 (1960): 129-140. Print.
[3]Simone, Duca. “Rationality and the Wason Selection Task:A logical Account.”PSYCHE 15.1 (2009): 109-131. Print.
Print Friendly, PDF & Email