Confirmation Bias

I’m often guilty of confirmation bias. I have a particular world-view that favours balance across multi-levelled patterns, over extreme positions at any one level, so being an unfashionable position (in the blogosphere) I often latch onto examples that illustrate points that support my position. I was expecting Kahneman’s best selling “Thinking Fast and Slow” to be one long confirmation. In a sense it is, but it’s also a major disappointment in that it falls short and remains remarkably naive, for a Nobel-prize-winning effort.

OK, so there’s nowt so queer as folk, and psychology is everything when it comes to human behaviour in the world, even the world of economics. My agenda is the psychology of decision-making, how we use what we know to make (moral) decisions that govern our activities. So “the most important psychologist alive today”, also being an economics Nobel-laureate has to be a winning combination.

Kahneman’s book is full of “cognitive biases” that confound simple rational logic. The trouble is, his book is a summary of 30+ years research of his, with Amos Tversky and Nassim Taleb to name but two. So, if it’s a subject you’re interested in you’ve probably heard of most of them before, and seen references to most of the key experiments and case-studies, in works by others. If the concept of “cognitive biases” messing with our economic decisions as “rational agents” is new to you then Kahneman’s collection is definitely worth reading. His “prospect theory” overturns most utility-theory-based economics textbooks. But any economics that favours statistical objectivity over the subjectivity of its “subjects” has long been branded “autistic” – economics without social skills. Personally, I’ve already moved on.

In fact as I write this I’m a chapter or two from completing “Thinking Fast and Slow” and I’m documenting some criticisms in the hope Kahneman is about to overturn them in his conclusions.

(1) So many of the case studies and experiments are academics using their students as source material. One issue is that so many such “experiments” are questionnaire-based or if not the “real value” in the choices is nevertheless in an experimental context. However the biggest criticism is the participants – intelligent and educated, but still students – the lack of “wisdom” involved.

(2) Much of the book, the title is a reference to the fact, is about System 1 and System 2 thinking. 1 being intuitive and immediate (fast) 2 being considered and calculating (slow). Clearly the subjective psychological angle of 1 is constantly traded against the objectively reasoned angle of 2, and in particular – we are often talking about academic subject matter experts here – how 1 interferes with even expert judgements applied to the inputs and outputs of 2. Seems strange to me to completely miss any opportunity to link this to the right-left-right brain behaviours (see Iain McGilchrist “The Master and his Emissary“. Where R-L-R = 1-2-1; That is inputs (and outputs) are filtered and interpreted by our intuition even if the process of deliberation is explicitly objective and rationally considered.)

(3) He makes a couple of asides about “something your grannie could have told you” when commenting on empirically demonstrated effects, but doesn’t seem to pick up on the existence of wisdom in adages like “a bird in the hand” or “possession is 9/10ths”. The biases in accounting for cost and risk vs strict statistical odds for losses and gains are not some perversion of rationality – they are refinements of rationality. Long run odds may be relevant to actuaries but are irrelevant to individuals – we don’t live by endless streams of binary choices with clear odds, such as those presented in the experimental tests, not every swing has a roundabout. Heuristics of what really matters are probably built into System 1 behaviours – eg: I’ve done the calculation, but on balance I’d prefer …

Wisdom says, life’s just complicated enough for simple logic to not be quite enough.

It’s almost as if (as I said before) we don’t actually want to believe what’s right before our eyes.

One thought on “Confirmation Bias”

  1. Pingback: Psybertron Asks

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.