The Optimism Bias (2)
So we know now (and in fact we knew already) that we have this optimism bias, and consistently and predictably expect things to turn out better for ourselves(though not for others) than they actually do. What follows?
Tali Sharot suggested we can incorporate this knowledge into our planning decisions, and indicated that the Government indeed have in their Green Book but somehow this feels too simplistic.
For starters, it sounds suspiciously like contingency planning with a bit of extra scientific backing. You know that you consistently mis-predict, mis-assess and so forth, so you factor that in. It is different from having a bit extra for unexpected events, but not that different.
In any case there is a deeper problem.
During a book tour of his own a few weeks ago Daniel Kahneman was speaking about cognitive biases more generally. In an interview with Oliver Burkeman he made the telling remark: “It’s not a case of: ‘Read this book and then you’ll think differently,’” he says. “I’ve written this book, and I don’t think differently.”
Tali Sharot’s argument, combined with Kahneman’s comment reminded me of the wonderful Hofstadter’s law:
it is not so easy to trick ourselves into not tricking ourselves.
“It always takes longer than you expect, even when you take Hofstadter’s law into account.”
In other words, it is not so easy to trick ourselves into not tricking ourselves. Sharot seems to suggest that the optimism bias is adaptive, and that it is broadly a good thing, but again this feels like an answer designed to reduce dissonance rather than being fully thought through. In this respect I have sympathy with Jules Evans who argues that The Optimism Bias is unduly pessimistic about our ability to change ourselves.
The issue, of course, is HOW to we go about changing? (And how much does this matter?)
My first set of scribbles in response to Sharot’s book was “This is about a deluded sense of self rather than optimism…”
This point goes beyond the scope of this blog, and I have written about it before but my impression is that our best hope in addressing biases are forms of psychological or spiritual practice that lead us to transform our fundamental sense of who we are. There may be no short-cut out of delusion.
One finding of many that might support this claim is the curious discovery that Buddhist meditators are more conventionally ‘rational’ in classic behavioural economics experiments i.e. they are more self-interested, and care less about norms of fairness and reciprocity. The stock response to this curious finding is that Buddhists are not so kind and compassionate after all! However, it looks to me more like they are much more aware of what is going on than most participants, and fully grasp that this is a game they are playing, and not a proxy for the human feelings and relations that actually matter, and which they experience more acutely than most. If you are genuinely altruistic, you have less need of altruistic punishment. Similarly, if you have an experiential (rather than merely conceptual) grasp of how the mind distorts reality, you may be better able to prevent it doing so in practice.
The issue of cognitive bias matters hugely in general, but when you consider the major issues of our time, not least the climate crisis and the debt crisis, both are arguably grounded in problems relating to optimism.
I am not saying that we should all just meditate and everything will be ok (that would be too optimistic!) but it might be a more fruitful ‘so-what’ to fall out of our awareness of the optimism bias.