For the last six months I’ve been reading Thinking, Fast and Slow by Daniel Kahneman. It took me a long time to read, not only because I’m a slow reader, but because every chapter could have been a full book itself, given all the shared experience and the thoughts it triggered.
In short, our fast thinking is automatic and jumps to conclusions (impulses and intuitions), while our slow thinking is more systematic and requires effort and concentration.
For this post, I have taken theories and ideas mainly from the part Overconfidence and reflect upon my own experiences.
Was Gandhi more or less than 144 years when he died? How old was Gandhi when he died*?
The mention of the number 144 triggers a suggestion in the mind, an association to a very old person. You won’t guess that Gandhi was indeed 144 years old, but you will most likely anchor your estimate to that number. The anchor is a starting point from where you make adjustments.
Experiments where groups are given different anchors can demonstrate the anchoring index, the impact of the anchors. (Defined as the ratio between the difference between the anchors and the difference between the average estimates.) The anchoring effect is very strong, an index of 55% is typical for various kinds of experiments.
To challenge the anchoring effect, test thinking the opposite.
Reflection: the first presented version of a project plan easily becomes a dangerous anchor, a misleading starting point for a conversation or negotiation. Challenge it radically!
The Illusion of Validity
When unpredicted events occur, our minds search for facts and stories to explain them. We build these explanations on the, often very limited, information that we have, creating an illusion that we understand the past.
This illusion gives us an overconfidence to also predict the future. Overconfidence can become even stronger in areas where highly skilled and experienced experts analyse complex data. By focusing on the role of skill while neglecting the role of luck we are also creating an illusion of control.
Errors of prediction are inevitable because the world is unpredictable
Reflection: for projects gone bad, too much effort is put into making up the stories to justify the history instead of accepting the unknown unknowns (all the things you just can’t predict). We try to find that single step missing in the process, or that piece of information we forgot to schedule, to convince ourselves that next time we will get it all right.
Intuitions vs Formulas
The Apgar Score is used in delivery rooms across the world to assess the shape of a new-born baby. It’s a simple but systematic method where five variables are rated zero, to two. A baby with a score below five needs immediate care.
This test replaced earlier individual expert judgement that too often missed out on severe signals.
The human mind makes inconsistent decisions given complex information. Experiments have shown that experts evaluating identical information twice, often come to different conclusions.
Whenever we can replace human judgement with a formula, we should at least consider it.
Reflection: the classic RAG status project report could be a similar checklist, the quick but systematic assessment to get an overall score. However, at least in my experience, the action part is to weak. What consequences should follow, given an alarming score? What score should trigger an immediate closure? How to stop keeping investing in bad projects? (avoiding the sunk cost fallacy).
The Outside View
Basing forecasts and estimates on own experience and knowledge is taking an inside view. Even if you are a team sharing your views, taking some average and then extrapolating, it’s still an inside view. And the inside view beats the outside view in our minds.
In an outside view, you look at reference data and out of that define an objective baseline. The baseline prediction is the anchor from which specific adjustments are made according to additional information.
The Planning Fallacy
People are in general very bad at estimating how long it will take to complete a task. The Planning Fallacy states that plans are unrealistically close to best-case scenarios and that they could be improved by learning from statistics from similar cases.
Reflection: Business cases are usually constructed by someone who really believes in the idea, that the investment is the right one to do. Therefore, to get the business case approved, it’s too optimistic: overestimating benefits and underestimating costs.
However, I don’t see this as the main concern: overconfidence and not acknowledging unpredictability is.
Now, take these aspects of overconfidence and apply to a project portfolio where there are also multiple dependencies. Think, fast and slow!
*Gandhi was shot to death in 1948 at the age of 78