A Textbook Example

In his book Thinking, Fast and Slow academic psychologist and Nobel laureate Daniel Kahneman tells a story about how he, his long-time collaborator Amos Tversky and other faculty members began to write a textbook about judgment and decision-making. After a brisk year in which the team constructed a detailed outline of the syllabus, wrote a couple of chapters and generally felt pretty good about life, Kahneman asked his team members to estimate how long it would take them to finish the textbook. Their estimates were all very positive. Everyone though the job would take about two years. The most optimistic estimate was 18 months. The most pessimistic was two and a half years.

Kahneman had a further idea. He asked the team’s curriculum expert whether he could think of other writing teams that had developed a curriculum from scratch. After a little thought the expert fell silent. When he spoke at last he was embarrassed to report that not all teams at their own stage of readiness had even finished the job. In fact, about 40% of them had failed to complete the task at all, and of those that had finished none had done so in less than seven years or more than ten. Worse still, the expert judged that Kahneman’s group was less endowed with skills and resources than the teams that had gone before.

No-one in the team would have been willing to spend another six years working on a project that had a 40% chance of failure. But what did Kahneman and his colleagues do? They did what most people do when presented with an unfavourable forecast for a pet project, of course: they decided that nothing so awful could possibly apply to them and continued regardless.

The book was finally completed eight years later. By then Kahneman had left that job and had emigrated to another country.

Inside View & Outside View

This would just be an amusing story that happened to someone else were it not so embarrassingly familiar to anyone who has been involved in the business of planning and forecasting, or forecasting and planning in business, which is to say just about everyone. Kahneman and Tversky later distilled three lessons from their experience:

  1. There are two different approaches to forecasting. When we take the inside view we focus on specific circumstances and search for evidence in our own experiences. We fail, gloriously fail, to take account of what Donald Rumsfeld memorably described as the ‘unknown unknowns,’ the vicissitudes beyond anyone’s planning horizon – the illnesses, the bureaucratic cock-ups, the heartaches and the thousand natural shocks that flesh is heir to – that play havoc with our plans. The outside view of planning, in which one anchors one’s forecast upon a baseline or reference estimate of projects similar to one’s own, is much more likely to indicate how those projects generally turn out.

  2. More generally, the product of the inside view – in the story about the textbook, the two-year estimate – is usually a very optimistic forecast. Kahneman and Tversky named the tendency for the inside view to produce an unrealistically optimistic forecast the Planning Fallacy. They later refined that term to mean plans and forecasts that:

    • are unrealistically close to the best-case scenario; and
    • could be improved by consulting statistics of similar cases.
  3. It is hard, very hard, to act upon rational, outside view assessments. Once you have invested a great deal of effort in an enterprise it is very easy to find reasons to continue with it, even though there is good evidence available to you that you should abandon it and briskly walk away.

Improving Planning

These three phenomena go a very long way to explain why so many bad projects get started and, once they’re under way, why they continue far longer they should. Planners take the inside view too often; they produce very optimistic forecasts; and they continue with flawed projects for too long. And although this may not be a major problem for society when we’re talking about writing a new psychology textbook, things are much more serious when the project in question is a major governmental development such as a multi-billion pound railway line, a new power station or a ‘transformational’ programme. What can we do about it?

Imagine a spectrum of ‘predictability.’ At one extreme are circumstances of zero predictability – that is, where you can never tell in advance how things will turn out – and so one’s best estimate about a given outcome (for example: sales of a book, total fees for a drafting job, time taken to complete a negotiation) is the average result. At the other extreme are circumstances where prediction for a given outcome is perfect, so one may ignore the average result completely. Most real-world circumstances fall somewhere in the middle, so most of the time our predictions should be regressive; that is, they should fall somewhere between the average outcome and one’s estimate for the case at hand. We should ‘drag’ our intuitive estimates towards the historical average.

Note that this is not the same thing as simply adding some contingency to a forecast to cover known risks. In Donald Rumsfeld’s terminology, those are ‘known unknowns.’ They are, to a large extent, tractable. The unknown unknowns – including ‘Black Swans’ – are wild cards.

Years after the textbook debacle Kahneman and Tversky devised a five-step procedure to generate properly regressive predictions. It goes as follows:

  1. Select an appropriate reference class (i.e., the class of problem similar to the one at hand).
  2. Assess the distribution of the reference class.
  3. Make one’s intuitive estimate.
  4. Assess the ‘predictability’ of this class of problem.
  5. Correct the intuitive estimate by ‘dragging’ it towards the reference class distribution.

Although a detailed discussion of the mathematics involved is beyond the scope of this article, the essential idea can be seen pretty clearly in diagrammatic form:

Click here to view table.

This regression analysis allows your forecast to ‘cut to the chase.’ It factors in the effect of unknown unknowns, given how they’ve affected similar projects to yours. (It does not – and by definition, cannot – identify the unknown unknowns that will afflict your project or tell you how to manage them if and when they arise.) This technique is the basis of Reference Class Forecasting method for major programmes championed by Professor Bent Flyvbjerg of Oxford University’s Saïd Business School.