Keeping Score

What you learn from measuring predictions

Mon Nov 02 2015

Minds on the future

Picture a sprawling organisation with an annual budget of $50 billion. As in all organisations, its executives make decisions based on what they think will happen in the future—forecasts, in other words—but forecasting is particularly important for this organisation, because much of its money is spent generating forecasts which it shares with others. This is America's intelligence community, consisting of 17 government agencies. One might assume, given its vast resources and experience, that its forecasts are as accurate as humanly possible. Or perhaps, given its high-profile mistakes, one might think its forecasts are hopeless. The truth? Until recently, nobody really knew.

That's not unusual. All organisations make forecasts and pay attention to the forecasts of others. Yet the accuracy of much of this forecasting—maybe most—has not been determined. It's an astonishing oversight.

People routinely make quick, intuitive judgments using only minimal information. That's not a bad way to operate in many circumstances. But when it comes to judging forecasting, it means that memorable successes and failures—the analyst who correctly warned of a market collapse, the executive surprised by a competitor—will excessively influence our perception. This often gives us the feeling of knowing when in fact we do not know. To know, the accuracy of forecasters would have to be systematically tested, and it hasn't been.

We understand this in other domains. The only reasonable way to judge an athlete is to review performance statistics. No rational person would swallow medicine unless it had been scientifically tested for efficacy and safety. Yet when it comes to the forecasting that informs critical decisions, organisations routinely pay without demanding proper evidence of quality.

It's in the interests of organisations to test forecasting, but it's often not in the interests of those who run them. What if Dave in the mail room turns out to have a better sense of how the market is developing than Bob in the CEO suite?

Psychology and self-interest are formidable barriers—but not insurmountable. In 2011 a branch of us intelligence funded a four-year tournament that saw five research teams use whatever methods they wished to forecast the sorts of questions intelligence analysts tackle. Will Russia annex Crimea? Will Greece default? Will the Chinese economy slow? From this we could identify the winning methods and the strategies used by leading forecasters—or "superforecasters", as we call them. None requires a PhD in econometrics.

How to be a superforecaster

One lesson: don't overuse what statisticians call the "ignorance prior", the tendency to say 50% whenever you feel you don't even know enough to guess. You often do know enough. Most of us know nothing about the inside workings of the regimes in Egypt or Saudi Arabia, so, if asked how likely they are to be deposed, we might shrug and say 50%. Not so fast. Also take the outside view: how often are authoritarian regimes overthrown in any given year? Do that and you will wind up with a better guess—in this case, closer to zero than to 50%.

Another lesson: don't fall in love with your first estimate. The best forecasters readily update their calls in light of new information. But they also avoid the opposite mistake of assuming this changes everything. They know that truly big changes—revolutions, secessions, forced currency conversions—are rare, so their updates tend to be tweaks. Making many, small updates is often an effective way to strike a balance between under- and over-reacting.

Bottom line: superforecasting takes practice at distinguishing shades of grey, the differences between 55/45 and 45/55 bets. It requires unusually fine-grained assessments of uncertainty. Others besides America's intelligence agencies are taking note. From Wall Street to Silicon Valley, executives are starting to ask how good their forecasting really is and whether it could be better. UBS, a bank, has even launched its own internal forecasting tournament. As more people get serious about forecasting in 2016, the probability is high that their forecasting will improve—and so will their decision-making.

*Are you a superforecaster? Find out by taking part in The World in 2016 challenge, which is running at

You are reading a small selection of content from The World in 2016. To read all the articles in this year’s edition download The World in 2016 app.
Download 'The World In 2016 iOS app'
Download 'The World In 2016 Android app'