Via Planetizen, some depressing news: an international study has found that transportation planners regularly get their traffic and rail ridership forecasts wrong. And not just by a little. Half of all road traffic forecasts are wrong by at least 20 percent (though road projects tend to get a little more traffic than forecasted). Rail ridership, on the other hand, is typically less than half (ouch!!) of what the planners forecast. The study’s authors have seen no improvements in the accuracy of forecasts over the last 30 years. [Note – I got the road facts wrong the first time I posted this. Sorry.]
One of the study’s authors also coauthored Megaprojects and Risk (a book that’s been on my reading list for a while now), which documents how transportation planners systematically—and self-servingly—underestimate the costs of big projects, while overestimating the benefits.
We are a nonprofit. Donate now to support more research like this!
Now, I have no idea what this means for the many transportation megaprojects now planned or underway in the Northwest. Some planners have assured me, for example, that the estimated cost of rebuilding the Alaskan Way Viaduct was based on a responsible forecast that factored in most of the relevant risks of cost overruns. But with big gas taxhikes proposed in the Washington statehouse to pay at least part of the cost of new megaprojects such as the Viaduct, it would be nice to have a little more confidence in the forecasting process. And the same thing goes for the new bridge project the BC government is trying to foist on the Greater Vancouver region.
Of course, the problems with megaproject forecasting point to a larger pattern: humans are really, really bad at predicting the future. Stock market professionals consistently overestimate both the magnitude of future market gains, and their ability to outperform the market overall. Oil markets did a terrible job of predicting the current price runups. People even do a terrible job of predicting what kinds of things will make them happy, systematically believing, for example, that material goods will make them happier than they actually do. And so on. (More examples here, and all over the web.)
You’d think that, by now, we would have learned that we’re bad at predicting the future. But it seems that when facts and overconfidence collide, the facts bounce off, and overconfidence stands firm.