Dan Gardner of the Ottawa Citizen is one of my very favourite journalists. I first came across his work during the 2008 election campaign, in which he distinguished himself by writing what turned out to be the only article on carbon taxes that made no reference to Stéphane Dion's accent. His piece was a beacon of intellectual curiosity in what was otherwise a very bleak media landscape, for which I was almost pathetically grateful.
Happily, he has also found time to write books. His first book, Risk, dealt with the psychology of how we handle uncertainty, and is well worth tracking down, buying and reading. His most recent book, Future Babble, notes that one of the ways we deal with uncertainty is to rely on the predictions of experts. Unfortunately, experts aren't very good at forecasting, but no-one seems to notice or care. Many of the examples Dan uses are, inevitably, from economics, so this is a good book for economists to read and to take seriously.
I think it's worth elaborating on this point a bit. If experts can't produce reliable forecasts, then how can they be experts? The answer is that their expertise is in their ability to make conditional forecasts about a given phenomenon, holding everything else constant - the famous 'ceteris paribus' qualifier. And when we are able to control for these factors, economists' predictions aren't all that bad: when Hurricane Katrina knocked out 10% of North America's refinery capacity, the resulting increase in gasoline prices were pretty much in line with available estimates for gasoline demand elasticities. What economists can't do is predict the arrival of hurricanes that knock out 10% of refinery capacity.
The problem facing economic forecasters is that they are invariably asked to produce unconditional forecasts. This would be a daunting task, even if they could make conditional forecasts with a fair amount of precision. We might be able to call one or two coin tosses. But as the forecast horizon gets longer, there are more coin tosses to call, and accuracy deteriorates. Dan makes the point that meteorologists can make reliable forecasts for the next 24 or 48 hours, but not beyond horizons of 10 or 14 days.
These inherent uncertainties are an integral part of a proper forecast, and it's hard to fault an analyst who goes to great lengths to document the ways and likelihoods in which a certain projection could go awry.
The real targets of Future Babble are those who eschew the ways of responsible forecasting and who make confident, unconditional predictions based on not much more than their brand of Personal Credibility. There are lots of them, and the most delightful parts of the book are those where they get smacked down. I'll resist the temptation to go over them; it would be like giving away the best jokes of a comedy. And some of them are pretty funny.
Even though many of the examples are from economics, the real culprits are what Declan once called 'newspaper economists': the sort of people who issue press releases and who are quoted in the media for really no better reason than because they issue press releases.
In a recent column, Dan drew the distinction between how Bank of Canada Governor Mark Carney talked about the Bank's forecast and how a Master of the Wall Street Universe talked about his. Carney's was nuanced, guarded, and conditional. The Wall Street Master of the Universe was emphatic and unconditional, and made for much better TV.
And that's the real lesson of Future Babble. What people want from forecasters is certainty. Conscientious forecasters cannot provide it, because the future is inherently uncertain. But as long as suckers people are willing to pay for a forecast that promises certainty, there will be snake oil salesmen people willing to satisfy that demand.
Before I finally got around to reading it, my main concern about the book was that its message would be interpreted as "experts know nothing" by cranks and assorted do-it-yourself theorists. And it probably will - that's how cranks think.
They already are. I've spotted one climate change denier that is using Dan's book to bolster their denial claims.
Posted by: Robert McClelland | January 03, 2011 at 07:15 PM
And yet he specifically makes the point that uncertainty about long-run climate change forecasts does not invalidate the case for carbon taxes.
Posted by: Stephen Gordon | January 03, 2011 at 07:21 PM
I'd say that Gardner is quite clearly the best journalist in Canada. I first remember his series on the war on drugs in the Citizen sometime around 2000. He won a bunch of awards for that IIRC and he's been consistently good ever since
Posted by: Rob | January 04, 2011 at 12:05 AM
"And that's the real lesson of Future Babble. What people want from forecasters is certainty."
Prediction/betting markets can be very valuable here, since in most circumstances their output is quite close to a consensus forecast. I'd trust a liquid financial market more than any single expert. This goes all the more for large, well-established markets, such as interest rates, inflation, credit events etc.
Posted by: anon | January 04, 2011 at 04:15 AM
I like your point about experts making conditional forecasts. This, unfortunately, highlights the biggest failure of the economics profession during the financial crisis.
Yes, its difficult to predict what U.S. house prices might do in the future, and it was especially difficult from 2003-2007. By 2007, however, it was clear that house prices had at least flattened out. When I say it was "clear", I mean it was accepted by most economists.
So what was the conditional probability of subprime securities experiencing large losses once house prices flattened out (eliminating refi cash-outs as a dampener of high subprime delinquencies)? It was much higher than economists predicted.
Once subprime losses escalated, what was the probability that lenders would tighten underwriting standards, and that this would result in steeper house price declines? Much higher than predicted.
And what was the conditional probability of shadow bank liabilities -- backed by subprime collateral -- experiencing runs once subprime losses escalated? Apparently much higher than economists predicted.
And once runs on the shadow banking system began in August of 2007, what was the probability that we might see the failure of large financial institutions, and a more systemic run -- especially since shadow banks were short of a liquidity cushion? Again, much higher...
The biggest lie about the financial crisis is that no one could have predicted it. It holds for house prices, and everything after that was a matter of gauging conditional probabilities. I am not saying all economists should have pegged it exactly, but they massively underestimated the risk once we knew what the housing market was doing.
Posted by: David Pearson | January 04, 2011 at 11:32 AM
Well, if that's our biggest failure, then we're doing pretty well; the proportion of economists who follow the US housing market closely enough to offer an informed opinion is very small. And perhaps the real problem is that those who do were too diffident.
Posted by: Stephen Gordon | January 04, 2011 at 12:24 PM
Stephen,
If something is about to wallop the real economy, I think its falls into the scope of macroeconomics. To use a metaphor: If I'm charged with catching a trout, I'll travel to the nearest well-stocked stream, not cast a line into my swimming pool. As I said, you didn't have to make predictions about housing, but use what was known about housing to make predictions about the real economy -- based on conditional probabilities.
If I had a macro economist on staff at a large corporation, and after the biggest financial crisis in 80 years cost my firm dearly, I don't think I'd be too happy with the excuse of, "well, I'm not a housing economist, so how could I have been any help?"
The real question is whether you agree that the conditional probabilities above should have been apparent, or not.
Posted by: David Pearson | January 04, 2011 at 01:18 PM
Even in hindsight, it's entirely possible that they got the probabilities right and were simply unlucky.
Posted by: Stephen Gordon | January 04, 2011 at 02:03 PM
Doubtful. The job of economists is not to make point estimate predictions, but to highlight risks that fall within a range of probabilities. I believe that, as a group, they saw the financial crisis as an event with an exceedingly low-probability event. In this, they were clearly wrong, and not just in hindsight. It was evident from the conditional probabilities I outlined above that there was a material risk of a crisis caused by a plateau in house prices, by an escalation in subprime losses, etc.
Besides, you proposed the that we use economists' ability to analyze conditional probabilities as a gauge of the worth of their research and analysis. Saying that they were "unlucky" in analyzing those probabilities implies using some other gauge instead.
Posted by: David Pearson | January 04, 2011 at 02:20 PM
"I believe that, as a group, they saw the financial crisis as an event with an exceedingly low-probability event."
As it turns out, market forecasts (as implied by option prices and other indicators) consistently overestimate the risk of asset market crashes. The 2007-2010 crisis was a partial exception, with volatility indexes at historical lows before the crisis.
On the other hand, if Sumner is right and tight monetary policy is to blame, one could argue that the forecasts were reasonable, since the Fed's response turned out to be especially disappointing given e.g. the Chairman's academic background.
Posted by: anon | January 04, 2011 at 03:47 PM
You're trying too hard to be liked by the twitter crowd.
Posted by: Just visiting from Macleans | January 04, 2011 at 11:42 PM
I was assuming this would be a repeat of Tetlock's argument.
Posted by: Wonks Anonymous | January 05, 2011 at 11:16 AM
"They already are. I've spotted one climate change denier that is using Dan's book to bolster their denial claims."
"And yet he specifically makes the point that uncertainty about long-run climate change forecasts does not invalidate the case for carbon taxes."
But does his argument about experts suggest that climate scientists and the IPCC have gone about advocating for climate change policy all wrong? The results in the actual journal articles are couched in conditional probabilities, but whenever you see climate change scientists in the media or talking to politicians they frame the worst case scenarios as certain events.
Posted by: JDUB | January 05, 2011 at 11:44 AM
I don't like how the G&M experts claim CPC is protecting sensitive environments. GHG-intensive peat is not being sequestered in AB and mountain pine beetle is devastating (again, maybe even to peat and peermafrost). When Gini becomes too high, experts become captured and there are productivity losses until either a USSR style collapse or ?
In the USA, their executive branch is rich-people captured (be all right if not lobbying for a global holocaust), but not their crowns, yet. There is no outrage here becuase of high commodity prices. But what is the incentive for anyone to contribute to preventing rich Canadians from enduring loss of civil order? Harper kills daycare like a RW robot while giving to oil/tar/banks/insurance and the mentally ill. Poor people have to eat fuck live with and tolerate loud AB/CPC subsidized mentally ill people while productive minds are fucked over because AB blames Ignatieff for Great Depression undergovernment. After 1957, the Holocaust in the future became profitable.
Posted by: selective bias is lies if intentional | January 06, 2011 at 11:37 AM