Reporting Uncertainty

In chapter 6, Silver discusses the reporting (or lack thereof) of uncertainty in prediction. He points out that failure to report uncertainty can have potentially catastrophic outcomes. For example, a weather service in North Dakota forecasted that after a snow-heavy winter, the Red River would rise 49’. Since the levee that holds water out of the city was 51’ tall, they did not believe flooding would be an issue. What the weather service did not report is that their prediction had a margin of error of about +/- 9’. This means there was still a decent chance the river would rise above 51’ and the levee would overflow. As a result, locals did not prepare for the flood that damaged or destroyed 75% of the city’s homes. With proper preparation, the floodwaters could possibly have been diverted away from the city and the damages avoided. When asked why they didn’t report the margin of error surround the prediction, the weather service responded they were afraid the public would lose confidence in the forecast if uncertainty was reported.

Unfortunately, this seems to be just the case with the general public. People are not accustomed to seeing uncertainty reported with predictions or statistics. Instead, they take statistics at face value. If a layperson were to read a scientific journal entry, they would most likely be overwhelmed by all the caveats and margins of error presented with the data. The general public seems to be much happier reading brief popular science articles that state scientific findings without any accompanying uncertainty or error to obscure the study’s results. While this may make people feel more confident in the data reported, it is actually obscuring the true signal by incorporating, and not taking into account, the noise. This is dangerous as it may lead people to stop thinking critically. A critical thinker may have seen the North Dakota weather forecast and noted that 49’ is awfully close to the top of the levee. Maybe precautions should be taken just in case the forecast is not exactly accurate. Instead of withholding uncertainty to give people more confidence, reporting of uncertainty should become ubiquitous so that people get used to taking all factors into account when interpreting a forecast.

Another problem is false reporting of uncertainty. As Silver pointed out, economists often report a 90% confidence interval surrounding their prediction. However, instead of the true values falling within the interval 1/10 times (as expected with a 90% CI), the interval actually contains the true value only about 1/3 or ½ times (hardly better than a random guess). I would definitely not stake my financial future on these figures! The general public, however, may see 90% and feel very confident about the forecast. This dishonesty is particularly daunting because most people assume that what is reported to them is correct and they won’t even think to question the uncertainty reported.

Advertisements
This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s