Monday, June 1, 2009

Hurricane Maps and Standard Deviations

The National Weather Service is changing the way it shows hurricane predictions to the public this year. In years past hurricane forecast maps have shown a "cone" of likely movement for the storms that corresponds to roughly a standard deviation of the results of their models along with a line that showed the mean path from their predictions. Essentially they run simulations hundreds of times on several different numerical codes and then do statistics on the results to make a "cone and line" of the storms likely path. For example, below is the "cone and line" graphic for Hurricane Katrina.As you will note the mean results of their models had Katrina making landfall on the Mississippi side of the Mississippi/Alabama border. As we all remember, that turned out to be wrong - the storm hit New Orleans dead on.

Technically, the plot was spot on with its prediction. Katrina stayed well within the predicted "cone" that forecasts said it would. Your average meteorologist (or scientist for that matter) would understand that weather forecasts are statistical rather than exact; they can't say what the storm will do, only what it is likely to do.

Here's where the things are changing this year. The National Weather service has announced that it is removing the "mean" line on its forecast plots. It turns out that something like 3 in 4 people surveyed said that they would make evacuation decisions in the face of a hurricane based on the line rather than the cone. In fact most people completely ignore the cone when looking at the forecast maps and focus solely on the line Essentially most people don't appreciate the chaotic nature of real weather.


  1. Actually the prediction you have posted there (the map) was almost dead on. The eye of the hurricane made landfall in Mississippi where it made the most physical damage (I actually drove through there a year later and there wasn't much left of the Mississippi beach front).

    If you want a good erratic hurricane to look at try Mitch, it moved around a lot.

  2. From the wikipedia:

    "Given the current speed of progress, Supercomputers are projected to reach 1 Exaflops in 2019. Erik P. DeBenedictis of Sandia National Laboratories theorizes that a Zettaflop computer is required to accomplish full weather modeling, which could cover a two week time span accurately. Such systems might be built around 2030."

    Good, within the next ~20 years I can plan on what to wear 10 days in advance.

  3. They might want to draw both one and two sigma cones. I mean, every year there are several hurricanes. (Just not all hit the US) Some are going to be off by one sigma.

    One day we will have 4 hurricanes hit land in one year, one will be off by more than one sigma, and the headlines will be "How did our meteorologists get things so wrong?"

  4. The problem with weather forecasting is that the equations used are chaotic, so very small changes to initial conditions lead to very different predictions. The way weather models try to overcome this is by running hundreds of simulations, adding small (within measurement error) random variations to their initial conditions each time, and then compile statistics using the results. Thus when a weather man says that there is a 70% chance of rain, what he or she means is that when they ran their models with current conditions as inputs, they got rain 70% of the time.

    Of course bigger computers will produce better forecasts, but that will mainly be in their ability to use a larger physical domain. Currently NOAA's standard weather models cover 160,000 square kilometer boxes (roughly the size of Montana). Right now people are working on coupling local weather models to global climate models in order to improve accuracy, but that is probably 5 years away from being reliable for things like tracking hurricanes.


To add a link to text:
<a href="URL">Text</a>