Wednesday, August 11, 2010

Global Warming Could Be Bad... Or It Could Be Worse

The data on global climate chance from 1950 to the present leads to some pretty solid conclusions - the Earth's climate is warming and human greenhouse gas emission is at least one of the leading causes.  What everyone really wants to know, however, is not what has happened but what will happen to our climate if we keep doing what we're doing.  That's a much tougher question because it relies almost totally on computer models that use hundreds of parameterizations, simplifications, and approximations, some of which are poorly understood.  However the models aren't stupid - when one feeds in historical data one generally reproduces historical results - and climate scientists generally know what the uncertainty on each of these estimates are.

Based on these models the IPCC estimates that there will be a 1.8 to 4 degree Celsius rise in average global temperature by the year 2100, however with all the uncertainties some people might be inclined to say that there's a good chance it really won't be that bad.  They're right - turns out it might be worse.

From NASA's ever-useful Earth Observatory comes the results of running a suite of climate models thousands of times tweaking various parameters randomly (but in proportion to their uncertainty) on each run.  By doing this scientists get a feel for just how much what they don't know about their models can impact the results.  We call the result a probability distribution function, which shows the relative likelihood of the global average temperature increase by 2100.  Here's the result:

Clearly the highest probability is for an increase of between 1.8 and 4 degrees, as the IPCC claimed, but according to the models there is almost no chance of less than 1.8 degrees of warming while there is a decent chance of between 4 and 6 degrees of warming, and a slim chance of as much as 10 to 12 degrees of warming.


  1. Very interesting graph. Naively I would have thought it would be skewed to the small temperature changes. I mean, this plot makes it look like it is more probable that there is a 11 degree Celsius increase than a 1 degree increase. That's incredible!

  2. Joseph,

    It turns out it's almost impossible not to cause some serious warming when CO2 levels rise as far as they have in the past 100 years. I should stress that these models aren't perfect, but as far as we understand their imperfections it looks like things are going to warmer.

  3. One thing to keep in mind is that in the more extreme cases we should start seeing stronger negative feedback mechanisms. One thing I saw (I wish I could find the reference again) is that all of the models used by the IPCC assumed that the adsorption of CO2 by plants would remain constant, independent of the temperature (essentially they assumed a linear relationship between the total amount of biomass and its adsorption capabilities). But there is apparently evidence to show that this is not true. Which means that if you increase the temperature then plants will absorb more CO2 per unit of biomass than they currently do. Thus this would act as a strong negative feedback, which apparently is not taken into account by the models (and thus could not affect the outcome).

    To borrow a phase, it's the "unknown unknowns" that are important.

  4. "However the models aren't stupid - when one feeds in historical data one generally reproduces historical results"

    This raises a red flag with me. I don't believe that a model being able to reproduce historical results using historical data is sufficient evidence that a model is not stupid or otherwise deficient.

  5. Ben, your objection at least on the surface does not make sense. Because on the one hand if you feed in the data from 1960 and before and then you accurately predict what should happen from 1960-1980 then that is a good indication of the usefulness of the model. But (big emphasis on the but), this is not a perfect green light for the model, and I think that is what you are getting at here. Just because the model is valid for the range 1900-2000 does not mean that it will be valid for the range 2000-2100, because obviously something changed (i.e. more CO2, higher average temperatures) and that may mean that the model must be re-tuned to work properly, and to assume that no new tweaking is needed is hazardous.

  6. Quantumleap42,

    I would expect a given model to allow for changing values of variables such as average temperature and so on, so I'm not too worried about that (Indeed, if a model purported to accurately represent global temperature and chemical dynamics on a large time scale can't do that, then what good is it?). My concern has to do with the scalability of the historical agreement. Good agreement from e.g. 1980-2000 is fine, but what if you feed the data from 1960 into the same model? Does it give you the right results for 1960-1980? Does feeding in the data from 1990 give accurate results of 1990-2010? If a model is unable to do this, then how can one claim that the model accurately represents the real world, or that its projections into the next 100 years have any possible legitimacy?


To add a link to text:
<a href="URL">Text</a>