A commonly used assumption about climate change is that increasing temperature goes hand in hand with increasing climatic variability. One of the problems with this claim is that, while temperature is a well-defined concept, variability can mean different things to different people. It is clear that there are mechanisms (e.g. hurricane systems) that link temperature to higher rates of extreme events in some systems. It is also clear that increasing mean temperature will lead to a higher higher probability that maximum temperatures will exceed a fixed value (which is sometimes defined as “extreme events”). Yet, if we define variability strictly as the extent of deviations from a (running) mean (which I have also often seen used as a description of climate change in ecological models), things are far less clear, both in terms of empirical data and in terms of mechanistic understanding. It’s therefore good to see that Chris Huntingford and colleagues take a closer look at the question of increasing temperature variability in this week’s issue of nature. Maybe surprising to some, they find no consistent increase in temperature variability after accounting for a number of technical artifacts. From the end of their abstract:
The normalization of temperature anomalies creates the impression of larger relative overall increases, but our use of absolute values, which we argue is a more appropriate approach, reveals little change. Regionally, greater year-to-year changes recently occurred in much of North America and Europe. Many climate models predict that total variability will ultimately decrease under high greenhouse gas concentrations, possibly associated with reductions in sea-ice cover. Our findings contradict the view that a warming world will automatically be one of more overall climatic variation.
Food for thought, also for some ecological research I suppose.