The World Meteorological Organization (WMO) held a conference at the end of September 2009 in Geneva, the WCC3. One of the sessions at the conference looked at Advancing climate prediction science
The advances in climate prediction and the associated challenges will be demonstrated. The full range of timescales from seasonal to centennial will be covered including how synergy between the different timescales can achieve seamless prediction.
The popular press latched onto the presentation by Mojib Latif, with serious distortion of what he said. For some background, watch Birth of a Climate Crock
Climate change odds much worse than thought
New analysis shows warming could be double previous estimates
The most comprehensive modeling yet carried out on the likelihood of how much hotter the Earth’s climate will get in this century shows that without rapid and massive action, the problem will be about twice as severe as previously estimated six years ago – and could be even worse than that. …
Other research groups have estimated the probabilities of various outcomes, based on variations in the physical response of the climate system itself. But the MIT model is the only one that interactively includes detailed treatment of possible changes in human activities as well – such as the degree of economic growth, with its associated energy use, in different countries.
The new projections, published this month in the American Meteorological Society’s Journal of Climate, indicate a median probability of surface warming of 5.2 degrees Celsius by 2100, with a 90% probability range of 3.5 to 7.4 degrees. This can be compared to a median projected increase in the 2003 study of just 2.4 degrees. The difference is caused by several factors rather than any single big change. Among these are improved economic modeling and newer economic data showing less chance of low emissions than had been projected in the earlier scenarios. Other changes include accounting for the past masking of underlying warming by the cooling induced by 20th century volcanoes, and for emissions of soot, which can add to the warming effect. In addition, measurements of deep ocean temperature rises, which enable estimates of how fast heat and carbon dioxide are removed from the atmosphere and transferred to the ocean depths, imply lower transfer rates than previously estimated.
“There’s no way the world can or should take these risks,” Prinn says. And the odds indicated by this modeling may actually understate the problem, because the model does not fully incorporate other positive feedbacks that can occur, for example, if increased temperatures caused a large-scale melting of permafrost in arctic regions and subsequent release of large quantities of methane, a very potent greenhouse gas. Including that feedback “is just going to make it worse,” Prinn says.
A.P. Sokolov, P.H. Stone, C.E. Forest, R. Prinn, M.C. Sarofim, M. Webster, S. Paltsev, C.A. Schlosser, D. Kicklighter, S. Dutkiewicz, J. Reilly, C. Wang, B Felzer, H.D. Jacoby.
Probabilistic forecast for 21st century climate based on uncertainties in emissions (without policy) and climate parameters.
Journal of Climate, 2007; preprint (2009): 1 DOI: 10.1175/2009JCLI2863.1
Selecting global climate models for regional climate change studies doi: 10.1073/pnas.0900094106
Regional or local climate change modeling studies currently require starting with a global climate model, then downscaling to the region of interest. How should global models be chosen for such studies, and what effect do such choices have? This question is addressed in the context of a regional climate detection and attribution (D&A) study of January-February-March (JFM) temperature over the western U.S. Models are often selected for a regional D&A analysis based on the quality of the simulated regional climate. Accordingly, 42 performance metrics based on seasonal temperature and precipitation, the El Nino/Southern Oscillation (ENSO), and the Pacific Decadal Oscillation are constructed and applied to 21 global models. However, no strong relationship is found between the score of the models on the metrics and results of the D&A analysis. Instead, the importance of having ensembles of runs with enough realizations to reduce the effects of natural internal climate variability is emphasized. Also, the superiority of the multimodel ensemble average (MM) to any 1 individual model, already found in global studies examining the mean climate, is true in this regional study that includes measures of variability as well. Evidence is shown that this superiority is largely caused by the cancellation of offsetting errors in the individual global models. Results with both the MM and models picked randomly confirm the original D&A results of anthropogenically forced JFM temperature changes in the western U.S. Future projections of temperature do not depend on model performance until the 2080s, after which the better performing models show warmer temperatures.
There is a brief introduction to climate modeling at Wikipedia, here
If you’d like to contribute to climate modeling by running models in the background on your PC, visit: climateprediction.net
You can see a sample model at the Japan Agency for Marine-Earth Science and Technology:
Reproduction of the Climate Change in the 20th Century by a Numerical Climate Model -Attribution of Causes of the Global Mean Surface Air Temperature Change: Temperature Rise in Late 20th Century Attributed to Human Activities-
Here’s an extract from their commentary:
The experiments conducted are for the past 150 years including 20th century. The following eight factors of climate variations have been included.
(1) Variations of solar insolation
(2) Changes in aerosols that reached stratosphere due to large-scale volcanic eruption
(3) Increase in greenhouse-gas concentrations (carbon dioxide, methane, nitrous oxide, and Halocarbon)
(4) Decrease in stratospheric ozone concentration since the middle of 1970′s
(5) Increase in tropospheric ozone concentration due to human activities
(6) Increase in sulfur dioxide emissions (precursor of sulfate aerosol) due to industrial activities
(7) Increase in carbonaceous aerosol emissions such as soot, due to human activities
(8) Change in land use
The factors (1) and (2) above are natural causes of climate variations, while those from (3) to (8) are due to human activities. Factors of climate variations were considered separately, such as including only natural factors or only human-activity factors, in order to understand the contribution of factors to the changes in mean surface air temperature observed.
When considering all the factors of climate variations (Figure1 top panel), the model has reproduced the warming trend very realistically both in the early 20th century (1910 to 1945) and in the late 20th century (the last 30 years of the 20th century).
IPCC AR4 WG1 Figure 9.5. Comparison between global mean surface temperature anomalies (°C) from observations (black) and AOGCM simulations forced with (a) both anthropogenic and natural forcings and (b) natural forcings only. All data are shown as global mean temperature anomalies relative to the period 1901 to 1950, as observed (black, Hadley Centre/Climatic Research Unit gridded surface temperature data set (HadCRUT3); Brohan et al., 2006) and, in (a) as obtained from 58 simulations produced by 14 models with both anthropogenic and natural forcings. The multi-model ensemble mean is shown as a thick red curve and individual simulations are shown as thin yellow curves. Vertical grey lines indicate the timing of major volcanic events. Those simulations that ended before 2005 were extended to 2005 by using the first few years of the IPCC Special Report on Emission Scenarios (SRES) A1B scenario simulations that continued from the respective 20th-century simulations, where available. The simulated global mean temperature anomalies in (b) are from 19 simulations produced by five models with natural forcings only. The multi-model ensemble mean is shown as a thick blue curve and individual simulations are shown as thin blue curves. Simulations are selected that do not exhibit excessive drift in their control simulations (no more than 0.2°C per century). Each simulation was sampled so that coverage corresponds to that of the observations. Further details of the models included and the methodology for producing this figure are given in the Supplementary Material, Appendix 9.C. After Stott et al. (2006b).
IPCC AR4 WG1 FAQ 9.2 , Figure 1. Temperature changes relative to the corresponding average for 1901-1950 (°C) from decade to decade from 1906 to 2005 over the Earth’s continents, as well as the entire globe, global land area and the global ocean (lower graphs). The black line indicates observed temperature change, while the coloured bands show the combined range covered by 90% of recent model simulations. Red indicates simulations that include natural and human factors, while blue indicates simulations that include only natural factors. Dashed black lines indicate decades and continental regions for which there are substantially fewer observations. Detailed descriptions of this figure and the methodology used in its production are given in the Supplementary Material, Appendix 9.C.
Predicting the Unpredictable
Climate Modeling Has Limits, but Without It, We’re Underwater
Pity the poor climate modeler. Here’s someone whose contributions are chronically underappreciated, whose methodology is under constant scrutiny and, worse, whose findings are often questioned, if not directly undermined. What’s a modeler to do when it often seems like all his or her work—the entire basis for the discipline, really—gets a bum rap from fellow scientists? Now, at a time when the global community arguably needs more accurate models and data than ever to predict future climate change and weather patterns, it certainly looks as though we should be embracing modelers’ efforts—not denigrating them—and providing them with all the necessary tools to help them improve their output. So what gives?
Hansen et al. 2007
Climate simulations for 1880-2003 with GISS modelE. Clim. Dynam., 29, 661-696, doi:10.1007/s00382-007-0255-8.
We carry out climate simulations for 1880-2003 with GISS modelE driven by ten measured or estimated climate forcings. An ensemble of climate model runs is carried out for each forcing acting individually and for all forcing mechanisms acting together. We compare side-by-side simulated climate change for each forcing, all forcings, observations, unforced variability among model ensemble members, and, if available, observed variability.
The following is “Fig. 6″ from the study:
General Circulation Models of Climate
The climate system is too complex for the human brain to grasp with simple insight. No scientist managed to devise a page of equations that explained the global atmosphere’s operations. With the coming of digital computers in the 1950s, a small American team set out to model the atmosphere as an array of thousands of numbers. The work spread during the 1960s as computer modelers began to make decent short-range predictions of regional weather. Modeling long-term climate change for the entire planet, however, was held back by lack of computer power, ignorance of key processes such as cloud formation, inability to calculate the crucial ocean circulation, and insufficient data on the world’s actual climate. By the mid 1970s, enough had been done to overcome these deficiencies so that Syukuro Manabe could make a quite convincing calculation. He reported that the Earth’s average temperature should rise a few degrees if the level of carbon dioxide gas in the atmosphere doubled. This was confirmed in the following decade by increasingly realistic models. Skeptics dismissed them all, pointing to dubious technical features and the failure of models to match some kinds of data. By the late 1990s these problems were largely resolved, and most experts found the predictions of overall global warming plausible. Yet modelers could not be sure that the real climate, with features their equations still failed to represent, would not produce some big surprise. …
FAQ on climate models, 3 November 2008
FAQ on climate models: Part II, 6 January 2009
We discuss climate models a lot, and from the comments here and in other forums it’s clear that there remains a great deal of confusion about what climate models do and how their results should be interpreted. This post is designed to be a FAQ for climate model questions – of which a few are already given. If you have comments or other questions, ask them as concisely as possible in the comment section and if they are of enough interest, we’ll add them to the post so that we can have a resource for future discussions.
The DCESS Earth System Model contains at present atmosphere, ocean, ocean sediment, land biosphere and lithosphere modules. An additional methane hydrate module is under development and an ice sheet module will be developed in the future. Sea ice and snow cover are diagnosed from estimated meridional profiles of atmospheric temperature. The model is divided into two 360° wide zones by 52° latitude. The model ocean is 270° wide and extends from the equator to 70° latitude. In this way the ocean covers 70.5 % of the Earth surface and is divided into low-mid and high latitude sectors in the proportion 84:16. Each ocean sector is divided into 55 layers with 100 m vertical resolution to maximum depths of 5500 m. Each of the 110 ocean layers is assigned an ocean sediment section. Ocean layer and sediment sector widths are determined from observed ocean depth distributions. We expect to make the MATLAB code for the DCESS model available on this homepage before the end of 2009.
Link to DCESS model description publication (pdf).
NASA Earth Observatory: Building a Climate Model
to understand how sunlight, air, water, and land come together to create Earth’s climate, scientists build climate models—computer simulations of the climate system. Climate models include the fundamental laws of physics—conservation of energy, mass, and momentum—as well as dozens of factors that influence Earth’s climate. Though the models are complicated, rigorous tests with real-world data hone them into robust tools that allow scientists to experiment with the climate in a way not otherwise possible. …
So far, the only way scientists can get the models to match the rise in temperature seen over the past century is to include the greenhouse gases that humans have put into the atmosphere. This means that, according to the models, humans are responsible for most of the warming observed during the second half of the twentieth century.