Everyone watches the nightly news to prepare for the upcoming week's weather. No matter how we prepare, the forecasts sometimes fall short of actual events. But weather prediction is a complex problem. Meteorologists have to take into account pressure systems, wind strength and direction, water vapor, temperature conditions at the ground and in the atmosphere, and much more. Predicting climate conditions in the next century as the Earth's climate changes is also a challenge. Having more and better observations is essential to improving both weather and climate forecasts.
Researchers Gil Compo, Prashant Sardeshmukh, and Jeff Whitaker set about developing a long-term dataset of the twentieth century to place current climate and weather anomalies in a historical context. The three researchers are part of the NOAA Earth System Research Laboratory and the CIRES Climate Diagnostics Center at the University of Colorado at Boulder. They were able to accomplish their analysis through use of powerful DOE Office of Science-supported supercomputers at Lawrence Berkeley and Oak Ridge National Laboratories
Their methodology is described in a 2006 paper entitled "Feasibility of a 100-year reanalysis using only surface pressure data," published in the Bulletin of the American Meteorological Society.
In order to gauge the success of the computer simulations of future climate events, the research team turned to historical weather data. Weather conditions have been observed and recorded for both land- and sea-based conditions from the nineteenth century to the present. They entered historical barometric pressure records obtained from weather stations and shipboard measurements into a computer program that combines the data with computer simulations of the weather. These computer simulations were compared to actual weather events to test the program's accuracy. But let's take a step back.
The current story began when NOAA digitized a series of old weather maps made in the 1940s, as part of the war effort, drawing on such archived data from earlier periods.
Compo wondered, "Wouldn't it be great to improve on these weather maps with new marine data recently recovered by NOAA." Compo and his colleagues collected historical data, especially barometric pressure readings, from marine platforms and weather stations. Barometric pressure is a measure of the mass of air pressing down on the Earth's surface. Warm air is light and often produces low pressure conditions, while cold air is heavy and often produces high pressure conditions.
"Barometric pressure is particularly important because it provides information about the air mass over a region. In addition, by comparing barometric pressure measurements at two neighboring sites, we can ascertain wind speed and direction that can be used to infer changes in temperature at ground level and different heights in the atmosphere" said Compo.
The original goal of this study was to determine whether modern data assimilation systems could use the available historical meteorological data to produce accurate daily surface weather maps for the period from the late nineteenth Century to the present. "We found that we could use the historical pressure observations to produce accurate weather maps up to the level of the jet stream, even in the late nineteenth Century, when there aren't many observations" said Compo.
Thirty international partners have contributed data to the project providing Compo, Whitaker, Sardeshmukh, and their team the largest dataset available to produce the most accurate weather reconstructions and simulations possible.
Using the computer model, the team reproduced weather maps for a period spanning 1891 to 2008 at approximately 200 square kilometer resolution. The team is currently generating maps from 1871 to 1890.
In addition to the maps, the scientists used the general circulation model to forecast weather conditions 27 hours in advance of the current time being reconstructed. "Comparing these simulations to the actual observations helps us know whether our reconstructed weather maps are of value," says Compo.
Undertaking simulations requires an enormous amount of computing time. "To produce the maps, we required more than four million computer hours at National Energy Research Scientific Computing Center (NERSC) and 1.1 million computer hours on Jaguar, the supercomputer run and maintained at Oak Ridge National Laboratory" said Compo.
The maps may provide scientist a new opportunity to explore recent weather fluctuations. Compo points out four topics he intends to study using the newly produced maps. "I hope to use the maps to better understand storminess over the past century, explore a period of Arctic warming in the 1920's, examine the influence of volcanic eruptions on weather patterns across the globe, and the contribution of atmospheric conditions that lead to climate catastrophes such as the Dust Bowl of the 1930s."
The long time range of the reanalysis dataset allows scientists to examine, for the first time long time scale climate processes. Version 1 of the dataset is available at the website http://www.esrl.noaa.gov/psd/data/gridded/data.20thC_Rean.html
Computer time was provided by SC's Office of Biological and Environmental Research as well as the Department's Innovative and Novel Computational Impact on Theory and Experiment program. Support for the project is provided by the NOAA Climate Program Office.
This article was written by Stacy W. Kish.