By John B. Drake

Few scientific topics evoke such general interest and public discussion as climate change. It is a subject that has been highly politicized. New results enter the environmental debate as evidence supporting a position. Usually the qualifiers, the background, and perspective needed to understand the result have been stripped away to form an appropriate sound bite. The attention is understandable given the importance of climate to agriculture and energy use. Fear of global warming and the greenhouse effect has been used as justification for reducing use of fossil fuels and increasing use of nuclear energy and alternative energy sources. It has been suggested that to avoid climate change, a return to a preindustrial level of emissions is necessary.

JOHN B. DRAKE, shown here studying a computer simulation of future climate under a global warming scenario, is a mathematician in the Mathematical Sciences Section of ORNL's Computer Science and Mathematics Division. He leads a team that has been developing parallel algorithms for climate modeling and implementing the Community Climate Model-2 of the National Center of Atmospheric Research in Boulder, Colorado, on the Intel Paragon supercomputer at ORNL. ORNL is one of several institutions attempting to predict impacts on climate of greenhouse gases emitted by fossil-fuel combustion and other human activities.

As a result of the Rio Conference of 1992, representatives of many nations of the world agreed on emission caps that would allow economic growth while still curbing undesirable increases in atmospheric carbon dioxide. As ecological awareness has grown, people want to know the predicted impact of human activities, including fossil-fuel combustion and forest burning, on future climate. Thus, climate prediction through computer modeling has been thrust into the limelight of policy debate.

The subject of this article is not the policy implications of greenhouse warming, or even the validity of the premise that global warming caused by the greenhouse effect is occurring. The subject is the current array of concepts and tools available to understand and predict the earth's climate based on mathematical models of physical processes. These tools for climate simulations include some of the world's most powerful computers, including the Intel Paragon XP/S 150 at ORNL. With these tools, we are attempting to predict the climate changes that may occur 100 years from now for different temperatures of the earth's surface that will likely result from rising levels of carbon dioxide in the atmosphere.

A parallel version of the Community Climate Model-2 was developed for ORNL's Intel Paragon XP/S 150 supercomputer to provide answers to energy-related climate questions. For example, what are the effects on climate of 100 years of raising the earth's surface temperature by burning fossil fuels at a rate high enough to double atmospheric CO2 concentrations? The parallel climate model, as modified at ORNL, can be used to predict climate states resulting from global warming and allow researchers to examine particular quantities such as the precipitable water available in the atmosphere. These fields can be compared with observed data, providing a better understanding of the type and severity of climate impacts. Scientific visualizations of the computed patterns, shown as clouds of varying thicknesses moving over the oceans and continents for each day of a year, have been developed and put on videotape at ORNL.

Facts about Climate

First, consider some of the observational data available about climate. The data help researchers frame the right questions and acquire the best understanding of climate and climate change.

Buried in the layers of Antarctica's ice is a history of the earth as revealing of earth's climate as tree rings are for the environmental conditions during growth of an oak. As ice accumulated from the yearly snowfall, oxygen-18 and deuterium from the atmosphere were trapped and preserved in their proportions to other atmospheric gases and elements. The isotopic fractions of these traces are strongly correlated with the annual mean temperature of the earth's surface. By drilling 2083 meters into the ice at Vostok, Antarctica, in 1987, a collaborating group of scientists from France and the former Soviet Union uncovered the first full glacial-interglacial cycle of earth's temperature.

The data indicate that, over the past 250,000 years when glaciers covered land and then retreated, the earth's temperature oscillated, varying by as much as 11°C. Such data remind us that climate change is a relative concept and that wide variations have occurred in the past. The data tell us that the global temperature has been about 2°C warmer than it is at present and that as little as a 5-degree difference in the global temperature is associated with the ice ages.

The Vostok ice core temperature variation in degrees Celsius as a difference from the modern surface-temperature value (top) and solar insolation as a function of time (bottom).

Although the climate of the past several centuries has been nearly constant, the longer time scales of the ice core data show that natural cycles may play out over thousands of years. A look at the pattern of temperature oscillations over time invites climate projections, much as fluctuations in the stock market invite some speculators to invest. The temperature record is analogous to the heart's signal in an electrocardiogram; in both cases, even the most regular of patterns is punctuated by irregular fluctuations.

One cycle of the earth's surface temperature is related to the change in the solar input induced by the earth's orbital precession. The precession is the change of the earth's axis of rotation over a 20,000-year period; it is analogous to the "wobble" of a spinning top. The change in temperature induced by orbital precession also has a period of approximately 20 thousand years (kyr).

The data present only a small sample. Only one or two periods of a 100-kyr cycle are apparent, and the current climate is not at a very dependable point for prediction. A longer record might offer the internal consistency required to analyze the structure of the climate time series, but lacking such self-consistent data we must turn to the physical relationships between variables to understand processes that affect climate.

The relationship between the amount of atmospheric carbon dioxide (CO2) and the earth's surface temperature has been much publicized, although the important graph is rarely seen. The former Soviet scientist M. I. Budyko is credited with first graphing the changes in historical temperature against changes in atmospheric CO2 concentrations. For the Vostok record this plot is given in the following figure. The graph suggests a direct, or linear, relationship between the average atmospheric temperature and the amount of CO2 in the atmosphere—that is, temperature rises as CO2 concentrations rise. The atmospheric temperature also rises as atmospheric methane increases. However, increases in the amount of dust in the atmosphere are linked to colder climates.


The Vostok ice core temperature variation in degrees Celsius plotted against the atmospheric CO2 concentration (ppmv).

Modern historical data coincide with the beginning of modern weather forecasting, which started with atmospheric pressure measurements following the invention of the barometer in 1644. Temperature, pressure, wind speed and direction, and precipitation measurements have accumulated from an increasing number of sites. The historical temperature record shows an increase in global average temperature by about 0.5°C over the past 100 years. Atmospheric concentrations of CO2 measured and recorded at the Siple Station in Antarctica over the same period show an increase from 285 parts per million by volume (ppmv) in 1850 to 312 ppmv in 1953. Measurements made at Mauna Loa (a large volcanic mountain in Hawaii) show that atmospheric concentrations of CO2 have increased from 315 ppmv in 1958 to 360 ppmv in 1993—the highest atmospheric CO2 value in the 200-kyr climate record (see asterisk in figure above).

If a strict linear relationship held between the amount of CO2 in the atmosphere and the temperature, we should be experiencing a much warmer earth than exists now. However, because the current data point does not fall nicely within the time series data, solving the problem of predicting our climate requires a more penetrating look at the climate system.

Global Dynamics: Backbone of the Climate System

The earth's climate system responds primarily to conditions imposed from outside such as the influx of solar radiation. Shortwave radiation from the sun is absorbed only slightly as it travels through the atmosphere. Upon striking the earth's surface—either land, sea, or ice—some of the radiation is reflected as shortwave radiation and some is absorbed, but most is reradiated as long-wave radiation. This long-wave radiation is absorbed more readily than shortwave radiation by the water vapor in the atmosphere or any of a number of other greenhouse gases (mainly, carbon dioxide, methane, and chlorofluorocarbons).

A significant amount of energy is also transferred from the earth's surface in the form of a latent heat flux induced by evaporation or by precipitation. The interaction of the land surface with the atmosphere is strongly influenced by the surface's moisture content. Finally, the energy variation over the earth's surface gives rise to winds and ocean currents that transport the energy.

The energy that accumulates in the warm tropical areas north and south of the equator moves toward the earth's frozen poles. This movement defines the main structure of the atmospheric flow. Together with the salinity of the ocean this poleward flux of energy also drives much of the oceans' circulation. The atmospheric and ocean circulations that control the earth's climate are perhaps the most challenging of fluid flow problems. Even though such a fluid flow problem can be formulated by classical equations, many mathematical and physical questions arise.

John von Neumann, the brilliant mathematician, physicist, and pioneer of the computer age (as well as a classmate and best friend of ORNL's former research director Eugene Wigner), was fascinated by the climate problem. In 1955, while addressing a conference of early weather modelers, he outlined an approach to climate research. "The approach," he said, "is to try first short-range forecasts, then long-range forecasts of those properties of the circulation that can perpetuate themselves over arbitrarily long periods, and only finally to attempt to forecast medium-long time periods. . . ." Von Neumann separates the time scales of weather and climate, stressing the dependence on the initial data of the weather. However, climate does not depend on particular initial data; rather, it is an inherent behavior of the system that manifests itself only over long time periods, much as the asymptote of a function is approached. He indicates that the hardest problem is in the region between the weather time scale of a few days and the climate scale of years or centuries. This is the region of the "butterfly effect" when the initial conditions (e.g., the fluttering of a butterfly's wings somewhere in Latin America) are seen to influence the specific progression of atmospheric flow.

That there is something worth calling a climate is taken for granted. That the system has asymptotic states has still not been proved mathematically. Von Neumann understood the basic concepts of nonlinear dynamical systems and chaos theory as applied to the atmospheric circulation long before these became subjects in their own right. The current jargon refers to sets in the space of possible weather states, known as attractors. This understanding is evident from his statement, "One generally believes that the various possible initial states which the atmosphere passes through fall somehow into groups, such that each group leads in the long run to the same statistical average. . ." These attractors do just what their name suggests: they attract nearby climate states to ever closer proximity and, if they were really known, would completely characterize climate. Knowing the attractors would be similar to having a topographic map of possible climate states. We would know the dimensions of the valleys into which the states would settle and also how easily the system might pass to another valley.

Although the general fluid flow equations have been known since the time of Leonhard Euler (1707-83), the mathematical theory of existence and the uniqueness of solutions are still developing. Edward Lorenz, a meteorologist at the Massachusetts Institute of Technology, brought the study of chaos to maturity with simple dynamical systems derived from atmospheric dynamics. The discovery of "deterministic chaotic" systems that are analyzable and the development of nonlinear dynamical systems theory have helped expand the notion of what is meant by a solution, of climate itself, and have provided new tools for approaching the equations. These advances stem not from application of a new mathematical theory to the climate problem but from an ongoing interaction among climate scientists, physicists, and mathematicians.

Spatial scales are also important to consider in the climate problem. The fundamental physical processes, "the minutiae of computation" as Von Neumann put it, must be incorporated in some fashion for the small scales to influence correctly the large scales. The interactions of components of the climate system form the backbone of the climate system and follow the routes of energy, mass, and momentum through the atmosphere and ocean. It is, after all, the climate system's job to transfer the energy received in the warmer equatorial regions to the cooler polar regions. As with many jobs, there may be more than one way to do it.

Mathematical Models of the Climate

To enable better understanding of the complex climate system, computer programs have been developed to model interactions of climate components. These general circulation models (GCMs) have been used extensively to understand climatic shifts observed in the past and to try to identify possible future responses of the climate system to changing conditions. Can the shifts occur over a short time, such as a decade or century? Will a shift be heralded by phenomena such as an increase in the frequency of El Niños and their surge of warm, western Pacific Ocean water toward South America? What are the different mechanisms of poleward heat transport that might provide the backbone of other climate states? These questions, and many others, indicate the complexity of current climate studies. Simple cause-and-effect arguments are usually not effective explanations in this arena. Complex computer models are practically the only tools available, so they are typically used to justify statements about climate and global dynamics.

For 20 years, climate-modeling researchers have been using some version of the Community Climate Model (CCM1) of the National Center for Atmospheric Research (NCAR). CCM1, which was produced in 1987, was operated on large serial supercomputers. Now, many of these researchers are using CCM2—a step forward that has been characterized as moving from some other planet to the earth. This step roughly corresponds with the advent of large, shared memory, parallel, vector computers such as the Cray YMP. Parallel computers allow a more detailed modeling of climate. The detailed examination of the balance of physical processes in the models moves closer to the observed state as modeling of details increases, building confidence that the physics is being captured.

Current atmospheric climate models capture very well the qualitative structure of the global circulation. The transport of energy from the warm equatorial regions to the cold poles and the split of the associated winds into cells are reproduced in simulations both qualitatively and quantitatively. The tropical Hadley cell and the mid-latitude Ferrel cells and jet streams are in good agreement with observations. These are the basic structures of the atmospheric circulation felt on the earth's surface as the doldrums, trade winds, mid-latitude westerlies, and polar highs.

The ability of the models to reproduce the current climate builds confidence in their physical validity. This validation, however, is not license to use the models for future climate predictions. Another important justification for use of the models has been their application to past climatic regimes. The NCAR CCM has been used to simulate climate effects resulting from increases in solar radiation in the northern summer because of changes in the earth's orbit. One of the effects was warmer land temperatures that gave rise to more intense monsoons. Increases or decreases in solar radiation resulting from changes in the earth's orbit are believed to be responsible for conditions that produced climates of past ages. According to Stephen Schneider of NCAR, "The ability of computer models to reproduce regional climatic responses to the changes in solar radiation brought about by variations in the earth's orbit lends a major element of confidence to the reliability of these models as forecasting tools for the future climate resulting from increasing greenhouse gases."

CCM2, the most recent code in a series of climate models developed by NCAR, captures the intricate interactions of the physical processes outlined here. This climate model, which is available to academic and industrial research users, simulates the time-dependent response of the climate system to the daily and seasonal variation of the solar input and of sea surface temperatures. For the past 10 years and into the foreseeable future, these models form the basis of a broad range of climate research and scenario testing used in support of decision makers who formulate national energy and environmental policies.

Parallel Computing Used with Global Circulation Models

Advances in computer technology have been welcomed by climate researchers because long climate simulations can take months of computing time to complete. The latest generation of supercomputers is based on the idea of parallelism. The Intel Paragon XP/S 150 can solve a single complex problem using the combined speed of 2048 processors. This computer differs from other supercomputers in that the memory of each processor is not accessible by the other processors. Such a system is called a distributed memory computer rather than a shared memory computer. This computer design allows for massive parallelism to be applied to problems but complicates formulation of the calculations.

The CCM2 is used almost exclusively on parallel supercomputers. The large computational requirements and the heavy volume of output generated by the model exclude its effective use on workstation-class systems. The heart of the dynamics algorithm in the CCM2 is based on spherical harmonics, the favorite functions of mathematicians and physicists who must represent functions with values on the surface of a sphere. The method transforms data on the sphere into a compact, accurate representation. Data for a 128 × 64 point grid on the earth's surface could be represented with only 882 numbers (coefficient) instead of 8192. It has had a very long reign as the method of choice for weather and climate models because of the accuracy of the spherical harmonic representation and the efficiency of the methods used to compute the transform. The transform is a "global" method in the sense that it requires data from the entire globe to compute a single harmonic coefficient. For distributed memory parallel computers, these calculations require communication among all the processors. Because communication is expensive on a parallel computer, many thought that the transform method had seen its day.

Before ORNL researchers became involved, parallelism in the models was limited to a shared memory paradigm in which only a few—1 to 16—processors were used. Because of the global communication required for the spectral transform, the distributed-memory parallel computers did not look promising. However, further study at ORNL revealed ways to organize the computation, completely changing our view and making it possible to implement the CCM2 on massively parallel computers.

Our research identified several parallel algorithms that keep the transform method competitive, even when using large numbers of processors as on the Intel Paragon XP/S 150 at ORNL. This powerful machine has 1024 node boards, each having two computational processors and a communication processor. The full CCM2 climate model was implemented for this parallel computer by a collaboration of researchers from ORNL, Argonne National Laboratory , and NCAR. It is currently being used by ORNL's Computer Science and Mathematics Division as the basis for the development of a coupled ocean-atmosphere climate model under the sponsorship of the Department of Energy's Office of Health and Environmental Research.

With the increase in computing capacity offered by the new generation of parallel computers, many researchers are working to improve the models by coupling the ocean and atmosphere. This exciting advance in the models brings us a step closer to a comprehensive model of the climate system. With this type of integrated model, a number of areas of climate study will open up. First, an improved method will emerge to simulate the earth's carbon cycle. Ocean and land processes (e.g., forests and soils) act as sources of and sinks for carbon in the atmosphere. Second, inclusion of high-resolution, eddy-resolving ocean models with atmospheric models will allow scientists to address previously unapproachable questions of the climate's predictability. The models will exhibit the typical behaviors of the interaction of the ocean and the atmosphere. The El Niño is but one mode of interaction. Discovering and identifying these modes may hold the key to the question of the climate's predictability.

Our models could be used to predict the overall impact on climate of counteracting atmospheric effects from both manmade and natural emissions—the warming effects of greenhouse gases and the cooling effects of sulfate aerosols. By using the increased computing power of the Intel Paragon, the IBM SP2, or the Cray Research T3D, researchers should advance one step further in understanding the complex interrelations among natural processes, human activities such as fossil fuel combustion, and the climate of our terrestrial home.


Where to?

[ Next article | Search | Mail | Review Home Page | ORNL Home Page ]