Modeling the earths climate

Today, several scientific endeavors are attempting to model the Earth's weather and climate for a variety of reasons, such as for farming, urban-

ization, and emergency preparedness and for economic, scientific, political, and humanitarian reasons. GISS has taken a lead and become one of the premier groups involved in modeling climate in order to better understand it.

One of the main goals of the researchers is to be able to anticipate the effect climate change will have on society and the environment. Although they are involved with several types of models, they are currently focusing most on global climate models (GCMs). These are large-scale models with the ability to simulate the entire Earth and all the forces that affect it, both human-induced and natural. For example, natural forces include volcanic eruptions, variations in insolation (incoming solar radiation), and changes in the Earth's orbital path. Human-induced forces include pollution (increasing greenhouse gases from burning fossil fuels), adding aerosols to the atmosphere, ozone depletion, some types of farming practices, and deforestation.

When scientists create climate models, they strive to ensure that physical phenomena are presented as accurately and consistently as possible and that all components of the cycle are realistically receptive to changes in the system. This is not an easy task. Because so many variables are dependent on other variables, if something does not work well in a model, it can skew the results or cause the model to fail. Complex GCMs are validated when they reproduce exactly the results of an actual past climate response whose initial conditions were fed into the model as a test. In other words, a model is a success when it can accurately simulate changes that have already occurred.

Scientists desire to build models that can look back millions ofyears, which is why proxy data are so important. By using proxy data that portray ancient climate accurately, a model can be written and tested until the correct outcome is achieved. Once this happens and the model is validated, current data can be input to make projections of future climate. GISS researchers have already simulated many of the Earth's past climate periods to validate their models and to better understand the Earth's climate history. When they model global climate events, such as ice ages, they gain a clearer insight into the worldwide effects of global warming today.

Climate scientists at GISS, such as Gavin Schmidt, James Hansen, Allegra LeGrande, Drew Shindell, Nadine Unger, Leonard Druyan, and Matthew Fulakeza, have been highly successful in developing mathematical models that illustrate how changes on the Earth's surface and in the atmosphere are affecting the climate today. Because the climate is such a complex system, the models are highly sophisticated and complex, with countless variables that have to be taken into account. When even one variable undergoes the slightest change, it can affect many other variables in a domino effect, all of which must be taken into account and provided for through representative equations and algorithms. As an illustration, consider these four simple changes.

1. an area experiences less cloud cover over a given period and receives more direct sunlight

2. if the area was normally covered with snow, the snow could melt

3. the albedo of the region could change from higher to lower values because of snowmelt

4. the temperature of the area would increase

Just these changes for one area would have to be accounted for in the model, and this does not even address the other issues that would apply at that same site, such as humidity, slope, aspect, soil type, elevation, latitude, continental or coastal location, and other important aspects of that particular area.

Fortunately, within the last decade, computing power has increased tremendously. More than three decades ago, when NASA successfully put astronauts on the Moon, the computer that was used filled an entire room. In comparison, some of today's desktop computers are more powerful. Even so, the mathematical models and computing power necessary to run a GCM are enormous, even by today's standards, with computing power that has increased by a factor of a million in the past 30 years. Modeling systems must be able to handle a multitude of simulations simultaneously on both global and regional scales, taking into account characteristics on land, in water, and in the atmosphere. They must also be able to handle various timescales such as years, decades, centuries, millennia, and so forth in order to generate reliable scenarios of climate change. Supercomputers must also have significant storage capacity.

According to scientists at NASA, the most sophisticated models developed represent the Earth as a three-dimensional grid, with the atmosphere split into 10 different grid layers. Each one of these grid layers in itself is an enormous dataset containing 65,000 reference points. When the model is run, each of the 65,000 points has a data value associated with it that is used in the model. To make it even more difficult, each point has more than one data variable assigned to it, such as a value for CO2, temperature, aerosol, pollution, albedo, and so on.

The power of the model lies in its ability to predict how the entire system will respond as values vary. For instance, if scientists want to see what a doubling of CO2 will do, the model doubles CO2 at all 65,000 points and predicts an outcome in relation to all the other variables. According to scientists at NASA, climate models are so intense that they have to be run on supercomputers that can handle more then 80 million calculations per hour. Simply to run a single model, the supercomputers must solve billions of calculations.

Even though the scientists at NASA have made great strides in modeling the climate, they still have not answered all the questions they would like to. According to Gavin Schmidt, a climate modeler at GISS, one thing they have not been able to model is abrupt climate change. He hopes that by combining paleoclimate and satellite image data NASA will eventually be able to build a model that will reveal aspects of the Earth's climate that have not yet been discovered and that will shed light on current climate mysteries. While he is interested in modeling paleo-climate, he says that "a further way to combine the models and the data is to 'forward model' the signal that would be recorded in the sediments or corals given a modeled climatic event."

In order to validate models, NASA put one to the test using the 1991 eruption of Mount Pinatubo in the Philippines. An extremely explosive eruption—one of the most violent in the 20th century—Mount Pinatubo ejected ash 21 miles (34 km) into the atmosphere. Scientists calculated that it also sent 17 million tons (15 million tonnes) of sulfur dioxide into the stratosphere. Because the stratosphere lies above the layer of the atmosphere in which Earth's weather takes place (5-31 miles, or 8-50 km), global atmospheric circulation spread the resulting sulfate aerosols (small reflective particles) around the Earth, so that after three weeks, the entire planet was encircled. As a result (rain could not wash them away), the resulting aerosol cloud remained in the stratosphere for more than a year, effectively shielding the Earth from the Sun's energy. This caused a measurable decrease in global temperature of 0.8°F (0.5°C). In addition, atmospheric water vapor decreased during this time. The end result was that the Earth measurably cooled for several years.

Because this eruption occurred at a time when satellite technology was available to observe and record it, climate modelers collected extensive amounts of data, such as sulfate measurements. Scientists hoped that if the recorded amounts they added to the models they had developed yielded results that matched those that had been collected in the field, it would prove the models were successful.

They found that the models generally predicted the results very well. The only problem was that they were not able to predict that Eurasia would warm slightly in the winter, as it actually did after the eruption. One explanation was that most global climate models do not usually deal with the stratosphere because it does not directly affect the weather. When the data were run on models that specifically included the stratosphere, however, they did achieve reliable results.

Scientists determined that Eurasia experienced a winter warming due to the North Atlantic oscillation (NAO) and that the sulfate cloud from Pinatubo had affected it. The NAO is a pressure system that determines how severe the winters in Europe are each year. A positive NAO makes Eurasia warm, and a negative NAO causes a cold winter. They determined the eruption triggered a positive NAO, which in turn made Eurasia warmer. To validate this, they ran similar related data that reflected conditions during the Maunder Minimum, which occurred from 1650 to 1710. By controlling the variable that represented the effect of the Sun's ultraviolet rays in the stratosphere and ozone production levels to match the conditions during that time, they showed that the NAO shifted to a negative phase and made Eurasia colder, which is actually what did happen.

Therefore, discovering the connection between the stratosphere and the NAO answered several complicated questions about how various climate factors interact with one another. Hence, paleoclimatology can be very useful by providing information, validating and refining models, seeing into the past, understanding better the ways that complicated climate systems work, and predicting what the future may bring both in terms of natural phenomena and human-induced effects.

Over the past 15 years, government researchers, private organizations, and academic institutions, in attempting to predict future climate change, have developed several global climate models. Each of these highly sophisticated 3-D models must have grids of cells programmed to solve for mass, momentum, and energy through timed sequences so that the climate system can be observed as it is modeled. The model is validated against observations and can also be run backward to see how much current models of a particular location match the actual behavior of a known site. Because there are so many input and output variables involved, they have to be continuously checked against experimental data to verify the results. Because of the importance of global warming, a common application of climate models today involves the effects of changing amounts of CO2 in the atmosphere. Models have been run, for instance, to illustrate the potential effects of a doubling of the amount of CO2, as is expected sometime within the next decade. If this were to happen, the results predict that the Earth will become much hotter and more humid and that sea level will increase 20 feet (6 m) over the next 100 years. The same models predict an increase of 6.7°F (4°C) over the same 100-year interval.

Continue reading here: What some models say about north america

Was this article helpful?

0 0