The Greenhouse Effect and Climate Change Revisited
The Greenhouse Effect and Climate Change Revisited
Abstract
On any planet with an atmosphere, the surface is warmed not only by the Sun directly but also
by downward-propagating infrared radiation emitted by the atmosphere. On the Earth, this
phenomenon, known as the greenhouse effect, keeps the mean surface temperature some 33 K
warmer than it would otherwise be and is therefore essential to life.
The radiative processes which are responsible for the greenhouse effect involve mainly
minor atmospheric constituents, the amounts of which can change either naturally or as a
by-product of human activities. The growth due to the latter is definitely tending to force
a general global surface warming, although because of problems in modelling complicated
feedback processes, for example those involving water vapour, ozone, clouds and the oceans,
the precise rates of change and the local patterns which should be expected are not simple to
predict.
This article updates an earlier review which discussed the physical processes involved in
the greenhouse effect and theoretical and experimental work directed towards an understanding
of the effect on the climate of recent and expected changes in atmospheric composition. In the
last ten years, progress in data acquisition and analysis, and in numerical climate modelling, has
tended to confirm earlier predictions of the likelihood of significant rises in the mean surface
temperature of the planet in the next 50–100 years, although this remains controversial.
(Some figures in this article are in colour only in the electronic version)
Contents
Page
1. Introduction 3
1.1. Plan of the paper 3
1.2. Climate change and global warming 3
1.3. Factors controlling climate 4
1.4. Climate and climate change on the terrestrial planets 6
2. Atmospheric composition and structure 7
2.1. Origin of the atmosphere 8
2.2. Vertical structure 8
2.3. Atmospheric composition 9
2.4. Anthropogenic emissions 10
2.5. Feedback cycles affecting atmospheric composition 10
2.6. Atmosphere–ocean coupling 10
2.7. Clouds and aerosols 11
2.8. The biosphere 11
3. Models of climate change 12
3.1. Simple models 13
3.2. Complex models 16
4. Latest model predictions 17
5. Experimental studies 18
5.1. Ground-based programmes 18
5.2. Satellite measurements 21
6. Conclusion 24
References 25
The greenhouse effect and climate change revisited 3
1. Introduction
incomplete that, at the present time, only modest progress has been made in understanding the
mechanisms underlying natural variability and no physical model exists which can explain from
first principles all of the fluctuations which appear in any kind of climate record. Reasonable
success has now been obtained in accounting approximately for the smoothed changes over
the last hundred years, as we shall see below.
With ‘hindcasting’ as calibration, the overwhelming task facing modellers remains
forecasting possible anthropogenic climate modifications, that is, quasi-permanent changes
driven by human activities. It is generally, although still not universally, recognized that
observed changes are the consequence of present-day levels of atmospheric pollution, and that
further changes are inevitable and likely to lead to serious environmental problems within the
lifetimes of people alive now. Options for avoidance strategies are limited and costly, putting a
high premium on the credibility of any prediction and leading to a massive effort to understand
the geosystem better, with more and better measurements and realistic, physics-based computer
models two of the top priorities.
Figure 1. The relative contributions of the principal greenhouse gases and other factors to the
radiative forcing of global warming, expressed as the difference between the years 2000 and 1750
(IPCC 2001).
the (better understood) warming effects of greenhouse gases like CO2 and CH4 . This is the
global warming controversy in a nutshell.
CO2 is the most important pollutant for greenhouse purposes, not because of its efficiency
(the strong CO2 bands are already saturated, as discussed above) but because there is so much of
it being produced, and because its second-strongest spectral band, the bending fundamental at
15 µm, falls right at the peak of the Planck function for typical terrestrial temperatures. Figure 2
shows how the three principal greenhouse gases have increased over the last 1000 years. CO2
has increased by 25% since the industrial revolution, and continues to rise at a rate of 0.5% p.a.;
methane concentrations have doubled in the same period and are increasing at 1% p.a.
The ‘secondary’ greenhouse agents, water vapour and cloud, which have an even larger
effect but which do not depend directly on human activities, are not included in these figures.
The mean level of water vapour is in a complex state of dynamic equilibrium with the liquid
and solid water in the oceans and cryosphere; any increase in mean surface temperature due
to increases in other gases such as CO2 will tend to be amplified by the increased capacity of
the lower atmosphere for water vapour. H2 O is in fact the most important greenhouse gas in
terms of its contribution in degrees to the surface temperature, since it is present in enormous
quantities. However, anthropogenic release of water vapour is not itself a direct contributor to
enhanced greenhouse warming, since the role of water vapour is one of amplifying changes
forced by other gases. Water vapour is the most effective component of the unperturbed
greenhouse, contributing about 65% of the 33 K of ‘natural’ warming which the planet enjoys.
The areas of greatest uncertainty in understanding the changing greenhouse effect are
undoubtedly those which involve clouds and aerosols. These have a large effect on the albedo
of the Earth and it has frequently been pointed out by sceptics that a tendency towards a hotter
6 F W Taylor
Figure 2. Global concentrations of three greenhouse gases as a function of time over the last
millennium.
Earth could be offset by a related tendency towards increased cloud cover, which reduces
the solar heating by reflecting more radiation to space. In fact, cloud–climate feedback is
extremely complex, and changes in the mean cloud cover or thickness can also increase any
warming which produced it by reducing the net thermal emission of the Earth to space. It
all depends on the height and composition of the cloud, and its microphysical properties such
as particle or drop size. Modelling the occurrence and properties of clouds and background
aerosols (i.e. the turbidity due to airborne particles which is present even when clouds are
not) as a function of air temperature, composition and other factors remains one of the biggest
problems in climate forecasting.
and the same greenhouse gases (principally carbon dioxide and water vapour) are responsible.
The existence of these planets offers an important opportunity to see how the greenhouse effect
works in situations other than that we observe on the Earth.
Venus is a particularly interesting subject, because there an extreme case of the greenhouse
effect raises the surface temperature to around 730 K, which is higher than the melting points
of lead, zinc and tin. This happens in spite of the fact that the net solar input is significantly less
than for the Earth, because the very high albedo of Venus (76% compared with around 30%)
more than offsets its greater proximity to the Sun. Venus is, in fact, an example of how clouds
increase the greenhouse effect. The sulphuric acid of which the Venusian clouds are composed
forms droplets which scatter very conservatively, diffusing a fraction of the incoming sunlight
down to the surface. At the same time, they are also very opaque in the infrared. The cloud
blanket therefore makes a large contribution to the backwarming effect, which outweighs the
albedo effect overall when combined with the contribution of around a million times as much
CO2 as the Earth.
The atmosphere of Mars is also warmed by airborne particles, in this case windblown dust
from the surface. The great size of the effect—around 30 ◦ C near the surface, and nearly 100 ◦ C
at the tropopause—emphasizes the potential scale of the aerosol problem on the Earth, and
the importance of dust-producing processes like volcanic eruptions and nuclear explosions,
particularly if these occur on a global scale.
Mars also possesses the solar system’s most dramatic evidence of past climate change.
Today’s frozen rocks and desert bear unmistakable features of rivers, lakes and seas. The origin
of these is highly controversial, but one leading theory is that Mars may in the past have had a
much thicker atmosphere. This would still have consisted primarily of CO2 , but being warmer
would have held much more water vapour, and possibly contained other constituents including
clouds. Together these might have produced a greenhouse effect large enough to raise the
temperature and pressure to values which could support liquid water on the surface. If so, how
could Mars have changed so much since then? The answer may lie in the high eccentricity of
its orbit and the large fluctuations in sunfall which result from resonances between this and
other orbital parameters. Alternatively, the heating could have been primarily due to volcanic
activity which has since subsided. In this case, water vapour, CO2 and other gases and particles
from the interior of Mars could have warmed the planet by greenhouse action and produced
lakes and rivers for as long as the volcanoes were sufficiently active. Yet another possibility
is that Mars lost enough of its early atmosphere in a collision with a large asteroid to cause
temperatures to fall below the freezing point of water. The resulting loss of water vapour
and clouds from the remaining atmosphere would further reduce the surface temperature and
eventually lead to the situation we find today, where the water not lost in the collision mostly
lies frozen beneath the surface and in the polar caps.
The climates of Mars and Venus clearly deserve much closer study. The general message
that Earth-like planets apparently can be subject to extremely large and variable greenhouse
warmings is an important and often under-rated one.
In this section, we recap briefly some of the fundamental properties of the major components
of the climate system, beginning with the origin and basic structure of the atmosphere. In
considering the crucial topic of atmospheric composition, it is necessary to take account of
a number of complex processes and cycles, including photochemistry, which modifies the
composition of the atmosphere in important ways, including the production of ozone; the role
of the biosphere, including the anthropogenic production of greenhouse gases; condensation
8 F W Taylor
processes involving water vapour and clouds; and the interaction of the oceans with the
atmosphere. All of these involve feedback loops, are difficult to isolate in observations and
indeed are often coupled together, all of which means that even when they are understood
qualitatively, they may be difficult to incorporate accurately into models.
Figure 3. The mean temperature structure of the atmosphere, showing the names given to various
regions.
the stratopause. Above the stratopause, the temperature declines again, reaching an absolute
minimum of about 190 K at the mesopause near 85 km, where the pressure is only a few
millionths of the standard surface pressure.
N2 78.08
O2 20.95
CO2 0.033
Ar 0.934
Ne 1.82 × 10−03
He 5.24 × 10−04
Kr 1.14 × 10−04
Xe 8.7 × 10−06
H2 5.0 × 10−05
CH4 2.0 × 10−04
N2 O 5.0 × 10−05
and anthropogenic, the latter mainly due to catalytic cycles involving reactive nitrogen (NOx ),
halogen, especially chlorine (ClOx ), and hydroxyl (HOx ) compounds, in which small amounts
of pollutant can destroy large amounts of ozone (Brasseur et al 1999).
carbon dioxide as the atmosphere. Heat is transported globally by ocean currents (driven by
winds) and released, with major effects on regional climates. The general circulations of the
atmosphere and oceans are so intimately interconnected that most leading efforts to construct
sophisticated climate models include the dynamics of both systems, and their coupling, in spite
of the extra strain such a step imposes on computing resources.
Figure 4. The global carbon cycle, including estimates of the amounts of carbon fluxes (in Gt/year)
between elements of the climate system (Schimel et al 1995).
dioxide abundance in the atmosphere, and the role of humans in producing CO2 and the
other greenhouse gases as a by-product of their burgeoning industrial activities. The latter
is, of course, a matter of great popular and political concern at the present time and a major
motivation for research into the relevant mechanisms and their likely consequences.
The global carbon cycle, illustrated schematically in figure 4, includes feedback processes,
some tending to accelerate and others to slow the buildup of atmospheric CO2 . Despite the
publicity which they receive, the principal man-made fluxes of CO2 , which are those due to
fossil fuel burning and deforestation, are small compared with the natural exchange between
plants, the soil and the oceans. Thus, it is necessary to be concerned not only about man-made
production of CO2 itself, but also any other pollution which may affect the natural equilibrium,
for example by reducing the population of biota in the oceans. On the other hand, the take-up
of CO2 by the oceans may increase if warmer water leads to an increase in aquatic plant life,
in spite of a reduction in the solubility of the gas at higher temperatures. More vigorous plant
growth on land, as a result of higher atmospheric CO2 concentrations, may also tend to slow
down the buildup of the gas in the atmosphere.
Studies of greenhouse change, like virtually all problems involving complex coupled systems
and a multitude of different processes, and certainly all of those for which forecasting the
The greenhouse effect and climate change revisited 13
future is a goal, rely on numerical models. Simple versions provide useful insight into the
physics involved, while complex parametrizations are necessary to obtain the greatest possible
accuracy and to attempt to incorporate all of the feedback process. As already noted, it is still
not possible to produce full and accurate representations of the past and present climate from
first principles, and there is still no complete consensus about which aspects of the apparently
chaotic climate system predictions are possible on any particular timescale in the future.
Nevertheless, the current state of the art has advanced to the point where some fairly
general global forecasts can be made with sufficient precision to at least begin to address the
basic requirements of policy makers. For example, from model studies it can be said with a
high probability that mean temperatures will rise, and stratospheric ozone amounts decline, by
amounts of a few degrees and several per cent respectively in the next few decades. Models
are capable in principle of providing regional detail as well, and some already do so, but the
uncertainties are much larger. In this section, we look at the basics of climate models before
considering some of the most recent results.
Figure 5. Simple climate models: (a) unperturbed; (b) increasing the albedo A of the Earth cools
the stratosphere and the surface while the composition, tropopause height and the tropospheric
lapse rate are unchanged; (c) increasing the abundance of CO2 while keeping the albedo and the
tropospheric lapse rate constant raises the tropopause height and warms the surface.
mass m (see, for example, Goody and Yung 1989). The path of absorber in the isothermal
column above the tropopause at pressure pt is H apt , where a is the mixing ratio by volume
of the absorbing gas (assumed constant with height, a reasonable approximation for CO2 ,
which is overwhelmingly the most important infrared absorber in the stratosphere) and H is
the atmospheric scale height (again constant with height, a fair approximation in the lower
stratosphere). The optical depth, and hence the equivalent grey absorption coefficient, is
therefore a function of the product of the pressure and the square root of the mixing ratio, so that
√
τ = f (ptrop a) = 1. (9)
We do not need the form of the function f for present purposes, but see Goody and Yung
(1989) for a discussion.
An albedo of A = 0.3, a tropopause height of 11 km and a pressure ptrop of 227 mb,
are considered to be representative means of the present-day atmosphere. Now consider the
effect of changing the absorber amount, including the case where the amounts are doubled, a
standard case for climate modellers (the resulting change in T0 is sometimes called the climate
sensitivity) and a situation which may occur on Earth√(for CO2 at least) during the next 50 years
or so. Then, according to (9), ptrop will move to 1/ 2 its present value. Physically, what is
happening is an increase in the depth of the convective layer to offset the greater opacity of the
atmosphere, so that heat is still brought up to the unit optical depth level where it can radiate
to space. (Note that the model does not depend on the relationship τ = 1 at the tropopause;
any constant value of τ would have the same effect. However, it can be shown from simple
radiative transfer theory that most of the heat reaches space from levels near τ = 1.)
This simple model predicts a surface temperature rise of about 9 ◦ C for the extreme case of
doubling the greenhouse gases in the atmosphere. An increase of CO2 from 330 to 350 ppmv,
roughly corresponding to the observed increase between 1975 and 1990, produces a surface
temperature rise of around 0.75 ◦ C, perhaps twice that observed (figure 6). This difference
can be explained by the lag introduced by the thermal inertia of the oceans, i.e. the time for
the climate system to reach equilibrium after a change in the ‘forcing’, which is orders of
magnitude longer than 15 years.
Despite the gratifying success of this simple calculation, it is of course necessary to
consider more accurate models that include feedback processes. The two most important are the
16 F W Taylor
Figure 6. Observed global temperatures from 1880 to 2000 (US National Climatic Data Center).
increased humidity of the air which would follow an increase in mean surface temperature, and
changes, probably increases, in the amounts of various types of cloud. Although water vapour is
the single most important absorber of those which produce the greenhouse effect, in the simple
model, the effect of increasing water vapour is negligible provided the increase is confined to
the troposphere. The physical reason is that the lower atmosphere is already optically thick,
and the vertical transfer of energy is dominated not by radiation but by convection. This is
represented by the fixed lapse rate in the model.
A second feedback process, this time probably tending to oppose any increase in the
surface temperature, is the increase in cloudiness which may result from increased humidity
and convection in the troposphere. Clouds are generally more reflective than the surface
they conceal, so increasing cloud cover tends to increase the albedo of the planet, which was
one of the parameters used to formulate the simple model. In the simple model, the surface
temperature is reduced by about 0.1 ◦ C for each 1% increase in the albedo increase of the
Earth.
Such a change could easily pass undetected, since albedo depends on quantities such as
cloud height, particle size, geographical location and so forth. Recently, satellite experiments
have been developed which measure the albedo and total thermal emission of the Earth from
space, as will be discussed below (section 5.2). Even these cannot detect changes in net
albedo without collecting many years of data, and the problem of separating trends from slow
fluctuations has to be surmounted.
In summary, very simple models give some insight into the principal processes at work,
and allow us to go through the motions of accounting for recently observed small changes
and of predicting major changes in the future. Obviously, the detailed values obtained
from such a simple and in some ways arbitrary model are not very reliable, but they do
emphasize the important principle that very basic physics implies non-trivial changes in mean
surface temperature and/or global cloud cover in response to changes in the atmospheric CO2
concentration. In the next section we shall consider the extent to which this conclusion can be
refined using sophisticated climate models.
that climate change is a regional affair and that changes in, for example, the mean surface
temperature of the Earth will occur as a result of large increases in some regions with smaller,
or even negative, increases in others. There will also be changes in the patterns of variability,
not only of temperature but also of important quantities such as rainfall which meteorologists
wish to understand and predict. Models with good spatial and temporal resolution, and
comprehensive formulation or parametrization of radiative, dynamical and even chemical
processes are required to address these goals.
It is beyond the scope of this review to describe the workings of complex climate models in
detail. All of them are based on general circulation models of one kind or another, increasingly
atmospheric and oceanic GCMs coupled together. The basic building blocks of any GCM
are the so-called primitive equations (e.g. Gill 1982), which express momentum balance,
hydrostatic balance, conservation of mass and conservation of energy. To these must be added
schemes for radiative transfer, cloud formation, surface interactions and all the processes
described in the preceding sections. These frequently involve the simplification, for example
by linearization, of complex processes, and constantly demand improvement. The resulting
models all give a heavily damped, essentially linear response to changing boundary conditions,
which is fairly unrealistic especially for long-term forecasts, and explains why models cannot
at present reproduce observed interannual variability.
Predictions obtained using existing climate models have been reviewed by the
Intergovernmental Panel on Climate Change of the World Meteorological Organization (IPCC
2001). This survey involved the work of nearly 1000 scientists and considered the results
from a number of quasi-independent models, and a wide range of ‘scenarios’ for the future
emission of greenhouse gases. It therefore represents the most authoritative set of conclusions
we have for a synthesis of all the available predictions and an assessment of the uncertainty.
The IPCC notes first that the global average temperature has increased over the 20th century
by 0.6 ± 0.2 ◦ C. Model studies suggest that this is mostly of anthropogenic origin, and in
particular that it tracks the increased emission of greenhouse gases and aerosols due to human
activities (figure 7). These and other tests, plus better computers and a more physical basis
have strengthened confidence in the ability of models to predict future climate.
Among other types of evidence, tide gauge data shows that the global average sea level
rose by 10–20 cm during the last century, while satellite and other observations show a decrease
in snow and ice cover as well. On the other hand, some important climate indicators appear
not to have changed, at least in recent decades when good measurements have been available,
most notably the mean temperatures in some parts of the southern hemispheric oceans and
Antarctica.
The range of predictions for the mean global surface temperature corresponds to an
increase of 1.4–5.8 ◦ C in the next century (figure 8). An increase larger than the last century and
without precedent in the last 10 000 years, according to palaeoclimate data, is considered very
likely. Sea level is expected to rise by 10–80 cm, due to the thermal expansion of the oceans
and, to a lesser extent, melting of glaciers and the polar icecaps. (The time constant for polar
melting is so long that its effect is unlikely to be significant for 100 years or more.) Models
indicate that extreme weather, deforestation and desertification could also be consequences
of global warming although the uncertainty of specific predictions with this level of detail
remains quite high.
18 F W Taylor
(a) (b)
(c)
Figure 7. Model simulations of annual global mean surface temperatures (IPCC 2001).
5. Experimental studies
Progress with understanding the greenhouse effect, and planning to ameliorate the social
and environmental effects of climate change, depends critically on our ability to design and
carry out programmes of measurements. Theory and modelling alone cannot improve our
understanding of the details of the physics on which models rely, and only good data can
test and improve specific models and parametrizations. There are few direct experimental
approaches to observing the greenhouse effect, however. We must probe the physical processes
in the atmosphere and oceans at the same time as we monitor the climate system for variations
and secular change. In this section we consider a representative selection of the ways in which
this is being done.
(d ) (e)
Figure 8. Global mean temperature and sea level changes in the next century, as predicted by
the Intergovernmental Panel on Climate Change, based on all of the available model calculations.
The upper three frames show the emission scenarios for carbon dioxide and sulfur dioxide as
representatives of the assumptions on which the predictions are based (IPCC 2001).
average. However, even today, this is very non-trivial, since the distribution of measurement
is non-uniform in space and time and aliasing of seasonal and non-secular changes can easily
occur. When long runs of data are required to search for trends, problems of changing
measurement techniques and unreliable intercalibration are encountered. Measurements made
within the urban ‘heat island’ are generally higher than those made at airports etc, and the last
50 years has seen the gradual move of met stations from one to the other.
5.1.1. Network for the detection of stratospheric change. The last decade has seen the
organization and improvement of measurement systems on a vast scale. As an example,
Taylor (1991) described the setting up of the Network for the Detection of Stratospheric Change
(NSDC), which commenced operation in January 1991. It consists of about 20 primary stations
and a larger number of complementary stations covering the globe. These address the rate of
build-up of greenhouse gases in the atmosphere as a whole by making upper-air measurements,
since those at the surface are confused by local effects such as the concentration of pollutants
near population centres, rainout and so forth. The stratosphere contains the main source for
some greenhouse gases (e.g. ozone), and the main sink for others (e.g. methane). Predicting the
rate of global change depends on understanding processes there, and furthermore most climate
models predict cooling of the stratosphere in response to an increase in the CO2 mixing ratio,
which is greater than the corresponding heating at the surface and therefore easier to detect (as
it has been, since about 20 years ago).
The measurements made by the network vary with site but the following are brief
descriptions of some of the most important.
20 F W Taylor
The differential absorption lidar technique uses light at two or more different wavelengths,
one of which is much more strongly absorbed by the species of interest than the other. For the
example of ozone, wavelengths of 0.353 and 0.308 µm are typical. The returned signals are
small but, given cloud-free skies and a long integration time (up to several hours) night-time
profiles from 5 to 50 km with vertical resolution of 200 m and absolute errors of less than 10%
can be achieved.
Temperature measurements by lidar rely on Rayleigh scattering from air molecules,
typically at the wavelength of Nd lasers (0.532 µm). The signal strength is proportional to the
air density and hence can be related to the temperature. The technique is valuable from about
30–80 km altitude, yielding temperature to an accuracy of better than 2 K with 200 m vertical
resolution. Below 30 km, backscatter from atmospheric aerosols confuses the temperature
signal and below about 26 km this contribution is so dominant that profiles of the aerosols
themselves can be derived. Water vapour lidars observing the Raman backscatter from water
vapour at 408 nm obtain vertical profiles from near the surface to the lower stratosphere.
Infrared spectroscopy is the most versatile tool of the experimental atmospheric scientist.
All of the key greenhouse gases, their source and sink species and their reaction partners
have vibration–rotation bands in the 2–15 µm spectral region, all of which is accessible
simultaneously to a Fourier-type spectrometer such as a Michelson interferometer. The
instruments used by NDSC achieve spectral resolutions in the region of 0.002 cm−1 , which
provide limited vertical profile information and will separate the stratospheric from the (often
much larger) tropospheric contribution to the spectral lines of interest. The Sun is used as the
source, and is observed over a range of solar zenith angles. The range of air masses thus obtained
makes it easier to observe weakly and more strongly absorbing species with the same set-up.
Ultraviolet and visible spectrometers, measuring in absorption using the Sun or the Moon
as a source, can obtain column abundance measurements which complement or are superior
to infrared data for certain species, including ozone, nitrogen dioxide, nitrogen trioxide, and
chlorine and bromine oxides. This is an update of the classic Dobson technique, first developed
for stratospheric ozone monitoring in the 1920s (Dobson 1930). Grating instruments with a
spectral resolution on the order of 1 nm and integration times of 20 s–20 min are used, and
precisions of around 1% (for O3 and NO2 ) and around 10% for the other species, are claimed.
Thermal emission from optically thin stratospheric thermal emission lines are monitored
using microwave receivers operating at wavelengths around 22.2 GHz (for water vapour) and
110 GHz (for ozone) to 279 GHz (for chlorine oxide). High spectral resolution defines the
shapes of the pressure-broadened lines and allows vertical profiles to be retrieved, but with
much less vertical resolution than is obtainable with active techniques such as lidar. Profiles
with a resolution of around one scale height (6–8 km) are obtained for around 30 min integration
times for water vapour (20–85 km), ozone (20–75 km) and chlorine oxide (25–45 km).
5.1.2. Cloud studies using aircraft. The key problem involving clouds is to relate their for-
mation processes, which can be parametrized in models, to their radiative effects. Much of
the current work is being done using instrumented aircraft, which sample the cloud directly in
order to measure its physical thickness, water content, temperature and pressure characteris-
tics. At nearly the same time, radiometers on the aircraft can obtain the upward and downward
fluxes of radiation above and below the cloud, in bandpasses corresponding to the solar and
planetary fluxes.
Going beyond basic cloud characteristics to an understanding of the physical processes—
principally scattering, emission and absorption—which occur in clouds, requires more sophis-
ticated measurements. It is possible, for instance, to measure the size (and to some extent the
shape if the particles are ice) of cloud droplets using optical methods whereby the drops are illu-
The greenhouse effect and climate change revisited 21
Figure 9. The estimated global carbon budget and its main components as a function of time
(Woods Hole Research Center).
minated so that their shadows fall onto a detector array. Such data, if combined with multispec-
tral measurements of the visible and near- and far- infrared flux in selected spectral intervals,
allow the radiative properties of real clouds to be calculated and compared with the measured
fluxes at their boundaries. Parametrizations useful in climate models can then be developed.
5.1.3. Carbon dioxide exchange with the oceans. Estimates of the global carbon budget, and
hence of the carbon dioxide amounts available to drive greenhouse warming, include a large
unidentified component (figure 9). This may be due in part to difficulties in measuring the
exchange of CO2 between the atmosphere and the oceans.
Shipborne measurements remain the only way to obtain data below the surface layer of the
oceans, and global and seasonal coverage is obviously a problem. International programmes,
such as the Joint Global Ocean Flux Study, coordinate the collection of vast numbers of data
and their analysis. Samples are collected from many depths using towed, floating and moored
sediment traps, pumps and coring drills. Water samples are analysed for their dissolved gas
and material content. The analysis aims to establish how much of the CO2 absorbed annually
by the oceans is subsequently re-released, and how, where and when. For instance, it has been
found that the exchange varies by a factor of 4 depending on the occurrence of the well known
El Niño phenomenon, because of differences in the temperature structure and the degree of
upwelling in the ocean, and the weakening of the winds in the eastern half of the Pacific during
El Niño events.
Some of the CO2 is retained permanently, after undergoing chemical and biological
changes and sedimentation. A large amount is in solution, perhaps to be more freely released
when global warming occurs (since the solubility of CO2 in water decreases with increasing
temperature) to provide an additional positive feedback mechanism. Minute plant life (phyto-
plankton) in the top 200 m of water consumes CO2 through photosynthesis, but the variability
of this process, its temperature dependence and the ultimate fate of the CO2 is obscure.
programmes, the limiting factor is not a limited data gathering capability, but rather the need
to develop sensors and techniques which are capable of obtaining the required parameters with
sufficient resolution and precision. Again, selected examples will illustrate the current state
of the art.
5.2.1. Earth Radiation Budget measurements. Since the 1960s satellites have measured the
outgoing reflected solar radiation, i.e. the albedo, and the outgoing thermally emitted radiation,
from Earth orbit. This is a less simple task than it may at first appear; the absolute calibration of
accurate radiometers is notoriously difficult, particularly in space where the hostile environment
can lead to deterioration of reference targets which cannot be checked. Furthermore, no real
instrument can obtain a uniform spectral response over the wide wavelength ranges (roughly
0.4–4.0 µm for solar and 4.0–100 µm for thermal fluxes) required to measure local energy
balance. Finally, albedo, although a simple concept, in practice is difficult to derive from
data because integration over a 2π solid angle is required of a field which may have strong
directional components (the thermal flux is easier because in this case it is reasonable to expect
that cylindrical symmetry applies).
The instrumentation on NASA’s Earth Radiation Budget Experiment (ERBE) satellites
addressed these difficulties by using redundant standard radiance targets, by breaking the
wavelength range into segments which are measured and calibrated separately and by using
angular scans which later can be integrated into hemispherical fluxes. The last of these is
problematical because the angular coverage over a given region from a single satellite obviously
can never be complete; it is necessary to fit the data to empirical models of the reflectance
properties of different types of surface and integrate the model, with a corresponding addition
to the error budget.
There is still the difficulty of covering the globe with a small number of satellites, given
that the quantities of interest are highly variable in space and time, including a powerful diurnal
component. ERBE sensors flew on two sun-synchronous polar orbiting satellites (NOAA-9
and NOAA-10) and on the Earth Radiation Budget Satellite (ERBS), which was launched
into a 57◦ inclination orbit which precesses with a 36 day cycle. The first of an improved
version known as CERES (Clouds and the Earth’s Radiant Energy System) was launched into
a 35◦ inclination orbit in 1998 and the second in 2000 on Terra, part of NASA’s new Earth
Observation System series of satellites. In Europe, plans are well advanced to include radiation
budget sensors on a geostationary platform, using a field of view which covers the whole Earth
so as to obtain integrated global coverage.
Satellite data have already provided the first reliable information on such basic climate
variables as the mean cloud cover of the Earth (more than 60%, or 10% more than earlier
estimates). Oceans are cloudier than continents, at 67%, and clouds over the ocean are lower
and about 10% less reflective than those over land. The tropics and the temperate zones are
generally cloudier than higher latitudes, but the latter are typically almost twice as reflective.
5.2.2. European Environmental Satellite (Envisat). We turn now from radiation budget
satellites, which are essentially data gathering systems, to examples of scientific satellites,
where the instruments are experiments investigating atmospheric processes and the goal, so
far as forecasting is concerned, is less to look at the boundary conditions on the climate
system and more to improve the physics in the models. At the time of writing, the European
Space Agency is preparing to launch Envisat, a large (2.673 t) polar-orbiting satellite which will
provide measurements of the atmosphere, ocean, land and ice over a 5 year period (ESA 1998).
The greenhouse effect and climate change revisited 23
Figure 10. The Envisat spacecraft showing the sensors. See text for key to acronyms.
5.2.3. The Earth Observing System (EOS). The Earth Observation System is NASA’s latest
series of giant satellites dedicated to climate studies. The first in the series, called Terra, was
launched in 2000; Aqua and Aura are scheduled to follow in 2001 and 2003.
A key feature of the advanced instruments on these satellites is that they address much
more comprehensively than before the various difficulties of measuring the atmosphere near
24 F W Taylor
the surface from space. These include fluctuations in the spectral properties of the surface,
cloud contamination of radiances and the fact that the limb-viewing technique cannot be used
because the troposphere is too opaque when viewed tangentially. This last point means that the
vertical resolution which can be obtained is limited, and the abundance retrievals are subject to
errors introduced by the lack of knowledge of the shape of the vertical profile. Nevertheless,
tropospheric studies clearly are an essential part of a long-term programme to understand
climatic change and must be tackled.
MOPITT (for Measurements Of Pollution in the Troposphere) on Terra is a gas correlation
instrument viewing vertically downwards and obtaining a small footprint a few kilometres
square on the surface (Drummond et al 1996). It uses the pressure-modulated gas correlation
technique to separate the spectral lines of its target species, in this case carbon monoxide and
methane, from the forest of lines due to water vapour and other more abundant molecules.
With high-spatial-resolution maps, the problem of finding cloud-free regions is reduced and
MOPPITT offers the possibility of studying the production and subsequent evolution of
concentrations of these two important greenhouse gases. CO has a relatively short lifetime
of a few months and shows substantial variability across the globe, for example a factor of
three difference between the mean values in the northern and southern hemispheres. MOPITT
measurements aim to identify the surface sources and sinks, natural and anthropogenic, and
the transport processes from the surface to the upper atmosphere. Early results show CO
concentrated over areas of industrial activity and of biomass burning in South America.
Methane has a long lifetime (7 years) and as a result is well mixed in the troposphere, with
only a few per cent inter-hemispheric difference and seasonal variation. It remains to be shown
how well these subtle features can be extracted from MOPITT data.
Even more advanced is the Tropospheric Emission Spectrometer instrument on Aura; TES
is an immensely sophisticated Fourier transform spectrometer with mechanically cooled optics
and detectors, which has the potential to measure thermal emission from tropospheric minor
constituents with high spectral and high spatial resolution, over a wide spectral range (Beer
et al 2001). The main focus is on tropospheric ozone, a key constituent in pollution studies, but
global measurements of a range of species (NOy , CO, SO2 and H2 O as well as ozone), from the
surface up through the middle atmosphere, will also advance the atmospheric chemistry part
of the greenhouse puzzle rapidly once it is available. Finally, alongside TES on Aura will be
the High-Resolution Dynamics Limb Sounder HIRDLS, which aims to elucidate the complex
coupling between atmospheric chemistry and dynamics. A prominent example of this is the
way the budget of important trace constituents in both the troposphere and the stratosphere
depends on transport in both directions across the tropopause. To first order, such transport is
inhibited by the temperature structure and the ways in which it actually occurs are very poorly
understood. However, key issues like quantifying the sources of tropospheric ozone, and the
rates at which ozone-depleting halogen compounds are introduced into the stratosphere, and
eventually removed, depend crucially on new progress being made.
6. Conclusion
In 1990, the position according to Taylor (1991) was that ‘it will probably be another
decade at least before definite experimental evidence for or against global warming is in
hand’. Although not everyone would agree, this has probably turned out to be an accurate
prediction. It is also the case that advances in measurements and modelling and increased
understanding of many key atmospheric processes have placed forecasts of future warming
on a much firmer footing (although not unassailable—see, for example, Lindzen 1995). The
best of these forecasts (IPCC 2001) suggests continued global warming which could have
The greenhouse effect and climate change revisited 25
serious consequences for living things on the Earth on timescales of less than a century
(see also Hansen et al 1998, 2000). The largest advances to be anticipated in the next 10 years
will probably come in aspects of more regional climate forecasting, leading to a better
understanding of the expected impacts on different parts of the globe, and on the frequency of
extreme events such as hurricanes.
References
Beer R, Glavich T A and Rider D M 2001 Tropospheric emission spectrometer for the Earth Observing System AURA
satellite Appl. Opt. 40 2356–67
Brasseur G P, Orlando J and Tyndall G 1999 Atmospheric Chemistry and Global Change (New York: Oxford University
Press)
Dobson G M B 1930 Proc. R. Soc. A 129 411–17
Drummond, James R and Mand G S 1996 The measurements of pollution in the troposphere (MOPITT) instrument:
overall performance and calibration requirements J. Atmos. Ocean. Technol. 13 314
ESA 1998 ENVISAT-1 Mission and System Summary (European Space Agency)
Gill A E 1982 Atmosphere and Ocean Dynamics (New York: Academic)
Goody R M and Yung Y 1989 Atmospheric Radiation (Oxford: Oxford University Press)
Hansen J E et al 1998 Climate forcings in the industrial era Proc. Natl Acad. Sci. USA 95 12 753–8
Hansen J, Sato M, Reto R, Lacis A and Oinas V 2000 Global warming in the twenty-first century: an alternative
scenario Proc. Natl Acad. Sci. USA 97 9875–80
Hansen J E and Travis L D 1974 Space Sci. Rev. 16 527–610
IPCC 2001 (Houghton J T, Ding Y, Griggs D J, Noguer M, van der Linden P J and Xiaosu D (ed)) Climate Change 2001:
The Scientific Basis. Contribution of Working Group 1 to the Third Assessment Report of the Intergovernmental
Panel on Climate Change (IPCC) (Cambridge: Cambridge University Press) p 944
Lindzen R S 1995 How cold would we get under CO2 -less sky? Phys. Today 48 78–80
Lunine and Jonathan I 1998 Earth: Evolution of a Habitable World (Cambridge: Cambridge University Press) p 344
Schimel et al 1995 CO2 and the carbon cycle Climate Change 1994 (Cambridge: Cambridge University Press)
Taylor F W 1991 The greenhouse effect and climate change Rep. Prog. Phys. 54 881–918