Remote Sensing Course Outline
Remote Sensing Course Outline
GEO 2404
DONNEX CHILONGA
CELL: 0999371020/0882756490
Email:
donnexc@[Link]/chilonga.d@[Link]
COURSE OUTLINE
Aim of Study
• To help students acquire basic understanding of remote
sensing and skills in image interpretation
5. Sensing Systems:
definition of sensor systems (passive systems and active systems);
short wave radiation sensors;
thermal infrared sensors;
passive microwave sensors;
radar sensing system;
orbiting earth satellites
Recommended Texts
Fotheringham, Flint David, editor. Spatial Analysis and GIS. London:
Taylor and Francis, 1994.
Keates, J. S. Cartographic Design and Production, 2nd edition. Essex:
Harlow, 1989.
MacDongall, E. B. Computer Programming for Spatial Problems.
London: Edgewood and Arnold, 1976.
Ritchie, W. Surveying and Mapping, new edition. Essex: Harlow, 1988.
Topic 1: The Art and Science of Remote Sensing
• All of us in this class have been involved, in one way or another, in collection of data,
processing it, analyzing it, or in fact using it for decision making. This acquired data may be
used to yield information for management purposes i.e., could be disaster management,
natural resource management, land management etc.
For class discussion: What is data? What is information?
• This course therefore focuses on the methods used to collect this particular geospatial data
or georeferenced data.
For class discussion: What is geospatial/georeferenced data?
• The spatial data would be important to different users in different ways. For example;
a) To an agronomist who would be interested in forecasting overall agriculture production.
b) To an urban planner who would be interested in identifying illegally built structures.
c) To a mining engineer who would be interested in providing map for surface minerology.
d) To a climatologist who would be interested to understand the cause of Cyclone Anna.
• All the above examples deal with spatial phenomenon. To acquire information of all
these examples, require a wide variety of methods to be used.
• For purposes of this course, we need to distinguish two main categories of spatial data
acquisition, thus, ground-based and remote sensing methods.
Remote sensing methods involve the use of image data acquired by a sensor such as
aerial cameras, scanners or a radar.
Definition of remote sensing
“The simplest way of understanding the term remote sensing can be related to the reading of
this sentence itself. Our eyes sense the written sentences on the page to which our brain
process this information and interprets the logical meaning. This is how the remote sensing
technology works”.
Remote sensing is the science and art of obtaining information about an object, area or
phenomenon through the analysis of data acquired by a device that is not in physical
(intimate) contact with the object, area, or phenomenon under investigation.
Remote sensing is the science of acquiring, processing and interpretation images that
record the interaction between electromagnetic energy and matter.
Remote sensing is the instrumentation, techniques and methods to observe the Earth’s
surface at a distance and to interpret the images of numerical values obtained in order to
acquire meaningful information of a particular objects on Earth.
Common to all definitions is the element that characteristics of the Earth’s surface are
acquired with a device that is not in contact with the object being measured.
Historical Overview of Remote Sensing
• Use of photography to study earth features started in 1838 when Wheatstone first
demonstrated use of reflecting mirror stereoscope to view trees on photographs. However,
Niepce, Talbot and Daguerre were the first to make an official disclosure of the existence of
photography in 1839.
• Airborne remote sensing was probably born in 1858 in France when the first known aerial
photograph was taken from a balloon by Tournachon. However, the earliest aerial
photograph still existing today was taken over Boston in the USA by Black in 1860.
• In 1903 the airplane was invented and in 1908 the first aerial photograph from an airplane
was taken in France. Systematic strip photography was introduced during WW I in 1917 and
by 1920 professional aerial photo interpretation was taking place.
• As early as 1922 vertical aerial photographs were being taken for forest surveys in Burma.
The aerial photographs were taken with an amount of overlap to allow 3-dimensional
viewing (stereovision). Until the early 1960s, the aerial photograph remained the single
standard tool for depicting the earth’s surface from a vertical or oblique perspective.
Historical Overview of Remote Sensing…continued…
• Satellite remote sensing started with the launch of the Russian satellite, Sputnik-1 in 1957.
In 1959 the first photo from outer space was taken by the American satellite Explorer-6.
From 1960 onwards there was systematic observation of the earth’s surface by the US
satellite TIROS-1 (Television Infrared Observation Satellite-1). The first US space station,
SkyLab, went into orbit in 1973. This was followed in the 1980s by the reusable US Space
Shuttle.
• In 1972 the US ERTS-1 (Earth Resource Technological Satellite (renamed Landsat-1)) was
launched. This was the first of a series of Landsat satellites (up to Landsat-7), which have
become probably the most important earth resource satellites.
• Today, remote sensing is carried out using airborne methods as well as satellite technology.
Remote sensing not only uses film photography, but also digital cameras, scanners and
video, as well as radar and thermal sensors. In the past, remote sensing was limited to
what could be seen in the visual part of the electromagnetic spectrum. Today, the parts of
the spectrum which cannot be seen with the naked human eye can now be utilised through
special filters, photographic films and other types of sensors.
How is remote sensing useful (Advantages)?
• The main limitation is that models are not obtained with high
accuracy as compared to in-situ observations or aerial
photography.
• Therefore, it is best practice to integrate remote sensing
methods with its complimentary technologies such as in-situ
observations.
Utilization of light energy in remote sensing
• Every remote sensing process involves an interaction of the
incident radiation falling over the target of interest in a sense that,
the radiation incident over the target is altered on account of the
physical properties of the target and reflect back the incident
radiation which is recorded by the sensor.
• This is illustrated by the use of imaging systems (referred as
optical remote sensing) where the following seven elements of
remote sensing are involved.
• It should also be noted that remote sensing involves the sensing
of emitted energy and the use of non-imaging sensors (referred as
thermal remote sensing).
Elements/Components of a Remote Sensing System
Elements/Components of a Remote Sensing System
i. Source of Illumination (A) - First requirement for any RS process is an energy source to provide EM energy
to the target of interest.
ii. Radiation and the Atmosphere (B) – as the energy propagates from its source to the target, it interacts
with the atmosphere as it passes through. This interaction may take place a second time as the energy
travels from the target and back to the sensor.
iii. Interaction with the Target (C) - once the energy makes its way to the target through the atmosphere, it
interacts with the target depending on the characteristics of both the target and the radiation.
iv. Recording of Energy by the Sensor (D) - after the energy has been scattered or Emitted from the target, a
sensor is required to collect and record the electromagnetic radiation.
v. Transmission, Reception, and Processing (E) - the energy recorded by the sensor has to be transmitted,
often in electronic form, to a receiving and processing station where the data are processed into an image
(hardcopy and/or digital).
vi. Interpretation and Analysis (F) - the processed image is interpreted, visually or digitally or electronically,
to extract information about the target which was illuminated.
vii. End users and application (G) - the last element of RS process is achieved when the useful information is
extracted from the imagery to reveal some new information, or assist in solving a particular problem.
Topic 2: The Electromagnetic Energy/Radiation
and The Electromagnetic Spectrum
• The electromagnetic spectrum ranges from the shorter wavelengths (including gamma and
X rays) to the longer wavelengths (including microwaves and radio waves).
• All these forms of energy radiate in accordance with basic wave theory, c = ν λ. This
theory describes EM energy as traveling in a harmonic, sinusoidal fashion at the velocity
of light c measured in ms-1.
The Electromagnetic spectrum
Regions of electromagnetic spectrum and wavelengths as used for Remote Sensing
Sensors on board earth resources satellites operate in visible, infrared and microwave
regions of the spectrum.
i. Ultraviolet: The UV region has short wavelengths (0.3 to 0.446 μm) and high frequency. UV wavelengths are not
necessarily available for RS as almost 80 to 90% of UV is absorbed by Ozone (O 3) but rather are used as a catalyst in
geologic and atmospheric science applications. Geologic application includes material exploration since many rocks
and minerals exhibit fluoresce property and emit visible light in the presence of UV radiations .
ii. Visible light: This is the portion of EMR to which our eyes are sensitive and is perceived as colors. The visible light
covers a very small portion of the spectrum, ranging from approximately 0.4 to 0.7 μm (micrometer).
• The thermal IR differs from visible and reflected IR in a way that this energy is radiated or emitted from
the earth surface or objects and characterizes in the form of heat. It covers wavelengths from
approximately 3.0 μm to 100 μm These are mainly used for monitoring the temperature variations of land,
water and ice.
iv. Microwave: This portion is of recent interest to remote sensing and the wavelength ranges approximately
from 1 mm to 1 meter. The shorter wavelengths have the properties similar to thermal infrared region while
the longer wavelengths are used for radio broadcasts. Atmospheric disturbances are minimal and it is
transparent to clouds hence making it good for active sensors. Microwave remote sensing is used in the
studies of meteorology, hydrology, oceans, geology, agriculture, forestry and soil moisture sensing.
Regions of electromagnetic spectrum and wavelengths as used for Remote Sensing
• This is radiant flux radiated from a point source per unit solid
angle in a radiant direction and is expressed in units of W sr-1
(where sr is the spectral radiance).
Irradiance (Ee)
• This is the radiant flux incident upon a surface per unit area
and expressed in a unit of Wm-2.
Radiant emittance (Radiant exitance ‘Me’)
• This is the radiant flux radiated from a surface per unit area
and expressed in a unit of Wm-2.
Radiance (Le)
• This is the radiant intensity per unit projected area in a radial
direction and expressed in the unit of W sr-1. Radiance is also
described as the radiation field as dependent on the angle of
view.
Electromagnetic Radiation Laws
• The electromagnetic energy follows certain physical laws as it
moves away from the source.
• Isaac Newton in his theory analyzed the dual nature of light
energy exhibiting both discrete and continuous phenomena
associated with the stream of minuscule (minute, tiny) particles
travelling in a straight line.
• This notion is consistent with modern theories of Max Plank
(1858 – 1947) and Albert Einstein (1879 – 1955).
The concept of a Black Body
• All objects with temperature above absolute zero emit electromagnetic energy.
• The amount of energy and the associated wavelengths depend upon the temperature
of the object. As the temperature of an object increases, the quantum of energy
emitted also increases, and the corresponding wavelength of the maximum emission
becomes shorter.
• The above hypothesis can be expressed by using the concept of blackbody. A blackbody
is a hypothetical source of energy that behaves in an idealized manner such that it
absorbs all or 100% of the radiation incident upon it and emits back (or radiates) the
energy as a function of temperature.
• The Planks’, Kirchhoff’s, Stefan-Boltzmann and Wien’s displacement laws explain the
relationship between temperature, wavelength, frequency and intensity of energy.
Planks Law
• This law provides the spectral radiance of a black body as a function of temperature.
• Plank ascertained that electromagnetic energy is absorbed and emitted in discrete units
called ‘photons’. The size of each unit is directly proportional to the frequency of the
energy’s radiation.
• Therefore, Plank’s theory proposed that electromagnetic energy can be quantified by its
wavelength and frequency and its intensity is expressed by ‘Q’ and is measured in Joules.
• The energy released by a radiating body in the form of a vibrating photon travelling at a
speed of light can be quantified by relating the energy’s wavelength with its frequency.
Planks Law…continued…
• Plank defined a constant ‘h’ to relate frequency (ν) to radiant energy ‘Q’ and is expressed
as follows:
Planks Law…continued…
• The above equation reveals that longer wavelengths have low
energy of photons while for short wavelengths the energy will be
high.
• The emissivity of a true blackbody is 1, and that of perfect radiator (a white body) would be
zero. This implies that all objects have emissivities between these two extremes. Objects that
absorb high proportions of incident radiation and re-radiate this energy have high emissivities,
whereas those which absorb less radiation have low emissivities, i.e., they reflect more energy
that reaches them.
Stefan-Boltzmann Law
• The Stefan-Boltzmann law defines the relationship between the total emitted
radiation (W) (expressed in watts cm-2) and temperature (T) (absolute
temperature, K):
W = T4
• The total radiation emitted from a black body is proportional to the fourth
power of its absolute temperature. The constant () is the Stefan-Boltzmann
constant (5.6697 × 10-8) (watts m-2 K-4).
• In short, Stefan-Boltzmann law states that hot blackbodies emit more energy
than cool blackbodies.
Wien’s Displacement Law
• This law specifies the relationship between the wavelength of emitted radiation and the
temperature of the object:
= 2898/T
• Where, is the wavelength hat which the radiance is at a maximum and (T) is the absolute
temperature in Kelvin (K). As objects become hotter, the wavelength of maximum
emittance shifts to shorter wavelengths.
• This law is useful for determining the optimum wavelength of object having temperature (T)
Kelvin.
• Together, the Wien and Stefan-Boltzmann law are powerful tools. With the help of these
laws, temperature and radiant energy can be determined from an object’s emitted
radiation. For example, temperature distribution of large water bodies can be mapped by
measuring the emitted radiation, similarly, discrete temperatures over a forest canopy can
be detected to plan and manage forest fires.
• An example illustrating the radiation laws;
Self-assessment exercise
1. Calculate the wavelength of the maximum energy emission for the Mars
• The amount of scattering depends on several factors including the wavelength of the
radiation, the abundance and size of the particles or gases, and the distance the radiation
travels through the atmosphere.
Types of scattering
• Generally, there are three types of scattering which take place through the
earth’s atmosphere, namely;
a) Rayleigh scattering,
c) nonselective scattering.
Rayleigh scattering:
• Rayleigh scattering takes place when the suspended particles are very small (mainly
comprising of oxygen molecules or dust particles) as compared to the wavelength of the
radiation. This type of scattering causes shorter wavelengths of energy to be scattered
much more than longer wavelengths. Therefore, the law is governed by the principle of
reciprocal of fourth power of the wavelength; expressed as:
• This type of scattering is dominant in the upper layers of atmosphere where tiny dust
particles and gas molecules predominates. Rayleigh scattering is responsible for the blue
color of the sky, since blue light is scattered the most on account of the size of wavelength
smaller than the size of dust particles and gas molecules. Same reasoning applies for the
appearance of orange color of the sky at dusk, i.e., when sun is low in the horizon, it
creates longer path length to the incoming radiations resulting in the scattering of red
light.
Mie scattering:
• This type of scattering occurs when the particles are just the same size as the
wavelength of the radiation. Dust, pollen, smoke and water vapour are common
causes of Mie scattering wherein the longer wavelengths are scattered the
most. Mie scattering is more dominant in the lower layers of atmosphere (i.e.,
within 0 to 8 km). In the lower layers of atmosphere larger particles are in
abundance and influence a broad range of wavelengths in and near the visible
spectrum.
Nonselective scattering:
• This scattering phenomenon occurs when the particles are much larger than the
wavelength of the incoming radiation thereby leading to approximately equal
scattering of all wavelengths (i.e., blue + green + red light = white light). Water
droplets and large dust particles are mostly responsible for causing this type of
scattering. Due to this scattering, clouds appear white in color and so as a blurry
white foggy appearance to the suspended water droplets during winter seasons.
Absorption
• Gases namely; ozone (O3), carbon dioxide (CO2) and water vapor (H2O) are responsible for
most of the absorption of electromagnetic radiations.
• Formation of ozone is the result of interaction of high energy ultraviolet radiations with
oxygen molecules (O2) present at an altitude of 20 to 30 km in the stratosphere. Ozone
layer forms a protective layer in the atmosphere by absorbing the harmful UV radiations
that may otherwise cause skin burns or other severe skin diseases if exposed to sun light.
• Lastly, water vapour present in the lower atmosphere (concentration normally varies from
0 to 3% by volume) are more effective in absorbing radiations as compared to other
atmospheric gases. Two important regions of spectrum ranging from 5.5 to 7.0 µm and
above 27 µm, are significantly absorbed up to 75% to 80%.
The Atmospheric Windows
• These are regions or bands of the electromagnetic spectrum which are not severely
influenced by atmospheric absorption and thus are partially or completely transmitted
through onto the surface of the Earth.
• In other words, gas molecules present in the atmosphere selectively transmit radiations
of certain wavelengths and those wavelengths that are relatively easily transmitted
through the atmosphere is referred to as atmospheric windows.
• Around 90 to 95% of the visible light passes through the atmosphere otherwise there
would never be bright sunny days on earth.
• The atmosphere is almost 100% translucent for certain wavelengths of mid and near
infrared spectrum which makes remote sensing analysis of satellite images in these
regions possible with a minimum distortion. The thermal infrared ranges from 10 - 12
m is used in measuring surface temperatures of the ground, water and clouds. Ozone
blocks ultraviolet radiation almost completely and almost all radiation in the range of 9.5
to 10 m is absorbed.
The Atmospheric Windows
Self-assessment exercise
1. Explain why most remote sensing sensors avoid detecting and recording wavelengths
in the ultraviolet portion of the spectrum.
2. What do you think would be some of the best atmospheric conditions for remote
sensing in the visible portion of the spectrum?
Interaction of EMR with the Earth’s Surface
• Electromagnetic radiations that are not completely absorbed or scattered in the atmosphere,
travels through the entire depth of the atmosphere before finally reaching the earth’s
surface.
• Three types of interaction take place when the radiations are incident (I) upon the surface;
• The total energy of radiations incident upon the surface follows the law of conservation of
energy or the energy balance written as:
E i Er Ea Et
• Where, Ei is the incident energy, Er is the reflected energy, Ea is the absorbed energy and Et
is the transmitted energy.
• The type and degree of interaction of radiations varies in accordance to the size and
• The type and degree of interaction of radiations varies in accordance to the size
and surface roughness for different objects as well as varying wavelengths.
Reflection:
• Reflection occurs when a radiation bounces off a target (mostly opaque surfaces) and is re-
directed. The degree of re-direction depends upon the magnitude of surface roughness
compared to the wavelength of the incident radiation.
Types of reflection
• Two types depending upon the characteristics of the surface or feature of interest.
i. Specular reflection which takes place when the surface is smooth and all (or almost all)
of the energy is directed away from the surface in as single direction. The angle of
reflection is equal to the incident angle (e.g., mirror, still water body or smooth metal
surface). This type of reflection results into a very bright spot (hot spots) on the image.
ii. Diffuse reflection occurs over the rough surfaces where incident radiations are reflected
almost uniformly in all directions. If the wavelengths are much smaller than the surface
roughness variations, diffuse reflection will dominate (e.g., loam soil would appear fairly
smooth to long wavelength microwaves in contrast to visible spectra wavelengths).
Transmission:
• Transmission of radiation takes place when radiation passes through the object or
feature without any considerable fading or attenuation. For a known thickness of
the object, the ability of a medium to transmit energy is measured as a
transmittance (t);
• This occurs when radiation is absorbed by the target. The portion of the EM
energy which is absorbed by the Earth’s surface is available for emission and as
• Furthermore, the spectral response of a feature is wavelength dependent meaning that the
proportion of reflected, absorbed, and transmitted energy will vary at different
wavelengths. Thus, two features may be indistinguishable in one spectral range and be very
different in another wavelength band. E.g., water and vegetation reflect nearly equally in
visible wavelengths, yet these are separable in near-IR wavelengths.
• Within the visible portion of the spectrum, these spectral variations result in the visual
effect called colour. E.g., vegetation reflects more highly in the green portion of the
spectrum and therefore appears green. About 95% of blue and red light are absorbed by
chlorophyll for photosynthesis while most of the green component is reflected.
Spectral reflectance (Rλ)
• This is the reflectance characteristics of earth surface features expressed as the ratio of
energy reflected by the surface to the energy incident on the surface, measured as a
function of wavelength. It is also known as albedo of the surface. It may vary from 0-100%.
• Behaviour of major surface types; water body, bare soil and vegetation.
Topic 5:
Remote Sensing Systems
• In Chapter 4, we looked at the underlying principles of
remote sensing. In this chapter, we shall look at how sensors
record and measure the electromagnet energy as reflected
or emitted by an object.
• Please refer to the diagram of components of RS as shown in
chapter 1.
What is a sensor?
• A sensor is a device that detects, measures and records
electromagnetic energy.
Types of sensors
There are two types of sensors:
a) passive and
b) active sensors.
Passive sensors
• These are sensors that depend on an external source of energy, usually the Sun.
• They can only be used to detect energy when the naturally occurring energy is
available (during the day).
• For example, a camera without a flash, image plane scanning sensors, such as
TV cameras and multispectral scanners covering both along-track scanning
systems (SPOT) and across-track scanning sensors, such as multispectral
scanners (LANDSAT MSS, TM; optical-mechanical scanner and scanning
microwave radiometers).
Active sensors
• Active sensors rely on their own sources of energy for illuminating
objects.
• For examples,
• For example, consider our eyes as natural sensors and the head being
a natural platform.
a) on the ground,
b) on an aircraft or balloon or on a spacecraft and
c) on satellite outside of the Earth's atmosphere.
Airborne Remote Sensing – Aerial Photography
• The term "photography" is derived from two Greek words meaning "light"
(phos) and "writing" (graphien).
• An aerial photograph, in broad terms, is any photograph taken from the air,
usually taken from aircrafts flying at altitudes ranging from one to nine
kilometers using a highly-accurate camera.
• Most air photo missions are flown using black and white film. However
colour, infrared, and false-colour infrared film are sometimes used for
special projects.
Types of photography
Two types of photography
• Film: most air photo missions are flown using black and
white film, however colour, infrared, and false-colour
infrared film are sometimes used for special projects.
• The most straightforward method for determining photo scale is to measure the
corresponding photo and ground distances between any two points.
• The scale S is then computed as the ratio of the photo distance d to the ground distance
D.
S=d/D
Example:
• Calculate the scale of an AP if a 1km stretch of highway covers 4cm on an air photo.
Scale as a function of focal length & terrain elevation
• For a vertical photograph taken over flat terrain, scale is a function of the
focal length f of the camera and the flying height above the ground H’.
S = f / H’
• If flying height above sea level H and elevation of the terrain h are known,
S=f/H–h
Use of average scale
• Equation above illustrates that photo scale is a function of terrain elevation.
Savg = f / H – havg
• The result of photo scale variation is geometric distortions and relief displacements.
Distortion and Displacement
Distortion
Displacement
• Shift in the location of an object in a photo that does not change the
perspective characteristics of the photo (The fiducial distance between an
object's image and it's true plan position which is caused by change in
elevation.)
Types of distortions
4. Lens distortion.
Types of displacement
2. Tilt; and,
NOTE:
• Vertical APs are taken along flight lines or flight strips, which are often
parallel. GPS to control flight line direction.
• When the aim of the photography is to cover the whole target area
completely, the ground areas of adjacent flight lines overlap by 15-40%.
This is called sidelap.
Discussion questions
3. Why are most aerial photographs taken from a tilted angle opposed to a
vertical position?
4. What two aspects of aerial photography are used to define the scale of a
photograph, and how are they calculated?
Space (Satellite) Remote Sensing
• This became feasible with the launch of USSR satellite Sputnik-1 in 1957.
• Today, there is a wide choice of satellites data sources but for this course, we shall
concentrate on two.
1. The Earth Resource Satellites (Earth Observation Satellites).
2. The Environmental Satellites.
Advantages of Satellite Images over Aerial Photographs
i. Satellite data coverage has no limitation (space or political boundaries)
ii. The data are homogeneous
iii. Spatially continuous
iv. Data already in digital format
v. Data collection frequency is high with no limitation
vi. Lower cost than aerial photography
An orbit
• There are two orbits used: near polar sun synchronous and geostationary.
• Near polar & sun synchronous orbit is where the orbital plane is inclined at a small
angle with respect to the earth's rotation axis, and the satellite travels northwards on
one side of the Earth and then toward the southern pole on the second half of its orbit.
• Geostationary orbit is where a satellite follows an orbit parallel to the equator in the
same direction as the earth's rotation and view the same portion of the Earth’s surface
at all times. The satellite is stationary with respect to the earth surface.
Characteristics of sensors on satellites
Spectral resolution: describes the ability of a sensor to define fine wavelength intervals.
The finer the spectral resolution, the narrower the wavelength range for a particular
channel or band (panchromatic, multi-spectral and hyper-spectral sensors).
Spatial resolution: describes the area of the earth that each pixel represents (the size of
the smallest possible feature that can be detected).
Temporal resolution: refers to the repetitivity of observation over an area and is equal to
the time interval between successive observations.
Using the characteristics above, we are collecting the most common land
observation satellites into the following groups.
In terms of the spatial resolution, the satellite imaging systems can be classified into:
Multi-spectral imaging system: The sensor is a multi-channel detector with a few spectral
bands. Each channel is sensitive to radiation within a narrow wavelength band. Egs
areLandsat TM, MSS, Spot HRV-XS, Ikonos MS, QuickBird MS.
• The intensity of each pixel corresponds to the average brightness, or radiance, measured
electronically over the ground area. Each pixel has a digital number (DN) corresponding to the
radiance of each pixel.
• Typically, the DNs are recorded over such numerical ranges as 0 to 255, 0 to 511, 0 to 1023, or
higher. These ranges represent the set of integers that can be recorded using 8-bit, 9-bit, or
10-bit binary computer coding scales respectively. (N.B. 28=256, 29=512, 210=1024. Note that
the "bright" pixels have high number values (i.e., 200 to 255), while the "dark" pixels, have low
number values (i.e., 50-100).
Interpretation & analysis of remote sensing data
• This involves the identification and measurement of various targets in an image in order to
extract useful information about them.
Two main methods can be used:
1. visual interpretation: performed by a human interpreter, and is based on
feature tone (color), pattern, shape, texture, shadow and association.
2. digital processing and analysis: performed using a computer (without
manual intervention by a human interpreter).
• Digital processing and analysis is carried out as a replacement for manual interpretation. It
is done to supplement and assist the human analyst. However, computer analysis
techniques are more limited in their ability to evaluate spatial patterns than visual
methods. Therefore, visual and numerical techniques are complementary in nature.
Digital Image Processing and Analysis
• This is a collection of techniques for the manipulation of digital images by
computers.
Four most common image processing functions:
1. Preprocessing
2. Image Enhancement
3. Image Transformation
• Pre-processing processes refer to those operations that are preliminary to the main
analysis.
• After the feature extraction is complete the analyst can work with the desired
channels or bands, but in turn the individual bandwidths are more powerful
for information. Such a pre-processing increases the speed and reduces the
cost of analysis.
b. Radiometric Corrections
• This aims to reconstruct physically calibrated values by correcting the
spectral errors and distortions caused by sensors, sun angle, topography
and the atmosphere.
• When images are recorded by the sensor, they contain errors in the
measured brightness values of the pixels. Radiometric processing
influences the brightness values of an image to correct for sensor
malfunctions or to adjust the values to compensate for atmospheric
degradation.
c. Geometric Corrections
• Raw digital images often contain serious geometrical distortions that arise from
earth’s curvature, platform motion, relief displacement, and non-linearities in
scanning motion.
a) Rectification is the process of projecting image data onto a plane and making it
conform to a map projection system.
• Involves rearrangement of the input pixels onto a new grid which conforms to the
desired map projection and coordinate system.
Two steps involved in geometric registration process
identifying the image coordinates (i.e. row, column) of several clearly discernible points,
called ground control points (GCPs), in the distorted image and matching them to their
true positions in ground coordinates (e.g. latitude, longitude measured from a map -○).
resampling: this process is used to determine the digital values to place in the new pixel
locations of the corrected output image.
a) nearest neighbour,
• Rectifying the image data for the degrading effects of the atmosphere
entails modeling the scattering and absorption processes that take place.
2. Image Enhancement Techniques
• Performed to make satellite imageries more informative. Involves the alteration of the
appearance of an image in such a way that the information contained in that image is
more readily interpreted visually in terms of a particular need.
Examples of enhancement functions include
contrast stretching to increase the tonal distinction between various features in a
scene. E.g., a linear contrast stretch, a linear contrast stretch with saturation, a
histogram-equalized stretch.
filtering is commonly used to restore imagery by avoiding noises to enhance the
imagery for better interpretation and to extract features such as edges and
lineaments. The most common types of filters: mean, median, low-, high pass, edge
detection....
3. Image Transformation
• Involves combined processing of data from multiple spectral bands. Arithmetic operations
(i.e., subtraction, addition, multiplication, division) are performed to combine and
transform the original bands into "new" images which better display or highlight certain
features in the scene.
• Is the process of sorting all the pixels in an image into a finite number of
individual classes.
• Is the process of categorizing all pixels in an image or raw remotely sensed
satellite data to obtain a given set of labels or land cover themes.
• Supervised classification gives the user more control over the classification process
because they manually select the training data and assign them to the correct classes.
• Each pixel is categorized into landcover class to which it closely resembles. If the
pixel is not similar to the training data, then it is labeled as unknown.
(k)Parallelepiped Classifier
• This classifier involves algorithms that examine the unknown pixels in the
image and aggregate them into a number of classes based on the natural
groupings or clusters present in the image.
• This method is usually used when there is less information about the data
before classification. There are several mathematical strategies applied in
unsupervised classification:
Strategies applied in unsupervised classification
a) Sequential Clustering: In this method the pixels are analysed one at a time pixel by pixel
and line by line. The spectral distance between each analysed pixel and previously defined
cluster means are calculated. If the distance is greater than some threshold value, the
pixel begins a new cluster.
b) Statistical Clustering: The algorithm uses 3x3 windows in which all pixels have similar
vector in space. Here the windows are moved one at time through the image avoiding the
overlap. The mean and standard deviation are calculated for each band of the window.
The smaller the standard deviation for a given band the greater the homogeneity of the
window. If the window passes the homogeneity test it forms a cluster.
Strategies applied in unsupervised classification
c) ISO Data Clustering (Iterative Self Organizing Data Analysis Techniques): This process
repeatedly performs an entire classification and recalculates the statistics. The procedure
begins with a set of arbitrarily defined cluster means, usually located evenly through the
spectral space. After each iteration new means are calculated and the process is repeated
until there is some difference between iterations. This method produces good results for
the data that are not normally distributed and is also not biased by any section of the
image.
d) RGB Clustering: It is a quick method for 3 band data. The algorithm plots all pixels in
spectral space and then divides this space into 32 x 32 x 32 clusters. A cluster is required
to have a minimum number of pixels to become a class.
Topic 7:
Application of Remote Sensing
• Remote Sensing can be applied in almost any field. Some of the important
applications of remote sensing technology includes:
a) Environmental monitoring and assessment (global warming etc.).
b) Land use and land cover global change detection and monitoring.
c) Prediction of agricultural yield and crop health monitoring.
d) Sustainable resource exploration and management.
e) Ocean and wetland studies.
f) Weather forecasting.
g) Defence and military surveillance.
h) Broadcasting and tele-communication