0% found this document useful (0 votes)
26 views135 pages

Remote Sensing Course Outline

The document outlines a course on Remote Sensing (GEO 2404) aimed at providing students with a foundational understanding and skills in image interpretation and remote sensing techniques. It covers various topics including the electromagnetic spectrum, sensing systems, and applications of remote sensing in fields like forestry and agriculture. Additionally, it discusses the historical development of remote sensing technology and its advantages and limitations.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views135 pages

Remote Sensing Course Outline

The document outlines a course on Remote Sensing (GEO 2404) aimed at providing students with a foundational understanding and skills in image interpretation and remote sensing techniques. It covers various topics including the electromagnetic spectrum, sensing systems, and applications of remote sensing in fields like forestry and agriculture. Additionally, it discusses the historical development of remote sensing technology and its advantages and limitations.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd

REMOTE SENSING

GEO 2404

DONNEX CHILONGA
CELL: 0999371020/0882756490
Email:
donnexc@[Link]/chilonga.d@[Link]
COURSE OUTLINE
Aim of Study
• To help students acquire basic understanding of remote
sensing and skills in image interpretation

Objectives of the Course:


By the end of the course, students should be able to:
a) interpret aerial photo/satellite imagery
b) apply remote sensing techniques
Topics of Study
[Link] Art and Science of Remote Sensing:
Definition of Remote Sensing;
Utilization of Light Energy in Remote Sensing

[Link] Electromagnetic Spectrum (Solar Radiation):


Definition of Electromagnetic waves;
Characteristics of electromagnetic radiation spectrum
Regions of the Electromagnetic Spectrum and their wavelengths;

[Link] and Temperature:


Features of Radiation from the Earth;
Features of Reflected Energy from the Earth;
The Earth as a Blackbody;
Temperature of the Earth and Reflected energy band;
Energy Windows
Topics of Study…continued…
4. Interaction of Electromagnetic Radiation with Earth’s Surface:
Characteristics of incident radiation (absorption, reflection and scattering);
Behaviour of major surface types (water body, bare soil and vegetation)

5. Sensing Systems:
definition of sensor systems (passive systems and active systems);
short wave radiation sensors;
thermal infrared sensors;
passive microwave sensors;
radar sensing system;
orbiting earth satellites

6. Application of Remote Sensing:


Forestry;
Agriculture;
Urban planning;
Natural resources management
Prescribed Texts
Avey, T. E. and G. Berlin. Fundamentals of Remote Sensing and
Airphoto Interpretation, 5th edition. New Jersey: Prentice-Hall, 1985.
Campbell, J. Introduction to Remote Sensing, 3rd edition. New York:
The Guilford Press, 2002.

Recommended Texts
Fotheringham, Flint David, editor. Spatial Analysis and GIS. London:
Taylor and Francis, 1994.
Keates, J. S. Cartographic Design and Production, 2nd edition. Essex:
Harlow, 1989.
MacDongall, E. B. Computer Programming for Spatial Problems.
London: Edgewood and Arnold, 1976.
Ritchie, W. Surveying and Mapping, new edition. Essex: Harlow, 1988.
Topic 1: The Art and Science of Remote Sensing
• All of us in this class have been involved, in one way or another, in collection of data,
processing it, analyzing it, or in fact using it for decision making. This acquired data may be
used to yield information for management purposes i.e., could be disaster management,
natural resource management, land management etc.
For class discussion: What is data? What is information?

• This course therefore focuses on the methods used to collect this particular geospatial data
or georeferenced data.
For class discussion: What is geospatial/georeferenced data?

• The spatial data would be important to different users in different ways. For example;
a) To an agronomist who would be interested in forecasting overall agriculture production.
b) To an urban planner who would be interested in identifying illegally built structures.
c) To a mining engineer who would be interested in providing map for surface minerology.
d) To a climatologist who would be interested to understand the cause of Cyclone Anna.
• All the above examples deal with spatial phenomenon. To acquire information of all
these examples, require a wide variety of methods to be used.

• For purposes of this course, we need to distinguish two main categories of spatial data
acquisition, thus, ground-based and remote sensing methods.

Ground-based methods involve operating in real world environment, making field


observations, taking in-situ measurements, and performing land surveying.

Remote sensing methods involve the use of image data acquired by a sensor such as
aerial cameras, scanners or a radar.
Definition of remote sensing
“The simplest way of understanding the term remote sensing can be related to the reading of
this sentence itself. Our eyes sense the written sentences on the page to which our brain
process this information and interprets the logical meaning. This is how the remote sensing
technology works”.
 Remote sensing is the science and art of obtaining information about an object, area or
phenomenon through the analysis of data acquired by a device that is not in physical
(intimate) contact with the object, area, or phenomenon under investigation.
 Remote sensing is the science of acquiring, processing and interpretation images that
record the interaction between electromagnetic energy and matter.
 Remote sensing is the instrumentation, techniques and methods to observe the Earth’s
surface at a distance and to interpret the images of numerical values obtained in order to
acquire meaningful information of a particular objects on Earth.
 Common to all definitions is the element that characteristics of the Earth’s surface are
acquired with a device that is not in contact with the object being measured.
Historical Overview of Remote Sensing
• Use of photography to study earth features started in 1838 when Wheatstone first
demonstrated use of reflecting mirror stereoscope to view trees on photographs. However,
Niepce, Talbot and Daguerre were the first to make an official disclosure of the existence of
photography in 1839.
• Airborne remote sensing was probably born in 1858 in France when the first known aerial
photograph was taken from a balloon by Tournachon. However, the earliest aerial
photograph still existing today was taken over Boston in the USA by Black in 1860.
• In 1903 the airplane was invented and in 1908 the first aerial photograph from an airplane
was taken in France. Systematic strip photography was introduced during WW I in 1917 and
by 1920 professional aerial photo interpretation was taking place.
• As early as 1922 vertical aerial photographs were being taken for forest surveys in Burma.
The aerial photographs were taken with an amount of overlap to allow 3-dimensional
viewing (stereovision). Until the early 1960s, the aerial photograph remained the single
standard tool for depicting the earth’s surface from a vertical or oblique perspective.
Historical Overview of Remote Sensing…continued…
• Satellite remote sensing started with the launch of the Russian satellite, Sputnik-1 in 1957.
In 1959 the first photo from outer space was taken by the American satellite Explorer-6.
From 1960 onwards there was systematic observation of the earth’s surface by the US
satellite TIROS-1 (Television Infrared Observation Satellite-1). The first US space station,
SkyLab, went into orbit in 1973. This was followed in the 1980s by the reusable US Space
Shuttle.
• In 1972 the US ERTS-1 (Earth Resource Technological Satellite (renamed Landsat-1)) was
launched. This was the first of a series of Landsat satellites (up to Landsat-7), which have
become probably the most important earth resource satellites.
• Today, remote sensing is carried out using airborne methods as well as satellite technology.
Remote sensing not only uses film photography, but also digital cameras, scanners and
video, as well as radar and thermal sensors. In the past, remote sensing was limited to
what could be seen in the visual part of the electromagnetic spectrum. Today, the parts of
the spectrum which cannot be seen with the naked human eye can now be utilised through
special filters, photographic films and other types of sensors.
How is remote sensing useful (Advantages)?

i. It provides a unique perspective from which to observe large regions.


ii. Sensors can measure energy at wavelengths which are beyond the range of
human vision (ultra-violet, infrared, microwave).
[Link] monitoring is possible from nearly any site on earth.
iv. It allows digital processing of the received data.
v. It provides rapid and easily updated data collection methods.
[Link] of land or sea that are otherwise inaccessible can be monitored.
Limitation of Remote Sensing

• The main limitation is that models are not obtained with high
accuracy as compared to in-situ observations or aerial
photography.
• Therefore, it is best practice to integrate remote sensing
methods with its complimentary technologies such as in-situ
observations.
Utilization of light energy in remote sensing
• Every remote sensing process involves an interaction of the
incident radiation falling over the target of interest in a sense that,
the radiation incident over the target is altered on account of the
physical properties of the target and reflect back the incident
radiation which is recorded by the sensor.
• This is illustrated by the use of imaging systems (referred as
optical remote sensing) where the following seven elements of
remote sensing are involved.
• It should also be noted that remote sensing involves the sensing
of emitted energy and the use of non-imaging sensors (referred as
thermal remote sensing).
Elements/Components of a Remote Sensing System
Elements/Components of a Remote Sensing System
i. Source of Illumination (A) - First requirement for any RS process is an energy source to provide EM energy
to the target of interest.
ii. Radiation and the Atmosphere (B) – as the energy propagates from its source to the target, it interacts
with the atmosphere as it passes through. This interaction may take place a second time as the energy
travels from the target and back to the sensor.
iii. Interaction with the Target (C) - once the energy makes its way to the target through the atmosphere, it
interacts with the target depending on the characteristics of both the target and the radiation.
iv. Recording of Energy by the Sensor (D) - after the energy has been scattered or Emitted from the target, a
sensor is required to collect and record the electromagnetic radiation.
v. Transmission, Reception, and Processing (E) - the energy recorded by the sensor has to be transmitted,
often in electronic form, to a receiving and processing station where the data are processed into an image
(hardcopy and/or digital).
vi. Interpretation and Analysis (F) - the processed image is interpreted, visually or digitally or electronically,
to extract information about the target which was illuminated.
vii. End users and application (G) - the last element of RS process is achieved when the useful information is
extracted from the imagery to reveal some new information, or assist in solving a particular problem.
Topic 2: The Electromagnetic Energy/Radiation
and The Electromagnetic Spectrum

• Remote sensing relies on measurements of


electromagnetic (EM) energy whose most
important source is the sun.
• Many sensors used in remote sensing measure:
a) reflected sunlight while
b) some detect emitted energy and
c) some produce their own energy.
What is Electromagnetic radiation (EMR), (aka electromagnetic energy)
• EMR refers to all energy that moves with the velocity of light in the form of waves. The source of EMR is
the subatomic vibration of photons and is measured in terms of wavelength. Sun is the main source of EMR
that travels through space in the form of waves.
• Wavelength is the distance between successive wave crests and is represented by a Greek letter lambda
(λ). Its units of measurement is in metres (m) or some fraction of metres such as nanometres (nm, 10-9
metres), micrometres (μm, 10-6 metres) or centimetres (cm, 10-2 metres).
• Frequency refers to the number of cycles of a wave passing a fixed point per unit of time and is normally
measured in hertz (Hz).
• Wavelength and frequency are inversely related to one another. As one increases, the other decreases. This
relationship is expressed through the general (basic) wave theory equation as:
c=νλ (therefore ν = c /λ and λ = c / ν).
Where: c = speed of light (3 x 108 ms-1), λ is the wavelength in (m, cm, μm or nm) and ν is the
frequency in Hertz (cycles/second).
• It can be inferred from equation that radiation with a small wavelength will have a high frequency hence
produces high energy whereas, radiation with a high wavelength will have a low frequency hence produces
low energy.
Propagation of electromagnetic waves with the speed of
The wave nature of EM energy light
What is Electromagnetic spectrum
• It is the distribution of electromagnetic radiation according to frequency and wavelength
(or energy) occurring as a continuum. It covers the entire range of photon energies
arranged in the increasing order of wavelengths on a logarithmic scale.

• The electromagnetic spectrum ranges from the shorter wavelengths (including gamma and
X rays) to the longer wavelengths (including microwaves and radio waves).

• All these forms of energy radiate in accordance with basic wave theory, c = ν λ. This
theory describes EM energy as traveling in a harmonic, sinusoidal fashion at the velocity
of light c measured in ms-1.
The Electromagnetic spectrum
Regions of electromagnetic spectrum and wavelengths as used for Remote Sensing

Sensors on board earth resources satellites operate in visible, infrared and microwave
regions of the spectrum.

i. Ultraviolet: The UV region has short wavelengths (0.3 to 0.446 μm) and high frequency. UV wavelengths are not
necessarily available for RS as almost 80 to 90% of UV is absorbed by Ozone (O 3) but rather are used as a catalyst in
geologic and atmospheric science applications. Geologic application includes material exploration since many rocks
and minerals exhibit fluoresce property and emit visible light in the presence of UV radiations .

ii. Visible light: This is the portion of EMR to which our eyes are sensitive and is perceived as colors. The visible light
covers a very small portion of the spectrum, ranging from approximately 0.4 to 0.7 μm (micrometer).

 Violet: 0.4 to 0.446 μm


 Blue: 0.446 to 0.500 μm
 Green: 0.500 to 0.578 μm
 Yellow: 0.578 to 0.592 μm
 Orange: 0.592 to 0.620 μm
 Red: 0.620 to 0.700 μm
Regions of electromagnetic spectrum and wavelengths as used for Remote Sensing…cntd…
iii. Infrared: Covers wavelength range from approximately 0.7 μm to 100 μm.
The IR region is generally divided into two categories based upon their radiation characteristics i.e.
a. reflected IR and
b. the emitted or thermal IR.
• Reflected IR region is used in ways similar to the radiations in the visible portion. It covers wavelengths
approximately from 0.7 μm to 3.0 μm. It is mainly employed for monitoring the status of healthy and
unhealthy vegetations, as well as for distinguishing among vegetation, soil and rocks.

• The thermal IR differs from visible and reflected IR in a way that this energy is radiated or emitted from
the earth surface or objects and characterizes in the form of heat. It covers wavelengths from
approximately 3.0 μm to 100 μm These are mainly used for monitoring the temperature variations of land,
water and ice.

iv. Microwave: This portion is of recent interest to remote sensing and the wavelength ranges approximately
from 1 mm to 1 meter. The shorter wavelengths have the properties similar to thermal infrared region while
the longer wavelengths are used for radio broadcasts. Atmospheric disturbances are minimal and it is
transparent to clouds hence making it good for active sensors. Microwave remote sensing is used in the
studies of meteorology, hydrology, oceans, geology, agriculture, forestry and soil moisture sensing.
Regions of electromagnetic spectrum and wavelengths as used for Remote Sensing

Name Wavelength Range Radiation Source Surface Property of


interest
Visible (V) 0.4 – 0.7 μm Solar Reflectance

Near Infrared (NIR) 0.7 – 1.1 μm Solar Reflectance

ShortWave Infrared 1.1 – 3 μm Solar Reflectance


(SWIR)
MidWave InfraRed 3 – 5 μm Solar, Thermal Reflectance, temperature
(MWIR)
Thermal InfraRed (TIR) 8 – 14 μm Thermal Temperature

Microwave, RADAR 1 mm – 1 m Passive: Thermal Temperature (P)


Active: Artificial Roughness (A)
Topic 3: Radiation and Temperature

• For us to understand better the relationship between radiation and


temperature, we need to understand some radiation terminologies and
the associated electromagnetic laws.

• The concept of electromagnetic radiation involves various terminologies


that form the basis of many important radiation laws widely used in
remote sensing-based analysis and modelling.
Important Radiation terminologies
Radiant energy
• This is the energy carried by electromagnetic radiation and is
expressed in unit of Joules.
Radiant flux
• This is the radiant energy transmitted as a radial direction
per unit time and expressed in Watt (W). In other words, the
rate at which photons strike a surface is called as the radiant
flux.
Radiant intensity (Ie)

• This is radiant flux radiated from a point source per unit solid
angle in a radiant direction and is expressed in units of W sr-1
(where sr is the spectral radiance).
Irradiance (Ee)
• This is the radiant flux incident upon a surface per unit area
and expressed in a unit of Wm-2.
Radiant emittance (Radiant exitance ‘Me’)
• This is the radiant flux radiated from a surface per unit area
and expressed in a unit of Wm-2.
Radiance (Le)
• This is the radiant intensity per unit projected area in a radial
direction and expressed in the unit of W sr-1. Radiance is also
described as the radiation field as dependent on the angle of
view.
Electromagnetic Radiation Laws
• The electromagnetic energy follows certain physical laws as it
moves away from the source.
• Isaac Newton in his theory analyzed the dual nature of light
energy exhibiting both discrete and continuous phenomena
associated with the stream of minuscule (minute, tiny) particles
travelling in a straight line.
• This notion is consistent with modern theories of Max Plank
(1858 – 1947) and Albert Einstein (1879 – 1955).
The concept of a Black Body
• All objects with temperature above absolute zero emit electromagnetic energy.
• The amount of energy and the associated wavelengths depend upon the temperature
of the object. As the temperature of an object increases, the quantum of energy
emitted also increases, and the corresponding wavelength of the maximum emission
becomes shorter.
• The above hypothesis can be expressed by using the concept of blackbody. A blackbody
is a hypothetical source of energy that behaves in an idealized manner such that it
absorbs all or 100% of the radiation incident upon it and emits back (or radiates) the
energy as a function of temperature.
• The Planks’, Kirchhoff’s, Stefan-Boltzmann and Wien’s displacement laws explain the
relationship between temperature, wavelength, frequency and intensity of energy.
Planks Law
• This law provides the spectral radiance of a black body as a function of temperature.

• Any object with T>0Kelvin (absolute zero temperature) radiates energy.

• Plank ascertained that electromagnetic energy is absorbed and emitted in discrete units
called ‘photons’. The size of each unit is directly proportional to the frequency of the
energy’s radiation.

• Therefore, Plank’s theory proposed that electromagnetic energy can be quantified by its
wavelength and frequency and its intensity is expressed by ‘Q’ and is measured in Joules.

• The energy released by a radiating body in the form of a vibrating photon travelling at a
speed of light can be quantified by relating the energy’s wavelength with its frequency.
Planks Law…continued…
• Plank defined a constant ‘h’ to relate frequency (ν) to radiant energy ‘Q’ and is expressed
as follows:
Planks Law…continued…
• The above equation reveals that longer wavelengths have low
energy of photons while for short wavelengths the energy will be
high.

• For instance, blue light is on the short wavelength end of the


visible spectrum (0.446 to 0.500 μm) thus has higher energy
radiation in contrast to red light (0.620 to 0.700 μm) on the far
end of the visible spectrum has low energy radiation.
Kirchhoff’s Law
• Kirchhoff’s law states that the ratio of emitted radiation to the absorbed radiation flux is the
same for all black bodies at the same temperature and forms the basis of the term emissivity
(), which is defined as the ratio between the emittance of a given object (M) and that of a
blackbody at the same temperature (Mb):

• The emissivity of a true blackbody is 1, and that of perfect radiator (a white body) would be
zero. This implies that all objects have emissivities between these two extremes. Objects that
absorb high proportions of incident radiation and re-radiate this energy have high emissivities,
whereas those which absorb less radiation have low emissivities, i.e., they reflect more energy
that reaches them.
Stefan-Boltzmann Law
• The Stefan-Boltzmann law defines the relationship between the total emitted
radiation (W) (expressed in watts cm-2) and temperature (T) (absolute
temperature, K):
W = T4

• The total radiation emitted from a black body is proportional to the fourth
power of its absolute temperature. The constant () is the Stefan-Boltzmann
constant (5.6697 × 10-8) (watts m-2 K-4).
• In short, Stefan-Boltzmann law states that hot blackbodies emit more energy
than cool blackbodies.
Wien’s Displacement Law
• This law specifies the relationship between the wavelength of emitted radiation and the
temperature of the object:

 = 2898/T
• Where,  is the wavelength hat which the radiance is at a maximum and (T) is the absolute
temperature in Kelvin (K). As objects become hotter, the wavelength of maximum
emittance shifts to shorter wavelengths.
• This law is useful for determining the optimum wavelength of object having temperature (T)
Kelvin.
• Together, the Wien and Stefan-Boltzmann law are powerful tools. With the help of these
laws, temperature and radiant energy can be determined from an object’s emitted
radiation. For example, temperature distribution of large water bodies can be mapped by
measuring the emitted radiation, similarly, discrete temperatures over a forest canopy can
be detected to plan and manage forest fires.
• An example illustrating the radiation laws;
Self-assessment exercise

1. Calculate the wavelength of the maximum energy emission for the Mars

which has a surface temperature of approximately 150K and lava erupting

from a volcano at 900K.

2. Which of the following wavelengths would you use to measure the

brightness temperature of sea surfaces and why?

a) visible, b) short wave infrared, or c) thermal infrared?


Topic 4: Interaction of Electromagnetic Radiation
with the Atmosphere
• The atmosphere is a mixture of gases at different layers levels. The first 80 kilometers
contains more than 99% of the total mass of the Earth's atmosphere.
• The earth’s atmosphere is capable of transmitting the entire electromagnetic radiation
used for a variety of remote sensing applications. However, the electromagnetic radiations
get affected as these passes through different layers of atmosphere. This can be
ascertained with the fact that the quality of image captured through aircraft borne sensor
is less affected by atmosphere as compared to the quality of image captured by satellite
borne sensor which is severely affected due to the fact that radiation passes through the
entire depth of the earth’s atmosphere. Hence fundamental knowledge of
electromagnetic energy interaction with atmosphere forms an integral part of remote
sensing-based analysis of spatial data.
• The incoming electromagnetic radiation gets affected by the particles of various sizes and
gases (e.g., dust, smoke, haze and other atmospheric impurities) present in the
atmosphere in suspended form therefore causing significant change in an acquired image
in terms of brightness and colour.
• The electromagnetic energy passing through the earth’s atmospheric layers is subjected to
alterations by two important physical processes, namely: (i) scattering, and (ii) absorption.
Scattering
• Scattering of electromagnetic radiation takes place when gas molecules and particles
present in the atmosphere interact with it and redirects it from its original path.

• The amount of scattering depends on several factors including the wavelength of the
radiation, the abundance and size of the particles or gases, and the distance the radiation
travels through the atmosphere.
Types of scattering

• Generally, there are three types of scattering which take place through the
earth’s atmosphere, namely;

a) Rayleigh scattering,

b) Mie scattering and

c) nonselective scattering.
Rayleigh scattering:
• Rayleigh scattering takes place when the suspended particles are very small (mainly
comprising of oxygen molecules or dust particles) as compared to the wavelength of the
radiation. This type of scattering causes shorter wavelengths of energy to be scattered
much more than longer wavelengths. Therefore, the law is governed by the principle of
reciprocal of fourth power of the wavelength; expressed as:

Rayleigh scattering = 1/4 Where,  is the wavelength in meters.

• This type of scattering is dominant in the upper layers of atmosphere where tiny dust
particles and gas molecules predominates. Rayleigh scattering is responsible for the blue
color of the sky, since blue light is scattered the most on account of the size of wavelength
smaller than the size of dust particles and gas molecules. Same reasoning applies for the
appearance of orange color of the sky at dusk, i.e., when sun is low in the horizon, it
creates longer path length to the incoming radiations resulting in the scattering of red
light.
Mie scattering:

• This type of scattering occurs when the particles are just the same size as the
wavelength of the radiation. Dust, pollen, smoke and water vapour are common
causes of Mie scattering wherein the longer wavelengths are scattered the
most. Mie scattering is more dominant in the lower layers of atmosphere (i.e.,
within 0 to 8 km). In the lower layers of atmosphere larger particles are in
abundance and influence a broad range of wavelengths in and near the visible
spectrum.
Nonselective scattering:

• This scattering phenomenon occurs when the particles are much larger than the
wavelength of the incoming radiation thereby leading to approximately equal
scattering of all wavelengths (i.e., blue + green + red light = white light). Water
droplets and large dust particles are mostly responsible for causing this type of
scattering. Due to this scattering, clouds appear white in color and so as a blurry
white foggy appearance to the suspended water droplets during winter seasons.
Absorption
• Gases namely; ozone (O3), carbon dioxide (CO2) and water vapor (H2O) are responsible for
most of the absorption of electromagnetic radiations.

• Formation of ozone is the result of interaction of high energy ultraviolet radiations with
oxygen molecules (O2) present at an altitude of 20 to 30 km in the stratosphere. Ozone
layer forms a protective layer in the atmosphere by absorbing the harmful UV radiations
that may otherwise cause skin burns or other severe skin diseases if exposed to sun light.

• CO2 occurs in low concentrations (approximately 0.035% by volume of a dry atmosphere),


mainly in the lower layers of atmosphere. CO2 effectively absorbs radiation in the mid and
far infrared regions (mostly in the range 13 to 17.5 µm) of the electromagnetic spectrum.

• Lastly, water vapour present in the lower atmosphere (concentration normally varies from
0 to 3% by volume) are more effective in absorbing radiations as compared to other
atmospheric gases. Two important regions of spectrum ranging from 5.5 to 7.0 µm and
above 27 µm, are significantly absorbed up to 75% to 80%.
The Atmospheric Windows
• These are regions or bands of the electromagnetic spectrum which are not severely
influenced by atmospheric absorption and thus are partially or completely transmitted
through onto the surface of the Earth.
• In other words, gas molecules present in the atmosphere selectively transmit radiations
of certain wavelengths and those wavelengths that are relatively easily transmitted
through the atmosphere is referred to as atmospheric windows.
• Around 90 to 95% of the visible light passes through the atmosphere otherwise there
would never be bright sunny days on earth.
• The atmosphere is almost 100% translucent for certain wavelengths of mid and near
infrared spectrum which makes remote sensing analysis of satellite images in these
regions possible with a minimum distortion. The thermal infrared ranges from 10 - 12
m is used in measuring surface temperatures of the ground, water and clouds. Ozone
blocks ultraviolet radiation almost completely and almost all radiation in the range of 9.5
to 10 m is absorbed.
The Atmospheric Windows
Self-assessment exercise

1. Explain why most remote sensing sensors avoid detecting and recording wavelengths
in the ultraviolet portion of the spectrum.

2. What do you think would be some of the best atmospheric conditions for remote
sensing in the visible portion of the spectrum?
Interaction of EMR with the Earth’s Surface
• Electromagnetic radiations that are not completely absorbed or scattered in the atmosphere,
travels through the entire depth of the atmosphere before finally reaching the earth’s
surface.

• Three types of interaction take place when the radiations are incident (I) upon the surface;

a) reflection (R), b) absorption (A); and c) transmission (T).

• The total energy of radiations incident upon the surface follows the law of conservation of
energy or the energy balance written as:

E i  Er  Ea  Et

• Where, Ei is the incident energy, Er is the reflected energy, Ea is the absorbed energy and Et
is the transmitted energy.

• The type and degree of interaction of radiations varies in accordance to the size and
• The type and degree of interaction of radiations varies in accordance to the size
and surface roughness for different objects as well as varying wavelengths.
Reflection:
• Reflection occurs when a radiation bounces off a target (mostly opaque surfaces) and is re-
directed. The degree of re-direction depends upon the magnitude of surface roughness
compared to the wavelength of the incident radiation.
Types of reflection

• Two types depending upon the characteristics of the surface or feature of interest.

i. Specular reflection which takes place when the surface is smooth and all (or almost all)
of the energy is directed away from the surface in as single direction. The angle of
reflection is equal to the incident angle (e.g., mirror, still water body or smooth metal
surface). This type of reflection results into a very bright spot (hot spots) on the image.

ii. Diffuse reflection occurs over the rough surfaces where incident radiations are reflected
almost uniformly in all directions. If the wavelengths are much smaller than the surface
roughness variations, diffuse reflection will dominate (e.g., loam soil would appear fairly
smooth to long wavelength microwaves in contrast to visible spectra wavelengths).
Transmission:
• Transmission of radiation takes place when radiation passes through the object or
feature without any considerable fading or attenuation. For a known thickness of
the object, the ability of a medium to transmit energy is measured as a
transmittance (t);

• Depending upon the characteristics of the medium, during the transmission


velocity and wavelength of the radiation changes, whereas the frequency remains
same. The transmitted energy may further get scattered and / or absorbed in the
medium.
Absorption

• This occurs when radiation is absorbed by the target. The portion of the EM

energy which is absorbed by the Earth’s surface is available for emission and as

thermal radiation at longer wavelengths. (Therefore, sometimes Absorbance in

energy balance equation can be represented as emissivity).


Spectral Response Curves and Spectral Signatures
• The proportions of energy reflected, absorbed, and transmitted (called spectral response)
will vary for different earth features, depending on their material type and condition.
Objects with similar physical properties will have a similar spectral response to EM energy
while those with different properties will have different responses. These differences permit
us to distinguish different features on an image.

• Furthermore, the spectral response of a feature is wavelength dependent meaning that the
proportion of reflected, absorbed, and transmitted energy will vary at different
wavelengths. Thus, two features may be indistinguishable in one spectral range and be very
different in another wavelength band. E.g., water and vegetation reflect nearly equally in
visible wavelengths, yet these are separable in near-IR wavelengths.

• Within the visible portion of the spectrum, these spectral variations result in the visual
effect called colour. E.g., vegetation reflects more highly in the green portion of the
spectrum and therefore appears green. About 95% of blue and red light are absorbed by
chlorophyll for photosynthesis while most of the green component is reflected.
Spectral reflectance (Rλ)
• This is the reflectance characteristics of earth surface features expressed as the ratio of
energy reflected by the surface to the energy incident on the surface, measured as a
function of wavelength. It is also known as albedo of the surface. It may vary from 0-100%.

• Spectral reflectance is an important parameter in determining spectral signature of features.

• Spectral Reflectance/radiance of various Earth surface features is as follows:


Spectral response curve
• This is a graphical representation of spectral response (reflectance
characteristics) of an object over different wavelengths of the electromagnetic
spectrum. These curves give an insight into the spectral characteristics of
different objects, hence used in the selection of a particular wavelength band
for remote sensing data acquisition. The graph is drawn between various
wavelengths (μm) of EM spectrum on x-axis & the amount of reflectance (%)
recorded by the R.S. system on the y-axis.

• Spectral reflectance curve shows the "peak-and-valley" configuration. High


amount of reflectance of a wavelength from a particular feature may result in
peaks in the graph and low reflectance results in a dip or valley in the curve.
The peaks indicate strong reflection of incident energy and the valleys indicate
predominant absorption of the energy in the corresponding wavelength bands.
Spectral signatures
• This is a unique spectral response or reflectance characteristics of a feature or object
depending on the properties of the object.

• Behaviour of major surface types; water body, bare soil and vegetation.
Topic 5:
Remote Sensing Systems
• In Chapter 4, we looked at the underlying principles of
remote sensing. In this chapter, we shall look at how sensors
record and measure the electromagnet energy as reflected
or emitted by an object.
• Please refer to the diagram of components of RS as shown in
chapter 1.
What is a sensor?
• A sensor is a device that detects, measures and records
electromagnetic energy.
Types of sensors
There are two types of sensors:
a) passive and
b) active sensors.
Passive sensors
• These are sensors that depend on an external source of energy, usually the Sun.

• They can only be used to detect energy when the naturally occurring energy is
available (during the day).

• For example, a camera without a flash, image plane scanning sensors, such as
TV cameras and multispectral scanners covering both along-track scanning
systems (SPOT) and across-track scanning sensors, such as multispectral
scanners (LANDSAT MSS, TM; optical-mechanical scanner and scanning
microwave radiometers).
Active sensors
• Active sensors rely on their own sources of energy for illuminating
objects.

• For examples,

a. a camera with a flash light,

b. a LIDAR or RADAR such as synthetic aperture radar (SAR), which can


produce high resolution imagery, day or night, even under cloud
cover.
A sensor platform
• This is a stand on which sensors are mounted.

• For example, consider our eyes as natural sensors and the head being
a natural platform.

• Platforms for remote sensors may be situated

a) on the ground,
b) on an aircraft or balloon or on a spacecraft and
c) on satellite outside of the Earth's atmosphere.
Airborne Remote Sensing – Aerial Photography

• The term "photography" is derived from two Greek words meaning "light"
(phos) and "writing" (graphien).

• An aerial photograph, in broad terms, is any photograph taken from the air,
usually taken from aircrafts flying at altitudes ranging from one to nine
kilometers using a highly-accurate camera.

• Most air photo missions are flown using black and white film. However
colour, infrared, and false-colour infrared film are sometimes used for
special projects.
Types of photography
Two types of photography

1. Vertical: produced with


camera aligned perpendicular
to the object. This is the most
desired photograph to get.

2. Oblique: produced with


camera tilted to an angle.
HIGH OBLIQUE LOW OBLIQUE

See the horizon. Any difference???


Basic Concepts and Terminologies of Aerial
Photography
Film

• Film: most air photo missions are flown using black and
white film, however colour, infrared, and false-colour
infrared film are sometimes used for special projects.

• Motion picture usually in black and white used to produce


images.
Focal Length
• Is the distance from the
middle of the camera lens to
the focal plane (i.e. the film).

• As focal length increases,


image distortion decreases.
The focal length is precisely
measured when the camera
is calibrated.
Flight lines or Flight strips (Swath)
• Flight Line: The flight path of
the airplane carrying the camera.

• Flight Strip: The strip of


photographs produced from a
single flight.

• Flight Plan: The aerial


photography operational
procedure in which the flight
objectives and the performance
criteria are specified.
Forward overlap and Sidelap
• Overlap: is the amount by which area
of one photograph includes the area
already captured in another
photograph.

• Forward overlap: the end lap is the


overlapping of successive photos
along a flight strip.

• Sidelap: is the overlap of adjacent


flight strips.
Stereoscopic coverage
• Is the three-dimensional view
which results when two
overlapping photos (called a stereo
pair), are viewed using a
stereoscope.

• Each photograph of the stereo pair


provides a slightly different view of
the same area, which the brain
combines and interprets as a 3-D
view.
Roll and Photo Numbers
• Each aerial photo is assigned a unique

index number according to the photo's

roll and frame. E.g., photo A23822-35 is

the 35th annotated photo on roll A23822.

• This identifying number allows you to find

the photo in archive, along with metadata

information such as the date it was taken,

the plane's altitude, the focal length of

the camera, and the weather conditions.


Fiducial marks

• These are four small


special registration
marks found in the
corners or in the middle
of the edges of each AP.

• These V-shaped notches


are used to locate the X
and Y axes of the photo.
Principal point

• Is the point where the


perpendicular projected
through the centre of the
lens intersects the photo
image.

• The intersection of the X


and Y axes.

• It is the centre of the photo.


Nadir point
• The point where an imaginary
vertical line, starting from the
camera to the globe’s center,
passes through the photograph.
This line is traced on the ground
directly beneath the camera and
it connects the image centers of
the vertical photographs.

• It is the point vertically beneath


the camera centre at the time of
exposure.
Isocentre

• This is a point that is equidistant to


the nadir and principal points. It falls
on a line half-way between the
Principal point and the Nadir point.

• If camera is exactly vertical, principal


point, nadir point, and isocentre will
be one point.

• In oblique and highly oblique AP,


principal, nadir, isocentre are very
highly distorted due to displacement.
Scale
• Scale of an AP expresses the mathematical relationship between distance measured on
the photo and the corresponding horizontal distance measured on the ground.

• The most straightforward method for determining photo scale is to measure the
corresponding photo and ground distances between any two points.

• The scale S is then computed as the ratio of the photo distance d to the ground distance
D.
S=d/D
Example:
• Calculate the scale of an AP if a 1km stretch of highway covers 4cm on an air photo.
Scale as a function of focal length & terrain elevation
• For a vertical photograph taken over flat terrain, scale is a function of the
focal length f of the camera and the flying height above the ground H’.

S = f / H’
• If flying height above sea level H and elevation of the terrain h are known,

S=f/H–h
Use of average scale
• Equation above illustrates that photo scale is a function of terrain elevation.

• Photographs taken over terrain of varying elevation will exhibit a continuous


range of scales. Likewise, tilted and oblique photographs.

• It is therefore convenient to compute an average scale for an entire


photograph. This scale is calculated using the average terrain elevation.

Savg = f / H – havg

• The result of photo scale variation is geometric distortions and relief displacements.
Distortion and Displacement
Distortion

• Shift in the location of an object that changes the perspective characteristics


of the photo.

Displacement

• Shift in the location of an object in a photo that does not change the
perspective characteristics of the photo (The fiducial distance between an
object's image and it's true plan position which is caused by change in
elevation.)
Types of distortions

Four types of distortions

1. Film and Print Shrinkage;

2. Atmospheric refraction of light rays;

3. Image motion; and,

4. Lens distortion.
Types of displacement

Three types of displacement

1. Curvature of the Earth;

2. Tilt; and,

3. Topographic or relief (including object height).

NOTE:

Both distortion and displacement cause changes in the apparent location


of objects in photos.
Taking Vertical Aerial Photographs
• It is very difficult to get aerial photographs that are exactly vertical due to
unavoidable tilt of aircraft taking the APs.

• Vertical APs are taken along flight lines or flight strips, which are often
parallel. GPS to control flight line direction.

• Successive photographs are generally taken with some degree of overlap


by about 50% to 60% in the direction of the flight line (forward overlap).
This ensures total stereoscopic coverage along a flight line.

• When the aim of the photography is to cover the whole target area
completely, the ground areas of adjacent flight lines overlap by 15-40%.
This is called sidelap.
Discussion questions

1. How is the Principal Point determined from aerial photographs?

2. What aspects of photographic geometry cause differences between


Nadir and the Principal Point?

3. Why are most aerial photographs taken from a tilted angle opposed to a
vertical position?

4. What two aspects of aerial photography are used to define the scale of a
photograph, and how are they calculated?
Space (Satellite) Remote Sensing
• This became feasible with the launch of USSR satellite Sputnik-1 in 1957.

• Today, there is a wide choice of satellites data sources but for this course, we shall
concentrate on two.
1. The Earth Resource Satellites (Earth Observation Satellites).
2. The Environmental Satellites.
Advantages of Satellite Images over Aerial Photographs
i. Satellite data coverage has no limitation (space or political boundaries)
ii. The data are homogeneous
iii. Spatially continuous
iv. Data already in digital format
v. Data collection frequency is high with no limitation
vi. Lower cost than aerial photography
An orbit

• An orbit is the path followed by a satellite as it moves around the earth.

• There are two orbits used: near polar sun synchronous and geostationary.

• Near polar & sun synchronous orbit is where the orbital plane is inclined at a small
angle with respect to the earth's rotation axis, and the satellite travels northwards on
one side of the Earth and then toward the southern pole on the second half of its orbit.

• Geostationary orbit is where a satellite follows an orbit parallel to the equator in the
same direction as the earth's rotation and view the same portion of the Earth’s surface
at all times. The satellite is stationary with respect to the earth surface.
Characteristics of sensors on satellites
 Spectral resolution: describes the ability of a sensor to define fine wavelength intervals.
The finer the spectral resolution, the narrower the wavelength range for a particular
channel or band (panchromatic, multi-spectral and hyper-spectral sensors).

 Radiometric resolution: describes the sensor is ability to discriminate very slight


differences in energy. The finer the radiometric resolution of a sensor, the more sensitive
it is to detecting small differences in reflected or emitted energy.

 Spatial resolution: describes the area of the earth that each pixel represents (the size of
the smallest possible feature that can be detected).

 Temporal resolution: refers to the repetitivity of observation over an area and is equal to
the time interval between successive observations.
Using the characteristics above, we are collecting the most common land
observation satellites into the following groups.
In terms of the spatial resolution, the satellite imaging systems can be classified into:

 Low resolution systems (approx. 1 km or more)

 Medium resolution systems (approx. 100 m to 1 km)

 High resolution systems (approx. 5 m to 100 m)

 Very high-resolution systems (approx. 5 m or less)


In terms of the spectral regions used in data acquisition, the satellite
imaging systems can be classified into:
 Optical imaging systems (include visible, near infrared, and
shortwave infrared systems)
 Thermal imaging systems
 Synthetic aperture radar (SAR) imaging systems
Optical/thermal systems can be classified according to the number of spectral bands used:

 Panchromatic imaging systems: the sensor is a single channel detector sensitive to


radiation within a broad wavelength range. The physical quantity being measured is the
apparent brightness of the targets. Egs are IKONOS Pan, QuickBird Pan, SPOT Pan, LANDSAT
ETM+ Pan.

 Multi-spectral imaging system: The sensor is a multi-channel detector with a few spectral
bands. Each channel is sensitive to radiation within a narrow wavelength band. Egs
areLandsat TM, MSS, Spot HRV-XS, Ikonos MS, QuickBird MS.

 Super-spectral/Hyper-spectral imaging system: it acquires images in many more spectral


channels than a multi-spectral sensor (from tens to hundreds of spectral bands). Examples
of hyper-spectral systems are: MODIS, MERIS.
Summary of most important satellite systems/programs

We will briefly look at the following satellite programmes


1. Landsat Satellite Programme
2. SPOT (Systeme Pour l’Observation de la Terre) Satellite Program
3. Meteorological Satellites
Landsat Satellite Programme

Satellite Launch Decommissioned RBV MSS TM Repeat Cycle

Landsat-1 July 23, 1972 January 6, 1978 √ √ X 18 dys

Landsat-2 Jan 22, 1975 February 25, 1982 √ √ X 18 dys


Landsat-3 Mar 5, 1978 March 31, 1983 √ √ X 18 dys
Landsat-4 July 16, 1982 X √ √ 16 dys

Landsat-5 Mar 1, 1984 X √ √ 16 dys


Landsat-6 Oct 5, 1993 Failure on launch X X ETM 16 dys
Landsat-7 Apr 15, 1999 X X ETM+ 16 dys
Sensors aboard the Landsat Systems

(a) The Return Beam Vidicon (RBV) Camera

(b) The Multi-spectral Scanner (MSS)

(c) The Thematic Mapper (TM)

(d) The Enhanced Thematic Mapper (ETM)

(e) The Enhanced Thematic Mapper Plus (ETM+)


Landsat 1,2,3 Landsat 4,5
Summary of most important satellite systems/programs

We will briefly look at the following satellite programmes


1. Landsat Satellite Programme
2. SPOT (Systeme Pour l’Observation de la Terre) Satellite Program
3. Meteorological Satellites
SPOT (Systeme Pour l’Observation de la Terre) Program

Satellite Launch Decommissioned HRV HRVIR Veg Repeat Cycle

SPOT-1 Feb 21, 1986 December 31, 1990 √ X X 26 dys

SPOT-2 Jan 21, 1990 √ X X 26 dys

SPOT-3 Sep 25, 1993 √ X X 26 dys

SPOT-4 Mar 23, 1998 X √ √ 16 dys


Sensors aboard the SPOT Systems

1. High Resolution Visible (HRV) Imaging Systems.

2. High Resolution Visible and Infrared (HRVIR).


SPOT Configuration
Meteorological Satellites
• Designed specifically for weather
prediction and monitoring.
• Their sensors have very low spatial
resolution but they offer the advantage of
a high temporal resolution.
Examples
1. NOAA Satellites (National Oceanic and
Atmospheric Administration). They
contain Advanced Very High-Resolution
Radiometer (AVHRR).
2. GOES Satellites (The Geostationary
Operational Environmental Satellites).
GOES image of the Western Hemisphere
Topic 6:
Digital Image Processing and Analysis
Data Acquisition and Interpretation

• The detection of EM energy can be performed either photographically


or electronically.

• Photographic systems are relatively simple and cheap and provide a


high degree of spatial detail.

• Electronic sensors offer the advantages of a broader spectral range of


sensitivity, improved calibration potential, and the ability to
electronically store and transmit data.
Picture elements/Pixels
• The basic character of digital image is its two-dimensional array of discreet picture elements or
pixels.

• The intensity of each pixel corresponds to the average brightness, or radiance, measured
electronically over the ground area. Each pixel has a digital number (DN) corresponding to the
radiance of each pixel.

• Typically, the DNs are recorded over such numerical ranges as 0 to 255, 0 to 511, 0 to 1023, or
higher. These ranges represent the set of integers that can be recorded using 8-bit, 9-bit, or
10-bit binary computer coding scales respectively. (N.B. 28=256, 29=512, 210=1024. Note that
the "bright" pixels have high number values (i.e., 200 to 255), while the "dark" pixels, have low
number values (i.e., 50-100).
Interpretation & analysis of remote sensing data
• This involves the identification and measurement of various targets in an image in order to
extract useful information about them.
Two main methods can be used:
1. visual interpretation: performed by a human interpreter, and is based on
feature tone (color), pattern, shape, texture, shadow and association.
2. digital processing and analysis: performed using a computer (without
manual intervention by a human interpreter).
• Digital processing and analysis is carried out as a replacement for manual interpretation. It
is done to supplement and assist the human analyst. However, computer analysis
techniques are more limited in their ability to evaluate spatial patterns than visual
methods. Therefore, visual and numerical techniques are complementary in nature.
Digital Image Processing and Analysis
• This is a collection of techniques for the manipulation of digital images by
computers.
Four most common image processing functions:

1. Preprocessing

2. Image Enhancement

3. Image Transformation

4. Image Classification and Analysis


1. Pre-Processing of the Remotely Sensed Images

• Pre-processing processes refer to those operations that are preliminary to the main
analysis.

• These techniques involve removal of unwanted and distracting elements in an attempt


to "restore" the digital data to their correct or original condition, thereby presenting a
faithful representation of the earth’s surface (of course we never know what were the
correct values and must always remember that attempts to correct data may actually
introduce errors).

• Preprocessing processes include: feature extraction, radiometric corrections, geometric


corrections, and atmospheric corrections.
a. Feature Extraction
• This does not mean geographical features visible on the image but rather
"statistical" characteristics of image data like individual bands or
combinations of band values that carry information concerning systematic
variation within the scene.

• After the feature extraction is complete the analyst can work with the desired
channels or bands, but in turn the individual bandwidths are more powerful
for information. Such a pre-processing increases the speed and reduces the
cost of analysis.
b. Radiometric Corrections
• This aims to reconstruct physically calibrated values by correcting the
spectral errors and distortions caused by sensors, sun angle, topography
and the atmosphere.

• When images are recorded by the sensor, they contain errors in the
measured brightness values of the pixels. Radiometric processing
influences the brightness values of an image to correct for sensor
malfunctions or to adjust the values to compensate for atmospheric
degradation.
c. Geometric Corrections
• Raw digital images often contain serious geometrical distortions that arise from
earth’s curvature, platform motion, relief displacement, and non-linearities in
scanning motion.

• This can be done through rectification and registration.

a) Rectification is the process of projecting image data onto a plane and making it
conform to a map projection system.

b) Registration is the process of making image data conform to another image.

• Involves rearrangement of the input pixels onto a new grid which conforms to the
desired map projection and coordinate system.
Two steps involved in geometric registration process
 identifying the image coordinates (i.e. row, column) of several clearly discernible points,
called ground control points (GCPs), in the distorted image and matching them to their
true positions in ground coordinates (e.g. latitude, longitude measured from a map -○).

 resampling: this process is used to determine the digital values to place in the new pixel
locations of the corrected output image.

Three common methods for resampling

a) nearest neighbour,

b) bilinear interpolation, and

c) cubic convolution (Lillesand T. at al, 2008)


d. Atmospheric Corrections
• The output from the instrument on satellite depends on the intensity and
spectral distribution of energy that is received at the satellite.

• The intensity and spectral distribution of energy/radiation has traveled


some distance through the atmosphere and accordingly has suffered both
attenuation and augmentation in the course of the journey thereby
degrading the output.

• Rectifying the image data for the degrading effects of the atmosphere
entails modeling the scattering and absorption processes that take place.
2. Image Enhancement Techniques
• Performed to make satellite imageries more informative. Involves the alteration of the
appearance of an image in such a way that the information contained in that image is
more readily interpreted visually in terms of a particular need.
Examples of enhancement functions include
 contrast stretching to increase the tonal distinction between various features in a
scene. E.g., a linear contrast stretch, a linear contrast stretch with saturation, a
histogram-equalized stretch.
 filtering is commonly used to restore imagery by avoiding noises to enhance the
imagery for better interpretation and to extract features such as edges and
lineaments. The most common types of filters: mean, median, low-, high pass, edge
detection....
3. Image Transformation
• Involves combined processing of data from multiple spectral bands. Arithmetic operations
(i.e., subtraction, addition, multiplication, division) are performed to combine and
transform the original bands into "new" images which better display or highlight certain
features in the scene.

• Some of the most common transforms applied to image data are:

 image rationing: this method involves the differencing of combinations of two or


more bands aimed at enhancing target features, or
 principal components analysis (PCA): Aims to reduce the dimensionality (i.e., the
number of bands) in the data, and compress original bands into fewer bands.
4. Image Classification

• Is the process of sorting all the pixels in an image into a finite number of
individual classes.
• Is the process of categorizing all pixels in an image or raw remotely sensed
satellite data to obtain a given set of labels or land cover themes.

• refers to the task of assigning classes (defined in a land cover


and land use classification system, known as the schema) to
all the pixels in a remotely sensed image.
Two types of digital image classification
a) supervised and
b) unsupervised.
a. Supervised Classification
• In this system each pixel is supervised for the categorization of the data by specifying to
the computer algorithm, numerical descriptors of various class types.

• Supervised classification gives the user more control over the classification process
because they manually select the training data and assign them to the correct classes.

• It is mainly a human-guided classification

• Two basic steps (stages) involved in typical supervised classification.

(a) Training Stage

(b) Classification Stage


a. Training Stage

• The analyst identifies the training area and develops a numerical


description of the spectral attributes of the class or land cover type.
b. Classification Stage

• Each pixel is categorized into landcover class to which it closely resembles. If the
pixel is not similar to the training data, then it is labeled as unknown.

Categories of classification stage:

(i) Measurements on Scatter Diagram

(j) Minimum Distance to Means Classifier

(k)Parallelepiped Classifier

(l) Gaussian Maximum Likelihood Classifier


(i) Measurements on Scatter Diagram: Each pixel value is plotted on the graph as the scatter
diagram indicating the category of the class.
(ii)Minimum Distance to Means Classifier: The mean vector for each category is
determined from the average DN in each band for each class. An unknown pixel can then
be classified by computing the distance from its spectral position to each of the means
and assigning it to the class with the closest mean.
(iii)Parallelepiped Classifier: For each class the estimate of the maximum and minimum DN
in each band is determined. Then parallelepipeds are constructed so as to enclose the
scatter in each theme. Then each pixel is tested to see if it falls inside any of the
parallelepipeds.
(iv)Gaussian Maximum Likelihood Classifier: This method determines the variance and
covariance of each theme providing the probability function. This is then used to classify
an unknown pixel by calculating for each class, the probability that it lies in that class. The
pixel is then assigned to the most likely class or if its probability value fails to reach any
defined threshold in any of the classes, be labeled as unclassified.
b. Unsupervised Classification
• This system of classification does not utilize training data as the basis of
classification. It is automated and does not require any user input.

• This classifier involves algorithms that examine the unknown pixels in the
image and aggregate them into a number of classes based on the natural
groupings or clusters present in the image.

• This method is usually used when there is less information about the data
before classification. There are several mathematical strategies applied in
unsupervised classification:
Strategies applied in unsupervised classification
a) Sequential Clustering: In this method the pixels are analysed one at a time pixel by pixel
and line by line. The spectral distance between each analysed pixel and previously defined
cluster means are calculated. If the distance is greater than some threshold value, the
pixel begins a new cluster.

b) Statistical Clustering: The algorithm uses 3x3 windows in which all pixels have similar
vector in space. Here the windows are moved one at time through the image avoiding the
overlap. The mean and standard deviation are calculated for each band of the window.
The smaller the standard deviation for a given band the greater the homogeneity of the
window. If the window passes the homogeneity test it forms a cluster.
Strategies applied in unsupervised classification
c) ISO Data Clustering (Iterative Self Organizing Data Analysis Techniques): This process
repeatedly performs an entire classification and recalculates the statistics. The procedure
begins with a set of arbitrarily defined cluster means, usually located evenly through the
spectral space. After each iteration new means are calculated and the process is repeated
until there is some difference between iterations. This method produces good results for
the data that are not normally distributed and is also not biased by any section of the
image.

d) RGB Clustering: It is a quick method for 3 band data. The algorithm plots all pixels in
spectral space and then divides this space into 32 x 32 x 32 clusters. A cluster is required
to have a minimum number of pixels to become a class.
Topic 7:
Application of Remote Sensing
• Remote Sensing can be applied in almost any field. Some of the important
applications of remote sensing technology includes:
a) Environmental monitoring and assessment (global warming etc.).
b) Land use and land cover global change detection and monitoring.
c) Prediction of agricultural yield and crop health monitoring.
d) Sustainable resource exploration and management.
e) Ocean and wetland studies.
f) Weather forecasting.
g) Defence and military surveillance.
h) Broadcasting and tele-communication

You might also like