0% found this document useful (0 votes)
37 views123 pages

DTM Chapter 3

The document provides an overview of Light Detection and Ranging (LiDAR), detailing its history, principles, and applications in various fields such as environmental research and mapping. It explains the working mechanisms of LiDAR, including the Time of Flight (ToF) principle, multiple return methods, and the necessary components for airborne LiDAR systems. Additionally, it discusses the limitations of traditional photogrammetry and highlights the advantages of LiDAR technology in terms of accuracy and efficiency.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views123 pages

DTM Chapter 3

The document provides an overview of Light Detection and Ranging (LiDAR), detailing its history, principles, and applications in various fields such as environmental research and mapping. It explains the working mechanisms of LiDAR, including the Time of Flight (ToF) principle, multiple return methods, and the necessary components for airborne LiDAR systems. Additionally, it discusses the limitations of traditional photogrammetry and highlights the advantages of LiDAR technology in terms of accuracy and efficiency.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 123

Digital Terrain Model

Chapter 3

Mathematical Equation for LiDAR, SAR


Digital Terrain Model

Light Detection and Ranging (LiDAR)


History of Lidar
• 60s and 70s First laser remote sensing instruments (Lunar laser ranging, satellite laser
ranging, oceanographic and atmospheric research)
• 80s- First Laser Altimetry systems (NASA Atmospheric and Oceanographic Lidar hic
Lidar (AOL) and Airborne Topographic Mapper (ATM)
• 1995- First commercial airborne Lidar systems developed
• 1996- Mars Orbiter Laser Altimeter (NASA MOLA-2)
• 1997- Shuttle Laser Altimeter (NASA SLA)
• 2000s –ICESAT 1
• Chandravan-LLRI
Limitations of analytical photogrammetry

 High photographic resolutions, low scanning resolution


 Stereo coverage for stereo photogrammetry
 Accuracy (aerial triangulation, bundle adjustment etc.)
 Camera hardware-large size
 Digital photogrammetry was in evolving stage
 Camera :expensive
 Softcopy algorithms for digital photogrammetry-not available
 GCPs quality
 Limited Capacity of computer hardware
Introduction
What LiDAR is:
• LiDAR is abbreviation for Light Detection And Ranging
• LiDAR uses laser light
• LASER is acronym for Light Amplification by Stimulated Emission of Radiation

• Wavelength (s) • Beam Divergence


• Coherence • Output power
• Duration of emission (pulse or • Power Requirements
continuous)
Introduction
What LiDAR is:
• LiDAR is an Active remote sensing method employed primarily to observe elevation of
various points on the earths surface.
• It is a ranging device where 3D coordinate of features on the earth can be estimated from
the position of the LiDAR sensor, platform altitude and laser feature geometry.
• Lidar is extremely useful in atmospheric and environmental research aw well as space
exploration
• It also has wide applications in industry, defense, and military.
• It has terrestrial, airborne, and mobile applications.
Introduction
 Monochromatic-light of one color (Wavelength)
Wavelength  Laser emits light in Infrared, Visible, and UV ranges
 LASER emits light at wavelengths in very narrow
range.

Lidar – mostly near infrared


Introduction

Wavelength Coherence
• Frequency: 50,000 (50k) to 200,000 (200k)
pulses per second (Hz) (slower for bathymetry)  Coherent Laser
• Wavelength:  All photons are in one phase
• infrared (1500 – 2000 nm) for meteorology –
Doppler LiDAR  Monochromatic laser is
 near-infrared (1040 - 1060 nm) for
terrestrial mapping better- coherent for long
 blue-green (500 – 600 nm) for bathymetry
 ultraviolet (250 nm) for meteorology distance
 eye-safe; high power voltage wattage
(1550nm)
Duration of Emission

Wave Pulse
 Continuous emission of photons  Energy released in vey short duration
over time.  Short pulse and long pulse
 Phase measurement principle  Focused pulses can illuminate a spot spot
with high energy
 For same power, short pulse is more intense
 Time of Flight (TOF) principles
Duration of Emission

Peak Power =PT


90% of PT
Longitudinal Profile of a Pulse

50% of PT

10% of PT

Rise Time Fall Time


Output and Power Requirements

Output power – strength of a Power requirements


LASER beam
 Laser uses electricity and converts it to
 Energy released by LASER in unit
light
time (watts)
 Efficiency is ratio of output to given input
 Output Power = (energy
 Maximum 30%, minimum 0.001%
released)/(time duration of energy
 Heat dissipation
release)
 Output power range: mW to KW
 A Laser releases power in narrow range
Lidar Equation Derivation

Power transmitted by the transmitter in a laser pulse = P T

Power received at ground or object = PTM

Illuminated area on ground (AT)=


Irradiance at the target=
Power transmitted pulse at target = AT

Power reflected pulse from a target = AT


Target (AT)
Lidar Equation Derivation

Power transmitted by the transmitter in a laser pulse = P T

Power received at ground or object = PTM

Illuminated area on ground (AT)=


Irradiance at the target=
Power transmitted pulse at target = AT

Power reflected pulse from a target = AT


Target (AT)
Lidar Equation Derivation

Power reflected pulse from target = AT

Power reflected pulse from a target per unit solid angle = A T


Power reflected pulse from a target per unit solid angle of target=
Power received at receiver in complete solid angle of receiver=

Target (AT)

Irradiance at receiver ()=


LiDAR Equation
Working Principle of LIDAR

Time of Flight (TOF) Principle


Range=
Peak Power =PT
Return Pulse
50% of PT

t1 TOF== t2 –t1 t2
Working Principle Multiple Return LIDAR
Peak Power =PT
Object 1 Object 3
Object 2
50% of PT

t1
𝟑 𝐖𝐚𝐲𝐬 𝐭𝐨 𝐃𝐞𝐭𝐞𝐫𝐦𝐢𝐧𝐞
𝐃𝐢𝐬𝐭𝐚𝐧𝐜𝐞 𝐔𝐬𝐢𝐧𝐠 𝐋𝐢𝐃𝐀𝐑

1.𝑻𝒓𝒊𝒂𝒏𝒈𝒖𝒍𝒂𝒕𝒊𝒐𝒏 📐
2.𝑻𝒊𝒎𝒆-𝒐𝒇-𝑭𝒍𝒊𝒈𝒉𝒕 (𝑻𝒐𝑭)
3.𝑷𝒉𝒂𝒔𝒆 𝑺𝒉𝒊𝒇𝒕 🌊
𝟑 𝐖𝐚𝐲𝐬 𝐭𝐨 𝐃𝐞𝐭𝐞𝐫𝐦𝐢𝐧𝐞
𝐃𝐢𝐬𝐭𝐚𝐧𝐜𝐞 𝐔𝐬𝐢𝐧𝐠 𝐋𝐢𝐃𝐀𝐑
𝑻𝒓𝒊𝒂𝒏𝒈𝒖𝒍𝒂𝒕𝒊𝒐𝒏 📐
𝑻𝒊𝒎𝒆-𝒐𝒇 -𝑭 𝒍𝒊𝒈𝒉𝒕 (𝑻 𝒐𝑭 )
𝗛𝗼𝘄 𝗶𝘁 𝘄𝗼𝗿𝗸𝘀: This method uses a laser beam and
a camera positioned at a fixed angle. The laser paints a 𝗛𝗼𝘄 𝗶𝘁 𝘄𝗼𝗿𝗸𝘀: This method measures the time it takes for
spot on the target, and the camera captures its position a laser pulse to travel to an object and reflect back. Knowing the
in the image. Based on the camera-laser angle and the speed of light in the medium, the distance to the object is
observed position shift with distance, trigonometry is calculated. (Think of how sound waves echo and how long it
used to calculate the distance to the object. takes to hear them).

𝗔𝗱𝘃𝗮𝗻𝘁𝗮𝗴𝗲𝘀: High accuracy (up to ten 𝗔𝗱𝘃𝗮𝗻𝘁𝗮𝗴𝗲𝘀: Suitable for long-range applications (space,
micrometers). air, automotive LiDAR).

𝗗𝗶𝘀𝗮𝗱𝘃𝗮𝗻𝘁𝗮𝗴𝗲𝘀: 𝗗𝗶𝘀𝗮𝗱𝘃𝗮𝗻𝘁𝗮𝗴𝗲𝘀:
- Limited range (often less than 10 meters). - Limited acquisition speed - needs to receive a signal before
- Measurement error increases with distance. sending another pulse.
- Primarily used in short-range, handheld LiDAR - Trade-off between laser shot rate (number of points) and
systems. maximum distance.
Working Principle Multiple Return LIDAR

𝗛𝗼𝘄 𝗶𝘁 𝘄𝗼𝗿𝗸𝘀: This method uses a continuous laser beam with


modulated amplitude or frequency. The beam is split - one part goes directly
to a detector, and the other reflects off the target. By measuring the phase
difference between the two received signals, the distance is calculated.
(Imagine waves slightly out of sync due to the travel time).

𝗔𝗱𝘃𝗮𝗻𝘁𝗮𝗴𝗲𝘀:
- Faster data acquisition speed.
- Higher resolution and accuracy compared to ToF.
- Less noise.

𝗗𝗶𝘀𝗮𝗱𝘃𝗮𝗻𝘁𝗮𝗴𝗲𝘀:
- Limited range compared to ToF (typically used in medium-range terrestrial
and indoor LiDAR scanners).
- Performance depends on the modulation type (aperiodic or periodic).
Introduction
How Light Detection And Ranging(LiDAR) works?
Each time the laser is pulsed:
• Laser generates an optical pulse
• Pulse is reflected off an object and return to the receiver
• High-speed counter measures the time of flight from the star pulse to the return pulse
• Time measurement is converted to a distance (the distance to the target and the position of the
airplane is then used to determine the elevation and location).
• Multiple returns can be measured for each pulse
• Up to 200,000+ pulse /sec
• Everything that can be seen from the aircraft is measured
Dependencies
 Reflectivity of target
 Range
 Average power
 Sequential Filtering
 Areas of transmitter, target, and receiver
 Beam divergence (ambiguity in FP)
 Lidar sensor
 Active sensor
 Fire pulses consecutively
 Does not required any sunlight and so work in night
LIDAR Data Acquisition Platforms

Various platforms:

• Terrestrial

• Airborne: Planes,
Helicopters, drones

• Mobile: cars

• Satellite

• Handheld devices
Aircraft LiDAR System
• 1st developed in 1960 by Hughes Aircraft Inc.
• Measures distance to surfaces by timing a laser pulse and it’s corresponding return(s)
• Lidar data provide X,Y,Z positions of each return
• Typically used in very accurate mapping of topography
Airborne LIDAR Data Acquisition

Three main components to provide


a georeferenced elevation
measurement:

• Laser & Sensor

• Inertial Measurement Unit (IMU)

• GPS
Aircraft LiDAR System
The basic components of an airborne laser
scanner are:
a. Scanner assembly
 The laser system mounted over a hole in the
aircraft’s fuselage (the main body of aircraft),
continuously sends laser pulses towards the
terrain as the aircraft flies.
 Depending on aircraft velocity and survey
height, current technology allows measurement Ground Surface or object
densities between about 50 points/m2.
 Modern scanner assemblies provide a roll
compensation to compensate for the roll of
the aircraft.
 Roll compensation allows the overlap between
flight lines to be planned to be smaller and
therefore gives an economic advantage.
Airborne Lidar Data Acquisition
Laser Scanner Characteristics
 Point repetition rate (PRR) or Point
repetition frequency (PRF)
 Scanning frequency
 Field of view (FOV)
 Relation between FOV and scanning
frequency

 Scanning mechanism
 Zigzag (saw-tooth)
 Parallel
 Sinusoidal (elliptical)
Airborne Lidar Data Acquisition
 LASER scanner: scanning
pattern.
 Scanning pattern: sinusoidal,
saw-tooth (zig-zag), parallel
 PRF, flying height, swath
 Beam divergence and horizontal
accuracy
 Point spacing
 Physical limitations
 Sequential firing
Airborne Lidar Data Acquisition

Airborne field operation


Airborne Lidar Data Acquisition
Aircraft LiDAR System
b. Airborne GPS antenna
 Observes: 3D coordinates of aircraft trajectory in
WGS84 coordinate system
 The standard is a dual frequency antenna recording
GPS signals at a sampling rate of 1 Hz 0r 2Hz.
 The antenna is mounted at an exposed position
on top of the aircraft,
providing an undisturbed view to GPS
satellites.
 Requirements: lock should be maintained for
accuracy (GDOP)
 Banking angle is restricted
Aircraft LiDAR System
C. Inertial measurement unit (IMU):
• Observes roll, pitch, and yaw angles of
aircraft direction with respect to true East,
true North, and gravity direction
 Date update rate: 60 Hz to 250 Hz
 Sleeps on long flight lines,
 Requirements: aircraft should not fly in
one direction continuously
 Acceleration data can be used to support
the interpolation of the platform position
on the GPS trajectory, while rotation rates
are used to determine platform orientation.
 The combination of GPS and IMU data
allows one to reconstruct the flight path
(air trajectory) to an accuracy of better
than 10 cm.
Aircraft LiDAR System
D. Control and data recording unit
 this device is responsible for time synchronisation and control of the whole system.

 It stores ranging and positioning data gathered by the scanner, IMU and GPS.

 Modern laser scanners, which generate up to 300 000 laser pulses per second, produce about 20 Gbyte of ranging data
per hour.
 GPS and IMU data only sum up to about 0.1 Gbyte per hour.
E. Operator laptop
 this serves as a means of communications with the control and data recording unit, to set up mission parameters, and to
monitor the sys- tem’s performance during the survey.
F. Flight management system
 this is a means for the pilot to display the preplanned flight lines, which provides support for him in completing the
mission.
Airborne Lidar Data Acquisition
 User requirements
 Project requirements
o Data density
o Data spacing(across and along track)
o Overlap and spacing of flight lines
o Errors
 Mapping requirements
o Uniform spacing of data points
o Minimum overlap and spacing of flight lines
o Resolution of data (derived term)
Airborne Lidar Data Acquisition
 Ideal Condition
 No change in altitude
(H)
 No change in speed
 No change in
direction of flying
Flying Altitude (H)
Airborne Lidar Data Acquisition
Spot 9
Spot 25

Spot 12

Spot 5

Spot 4

Spot 3

Spot 2

Spot 1
Spot 17

Variable
(1 - 30 m ) Surface Point Spacing
Airborne Lidar Data Acquisition
Airborne Lidar Data Acquisition
Airborne Lidar Data Acquisition
Airborne Lidar Data Acquisition

Point cloud view of the tree in the


photo on the right. Each point is
colored by which return it was from a
particular pulse:

Red = 1st return


Yellow = 2nd return
Green = 3rd return
Wavelength
500 – 1000 nm GPS IMU
Altitude
Pulse rate
600 – 1000 m AGL
10s – 100s kHz
Swath width
Footprint
up to 1500 m
15 – 20 cm
Accuracy
5 – 15 cm vertical
20 – 30 cm horizontal

GPS base station


DISCRETE PULSE AND FULL WAVEFORM
LIDAR
1.A Discrete Return LiDAR System records individual
(discrete) points for the peaks in the waveform curve. Discrete
return LiDAR systems identify peaks and record a point at each
peak location in the waveform curve. These discrete or
individual points are called returns. A discrete system may
record 1-11+ returns from each laser pulse.
2.A Full Waveform LiDAR System records a distribution of
returned light energy. Full waveform LiDAR data are thus more
complex to process, however they can often capture more
information compared to discrete return LiDAR systems. One
example research application for full waveform LiDAR data
includes mapping or modelling the understory of a canopy
DISCRETE PULSE AND FULL WAVEFORM
LIDAR
DISCRETE PULSE AND FULL WAVEFORM
LIDAR
Effect of Footprint of LiDAR
• Small foot print Lidar may
not capture tree top as first
return and/or may not reach
the ground surface.
• A larger footprint essentially
captures the overall canopy
structure.
• Full-waveform Lidar has a
relatively larger footprint.
Effect of Footprint of LiDAR
Geolocation process
Geolocation process
Terrestrial LIDAR

Terrestrial LiDAR
A basic Lidar system involves a laser range finder
reflected by a rotating mirror (Top)
The laser is scanned around the scene being
digitized, in one or two dimensions, gathering
distance measurements at specified angle interval
(bottom)
Terrestrial LIDAR
• Panoramic-type scanners carry out distance and angular
measurements in a full 360 degree in horizontal and 180
degree in vertical plane.
• Hybrid scanners, the scanning action is unrestricted around on
rotation axis usually the horizontal scanning movements

Using Terrestrial Laser


Scanner
Terrestrial LIDAR

Steps for TLS Data Processing


Terrestrial LIDAR

Application of TLS • Deformation/Deflection

• Slope stability monitoring • Architecture

• Forestry • Archeology

• Shipbuilding • Virtual Reality

• Biometrics
• Construction
Mobile LiDAR System
• A vehicular based imaging and LIDAR
data collection system
• It captures accurate georeferenced
imagery and LIDAR point clouds safely
• It reduces field time and produces
higher quality data
• This equipment collects 360 degree
data (coordinate) of the terrain
features.
Mobile LiDAR System
• Communication Towers
Application of Mobile Lidar
• Infrastructures and assets mapping and
System
monitoring
• Topographic Survey
• Dam site
• Construction Survey
• Volume calculations
• Highway and Railways
• Tunneling
• Bridges
• Power Transmission
Mobile LiDAR System

Mapping Power Infrastructures

Mapping insulators, power


lines, and towers to
generate complete power
infrastructure assets.
Mobile LiDAR System
Mapping Shield Wires and Vegetation
Identifying the position of danger points, such as vegetation overgrowth and analysis of potential
danger points
Transmission line
Locate defect: missing glass and insulator
bell
Mobile LiDAR System
Mobile LiDAR System
Transmission line
Detail inspection of connections
Mobile LiDAR System
Visualization of trees along the power line corridor according to specific attributes
• Center of each circle represents an individual tree
• Radius of the circle represents the crown size of that tree and the color represents the tree height according to the color ramp
Process of Lidar data

1.Emission of a laser pulse


2.Record of the back scattered single
3.Distance measurement (Time of flight method)
4.Retrieving plane position and altitude
5.Computation of precise echo (return pulse)
position
RADARGRAMMETRY AND SAR INTERFEROMETRY

• In practice, synthetic aperture radar (SAR), is widely used to acquire images. Images acquired by SAR are very
sensitive to terrain variation.
• This is the basis for three types of techniques, that is,
1. Radargrammetry,
2. Interferometry, and
3. Radarclinometry
• Radargrammetry acquires DTM data through the measurement of parallax
• SAR interferometry acquires DTM data through the determination of phase shifts between two echoes.
• Radarclinometry acquires DTM data through shape from shading. Radarclinometry makes use of a single
image and the height information is not accurate enough for DTM.
History of RADAR

1880-1885 1940-1950 1970s


RADAR was invented by Concept of side looking Imaging RADAR
Heinrich Hertz airborne RADAR (SLAR) satellites started
came in 1940s, but with
sophisticated techniques
in 1950s.

1904 1960 1990

First patent for using Civilian applications in The airborne SAR


RADAR as a ship the geosciences started research started
detector by Huelsmeyer
RADARGRAMMETRY AND SAR INTERFEROMETRY

 Electromagnetic spectrum

 rays

0.4
X rays
0.4
5
Ultraviol
0.5 et
m 3
Visible
0.5 Near IR
7
Infrared Thermal IR
0.
6 Far IR
0.
7
Micro

wave

Radio
 UHF
s

1
-

GHz wave
100
MHz
VHF
10
s
MHz
HF From Seguin & Villeneuve,
Astromnomie et Astrophysique

RADARGRAMMETRY AND SAR INTERFEROMETRY

Sun
Source + Sensor
Sensor Senso
r
Reflexion Emission Backscattering

Target Targe Target


t

TIR ‘active’
Optical Domain
Microwaves microwave
s
VIS NIR-MIR TIR Microwaves
0.4-0.7 5 µm 0.75-150 cm
µm
RADARGRAMMETRY AND SAR INTERFEROMETRY

RADAR:
RAdio Detection And Ranging

Application first started with the detection and ranging of


enemy aircrafts and ships during war times using radio
waves
Imaging RADAR PALSAR

NASDA)
Emition of emw
Reception backscattered echoes US Army

RADAR can be imaging and non imaging

Example for non-imaging RADAR


 Doppler RADAR for vehicle speed detection
 Plan Position Indicator (PPI
Road RADAR (© Nepal
police)
RADARGRAMMETRY AND SAR INTERFEROMETRY

Active MW sensors can be either


Radar can operate either in monostatic mode or bistatic mode.
 Nadir looking (e.g. Altimeter)
 Side looking (e.g. Imaging
radar)

Imaging radar can be further


classified as:
 Real aperture radar monostatic mode bistatic mode
 Synthetic aperture radar
(SAR)
RADARGRAMMETRY AND SAR INTERFEROMETRY

RADARSAT2
C Band
ERS1
C Band
TerraSAR X
X Band Sentinel-1
RADARSAT C Band
C Band
ENVISAT ALOS
RSO L Band
ERS2 Bande C
C Band
COSMO-SkyMed ALOS 2
JERS X Band L Band
L Band

Tandem X
X Band

1991 1992 200 2010


200 201
1995 7 2014
2 6
RADARGRAMMETRY AND SAR INTERFEROMETRY

Frequency –
Wavelength

Band Wavelength Frequency f


 (cm)

X ~ 3 cm ~ 10 GHz

C ~ 6 cm ~ 5 GHz

L ~ 25 cm ~ 1,2 GHz

P ~ 70 cm ~ 400 MHz
Summery of SAR Data
Acquisition Band Polarization Spatial Revisit time Scene cover
Name period Frequency mode resolution (m) (days) (km)
ERS-1 / 2 91 - 11 C VV 20 35 185x185
JERS 92 - 98 L HH 20 44 75 x 75
Radarsat 95 – 13 C HH 10-100 24 35 x 500
1 or 2 pol.
ASAR 01-13 C 30-1000 few -35 100x500
HH/HV/VV
Polarimetric
PALSAR 07-11 L 10-100 few-24 100-500
HH/HV/VV
Polarimetric
Radarsat-2 2007 - C 1-15 5 to 10 NA
HH/HV/VV
1 or 2 pol.
TerraSAR-X 2007 - X 1-20 few-11 5-100
HH/HV/VV
Cosmo- 2007 - X 1 or 2 pol HH/HV/VV 1-100 12 h 10-200
Skymed
SAOCOM 2015 L Polarimetric HH/HV/VV 7-100 few-16 60-320

Sentinel 1 2015 C 1 or 2 pol HH/HV/VV 5 - 100m few-12 80-400


Polarimetric HH/HV/VV
ALOS-2 2015 L 3-100 few-14 25-350
Comparison

LiDAR Optical Multi-spectral RADAR


Platform Airborne/ Spaceborne Airborne/ Spaceborne Airborne/ Spaceborne
Radiation Own Radiation Reflected sunlight Own Radiation
Spectrum Infrared Visible/Infrared Microwave
Frequency Single Frequency Multi-frequency Multi-frequency

Polarimetry Not Applicable Not Applicable Polarimetric Phase


Interferometry Not Applicable Not Applicable Interferometric Phase
Acquisition time Day/Night (Active Day Time (Availability of Day/Night (Active
Remote Sensing) Sun Light, Passive Remote Sensing)
Remote Sensing)
Weather Blocked by Clouds Blocked by Clouds See through Clouds
Geometry of RADAR

1. Polarization
2. Near range and far range
3. Azimuth Direction
4. Range Direction
5. Depression angle
6. Look angle
7. Incidence angle
Geometry of RADAR
Polarization
Important characteristics of coherent EMW:

Electromagnetic field evolution is predictable

Most general: Elliptical polarization


Geometry of RADAR
Polarization
Polarization characterisation of a radar acquisition:
VV
reception emission ERS, ASAR, Sentinel-1
HH
reception emission JERS, RADARSAT, PALSAR
HV
reception emission ASAR, PALSAR, Sentinel-1
VH
reception emission ASAR, PALSAR, Sentinel-1
Polarization
Brazil

Flooded forests

Deforested areas
J.-M. Martinez, 2010
ALOS acquisition ( = 24 cm)- Polarization
Polarization
Brazil

Flooded forests

Deforested areas
J.-M. Martinez, 2010
ALOS acquisition ( = 24 cm)- Polarization
Polarization
ASAR acquisition
VV Gaboon HV
Polarization
Tubuai Island, vegetation discrimination, L Band
HH polarization VV polarization

HV polarization HH HV
VV
Polarization
in visible domain also!
Vertical Horizontal
Radar images interpretation rules
Intensity (or Amplitude) Images
Surface scattering (bare soils) smooth rough
VV > HH low high
HV ~ 0
VV polarization
Volume
For bare surfaces (roughness / moisture)
scattering vegetation with vertical structures (i.e.
rice crops)
Double reflexion
(Dense forest)
(urban areas, flooded vegetation) HV polarization
HH > VV
HH, VV For Forest/Non forest discrimnation
Wild areas
high (urban areas, disorderly rocks) HH polarization
VVhigh
HV ~ HH ~ HV For flooded/Non flooded vegetation Urban
areas
Geometry of RADAR

• Angle between the horizontal plane and the RADAR line of sight
• Complementary of look angle

Angle between the RADAR line of


sight and the local vertical at terrain
point

Depression angle and incident


Angle between the vertical line from angel
aircraft (towards nadir) and the RADAR  For airborne RADAR: incident
line of sight angle is complement of
depression angle
 For spaceborne RADAR: one
needs to consider spherical
geometry of Earth
Resolution Types

1. Spatial resolution
 Ability of sensor to separate two objects spatially
2. Radiometric resolution
 Ability of sensor to separate two brightness responses from reflected signal
3. Temporal resolution
 Revisit time of a sensor at same spatial location
4. Spectral resolution
 Ability of sensor to separate wavelength range of signal
Resolution Types
1. Spatial resolution types  Define ground range resolution and ground

 azimuth resolution
Range resolution
 Ground resolution separates two objects
 Distinction of two objects along the slant
on geoid
range
 Needs rescaling of range resolution and
 Azimuth resolution azimuth resolution
 Distinction of two objects along the flight
direction
 Resolution on ground or terrain
Resolution Types
Spatial resolution:
smallest distance allowing the separation of two objects

Optical data:
sensor spatial resolution < image pixel size
==> pixel size same as spatial resolution
hence the use of one word for the other
Radar data:
sensor spatial resolution > image pixel size
==> these 2 notions remain different
Radar Imaging – Acquisition

Spatial resolution
 Range resolution
 Azimuth resolution
Radar Imaging – spatial resolution

Terrain is flat and


horizontal
Radar Imaging – spatial resolution
Emitted pulses (PRF) v : Sensor Speed V
Line spacing: p az PRF

L Ground track 
l
az Azimuthal resolution: r
R

az L
H d
Swath S

raz Azimuthal direction


Source: CNES
lengthwise
(= image lines)
Range direction
transversal
Radar Imaging – spatial resolution
1
Pulse duration = p =
� L
l

c az
Range resolution X
p


r 2 r Traveling direction

H
R

Swath Satellite track

Ground Range resolution:


c c
p
X gr

 2 sin  2 B Cos   Depression angle
Radar Imaging – spatial resolution

𝑟�

2�
=�
Slant range resolution:
𝐵
i

R
c 𝑑
i
𝑑
2
B

sin(𝑖
)
i

c
2 B .sin i Ground range
resolution c
Ground range resolution: X gr
 2 B sin i

Radar Imaging – spatial resolution

Computing the range resolution at


two different depression angles
( 40 deg and 65 deg ) for a real
aperture radar with a pulse length
of 0.1 × 10 ^ - 6 sec. The towers
can be re- solved in the far-range
but not in the near-range (after
Sabins, 1997).
Radar Imaging – spatial resolution

l L
c az
c
Range resolution X
ps

Traveling direction
s
 2 2B r

H
R

Swath Satellite track


c
Ground range resolution X gr

2 B sin 


Azimuthal resolution X az R
 L
Radar Imaging – spatial resolution

Computing the azimuth resolution at


two different slant-range distances (20
and 40 km) for a real aperture radar
with an X- band wavelength of 3 cm
and a 500 cm antenna. The tanks can
be resolved in the near-range but not
in the far-range
Radar Imaging – spatial resolution

Limitation of RAR (Real Aperture Radar)


• Can’t be used in Spaceborne satellite because of Slant range. (more than
100km)
• Scale distortions
• Pixel size keeps varying

To overcome this Problem SAR (Synthetic Aperture Radar is evolved


Radar Imaging – spatial resolution

Synthetic Aperture
Radar: (i.e. improvment of
azimuthal resolution)

Small mobile antenna  Fixed


wide antenna
Radar Imaging – spatial resolution

v L = v . Exp. Time v
Exposure time
Synthetic Aperture
Equivalence Radar: (i.e. improvment
of azimuthal resolution)
azt
SAR Small mobile antenna  Fixed wide
Resolution
𝜆
antenna
r = � .
R �

~ 5 km ~ 10 m
Coherent sum of the successive echoes
Gain in azimuthal resolution
Adaptative filtering (Doppler Bandwidth)
V
BD
2V
X az => X az
L

 L  BD  2
Radar Equation

Assignment: -

Derive the radar equation step-by-step, explaining the relationship between transmitted
power, radar cross-section, and received power, including all necessary assumptions and
factors.

Date : - 2025-01-18
Phase of RADAR
 RADAR backscatters are coherent
 Same phase for a sensor for a point on terrain
 Two different sensors have different coherent phases for a point on
terrain

4 𝜋𝑅
−𝑗
𝑗𝜑1 𝜆
𝐸 1 =𝑒 =𝑒 Phase difference information of two phase images provides
range difference estimation for a point

2 𝜋 (2 𝑅+ 𝛿 𝑅)
− 𝑗
𝑗𝜑2 𝜆
𝐸 2=𝑒 =𝑒
Interferometry Principle

Phase difference information of two phase images provides range difference


estimation for a point
4 𝜋𝑅
−𝑗
𝐸 1 =𝑒 𝑗 𝜑 =𝑒 1 𝜆
2 𝜋𝛿 𝑅
2 𝜋(2 𝑅 +𝛿 𝑅) Δ 𝜑=𝜑 1 −𝜑 2=
𝐸 2=𝑒
𝑗 − 𝜑2
=𝑒
− 𝑗
𝜆 𝜆
2 𝜋 (2 𝑅 − (2 𝑅 + 𝛿 𝑅 ))
− 𝑗
𝜆
𝑗 (𝜑 1 −𝜑 ¿ ¿ 2)=𝑒 ¿
𝐸1 𝐸2 =𝑒
2 𝜋 (𝛿 𝑅)
− 𝑗
𝐸1 𝐸2 =𝑒
𝑗 (Δ 𝜑 )
=𝑒 𝜆 2 𝜋 ( 𝑃𝑎𝑡h 𝐷𝑖𝑓𝑓𝑒𝑟𝑒𝑛𝑐𝑒)
𝑃 h𝑎𝑠𝑒 𝐷𝑖𝑓𝑓𝑒𝑟𝑒𝑛𝑐𝑒=
𝜆
Interferometry Principle

Interferometry SAR: InSAR or IfSAR


 Single Pass Type
 Two antennas (transmitter and two receivers) are mounted at the same along track position, but one
above other
 Temporal baseline is zero
 Two Pass Type
 Two antennas are placed in different tracks
 Differential interferometry SAR (Three Pass Type): DInSAR
 Three Pass Type: three antennas in three different tracks
 Measure two spatial base lines
Interferometry Principle

1. Spatial baselines
• Target area is imaged in two different SAR tracks simultaneously
• We measure target elevation from the known platform positions

2. Temporal baselines
• A pure case is that the SAR measurement acquired from exactly identical tracks
• The temporal baseline is used to measure the radial velocity

3. Mixed baselines
• Combination of both baselines
Single Pass Interferometry

In this, we record phase of the echo from the target in the two channels mounted on a
platform
Phase difference: 2𝜋
Δ 𝜑=𝜑 1 −𝜑 2= 𝜌 ( 𝑅2 − 𝑅1 )
𝜆
= 1, if channels share the transmit antenna
= 2, each channel is transmitting and receiving on its own antenna

Фм = mod(Δφ, 2π)
Фм -phase difference shown by the interferogram
Single Pass Interferometry
Single pass or simultaneous baseline:
two RADARs acquire data from different
vantage points at the same time

RADAR interferogram contains only


phase difference information of two
phase images for same points.
Two Pass Interferometry
 Important issues
 A point on terrain is observed by two RADAR sensors in two tracks at different
times
 Temporal base line exits but no change is terrain point
 Unknown spatial baseline (length and orientation)
 Non parallel tracks leads to difficulty in finding baseline
 Change in target area (temporal decorrelation)
 Motion compensation is difficult (aircraft case)
 So estimation of baseline is difficult
Two Pass Interferometry

Two Pass Type (repeat pass or


repeat track):
Two RADARs acquire data from
different vantage points at different
times
Properties of SAR Images

Two types of Properties:-


1. Geometric Properties
• Foreshortening
• Layover
• Shadow
2. Radiometric Properties
• Speckle Effect
Properties of SAR Images

Geometric Properties
1. Foreshortening
• Sensor-facing slope foreshortened
in image
• Foreshortening effects decrease
with increasing look angle
Properties of SAR Images
Geometric Properties
1. Foreshortening
Foreshortening depends on:
• Object height
• Look angle
• Location of objects in the
range direction
Properties of SAR Images
Geometric Properties
2. Layover
• Mountain top overlain on ground ahead of
mountain
• Layover effects decrease with increasing look

angle
Properties of SAR Images

Geometric Properties
3. Shadow
• Area behind mountain cannot be seen
by sensor
• Shadow effects increase with increasing
look angle

Characteristics of Radar
shadows:
• Completely dark
• Occurs only in the range
direction
• Shadows vary in the range
direction
Properties of SAR Images
Properties of SAR Images

Radiometric Properties
1. Speckle
• Speckle: noise in RADAR image
• Resolution cell contains a large number of scatterers
• The returned echo from scatterers is coherently summed to obtained the phase and
brightness of the resolution cell.
• Brightness in the resolution cell increases than the actual brightness
• This unexpected brightness is called speckle
Properties of SAR Images

Radiometric Properties
1. Speckle
Properties of SAR Images

Radiometric Properties
1. Speckle
Properties of SAR Images

Radiometric Properties
1. Speckle Speckle filters
Characteristics • Gamma filter
• Frost filter
• Unwanted and random noise
• Lee filter
• Speckle is a friend and foe, which is very • Multi-look filter
helpful in radar image interpretation
• We assume a Gaussian distribution
Properties of SAR Images

Speckle filters
• Gamma
filter
• Frost filter
• Lee filter
• Multi-look
filter
RADARGRAMMETRY AND SAR INTERFEROMETRY

Selection of SAR images suitable for interferometry use is the first step to be carried out for
any interferometric processing. It is a key step, since the criteria adopted for selection of the
images have strong impact on the quality of the final results. These criteria depend upon the
• View angle (ascending and descending passes)
• Geometrical baseline
• Temporal baseline
• Time of the acquisition
• Coherence
• Meteorological conditions
Introduction
• View angle (ascending and descending passes)
Selecting images for InSAR DEM generation

1. Select Tandem acquisitions to reduce temporal decorrelation.


2. Interferograms with very small perpendicular baseline values (< 30 m), though easy to unwrap, are almost
useless due to their high sensitivity to phase noise and atmospheric effects.
3. Interferograms with normal baseline (values higher than ~450 m) are usually almost impossible to unwrap
if no a priori DEM is available and the topography of the area is not very smooth. Moreover the coherence
is generally small, due to the high geometrical and volume scattering decorrelation [Gatelli94, Zebker92,
Rodriguez92].
4. The optimum perpendicular baseline is in the range between 150 and 300 metres. However, the best result is
achieved by using more than one interferogram: interferograms with small baselines can be exploited to help
unwrap interferograms with high baselines. Moreover, different interferograms can be combined in order to
reduce the atmospheric artefacts.
Selecting images for InSAR DEM generation

5. If no Tandem pair is available, consider using phase A, B and D ERS-1 acquisitions (3-day repeat cycle)
instead of phase C (35-day repeat cycle).
6. When the DEM will be used for differential interferometry applications, use the same track as that used to
estimate possible ground deformations, in order to avoid the necessity of image interpolation.
7. Coherence values are affected by local weather. Avoid acquisitions during rain, snow or strong wind. These
phenomena usually cause loss of phase coherence. Weather information can be often recovered from historical
databases available on the web. Nighttime acquisitions are usually less affected by atmospheric effects
[Hanssen98].
8. Discard images acquired during very hot days: hot air can hold much more water vapour than cold air (a
major cause of atmospheric artefacts in SAR interferograms) [Hanssen98].
9. Usually Tandem pairs acquired on vegetated areas during the dry season show higher coherence values than
those acquired during a wet season.
Step for Dem Generation

https://sentiwiki.copernicus.eu/web/s1-products
Step for Dem Generation
Coregistration (subpixel) and Slave-to-
Master resampling.
Cross-Correlation: used to set the
parameters to perform both coarse and
fine coregistration.

https://sentiwiki.copernicus.eu/web/s1-products
Step for Dem Generation

https://sentiwiki.copernicus.eu/web/s1-products
Step for Dem Generation
Comparison of different method
MAJOR TOPICS COVERED IN CLASS

Prepared By: Netra Bahadur Katuwal Asst. Prof. IOE , Paschimanchal Campus

You might also like