0% found this document useful (0 votes)
182 views15 pages

Aerial Photography and Photogrammetry Insights

1. Photogrammetry is the science of making measurements from photographs, especially for recovering the exact positions of surface points. 2. There are two types of photogrammetric measurement: aerial, which uses cameras mounted on aircraft, balloons or drones, and close-range, which uses handheld cameras. 3. Aerial photography provides bird's eye views of large areas with details about land features, and has higher resolution than satellite imagery. It can also capture dynamic events and collect temporal data over time.

Uploaded by

sameer bakshi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
182 views15 pages

Aerial Photography and Photogrammetry Insights

1. Photogrammetry is the science of making measurements from photographs, especially for recovering the exact positions of surface points. 2. There are two types of photogrammetric measurement: aerial, which uses cameras mounted on aircraft, balloons or drones, and close-range, which uses handheld cameras. 3. Aerial photography provides bird's eye views of large areas with details about land features, and has higher resolution than satellite imagery. It can also capture dynamic events and collect temporal data over time.

Uploaded by

sameer bakshi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

IIT KHARAGPUR (RCGSIDM)

(Weekly Report-2)
Abhishek Kumar
Roll No- 21ID60R19
[Link] (ID)
Subject Code- ID61002

Photogrammetry
It is the combination of three words: Photo (Light) + Gramma (Geometry) + Metron
(Measurement). It is the science and technology for measurement and analysis of area or
region using photograph. We can measure with camera which fixed in aircraft, aero-plane and
satellite.
There are two way of measurement; Aerial – (Aircraft, Balloon, Aero-plane) and Close Range
– (Hand held camera, Terrestrial)
Aerial Photography
In aerial photography, geometric camera fixed in aircraft. Film and Digital photography are the
types of aerial photograph. We can also take stereo photograph using to camera at the same
place, which will provide the depth information.
Satellite Photography
In this, large area has been captured from higher altitude in the form of radar data, color and
thermal images. Landsat is an example of remote sensing satellite.
Drone Photography
In case of resolution of image or data, it will be helpful. However, procuring and flying of drown
is difficult and also collection of information for a large area within small time is also difficult.
Advantages of Aerial Photography
1. When we study of larger area, vantage point gives the bird eyes view of large area.
Which actually help us to extract the special features. Like; water bodies drainages,
building features.
2. When we compare aerial photograph to satellite photograph, aerial photograph has
higher resolution than satellite photograph.
3. It is also used to measure the dynamic phenomenon. Like; floods, forest fire, oil spill,
etc.
4. Aerial photograph can capture temporal data, multiple times at a place or area.
5. We can also identify UV, IR, thermal IR band in addition to visible energy with camera.
6. We can collect ground feature points. Like; latitude, longitude, height, volume, slope,
direction, etc.
7. It can detect what human eyes can’t.
History of Photogrammetry
A terrestrial photography had started in 1839. In 1840, topographic survey has been done by
using photograph. In 1854, the 1st aerial photograph has been taken by Gaspard Felix using
balloon for planning purpose of city. Earlier, Kites and Pigeons were used to take photograph,
then Rocket and Balloon. During 2nd world war, aero-plane was used to take photograph of
enemy countries.
Types of Camera
1. Pin Hole Camera
 Passes of light rays from the small pin hole.
 Inverted Image has been formed.
 In this camera, we don’t keep lens.
2. Simple Camera
 If we put a lens in place of pin hole by enlarging the size of hole, so more light can
pass through it.
3. Modern Camera
 In simple camera, if we add; adjustable diaphragm and shutter to maintain the size of
lens and exposure duration.

Figure: (a) Pinhole Camera; (b) Simple Lens Camera


Basics of Photography
1. Focus
For lens, it is distance between lens and focused point of parallel light.
1 1 1
= +
f o i
Where,
f =focus
o =distance of object from the lens
i =distance of photo/image from the lens
Example: Samsung J7 Pro- 3.7mm, OnePlus Nord 5 CG – 4.73mm
2. Exposure
It is accepted radiant energy in unit area. It defines, appearance of bright or dark image.
Exposure time can be calculated as:
sd t
E=
4f
Where,
E :exposure – J/mm2
s :Scene brightness - J/mm2/s
d :diameter of lens opening – mm
t :Exposure time – sec.
f :focal length of lens
In normal camera, there are some features like ISO settings, Aperture settings, Shutter speed
control exposure.
Example of OnePlus Nord 5 CG:

ISO: 125 ISO: 800 ISO:125


Aperture: f/1.7 Aperture: f/1.7 Aperture: f/1.7
Exposure time: 1/30 Exposure time: 1/25 Exposure time: 1/33
Focal Length: 4.73mm Focal Length: 4.73mm Focal Length: 4.73mm

3. Geometric factors influencing Exposure


In aerial photograph, geometric features depend upon ground objects type and condition,
these effects the exposure of camera and this effect is called as Extraneous effect. It has two
types Atmospheric effect (cloud, haze, etc.) and Geometric effects (exposure fall out)
4. Filters
It allows the selected wavelength of different energy and blocked other wavelength
(Commonly short wavelength), to reach image plane of camera for image generation. It is
made of gelatinous transparent materials which absorb or reflect energy and finally reach at
plane of image. These are placed along with optical axis in front of lens.

In the filters of the aerial camera, chemical (organic) colorants/dyes are placed in glass or
dry gelatines. These filters are used as an absorptive filter which differentiate the objects.

Figure: (a) High Pass Filter; (b) Band Pass Filter

Multi Band Imaging are the combination of different high and band pass filter like multi
spectral sensors in satellites. Interference filter can absorb unused light. Antivignetting
filters, enhance uniformity of exposure in picture by decreasing the density from center to
outer side of image. Polarizing filters allow one specific type of light that transmitted in
perpendicular to image plane and amount of noise is removed. It smoothens the wave also.
Film Photography
Silver halide grains are used to measures sensitivity of light, which depends upon size, shape
and number of grains. An anti-halide layer are used to absorb light that are coming from
emulsion and transperent base uses as a support and to restrict the reflection of light back to
emulsion layers. Resolution may decrease by increasing the grain size. Resulting image
depends upon emulsion layers, which may be panchromatic, Near-Infrared and colour films.

Black and White Films


When we provide Panchromatic or Near-Infrared emulsion layer then black and white films
are formed. We can take photo within the wavelength range of 300-900 mm. It is used to
differentiate deciduous and coniferous trees. In a film of camera, silver halide crystals are
sensitive for blue, green and red light. When the large amount of reflected light energy from
the object are focused into the crystals of silver halide, bonds between silver and halide
becomes weaker and exposed. After exposing of emulsion layer, a latent image is formed,
called as negative. To develop a positive print white light passes through emulsion layer of
negative. A brighter positive print is formed which is reversal of negative.

Figure: (a) Panchromatic film; (b) IR film


Colour Films
Colour films are majorly used in remote sensing. Different shades of colour can be differentiate
by human eyes.

Figure: Working of a colour film

Emulsion of colour film made with blue, green and red layers in respective order, in which blue
layer allows blue lights, green layer allows green and blue lights and red layer allows red and
green lights after activation of respective layers. A blue-blocking filter of yellow colour is
introduced just after blue layer which restrict blue light, coming from bottom layers. The
spectral sensitivity of different colours light can be observed from curve. Obtaining a colour
negative film process is similar to black and white film, after exposing of silver halide crystal.
Now, silver halide is replaced with suitable dyes according to colour triangle (like- yellow dye
for blue layer, magenta for green layer and cyan for red layer) in inverse proportion of light
intensity. Blue, green and red light along with white light is passes through dyes and we get
blue, green and red colour image in the absence of yellow, magenta and cyan dyes
respectively. We get black colour in the presence of all dyes.
Colour IR Films
In this we have three different layers of emulsion; green, red and Near-Infrared. Blue, green
and red layers allow light similar to colour film. Near-infrared layer activates only in NIR region,
generally we give blue colour to near-infrared. Near-infrared (blue), green and red light along
with white light is passes through dyes. We get blue, green and red colour print in the absence
of yellow, magenta and cyan dyes respectively. We get black colour in the presence of all
dyes.

Figure: Working of a colour IR film

Digital Photography
Photosensitive solid state device (SSD) is used to obtain image through digital camera. It
consists a 2-D array sensor of CCD(charge-couple-device) or CMOS(Complementary Metal-
Oxide Semi-conductors), which sense radiation of 1 pixel means 1 photodiode per 1 site.
Electric filed is obtained in the proportion of brightness, after detecting the radiations coming
from surface of object. Electric field converted in voltage and we get digital brightness value
of the pixel. Like, Canon EOS 100 D - 22.3 *14.9 mm of CMOS sensor have near 18.5 million
photo sites (18 Mega Pixel).

Description CCD CMOS


Light Sensitivity High Low
Noise Less High
Filters Alternate Blue, 3 layers (blue, green
Green and Red filters and red)
Colour Depth 8-12 bits
Resampling Complex and time Fast
consuming
Photodiodes per Generally, 2 3 layers
photosite

Flash drives are used to store digital data obtained from CMOS or CCD. From the ISO settings,
we can control the sensitivity of light. It increases with higher ISO and also exposure time
decreases. Like; Samsung J7 pro – ISO100 to ISO 800, One Plus 7T- ISO100 to ISO1600,
Canon D100 ISO 100 to ISO 12800, etc.
Small form of digital camera is used in aerial photography using single lens reflector (size
similar to 35 mm film). Crop factor is the ratios of the dimension of full film image to the cropped
sense image. Like; Crop factor 1.5 means length of full film image is 24 and cropped sense
image is 16. Field view of sensor is affected by aspect ratio which is width to height ratio.

Characteristic (a) Film (b) Digital


Photography Photography
Data Capture Silver halide Photosensitive
crystal SSD (CCD or
CMOS)
Data Storage Photographic Magnetic, Optical,
Film or Print SSD
Data Chemical and Digital Image
Manipulation Optical Processing
Processing
Data Mail, fax, delivery Computer
Transmission service networks,
telemetry,
telephone and
mobile network,
internet
Soft Copy Display Projected Slides Computer
monitor, TV,
projector
Hard Copy Silver Halide Inkjet, laser.
Display prints thermal printers

Advantages of Digital Camera


1. Because of high sensitivity and linearity, digital camera easily capture image within few
time-interval.
2. Sensors used in digital camera can able to sense entire range of radiations.
3. It can record the GPS information and compatible with digital technologies.

Aerial Photography
For aerial photography and remote sensing applications, precision-built cameras are used.
They are designed to capture a large number of images quickly and with the highest level
of geometric fidelity and optical quality.
There are many aerial cameras are invented:
• Single lens frame cameras
The most commonly used camera’s features a low distortion lens, 230mm x
230mm film size, a 120m cartridge capacity (500 photos), an automatic shutter control, and
yawn (x,y,z) axis control by Gyro-stabilisation.
It has fixed focal length (90 mm, 152 mm, 200 mm), larger focal length is used for high altitude
camera. Viewing angles are measured along diagonal lines; normal (< 75 degree), wide (75
to 100 degrees), and super wide (> 100 degrees). An image motion compensation system is
used in camera to prevent blurring. Vertical photographs have fiducial marks, which define
frame reference for spatial measurement. Calibration has been done of focal length, distance
between fiducial marks, location of principal point.

• Panoramic Film Camera


Through a narrow slit it displays a narrow angle field of view and ground
areas are viewed through rotating camera lens or prisms. We have to scan side by side and
perpendicular to the flight direction. During rotation of the lens, the exposure slit moves along
the curved film and covers a large area, resulting in panchromatic distortion (Ends are
compressed) due to lack of geometric fidelity. This is a small format digital camera.

• Small format digital cameras


These are relatively inexpensive and work with drones, UAVs, paragliders, RC flights, kites,
etc. Examples are the Canon EOD5D and Nikon D200.

• Medium format digital cameras


Similar to Small format cameras and produce colour and colour IR photographs by using
filters.

• Large format Digital cameras


This camera consists of 5 CCD. One of them having 250 MP and 230 mm focal length and
others 4 having 42 MP and 45 mm focal length.
Spatial Resolution of Cameras
It is influenced by the film's resolving power, the camera lens, uncompensated motion error,
and the weather. Standard test chart consists of three equally spaced parallel lines, which is
used to assess resolving power, as the inverse of the centre to centre distance (mm) that can
be known when seen. It is measured in lines/mm or line pairs/mm. Film Resolving power
specify at a specific contrast ratio in the middle of line and their backgrounds (1000:1 to 16:1).
Densitometers are used to scan the images in this pattern.

Photogrammetry
It is the art, science and technology of collecting well founded data related to any physical
body or environment by special measurements and other derived data from photographs and
recorded radiations. This technology is used to measure; distance, area, topography, change
detection, map, etc.

Analog Photogrammetry

•From the 2 photos, we can produce a 3-D.


•Developed in early 20th century and used between 1930-1970.

Analytical Photogrammetry

•Analytical potters are used.


•Developed after 1960.
•Film based photograph.

Digital Photogrammetry

•Computers are used to analysis of photograph.


•Developed after 1990.
•Digital cameras are widely used for aerial photogrammetry.

Types of Aerial Photographs

Vertical Photographs
- Square or Rectangular Obilque Photographs
- Axis of camera focus as vertical as possible. - International inclination because of rotating camera or
- Generally, single lens frame camera. lens.
- Tilt not more than 3 degree
- Tlted photograph consider as vertical because of less tilt.

Tilted Low oblique


True Vertical High oblique (Include horizon)
(1-3 degree) (No horizon)
Attributes Vertical Low oblique High oblique
Optical axis < 3 degree <30 degree More than 30 degree
Characteristics No horizon No horizon Horizon appears
Coverage Small area Larger area Largest area
Area’s Shape Square Trapezoidal Trapezoidal
Photograph scale Invariable Varies Varies
Difference with map Least More Greatest
Advantages Mapping Reconnaissance Illustrative

Taking Vertical Photograph


From the frame camera we can take photograph vertically along line of flight. Nadir line is
drawn on the bottom of the aircraft and connect centre of vertically taken photograph.
sequential photographs have overlap 55-65 % to each other, which is called as end lap.
Photograph should 25-30% overlap with photograph of adjacent flight line, called as side lap.

Stereoscopic Viewing
A single eye person can’t able to analyze depth perception. From this we can nullify the
shadow and can understand hidden objects.
Viewing photo stereoscopically- From stereo models, we can analyze all 3 dimensions of
object by changing parallactic angles. In this figure, Φ1 are Φ2 the parallactic angles; b is the
eye base (6.3-6.9 cm). A normal human can able to recognize 3 second or 1 degree (60
minutes) change in parallactic angle.

The 3D view of two consecutive overlapping images is called a stereo overlay and can be
viewed with a stereoscope. The interval meter gives the facilities, when will we have to open
or close the shutter to take continuous images. The ground distance between two consecutive
optical centres is called an air base. Ratio of airbase to flying height is called vertical
exaggeration and block is the combination of strips. Index mosaic is the rough arrangement,
which provide an index to particular photograph.
Geometric element of photograph

Point L is the exposure station of lens. Negative image plane is formed back to exposure
station from a distance of f= focal length. Considering size of photo positive plane is same as
negative and locate at same focal length in front of lens. (x) and (y) line is photo axes line in
which (x) is along flight direction and (y) perpendicular to flight direction. Origin or Principle
point (o) on the image plane cut at ground principal point (O). ground points A,B,C,D,E
consider as a’,b’,c’,d’,e’ at the image negative (reverse way) and as a,b,c,d,e at positive plane
of image.
Photogrammetric Scale
It is act as ground distance in map. It is represented in different form:
Unit equivalent: 1 mm= 25m; it means 1 mm in map is equal to 25 m in ground.

Representative faction:
,

Ratios: 1: 25,000
Large scale (1:7000, 1:10000) cover small areas with detailed information and small scale
(1:50000, 1:100000) cover large areas.

𝑀𝑎𝑝 𝑑𝑖𝑠𝑡𝑎𝑛𝑐𝑒 𝑑
𝑆 = 𝑝ℎ𝑜𝑡𝑜𝑠𝑐𝑎𝑙𝑒 = =
𝐺𝑟𝑜𝑢𝑛𝑑 𝑑𝑖𝑠𝑡𝑎𝑛𝑐𝑒 𝐷
From similar triangle,
𝑓𝑜𝑐𝑎𝑙 𝑙𝑒𝑛𝑔𝑡ℎ 𝑓 𝑓
𝑆= = =
𝑎𝑣𝑔𝑟𝑎𝑔𝑒 𝑓𝑙𝑦𝑖𝑛𝑔 ℎ𝑒𝑖𝑔ℎ𝑡 𝑎𝑏𝑜𝑣𝑒 𝑡𝑒𝑟𝑟𝑎𝑖𝑛 𝐻′ 𝐻−ℎ

Terrains are not uniform. So, we take average terrain elevation.


𝑓
𝑠 =
𝐻−ℎ

Photoscale varies which is reason for geometric distortion. At constant scale, there is no relief
displacement and orthographic projection has been measured. At varied scale, relief
displacement occurs and perspective projection has been measured.
Area Measurement
Accuracy of measurement depends on device, scale and relief displacement and it can be
measure by simple scales, numerical methods, grids and GIS. Simple scales can be measure
like;

(i) for Rectangular Shape (ground area) = ∗

(ii) for Irregular Shape (ground area) =

Generally numerical method is used for irregular shapes by counting grid on overlaying
transparent graph sheets. It is also called as dot grid method. Like; we captured an area and
made grids on transparent map. Then area of map is equal to number of grids*grid area.
Example: Covered area in map having 129 dots, density: 25 dots per cm 2, Scale of map
1:20000, Ground Area Covered =?
Solution:
Ground Area= 129* (1/25)/(1/20000)2 = 2064*106 cm2 = 20.64 hac.
Because computing tools are complex. It becomes easy to count grid’s number by digitising
tablets. We can use softcopy to digitising directly by computer, which is called as head-up or
onscreen digitising. From this we can easily rectify errors by zooming capabilities.

Ground Coordinates from a vertical photograph

In the above figure,


H: flying height of the camera
A, B: ground points
a, b: image points
xa,ya & xb,yb: photographic coordinates
Xa,Ya & Xb,Yb: ground coordinates
From similar triangle La’o and LA’Ao
𝑜𝑎 𝑓 𝑥
= =
𝐴 𝐴 𝐻−ℎ 𝑋

(𝐻 − ℎ )𝑥
𝑋 =
𝑓
Similarly,
(𝐻 − ℎ )𝑦
𝑌 =
𝑓
(𝐻 − ℎ )𝑥
𝑋 =
𝑓
(𝐻 − ℎ )𝑦
𝑌 =
𝑓
Relief Displacement of vertical photographs

It is the shift a point or position in photograph called as relief displacement. The shift of point
because of elevation difference, viewing angle, flying height and location of point or object
from principle points. More altitudes mean more relief displacement radially, which looks like
side view.

Geometric Component of Relief Displacement

From the above figure:


H – flying height above datum
f – focal length
h –object’s height
d –object’s relief displacement on image
D – equivalent relief displacement of object – projected to datum
r – radial distance from principal point from the top end of tower
R –equivalent stretch projected to ground reference point from similar triangles
From similar triangles,
D R
=
h H
d r
=
h H

r. h
d=
H
.
Height of object (h) =
We have done an example in class;
Q. Flying height of an aircraft: 2500 m, Relief distance: 5 mm, Distance to top of object from
principle point: 100mm, Height of the Object on the ground?
Solution:
Height = 5 * 2500 / 100 = 125 m

Correction for relief displacement


Ground reference plane has been set at an average of elevation because relief displacement
changes at every point on the image plane. We have to shift points A, B at A’, B’ on datum
plane and a’, b’ on image plane respectively. However, C is already on datum. So, there is no
requirement of shift c in image plane. Now, terrain can be corrected on the basis of radially
relief displacement at each image points.

Common questions

Powered by AI

Digital cameras offer advantages such as fast data capture due to high sensitivity, ease of digital image processing, and integration with GPS and other technologies. They also allow quick data transmission through networks. However, digital cameras can suffer from higher noise levels compared to film. Film cameras provide better color depth and resolving power for traditional data capture with silver halide crystals, but they require chemical processing and are less adaptable to immediate technological integration .

Stereoscopic viewing techniques enhance depth perception by allowing viewers to perceive 3D structures from 2D images through changing parallactic angles. This technique uses two overlapping images viewed with a stereoscope, creating a stereo overlay that can reveal depth information. It aids in understanding hidden objects and nullifying shadows, making it essential for tasks like photogrammetric measurements and 3D modeling .

Exposure time is closely related to the lens's focal length via the exposure equation: E = sd²t / (4f²). Here, E represents exposure, s is scene brightness, d is the lens opening diameter, and t is exposure time. A longer focal length results in the need for longer exposure times for the same level of exposure, assuming other factors remain constant, because the focal length affects how light is focused onto the image plane .

Advancements in sensor technology have significantly enhanced digital camera performance through increased light sensitivity, reduced noise, and improved color depth. CMOS sensors, for example, allow quick resampling and high-speed data capture, crucial for aerial photography's precision demands. These sensors integrate with various filters, improving adaptability to different lighting conditions. Moreover, improvements in sensor technology facilitate the integration of digital cameras with modern technologies like GPS and wireless communication, enhancing data usability .

Crop factor is significant as it determines the field view by affecting how much of the scene is captured by the sensor. It compares the sensor size to a full-frame sensor, impacting image composition by effectively zooming in on the scene. A crop factor of 1.5, for example, means less scene is captured compared to a full-frame sensor, altering perspective and requiring adjustments in image planning and capture strategies .

Spatial resolution significantly impacts aerial photography and photogrammetry by determining the smallest object that can be discernibly represented in an image. High spatial resolution allows for detailed analysis, such as identifying small features in terrain. It is influenced by factors like the camera lens's resolving power and motion errors. Poor spatial resolution can result in less accurate measurements for applications like surveying and mapping, where precision is critical .

Relief displacement causes shifts in image positions based on elevation differences, viewing angles, and flying heights, affecting image interpretation. It is radial from the principal point and increases with altitude, complicating measurements by simulating side views of objects. Interpretative accuracy requires correction considerations, as relief displacement influences apparent object positions .

Geometric factors influence exposure through extraneous effects such as atmospheric and geometric conditions. Atmospheric factors like clouds and haze can scatter light, affecting exposure. Geometric effects, such as exposure fall out, occur due to the varying angles and distances of ground objects from the camera, leading to uneven exposure across an image. Calculations for exposure must account for these variabilities to ensure uniform image quality .

The photogrammetric scale affects measurement accuracy by determining the ratio of photo distance to actual ground distance, influencing geometric distortion assessments. Large scales provide detailed information for small areas, whereas small scales cover large areas with less detail. Relief displacement and orthographic versus perspective projections further complicate measurements, necessitating adjustments for accurate spatial data derivation .

Filters in photography, especially aerial photography, selectively allow wavelengths of different energies to pass through while blocking others. This process affects image quality by altering the color balance and contrast. For instance, polarizing filters can reduce glare and enhance contrast by allowing light waves oriented in a particular direction to pass, thereby reducing noise. Antivignetting filters enhance uniform exposure by decreasing density from the center to the periphery of an image .

You might also like