0% found this document useful (0 votes)
2 views

imageprocessing

The document provides definitions and explanations of various concepts in image processing, including electromagnetic spectrum, image acquisition, enhancement, segmentation, and recognition. It also discusses the applications of different electromagnetic spectrum bands in medical, astronomical, and industrial fields, as well as the effects of sampling and quantization on image quality. Additionally, it outlines the main applications and sources of digital images, along with the fields and types of processes involved in digital image processing.

Uploaded by

mahdi.fouad.c1
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

imageprocessing

The document provides definitions and explanations of various concepts in image processing, including electromagnetic spectrum, image acquisition, enhancement, segmentation, and recognition. It also discusses the applications of different electromagnetic spectrum bands in medical, astronomical, and industrial fields, as well as the effects of sampling and quantization on image quality. Additionally, it outlines the main applications and sources of digital images, along with the fields and types of processes involved in digital image processing.

Uploaded by

mahdi.fouad.c1
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 23

image processing questions

Q-1 Define the following: 1. Electromagnetic Spectrum ( EMS ): when a beam of sunlight is passed
through a glass prism, the emerging beam of light is not white but consists instead of a continuous
spectrum of colors ranging from violet at one end to red at the other.

2-Low-Level processes : a type of computerized processes that involve primitive operations such as
image preprocessing to reduce noise, contrast enhancement and image sharpening.

3-High-Level processes :- The process of acquiring an image of a text, processing it, extracting
(individual characters, describing characters suitable for computer processing and recognizing those
individual characters are in the scope of digital image processing.

4. Mid-Level processes :- a type of computerized processes that involves segmentation (partitioning


image into regions), description of objects to reduce them to a form so that a computer can process
and classification of objects.

5. Relative sensitivity of the average human eye:

The relative sensitivity of the average human eye to EMW at different wavelengths. This portion of
the EM spectrum to which the eye is sensitive is called visible light.

6. Image acquisition: Digital imaging or digital image acquisition is the creation of a representation
of the visual characteristics of an object, such as a physical scene or the interior structure of an
object. The term is often assumed to imply or include the processing, compression, storage, printing,
and display of such images.

7. Image Enhancement: one of the simplest and most appealing areas of digital

image processing that aims to bring out detail that is obscured, or simply to

highlight certain features of interest in an image

8. Image segmentation: one of the most difficult tasks in digital image

processing that procedures partition an image into its constituent parts or objects.

9-Image Recognition: the process that assigns a label (e g vehicle) to an object based on its
descriptors

10-Pattern Recognition : is the process that assigns a label (e.g., vehicle) to an object based on its
descriptors.

11_Color Image Processing: is an area that has been gaining in importance because of the significant
increase in the use of digital images over the Internet. Color is used also as the basis for extracting
features of interest in an image.

12. Digital image: two-dimensional function, f(x, y )where x and y are spatial coordinates. The
intensity or gray level of the image at that point, is called: the amplitude of f at any pair of
coordinates (x,y). When x, y and f, values are all finite, discrete quantities, we call the image a digital
image.

13-Pixel: is the unit of image size and each value of pixel is called gray level.

14_Gray Level: :- It is known that if the image points are increased in addition to the number of gray
levels to a larger image file and longer computation times, representing 256 gray levels are encoded
with 8 bits. Also, 256 gray levels are provided by a computer screen. Today, the flashlights are
pressed to distinguish 1024, 4096 or more levels.

15-Grayscale: is a group of shades without any visible color. each pixel of a grayscale carries an
amount of light, ranging from the weakest amount of light, or black, to the strongest amount of light,
or white. Grayscale only contains brightness information, not color.

16-Grayscale image:

Each pixel is a shade of grey, normally from: 0 ( Black )to 255(white) .This range means that each pixel
can be represented by eight bits, or exactly one byte. 17_Grayscale image also known as 8_bit
image.

18_Binary image: Each pixel is just black or white. Since there are only two possible values for each
pixel, we only need one bit per pixel. Such images can therefore be very efficient in terms of storage
Images for which a binary representation may be suitable include text (printed or handwriting),
fingerprints, or architectural plans.

19-Gray image: an image that each pixel in it is a shade of grey, normally from 0 to 255. This range
means that each pixel can be represented by eight bits, or exactly one byte.

20_RGB image: - an image that each pixel has a particular color that color being described by the
amount of red, green and blue in it. If each of these components has a range 0 to 255.

21-Radiance: is the total amount of energy that flows from the light source, and it is usually
measured in watts (W)

22-Brightness: is a subjective descriptor of light perception that is practically impossible to measure.


It embodies the achromatic notion of intensity and is one of the key factors in describing color
sensation.

23. Luminance: measured in lumens (Im), gives a measure of the amount of energy an observer
perceives from a light source.

24. Image Formation Model: the process in which three-dimensional (3D) scene points are projected
into two-dimensional (2D) image plane locations, both geometrically and optically

25. Image Sampling: the process of digitizing the coordinate values of an image.

26-Image Quantization :- the process of digitizing the amplitude values of an image.

27-Spatial Resolution : one of the main parameters of the digitization that represents the number of
samples in the grid
28-Intensity Resolution: he smallest discernible change in intensity level.

29-image quality: the quality of the images increases as the resolution and the bits per pixel
increase.

30-Image resolution: one of the main parameters of the digitization that represents the number of
samples in the grid.

31-Morphological processing: deals with tools for extracting image components that are useful in the
representation and description of shape.

32-image element: some value in the image matrix. also called picture element, pixel, pel.

33-Number of intensity levels: Number of storage bits for various values of N and k where the
number of bits b required to store a digitized image is b = Mx N x K.

34-No. of bits (b): he number of bits required to store a digitized image is b = M x N× K.

35-Edges of the image: Edges represent the boundaries between two different regions within an
image. More precisely, the edge of an object within a digital image is the set of contiguous pixels
where an abrupt change of intensity occurs.

36-Smooth region in the image: represent the regions that have a low frequency in the image (less
noise and less pixelated)

37-False contouring: constant or nearly constant intensity caused by the use of an insufficient
number of intensity levels in smooth areas of a digital image, is called false contouring so called
because the ridges resemble topographic contours in a map.

38-Blockness : is the process of converting a continuous-space signal into a discrete-space signal.

39-Isopreference curve: It is defined as a form at the N level in which the results of a Group of image
types are summarized with N contrast, in Which it is requested to arrange these images according to
Their subjective quality. It was found in the course of the Experiments that the Isopreference Curves
tended to shift Right and upward . and The key point of interest in the Context of the present
discussion is that Isopreference Curves tend to become more vertical as the detail in the .Image
increases

40-NK-plane: Each point in the Nk-plane represents an image having values of N and k equal to the
coordinates of that point.

42. low level of detail :

43. Medium level of detail :

44. High level of detail:

45. Magnitude of the Fourier transform ( FT ): |F(u)|: The magnitude (spectrum, Modulus) of the
Fourier transform. is used to measure how strong the change in image intensity is.
46. Power Spectrum of the FT: [F(u) |2: power spectrum of the Fourier transform or called Intensity.
shows power as the mean squared amplitude at each frequency line but includes no phase
information.
49. Rotation Property of FT:

49-Parsival’s formula: The integral of the power spectrum (square modulus) of the function in
spatial domain equal to the integration of the power spectrum (square modulus) in frequency
domain.

51 .Convolution process: Convolution is a mathematical operation on two function (f and g) that


produces a third function expressing how the shape of one is modified by the other. The term
convolution refers to both the result function and to the process of computing it. It is defined as the
integral of the product of the two functions after one is reversed and shifted.

52. Auto-correlation function: Auto-correlation, also known as serial correlation, is the correlation of
a signal with a delayed copy of itself as a function of delay. Informally, it is the similarity between
observations as a function of the time lag between them.
53-Cross-Correlation function :a measure of the similarity between two waveforms as a function of
their relative delay.

54. Blurring function ( impulse Function ) : Impulse function depend on the position of each element
(Pixel ) in the function

55. Space variant PS ( SVPDF): Space Variant Point Spread Function,: the PSF that does change its
shape with the position of pixel in the image.

56-Space Invariant Point Spread Function, defined as: the PSF that doesn't change its shape with
the position of pixel in the image.

57-linear systems :are characterized by how they respond to impulses; that is, by their impulse
responses. As you should expect, the output image from a system is equal to the input image
convolved with the system's impulse response.

58. Point Spread Function ( PSF ):PSF: determine the energy distribution in the image plane for the
point source Located in the object plane.

59-The optical transfer function (OTF) :is a central concept in Fourier optics. For each component of
spatial frequency in the object intensity, it determines the strength and phase of the corresponding
component in the image.

4. Brightness:- is a subjective descriptor of light perception that is practically impossible to measure.


It embodies the achromatic notion of intensity and is one of the key factors in describing color
sensation.

5. Contrast of the image:- the difference in Luminance or color that makes an object distinguishable.

Histogram of the image :- the basis for numerous spatial domain processing techniques

14. Image acquisition :- Digital imaging or digital image acquisition is the creation of a representation
of the visual characteristics of an object, such as a physical scene or the interior structure of an
object. The term is often assumed to imply or include the processing, compression, storage, printing,
and display of such images.

Image enhancement : :- one of the simplest and most appealing areas of digital image processing
that aims to bring out detail that is obscured, or simply to highlight certain features of interest in an
image.

Image Recognition:- the process that assigns a label (e g vehicle) to an object based on its
descriptors.

Image segmentation: one of the most difficult tasks in digital image processing that procedures
partition an image into its constituent parts or objects.

Luminance :measured in lumens (lm), gives a measure of the amount of energy

Number of intensity levels:- Number of storage bits for various values of N and k where the number
of bits b required to store a digitized image is 𝐛 = 𝐌 𝐱 𝐍 𝐱 K.
Power Spectrum of the FT :- power spectrum of the Fourier transform or called Intensity.

35. Probability density function of the image:- The PDF is used to specify the probability of the
Random Variable falling within a particular range of values, as instead of

taking on any one value.

Radiance :- is the total amount of energy that flows from the light source, and it is usually measured
in watts (W).

38. RGB image :- an image that each pixel has a particular color that color being described by the
amount of red, green and blue in it. If each of these components has a range 0 255 this gives a total
of 2553 16777216 different possible colors in the image.

Q-2 Answer the following:

1. What is the aim of image processing?

Interest in digital image processing methods stems from two principal application

areas: 1. Improvement of pictorial information for human interpretation

2. Processing of image data for storage, transmission, and representation .

2. What is the major bands of EMS applied for medical application?

The common applications of DIP in the field of medical is:  Gamma ray imaging

 PET scan

 X Ray Imaging

 Medical CT  UV imaging

3. What is the major bands of EMS applied for Astronomical application?

Gamma-Ray Imaging

X-ray Imaging

Imaging in the Ultraviolet Band

Imaging in the Infrared Band

4. What is the major bands of EMS applied for industrial application?

 Imaging in the Ultraviolet Band  Imaging in the Infrared Band

 Acoustic imaging

 Electron imaging
 X-ray Imaging

5. What is the main application of Acoustic wave in image processing?

Imaging using “sound” finds application in:

 1. Geological (mineral and oil exploration )

 2. Industry (manufacturing)

 3. Medicine

6. What are the Basic Concepts in Sampling and Quantization

The sampling rate determines the spatial resolution of the digitized image, while

the quantization level determines the number of grey levels in the digitized image

A magnitude of the sampled image is expressed as a digital value in image

processing.

7. What are the Geometric spatial transformations

A spatial transformation of an image is a geometric transformation of the image coordinate


system.

It is often necessary to perform a spatial transformation to:

• Align images that were taken at different times or with different sensors

Correct images for lens distortion.

Correct effects of camera orientation.

Image morphing or other special effects.

8. What are the effect of reducing the number of intensity levels, keep the number of samples
constant.

Results in what is now as false contouring

9. What are the effect of reducing the number of samples, keep the number of intensity levels
constant.

Image resolution will be lower than original

11. Compare between the resolving power of the human eye and the CCD Camera.

Sol/Just in terms of raw resolving power, a Charge-Coupled Device (CCD) imaging chip of medium
resolution can have this number of elements in a receptor array no larger than(5x5) mm2.
While the ability of humans to integrate intelligence and experience with vision makes these types of
number comparisons somewhat superficial. Keep in mind for future discussions that the basic ability
of the eye to resolve detail certainly is comparable to current electronic imaging sensors.

Q-3 Write in assay about the:

1. Application of each band in electromagnetic spectrum in image processing fields.

2. Effects of: Sampling and Quantization, in the image.

An image function f(x,y) must be digitized both spatially and in amplitude in order to
become suitable for digital processing. Typically, a frame grabber or digitizer is used to
sample and quantize the analogue video signal. Therefore, in order to

create an image which is digital, we need to convert continuous data into digital form. This
conversion from analog to digital involves two processes:

Sampling (digitization of coordinate values).

Quantization (digitization of amplitude values).

3. Image Processing, including: Definition, Purpose, and Application.

Image processing is a method of processing of images by using mathematical operations like signal
processing where the input is an image, a group of images, or a video,

this is done in order to extract an part of image or to get some important information from the image.

The output of image processing maybe the image or some part of image or a specific set of
parameters related to the image.

In this process the system treats the images as a two dimensional signal while applying pre- set
signal processing techniques

to them. Images could also be processed as a three dimensional signal, with time axis as the third
dimension.

Interest in image processing methods stems

from two principal application areas:

1. improvement of pictorial information for human

interpretation; and

2. processing of image data for storage, transmission,

and representation.

Applications of Digital Image Processing


Image processing has an enormous range of applications; almost every area of science and
technology can make use of image processing methods.

Here is a short list just to give some indication of the range of image processing applications.

Image sharpening and restoration / Medical field / Remote sensing / Transmission and encoding

Machine/Robot vision

Color processing / Pattern recognition Video processing

Microscopic Imaging

and Other applications

Q-4 Answer the following: 1. Derive Image Formation Model

2. Prove that the convolution process in spatial domain equivalent to simple multiplication in
frequency domain
Q-5 List only the:

Main application of Digital image Processing

Image sharpening and restoration


Transmission and encoding
Machine/Robot
Color processing
Pattern recognition
Video processing
Microscopic Imaging

2. Source of Digital images

 Camera

 Photographs,

 Photographic film,

 Printed paper,

 Scanner or similar device.

3. Fields of Digital image processing.

Image sharpening and restoration

Medical field

Remote sensing

Transmission and encoding is

Machine/Robot vision

Color processing

Pattern recognition -

Video processing

Micro scope Imaging

4. Type of processes in Digital Image processing.

1. Low level processes

2. Mid-level processes
3. High level processes

5. Fundamental steps in Digital Image Processing.

Image acquisition.
Image enhancement
Image restoration
Color image processing:
Wavelets and multiresolution processing
Compression
Morphological processing
Segmentation representation and description
Object recognition

6. Types of Digital images.

1. Binary image

2. Greyscale image

3. True color, or RGB image

7. Types of Neighbors

1-Diagonalneighbors𝐍𝐃(𝐏), these neighbors are 4-pixels

2-4-neighbors𝐍𝟒(𝐏), these neighbors are 4-pixels

3-8-neighbors𝐍𝟖(𝐏), these neighbors are 8-pixels: 𝐍𝟖(𝐏)=𝐍𝐃(𝐏)U𝐍𝟒(𝐏)

10. Reasons of Blurring

Defects of the Lenses'

Relative Motion between Object and Optical System

Medium between Object and Optical System,

Exposure Timer

11. Reasons of The Noise

1. low light situations impact the signal-to-noise ratio

2. air quality can affect the signal-to-noise ratio

3. lowlighting

4. A high ISO setting is the most common contributor to image noise


16. List the main parameters of the digitization.
Digital Footprint :

Timeless Time

Death of Distance

Poly-directionality, Network Structure,

Network Externalities

Economies of Scale,

Media Richness Selection

Exposure Selection,

Algorithmification: A.I.

27. What are the basic quantities used to describe the quality of achromatic light source.

Radiance

Luminance

Brightness

What are the main parameters of the digitization .

1-Image resolution “Spatial Resolution“ the number of samples in the grid

2-Pixel accuracy “Gray Level Resolution“ how many bits per sample are used

39. What is the types of image sources


1- EM Waves

a- γ Rays b- X-Rays c- UV

d- Visible e- IR

f- Micro

2- Sound Waves

a- Sonic

b- Ultrasonic

3- Electronbeam

4- Magnetic Waves

40. What is the types of quantization

There are two types of Quantization:

1- Uniform Quantization

3- Non-Uniform Quantization

41. What is the types of sampling

There are two types of sampling:

1- Uniform Sampling

2- Non-Uniform Sampling

You might also like