Unit III Sensors and Machine Vision
Unit III Sensors and Machine Vision
UNIT III
SENSORS
• Inductance
• Eddy Current
• 2 . Proximity • Hall effective
• Photo Electric
• Strain gauge
• 3 . Force , • Dynamometer/ load cell
Torque and • Piezoelectric load cells
Pressure • Tactile sensors
• Ultrasonic stress sensors
• Electromagnetic
• Ultrasonic
• 4 . Velocity • Tacho generators
and • Resistive sensors
Acceleration • Capacitance
• Piezoelectric
• Photo electric
• Pitot tube
• Orifice plate
• 5 . Flow • Flow nozzle
• Venturi Tubes
• Rotameter
• Ultrasonic flow meter
• Turbine Flow meter
• Electromagnetic flow meter
• 8 . Light • Photoresistors
• Photodiodes
• Photo transistors
• Photo conductors
• Charge couple diode
Touch sensors
• Locating the objects
• Recognizing the object type
• Force and torque control needed for task
manipulation
TYPES:
1. Binary sensors:
Detects the existence of the object to be handled.
(limit switch)
2. Analog sensors
produce proportional output signal for the
force exerted
Binary sensors
Analog sensors
TACTILE SENSOR
Displacement, position and
proximity sensors
Eddy current proximity sensors
• When an alternating current is passed thru
this coil, an alternative magnetic field is
generated.
• If a metal object comes in the close
proximity of the coil, then eddy currents are
induced in the object due to the magnetic
field.
• These eddy currents create their own
magnetic field which distorts the magnetic
field responsible for their generation.
• As a result, impedance of the coil changes
and so the amplitude of alternating current.
This can be used to trigger a switch at some
pre-determined level of change in current.
Inductive proximity switch
• An inductive proximity sensor has four
components; the coil, oscillator, detection
circuit and output circuit.
• An alternating current is supplied to the coil
which generates a magnetic field. When, a
metal object comes closer to the end of the
coil, inductance of the coil changes.
• This is continuously monitored by a circuit
which triggers a switch when a preset value
of inductance change is occurred.
Optical encoders
• Three light sensors are employed to detect
the light passing thru the holes. These
sensors produce electric pulses which give
the angular displacement of the mechanical
element
• The inner track has just one hole which is
used locate the ‘home’ position of the disc.
The holes on the middle track offset from
the holes of the outer track by one-half of
the width of the hole.
Pneumatic Sensors
Proximity Switches
LED based proximity sensors
1 . Linear / Rotational Displacement
• Linear / displacement variable differential transformer (LVDT/ RVDT)
• Optical Encoder
1 . Linear / Rotational Displacement
• Electrical Tachometer
• Gyroscope
2 . Proximity
• Inductance
• Eddy current
2 . Proximity
• Photo electric
3. Force, Torque and Pressure
• Dynamometer/ load cell
• Eddy current
3. Force, Torque and Pressure
• Tactile sensors
• Tacho generators
4 . Velocity and Acceleration
• Resistive sensors
5 . Flow
• Pitot tube
• Orifice plate
5 . Flow
• Flow nozzle
• Venturi Tubes
5 . Flow
• Rotameter
• Thermostors
7 . Temperature
light energy
voltage
measurement
light energy
voltage
measurement
electrical flow
8 . Light
• Photoresistors
• Photodiodes
8 . Light
• Photo transistors
• Photo conductors
8 . Light
• Charge couple diode
Robotic Vision System
• Robot vision may be defined as the process of
extracting, characterizing and interpreting information
from images of a three dimensional world
• The process can be divided into the following principal
areas
I. Sensing
II. Preprocessing
III. Segmentation
IV. Description
V. Recognition
VI. Interpretation
Vision System and Identification of
Objects
• Vision system is concerned with the sensing of
vision data and its interpretation by a computer
• The typical vision system consists of the camera
and digitizing hardware, a digital computer and
hardware & software necessary to interface them
• The operation of the vision system consists of the
following functions:
• (a) Sensing and digitizing image data;
• (b) Image processing and Analysis;
• (c) Application
Vision system
Two-dimensional (or) Three dimensional model
of the scene
• According to the gray levels
1. Binary Image
2. Gray Image
3. Color Image
Vision System – Stages
• Analog to digital conversion
• Remove noise
• Find regions (objects) in space
• Take relationships(measurements)
• Match the description with similar description
of known objects
Block Diagram Of Vision System
FRAME
GRABBER
I/F
LIGHT
Function 1. Sensing and digitizing 2.Image processing 3.Applications
image data and analysis
Typical Signal conversion Data reduction -Inspection
Techniques & -Sampling -Windowing -Identification
Applications -Quantization -Digital Conversion -Visual servoing
-Encoding Segmentation and navigation
Image storage/Frame -Thresholding
grabber -Region growing
lighting -Edge detection
-Structured light Feature Extraction
-Front/back lighting -Descriptors
-Beam splitter Object Recognition
-Retro reflectors -Template matching
-Specular illumination -Other algorithms
-Other techniques
The Image and Conversion
• The image presented to a vision system’s camera is light, nothing more
(varies intensity and wave length)
• Designer ensure the pattern of light presented to the camera is one that
can be interpreted easily
• Designer ensure what camera sees, the image has minimum clutter
• Designer ensure blocking extraneous light – sunlight, etc.., that might
affect the image
• Conversion
• Light energy converted to electrical energy
• Image divided into discrete pixels
• Note: A color camera considered as three separate cameras, each basic
color
• The best portion of the image is produced by the light passing through a
lens along the len’s axis
The image sensor collects light from the scene through a lens, using photo sensitive target, converts into electronic
signal
Processing of Image
• An analog to digital convertor is used to convert analog voltage of each into digital
value
• Voltage level for each pixel is given by either 0 or 1 depends on threshold value
• On the other hand grey scale system assigns upto 256 different values depending
on intensity to each pixel
Image digitization
79
Robot vision
80
Object ranging
81
Ranging
• Humans have stereo vision capabilities.
• Depth information can be obtained from
binocular vision.
• stereo vision has become an important
part of computer vision.
• stereo vision can be used by robots, to
detect the distance between itself and the
target (object).
• Parallel computing has been used over the
past two decades for quickly processing
Images
82
Applications of vision controlled Robotic
Systems
83
Stages of Machine Vision
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
Feature extraction
Feature extraction involves finding features
of the segmented image.
1. Area.
2. Perimeter.
3. Center of mass.
4. Compactness.
102
103
104
Digital Image Processing
& Analysis
Definitions
• Image Processing
• Image Analysis (Image Understanding)
• Computer Vision
• Image Acquisition
• Preprocessing
• Segmentation
• Representation and Description
• Recognition and Interpretation
• Knowledge base
Important Stages
in Image Processing
Image Acquisition
– Video camera
– Digital camera
– Conventional camera & analog-to-digital
converter
Preprocessing
– Output (representation):
• Raw pixel data, depicting either boundaries or whole regions
(corners vs. texture for example)
• Need conversion to a form suitable for computer processing
– (Description)
Representation & Description
• Knowledge database
– Guides the operation of each processing
module and controls the interaction between
modules
Comments