0% found this document useful (0 votes)
81 views120 pages

Unit III Sensors and Machine Vision

Sensors are devices that detect changes in physical quantities and convert them into signals that can be read by instruments or controllers. The document discusses different types of sensors that can measure variables like light, sound, heat, chemicals, position, force, velocity, flow, level, temperature, and light. It also covers the characteristics, devices, and classification of sensors. The key types of sensors mentioned are those for linear/rotational displacement, proximity, force/torque/pressure, velocity/acceleration, flow, level, temperature, and light.

Uploaded by

Vino
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
81 views120 pages

Unit III Sensors and Machine Vision

Sensors are devices that detect changes in physical quantities and convert them into signals that can be read by instruments or controllers. The document discusses different types of sensors that can measure variables like light, sound, heat, chemicals, position, force, velocity, flow, level, temperature, and light. It also covers the characteristics, devices, and classification of sensors. The key types of sensors mentioned are those for linear/rotational displacement, proximity, force/torque/pressure, velocity/acceleration, flow, level, temperature, and light.

Uploaded by

Vino
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 120

Sensors and Control Systems

UNIT III
SENSORS

• Sensor : Reads variables in robot motion for


use in control.
• Analogous to human sensory organs
– Eyes, ears, nose, tongue, skin
• Sensors help the robot knowing its
surroundings better
• Improves its actions and decision making
ability
• Provides feedback control
Sensors - What Can Be Sensed?
• Light
– Presence, color, intensity, content (mod), direction
• Sound
– Presence, frequency, intensity, content (mod), direction
• Heat
– Temperature, wavelength, magnitude, direction
• Chemicals
– Presence, concentration, identity, etc.
• Object Proximity
– Presence/absence, distance, bearing, color, etc.
• Physical orientation/attitude/position
– Magnitude, pitch, roll, yaw, coordinates, etc.
Sensors - What Can Be Sensed?
• Magnetic & Electric Fields
– Presence, magnitude, orientation, content
(mod)
• Resistance (electrical, indirectly via V/I)
– Presence, magnitude, etc.
• Capacitance (via excitation/oscillation)
– Presence, magnitude, etc.
• Inductance (via excitation/oscillation)
– Presence, magnitude, etc.
Characteristics of Sensors
CHARACTERISTICS of SENSORS

• Resolution – Minimum step size


= ‘Full range / 2ⁿ’
• Sensitivity = Change in output Response/ Change
in input Response
• Linearity = relationship between input and output
variations
• Interfacing = Micro controller or Micro processor
• Size, Weight and Volume
Sensor Devices
Sensor Devices
Classification of sensors
Classification of Sensors
• Power Requirement –
– Passive - Output provided by sensed physical
phenomenon . Eg . Thermocouple, Thermometer.
– Active – External power source .Eg. Strain Gauge
• Output Signal – Analog and Digital
• Measurement - Primary and Secondary
• Types based on various types of
measurements.
Quantity to be measured and Types of
Sensors
• Linear / displacement variable differential
• 1 . Linear / transformer (LVDT/ RVDT)
Rotational • Optical Encoder
Displacement • Electrical Tachometer
• Hall effect sensor
• Capacitive transducer
• Strain gauge elements
• Interferometer
• Gyroscope

• Inductance
• Eddy Current
• 2 . Proximity • Hall effective
• Photo Electric
• Strain gauge
• 3 . Force , • Dynamometer/ load cell
Torque and • Piezoelectric load cells
Pressure • Tactile sensors
• Ultrasonic stress sensors

• Electromagnetic
• Ultrasonic
• 4 . Velocity • Tacho generators
and • Resistive sensors
Acceleration • Capacitance
• Piezoelectric
• Photo electric
• Pitot tube
• Orifice plate
• 5 . Flow • Flow nozzle
• Venturi Tubes
• Rotameter
• Ultrasonic flow meter
• Turbine Flow meter
• Electromagnetic flow meter

• Float level sensor


• 6 . Level • Variable capacitance sensor
• Piezo electric sensor
• Photoelectric sensors
• Thermo couple
• Thermostors
• 7 . Temperature • Thermodynamic
• Resistance temperature
detector
• Infrared thermography

• 8 . Light • Photoresistors
• Photodiodes
• Photo transistors
• Photo conductors
• Charge couple diode
Touch sensors
• Locating the objects
• Recognizing the object type
• Force and torque control needed for task
manipulation
TYPES:
1. Binary sensors:
Detects the existence of the object to be handled.
(limit switch)
2. Analog sensors
produce proportional output signal for the
force exerted
Binary sensors
Analog sensors
TACTILE SENSOR
Displacement, position and
proximity sensors
Eddy current proximity sensors
• When an alternating current is passed thru
this coil, an alternative magnetic field is
generated.
• If a metal object comes in the close
proximity of the coil, then eddy currents are
induced in the object due to the magnetic
field.
• These eddy currents create their own
magnetic field which distorts the magnetic
field responsible for their generation.
• As a result, impedance of the coil changes
and so the amplitude of alternating current.
This can be used to trigger a switch at some
pre-determined level of change in current.
Inductive proximity switch
• An inductive proximity sensor has four
components; the coil, oscillator, detection
circuit and output circuit.
• An alternating current is supplied to the coil
which generates a magnetic field. When, a
metal object comes closer to the end of the
coil, inductance of the coil changes.
• This is continuously monitored by a circuit
which triggers a switch when a preset value
of inductance change is occurred.
Optical encoders
• Three light sensors are employed to detect
the light passing thru the holes. These
sensors produce electric pulses which give
the angular displacement of the mechanical
element
• The inner track has just one hole which is
used locate the ‘home’ position of the disc.
The holes on the middle track offset from
the holes of the outer track by one-half of
the width of the hole.
Pneumatic Sensors
Proximity Switches
LED based proximity sensors
1 . Linear / Rotational Displacement
• Linear / displacement variable differential transformer (LVDT/ RVDT)

• Optical Encoder
1 . Linear / Rotational Displacement
• Electrical Tachometer

• Hall effect sensor


1 . Linear / Rotational Displacement
• Capacitive transducer

• Strain gauge elements


1 . Linear / Rotational Displacement
• Interferometer

• Gyroscope
2 . Proximity
• Inductance

• Eddy current
2 . Proximity
• Photo electric
3. Force, Torque and Pressure
• Dynamometer/ load cell

• Eddy current
3. Force, Torque and Pressure
• Tactile sensors

• Ultrasonic stress sensors


4 . Velocity and Acceleration
• Electromagnetic

• Tacho generators
4 . Velocity and Acceleration
• Resistive sensors
5 . Flow
• Pitot tube

• Orifice plate
5 . Flow
• Flow nozzle

• Venturi Tubes
5 . Flow
• Rotameter

• Ultrasonic flow meter


5 . Flow
• Turbine Flow meter

• Electromagnetic flow meter


6 . Level
• Float level sensor

• Variable capacitance sensor


7 . Temperature
• Thermo couple

• Thermostors
7 . Temperature

• Resistance temperature detector


7 . Temperature
• Infrared thermography
8 . Light

light energy

voltage
measurement

light energy

voltage
measurement

electrical flow
8 . Light
• Photoresistors

• Photodiodes
8 . Light
• Photo transistors

• Photo conductors
8 . Light
• Charge couple diode
Robotic Vision System
• Robot vision may be defined as the process of
extracting, characterizing and interpreting information
from images of a three dimensional world
• The process can be divided into the following principal
areas
I. Sensing
II. Preprocessing
III. Segmentation
IV. Description
V. Recognition
VI. Interpretation
Vision System and Identification of
Objects
• Vision system is concerned with the sensing of
vision data and its interpretation by a computer
• The typical vision system consists of the camera
and digitizing hardware, a digital computer and
hardware & software necessary to interface them
• The operation of the vision system consists of the
following functions:
• (a) Sensing and digitizing image data;
• (b) Image processing and Analysis;
• (c) Application
Vision system
Two-dimensional (or) Three dimensional model
of the scene
• According to the gray levels
1. Binary Image
2. Gray Image
3. Color Image
Vision System – Stages
• Analog to digital conversion
• Remove noise
• Find regions (objects) in space
• Take relationships(measurements)
• Match the description with similar description
of known objects
Block Diagram Of Vision System

FRAME
GRABBER

I/F
LIGHT
Function 1. Sensing and digitizing 2.Image processing 3.Applications
image data and analysis
Typical Signal conversion Data reduction -Inspection
Techniques & -Sampling -Windowing -Identification
Applications -Quantization -Digital Conversion -Visual servoing
-Encoding Segmentation and navigation
Image storage/Frame -Thresholding
grabber -Region growing
lighting -Edge detection
-Structured light Feature Extraction
-Front/back lighting -Descriptors
-Beam splitter Object Recognition
-Retro reflectors -Template matching
-Specular illumination -Other algorithms
-Other techniques
The Image and Conversion
• The image presented to a vision system’s camera is light, nothing more
(varies intensity and wave length)
• Designer ensure the pattern of light presented to the camera is one that
can be interpreted easily
• Designer ensure what camera sees, the image has minimum clutter
• Designer ensure blocking extraneous light – sunlight, etc.., that might
affect the image
• Conversion
• Light energy converted to electrical energy
• Image divided into discrete pixels
• Note: A color camera considered as three separate cameras, each basic
color
• The best portion of the image is produced by the light passing through a
lens along the len’s axis

A pixel is generally thought of as the smallest single component of a digital image


The camera
• Common Imaging device used for robot vision
system:
– Vidicon camera (B/W Camera)
– Solid state cameras
• Charge couple device (CCD)
• Charge Injection device
• Silicon bi-polar sensor cameras
– Pinhole camera
Frame Grabber
• A hardware electronic device used to capture and store the digital image
• This captures individual, digital still frames from an analog video signal (or) a digital
video stream
• Frame grabbers were the predominant way to interface cameras to pc's
• Analog frame grabbers
– Which accept and process analog video signals, include these circuits
• Digital frame grabbers
– Which accept and process digital video streams, include these circuits
• Circuitry common to both analog and digital frame grabbers
– A bus interface through which a processor can control the acquisition and access
the data
– Memory for storing the acquired image
Functions of Machine vision system
• Image formation
• Processing of Image
• Analyzing the Image
• Interpretation of Image
Image formation
• There are two parts to the image formation process:
– The geometry of image formation, which determines where in the
image plane and the projection of a point in the scene will be located.
– The physics of light, which determines the brightness of a point in the
image plane as a function of illumination and surface properties.

The image sensor collects light from the scene through a lens, using photo sensitive target, converts into electronic
signal
Processing of Image
• An analog to digital convertor is used to convert analog voltage of each into digital
value
• Voltage level for each pixel is given by either 0 or 1 depends on threshold value
• On the other hand grey scale system assigns upto 256 different values depending
on intensity to each pixel
Image digitization

• Sampling means measuring the value of an image at a finite number


of points.
• Quantization is the representation of the measured value at the
sampled point by an integer.
Image digitization
Analysis of Image
• Image analysis is the extraction of meaningful information from images
that is prepared by Image processing techniques; and to identify objects
(or) facts of it (or) its environment
• This analysis takes place at the central processing unit of the system
• Three important tasks performed here
– Measuring the distance of an object – 1-Dimensional
– Determining object orientation – 2-Dimensional
– Define Object Position
Interpretation of Image
• The must common image interpretation is
template matching
• In binary system, the image is segmented on
the basis of white and black pixels
• The complex images can be interpreted by
grey scale technique and algorithms
Image Understanding
• A computer needs to locate the edges of an object in order
to construct drawings of the object within a scene, which
lead to shapes, which lead to image understanding.
• The final task of robot vision is to interpret the information
(such as object edges, regions, boundaries, colour and
texture) obtained during image analysis process.
• This is called image understanding or machine perceptions
• A robot vision system must interpret what the image
represents in terms of information about its environment.
Threshold decides which elements of the differentiated
picture matrix should be considered as edge candidates.
Possible Sensors for Identification of Objects

• Robot system interfaces to a vision system can provide excellent opportunity


to produce a better quality outputs
Use of a sensing array to determine
the Orientation of Object moving on a
conveyor belt
Sensing array to identify the presence
of an object moving on a conveyor belt
and to measure the width of the object
Robot Welding System with Vision
• Teach box is used to position the end-effector at various points
• The terminal is used for communicating with robot and also to indicate system
conditions, editing and executing the robot work program
• The welding path is traversed by the robot manipulator and can be programmed
using programming languages such as VAL, RAIL, etc.
• The various welding parameters such as feed rate, voltage, current, etc. can be
incorporated in the program
Robot Welding System with Vision
• The data is processed by a set of algorithms and the relevant information is analyzed
by the computer and compared by the programmed path for welding
• Any kind of deviations from the programmed path can be taken care by the system
itself, giving welds of uniforms and consistent quality
Machine Vision
75
76
77
78
Machine vision (MV) :

Machine vision (MV) is the application of computer vision


to industry and manufacturing.

Machine vision is interested in digital input/output


devices and computer networks to control other
manufacturing equipment such as robotic arms and
equipment to eject defective products in semiconductor
chips, automobiles, food and pharmaceuticals.

Just as human inspectors working on assembly lines


visually inspect parts to judge the quality of
workmanship, machine vision systems use digital
cameras, smart cameras and image processing software to
perform similar inspections.

79
Robot vision

•Robot vision is defined as the


process of acquiring and extracting
information from the images of 3D
world.
•Vision sensors are the most
powerful sensors, which can equip
the robot with large variety of
sensory information.

80
Object ranging

81
Ranging
• Humans have stereo vision capabilities.
• Depth information can be obtained from
binocular vision.
• stereo vision has become an important
part of computer vision.
• stereo vision can be used by robots, to
detect the distance between itself and the
target (object).
• Parallel computing has been used over the
past two decades for quickly processing
Images
82
Applications of vision controlled Robotic
Systems

• Finding the presence or absence of a part


at a specific location
• Position and orientation of an object can
be determined with precision
• Picking a part from a specific location and
placing it at a desired location
• Objects captured in an image can be
distinguished from each other
• Sorting of ‘good’ and ‘bad’ parts in a quality
control operation, based on parameters from an
image

83
Stages of Machine Vision

• Computer vision system


• Image enhancement
• Image analysis
• Pattern Classification

84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
Feature extraction
Feature extraction involves finding features
of the segmented image.

Usually performed on a binary image


produced from a thresholding operation.
Common features include:

1. Area.
2. Perimeter.
3. Center of mass.
4. Compactness.

102
103
104
Digital Image Processing
& Analysis
Definitions
• Image Processing
• Image Analysis (Image Understanding)
• Computer Vision

• Low Level Processes: contrast manipulation


• Mid-Level Processes: segmentation, recognition
• High Level Processes: understanding groups of objects
Initial Examples of Imagery
Improvement
Digital Image Processing
Important Stages
in Image Processing

• Image Acquisition
• Preprocessing
• Segmentation
• Representation and Description
• Recognition and Interpretation
• Knowledge base
Important Stages
in Image Processing
Image Acquisition

• Imaging sensor & capability to digitize the


signal collected by the sensor

– Video camera
– Digital camera
– Conventional camera & analog-to-digital
converter
Preprocessing

• To improve the image to ensure the success


of further processes

• e.g. enhancing contrast


removing noise
identifying information-rich areas
Segmentation

• To partition the image into its constituent parts (objects)

– Autonomous segmentation (very difficult)


• Can facilitate or disturb subsequent processes

– Output (representation):
• Raw pixel data, depicting either boundaries or whole regions
(corners vs. texture for example)
• Need conversion to a form suitable for computer processing

– (Description)
Representation & Description

• Feature selection (description) deals with


extracting:

– features that result in quantitative information


of interest or

– features that are important for differentiating


one class of objects from another
Recognition & Interpretation

• To assign a label to an object based on


information provided by the descriptors

• To assign meaning to a group of


recognized objects
Knowledge Base

• Knowledge database
– Guides the operation of each processing
module and controls the interaction between
modules
Comments

• Image enhancement for human visual


interpretation usually stops at
preprocessing

• Recognition and interpretation are


associated with image analysis applications
where the objective is automation
(automated extraction of information from
images)
Components

You might also like