|
|
by R. O. Duda, P.
E. Hart and D. G. Stork, John Wiley & Sons, 2000
hapter 1: Introduction to Pattern
Recognition (Sections 1.1-1.6)
Machine Perception
An Example
Pattern Recognition Systems
The Design ycle
Learning and Adaptation
onclusion
Machine Perception
Build a machine that can recognize patterns:
Speech recognition
Fingerprint identification
OR (Optical haracter Recognition)
DNA sequence identification
Pattern lassification, hapter 1
An Example
³Sorting incoming Fish on a conveyor according to
species using optical sensing´
Sea bass
Species
Salmon
Pattern lassification, hapter 1
e
Problem Analysis
Set up a camera and take some sample images to extract
features
Length
Lightness
Width
Number and shape of fins
Position of the mouth, etc«
This is the set of all suggested features to explore for use in our
classifier!
Pattern lassification, hapter 1
õ
Preprocessing
Use a segmentation operation to isolate fishes from one
another and from the background
Information from a single fish is sent to a feature
extractor whose purpose is to reduce the data by
measuring certain features
The features are passed to a classifier
Pattern lassification, hapter 1
Á
Pattern lassification, hapter 1
Ø
lassification
Select the length of the fish as a possible feature for
discrimination
Pattern lassification, hapter 1
ï
Pattern lassification, hapter 1
The length is a poor feature alone!
Select the lightness as a possible feature.
Pattern lassification, hapter 1
Pattern lassification, hapter 1
Threshold decision boundary and cost relationship
Move our decision boundary toward smaller values of
lightness in order to minimize the cost (reduce the number
of sea bass that are classified salmon!)
Task of decision theory
Pattern lassification, hapter 1
Adopt the lightness and add the width of the fish
Fish : D::
h
Pattern lassification, hapter 1
Pattern lassification, hapter 1
e
We might add other features that are not correlated
with the ones we already have. A precaution should
be taken not to reduce the performance by adding
such ³noisy features´
Ideally, the best decision boundary should be the one
which provides an optimal performance such as in the
following figure:
Pattern lassification, hapter 1
õ
Pattern lassification, hapter 1
Á
However, our satisfaction is premature because
the central aim of designing a classifier is to
correctly classify novel input
Issue of generalization!
Pattern lassification, hapter 1
Ø
Pattern lassification, hapter 1
ï
Pattern Recognition Systems
Sensing
Use of a transducer (camera or microphone)
PR system depends of the bandwidth, the resolution
sensitivity distortion of the transducer
Segmentation and grouping
Patterns should be well separated and should not overlap
Pattern lassification, hapter 1
Pattern lassification, hapter 1
Feature extraction
Discriminative features
Invariant features with respect to translation, rotation and
scale.
lassification
Use a feature vector provided by a feature extractor to
assign the object to a category
Post Processing
Exploit context input dependent information other than from
the target pattern itself to improve performance
Pattern lassification, hapter 1
The Design ycle
Data collection
Feature hoice
Model hoice
Training
Evaluation
omputational omplexity
Pattern lassification, hapter 1
Pattern lassification, hapter 1
Data ollection
How do we know when we have collected an adequately
large and representative set of examples for training and
testing the system?
Pattern lassification, hapter 1
e
Feature hoice
Depends on the characteristics of the problem domain.
Simple to extract, invariant to irrelevant transformation
insensitive to noise.
Pattern lassification, hapter 1
õ
Model hoice
Unsatisfied with the performance of our fish classifier and
want to jump to another class of model
Pattern lassification, hapter 1
Á
Training
Use data to determine the classifier. Many different
procedures for training classifiers and choosing models
Pattern lassification, hapter 1
Ø
Evaluation
Measure the error rate (or performance and switch from
one set of features to another one
Pattern lassification, hapter 1
ï
omputational omplexity
What is the trade-off between computational ease and
performance?
(How an algorithm scales as a function of the number of
features, patterns or categories?)
Pattern lassification, hapter 1
Learning and Adaptation
Supervised learning
A teacher provides a category label or cost for each
pattern in the training set
Unsupervised learning
The system forms clusters or ³natural groupings´ of the
input patterns
Pattern lassification, hapter 1
onclusion
Reader seems to be overwhelmed by the number,
complexity and magnitude of the sub-problems of
Pattern Recognition
Many of these sub-problems can indeed be solved
Many fascinating unsolved problems still remain
Pattern lassification, hapter 1