0% found this document useful (0 votes)
123 views18 pages

Autonomous Mobile Robots: Probabilistic Map Based Localization

Uploaded by

Evangelista Mao
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
123 views18 pages

Autonomous Mobile Robots: Probabilistic Map Based Localization

Uploaded by

Evangelista Mao
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Autonomous Mobile Robots

"Position"
Localization Cognition
Global Map

Environment Model Path


Local Map

Perception Real World Motion Control


Environment

Probabilistic Map Based


Localization:
Kalman Filter Localization
Zürich Autonomous Systems Lab
5 - Localization and Mapping
5
3
Kalman filter localization

© R. Siegwart, D. Scaramuzza ETH Zurich - ASL


5 - Localization and Mapping
5
4 Markov  Kalman Filter Localization

 Markov localization  Kalman filter localization


 localization starting from any unknown  tracks the robot and is inherently very
position precise and efficient.
 recovers from ambiguous situation.  However, if the uncertainty of the robot
 However, to update the probability of all becomes to large (e.g. collision with an
positions within the whole state space at object) the Kalman filter will fail and the
any time requires a discrete position is definitively lost.
representation of the space (grid). The
required memory and calculation power
can thus become very important if a fine
grid is used.

© R. Siegwart, D. Scaramuzza ETH Zurich - ASL


5 - Localization and Mapping
5
8 The Five Steps for Map-Based Localization

Prediction position Estimation


Encoder Measurement and Position
(odometry) estimate (fusion)

matched predictions

Predicted features
and observations

observations
Map
data base
YES

Matching

raw sensor data or


1. Prediction based on previous estimate and odometry extracted features

perception
2. Observation with on-board sensors
3. Measurement prediction based on prediction and map Observation
on-board sensors
4. Matching of observation and map
5. Estimation -> position update (posteriori position)
© R. Siegwart, D. Scaramuzza ETH Zurich - ASL
5 - Localization and Mapping
5
10 Kalman Filter Localization: Prediction update

 sr  sl sr  sl 


 cos(  )
 xt 1   2 2 b kr sr 0 
  sr  sl sr  sl  Qt   
xˆt  f ( xt 1 , ut )   yt 1    sin(  )  0 kl sl 
 2 2b 
 t 1   sr  sl 
 b 
Odometry
Pt  Fxt1  Pt 1  Fxt1  Fut  Qt  Fut
T T

© R. Siegwart, D. Scaramuzza ETH Zurich - ASL


5 - Localization and Mapping
5
11 Kalman Filter for Mobile Robot Localization: Observation

 The second step it to obtain the observation Z (measurements) from the


robot’s sensors at the new location
 The observation usually consists of a set n0 of single observations zj
extracted from the different sensors signals. It represents features like
lines, doors or any kind of landmarks.
 The parameters of the targets are usually observed in the sensor frame
{S}.
 Therefore the features in the map (in the world frame {W}) have to be
transformed into the sensor frame {S}.
 This transformation is specified in the function hj (see later).

© R. Siegwart, D. Scaramuzza ETH Zurich - ASL


5 - Localization and Mapping
5
12 Observation

Raw Date of Extracted Lines Extracted Lines


Laser Scanner in Model Space

line j

j

rj

 j 
R Sensor
zj   (robot)
 rj  frame

  r 
 R , j    
 r  rr j
© R. Siegwart, D. Scaramuzza ETH Zurich - ASL
5 - Localization and Mapping
5
13 Measurement Prediction

 In the next step we use the predicted robot position x̂ and the map to
generate multiple predicted observations ẑ j
 They have to be transformed into the sensor frame

zˆ j  h j xˆ 
 We can now define the measurement prediction as the set containing all ni
predicted observations

Zˆ  zˆ j 1  j  ni 
 The function hj is mainly the coordinate transformation between the world
frame and the sensor frame

© R. Siegwart, D. Scaramuzza ETH Zurich - ASL


5 - Localization and Mapping
5
14 Measurement Prediction

 For prediction, only the walls that are in


the field of view of the robot are selected.
 This is done by linking the individual
lines to the nodes of the path

© R. Siegwart, D. Scaramuzza ETH Zurich - ASL


5 - Localization and Mapping
5
15 Kalman Filter for Mobile Robot Localization: Measurement Prediction: Example
 The generated measurement predictions have to be transformed to the
robot frame {R}

 According to the figure in previous slide the transformation is given by

and its Jacobian by

© R. Siegwart, D. Scaramuzza ETH Zurich - ASL


5 - Localization and Mapping
5
16 Kalman Filter for Mobile Robot Localization: Matching

 Assignment from observations zj(k+1) (gained by the sensors) to the targets zt


(stored in the map)
 For each measurement prediction for which an corresponding observation is found
we calculate the innovation:

and its innovation covariance found by applying the error propagation law:

 The validity of the correspondence between measurement and prediction can e.g.
be evaluated through the Mahalanobis distance:

© R. Siegwart, D. Scaramuzza ETH Zurich - ASL


5 - Localization and Mapping
5
17 Matching

© R. Siegwart, D. Scaramuzza ETH Zurich - ASL


5 - Localization and Mapping
5
21 Estimation

 Kalman filter estimation of the new robot


position xt :
 By fusing the prediction of robot position
(magenta) with the innovation gained by the
measurements (green) we get the updated
estimate of the robot position (red)

© R. Siegwart, D. Scaramuzza ETH Zurich - ASL


5 - Localization and Mapping
5
22 Autonomous Indoor Navigation (Pygmalion EPFL)
 very robust on-the-fly localization
 one of the first systems with probabilistic sensor fusion
 47 steps,78 meter length, realistic office environment,
 conducted 16 times > 1km overall distance
 partially difficult surfaces (laser), partially few vertical edges (vision)

© R. Siegwart, D. Scaramuzza ETH Zurich - ASL


5 - Localization and Mapping
5
23
Markov versus Kalman: summary
 Remember: both methods solve the convolution and Bayer rules for the
action and the perception update respectively, but:

© R. Siegwart, D. Scaramuzza ETH Zurich - ASL


5 - Localization and Mapping
5
24
Available student projects

[Link]

[Link]

© R. Siegwart, D. Scaramuzza ETH Zurich - ASL


5 - Localization and Mapping
5
25
Available student projects

Vision-Based Dense Real-Time 3D Mapping


Category: Robot Localization and Mapping
Possibility of extension and employment at Dacuda

KINECT-aided Visual Dense Real-Time 3D Mapping


Category: Robot Localization and Mapping
Type: SA/BA/MA/SoM
Possibility of extension and employment at Dacuda

Air-Ground Matching for Flying and Ground Robots


Category: Flying Robots
Type: SA/BA/MA/SoM

© R. Siegwart, D. Scaramuzza ETH Zurich - ASL


5 - Localization and Mapping
5
26
Available student projects

Characterizing the performance of vision-based navigation for


aggressive maneuvers with quadcopters
Category: Robot Localization and Mapping
With IDCS Prof. Raff D’Andrea

Vision-Controlled Nano Quadcopter


Type: SA/BA/MA/SoM
Possibility of working in a team up to 3-4 students

© R. Siegwart, D. Scaramuzza ETH Zurich - ASL

You might also like