0% found this document useful (0 votes)
115 views

Pose Tracking I: Gordon Wetzstein! Stanford University!

The document provides an overview of pose tracking using the HTC Lighthouse system. It discusses how the Lighthouse base stations emit infrared laser sweeps that are detected by photodiodes on the VRduino shield. The timing of the detected laser sweeps can be used to determine the 2D coordinates of the sweeps and compute the 6 degree of freedom pose through triangulation. The VRduino contains an IMU and photodiodes that interface with the Arduino Teensy microcontroller to enable pose tracking of the VRduino in real time.

Uploaded by

Nono Nana
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
115 views

Pose Tracking I: Gordon Wetzstein! Stanford University!

The document provides an overview of pose tracking using the HTC Lighthouse system. It discusses how the Lighthouse base stations emit infrared laser sweeps that are detected by photodiodes on the VRduino shield. The timing of the detected laser sweeps can be used to determine the 2D coordinates of the sweeps and compute the 6 degree of freedom pose through triangulation. The VRduino contains an IMU and photodiodes that interface with the Arduino Teensy microcontroller to enable pose tracking of the VRduino in real time.

Uploaded by

Nono Nana
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 78

Pose Tracking I!

Gordon Wetzstein!
Stanford University!
!

EE 267 Virtual Reality!


Lecture 11!
stanford.edu/class/ee267/!
!
Overview!

•  overview of positional tracking!


•  camera-based tracking!
•  HTC’s Lighthouse!
•  VRduino – an Arduino for VR, specifically designed for EE 267
by Keenan Molner!
•  pose tracking with VRduino using homographies!
What are we tracking?!

•  Goal: track pose of headset, controller, …!

•  What is a pose? !
•  3D position of the tracked object !
•  3D orientation of the tracked object, e.g. using
quaternions or Euler angles!

•  Why? So we can map the movement of our head to the motion of the
camera in a virtual environment – motion parallax!!
Overview of Positional Tracking!

“inside-out tracking”: camera or sensor is located on HMD, no


need for other external devices to do tracking !
•  simultaneous localization and mapping (SLAM) – classic
computer & robotic vision problem (beyond this class)!

!
“outside-in tracking”: external sensors, cameras, or markers are
required (i.e. tracking constrained to specific area)!
•  used by most VR headsets right now, but everyone is
feverishly working on insight-out tracking!!
Marker-based Tracking!

•  seminal papers by Rekimoto 1998 and Kato & Billinghurst 1999!


•  widely adopted after introduced by ARToolKit!

Kato, Billinghurst - ARToolKit Rekimoto - Matrix


Marker-based Tracking!

ARToolKit OpenCV marker tracking


Google’s Project Tango!
Inside-out Tracking!
Google’s Project Tango!

also used but not shown: IMU!


problem: SLAM via sensor fusion!
Inside-out Tracking!

•  marker-less inside-out tracking used by Microsoft HoloLens,


Intel’s Project Alloy, …!
•  eventually required by all untethered VR/AR systems!
•  if you need it for your own HMD, consider using Intel’s
RealSense (small & has SDK)!
•  if you want to learn more about SLAM, take a 3D computer
vision or robotic vision class, e.g. Stanford CS231A!
“Outside-in Tracking”!

•  mechanical tracking!
•  ultra-sonic tracking!
•  magnetic tracking!
•  optical tracking!
•  GPS!
•  WIFI positioning!
•  marker tracking!
•  ...!
Positional Tracking - Mechanical!

some mechanical
linkage, e.g.!
•  fakespace BOOM!
•  microscribe!
Positional Tracking - Mechanical!

pros:!
•  super low latency!
•  very accurate!
!
cons:!
•  cumbersome!
•  “wired” by design!
Positional Tracking – Ultra-sonic!
•  1 transmitter, 3 receivers à triangulation!
Ivan Sutherland’s “Ultimate Display”!

Logitech 6DOF !
Positional Tracking – Ultra-sonic!

pros:!
•  can be light, small, inexpensive!
!
cons:!
•  line-of-sight constraints!
•  susceptible to acoustic interference!
•  low update rates!
Positional Tracking - Magnetic!

•  reasonably good accuracy !


•  position and orientation!
•  3 axis magnetometer in sensors!
•  need magnetic field generator (AC, DC, …),
e.g. Helmholtz coil! 3 axis Helmholtz coil!
www.directvacuum.com!

•  magnetic field has to oscillate and be sync’ed


with magnetometers!
Positional Tracking - Magnetic!

pros:!
•  small, low cost, low latency sensors!
•  no line-of-sight constraints!
!

cons:!
3 axis Helmholtz coil!

•  somewhat small working volume !


www.directvacuum.com!

•  susceptible to distortions of magnetic field!


•  not sure how easy it is to do this untethered (need to sync) !
Positional Tracking - Magnetic!
Magic Leap One controller tracking:!
•  magnetic field generator in controller!
•  magnetometer in headset!

https://www.ifixit.com/Teardown/Magic+Leap+One+Teardown/112245!
Positional Tracking - Optical!

•  track active (near IR) LEDs


with cameras!
OR!
•  track passive retro-reflectors
with IR illumination around
camera!

Oculus Rift !
•  both Oculus Rift and HTC Vive https://www.ifixit.com/Teardown/
Oculus+Rift+CV1+Teardown/60612!

come with optical tracking!


http://steam3.com/make-magic-2015/!
Understanding Pose Estimation - Triangulation!
3D point!
•  for tracking individual
3D points, multi-camera
setups usually use
triangulation!
2D
2D
projection!
projection!
•  this does not give us
the pose (rotation &
translation) of camera
or object yet!
Understanding Pose Estimation!
3D points in object coordinates!
•  for pose estimation,
need to track multiple ⎛ xi ⎞
⎜ ⎟
points with known ⎜ yi ⎟
⎜ zi ⎟
relative 3D coordinates!! ⎝ ⎠

2D
projections!
⎛ xn ⎞
⎜ n ⎟
i

⎜⎝ yi ⎟⎠
Understanding Pose Estimation!

•  when object is closer,


projection is bigger!

2D
projections!
Understanding Pose Estimation!

•  when object is father,


projection is smaller!

... and so on ...! 2D


projections!
!
!
!
Estimating 6-DoF pose from 2D projections is known as the !
Perspective-n-point problem!!
Understanding Pose Estimation!
1.  how to get projected
⎛ xi ⎞
2D coordinates?! ⎜ ⎟
⎜ yi ⎟
2.  image formation! ⎜ zi ⎟
⎝ ⎠

3.  estimate pose with


linear homography 2D
projections!
method! ⎛ xn ⎞
⎜ n ⎟
i

⎜⎝ yi ⎟⎠
4.  estimate pose with
nonlinear Levenberg-
Marquardt method
(next class)!
Understanding Pose Estimation!
1.  how to get projected •  HTC Lighthouse!
2D coordinates?!
•  VRduino!
2.  image formation!
3.  estimate pose with
linear homography
method!
4.  estimate pose with
nonlinear Levenberg-
Marquardt method
(next class)!
HTC Lighthouse!

https://www.youtube.com/watch?v=J54dotTt7k0!
HTC Lighthouse!

https://www.youtube.com/watch?v=J54dotTt7k0!
HTC Lighthouse – Base Station!

http://gizmodo.com/this-is-how-valve-s-amazing-lighthouse-tracking-technol-1705356768!
HTC Lighthouse – Base Station!

important specs:!
•  runs at 60 Hz !
•  i.e. horizontal & vertical update combined 60 Hz !
•  broadband sync pulses in between each laser sweep
(i.e. at 120 Hz)!
•  each laser rotates at 60 Hz, but offset in time!
•  useable field of view: 120 degrees!
HTC Lighthouse – Base Station!
•  can use up to 2 base stations simultaneously via time-division
multiplexing (TDM) !
•  base station modes:!
A: TDM slave with cable sync!
B: TDM master!
C: TDM slave with optical sync!
HTC!
HTC Lighthouse – Base Station!
•  sync pulse periodically emitted (120 times per second)!
•  each sync pulse indicates beginning of new sweep !
•  length of pulse also encodes additional 3 bits of information:!

•  axis: horizontal or vertical sweep to follow!


•  skip: if 1, then laser is off for following sweep!
•  data: data bits of consecutive pulses yield OOTX
frame!

https://github.com/nairol/LighthouseRedox/blob/master/docs/Light%20Emissions.md#sync-pulse!
VRduino!

•  in this class, we use the HTC Lighthouse base stations but


implement positional tracking (i.e. pose estimation) on the
VRduino!

•  VRduino is a shield (hardware add-on) for the Arduino Teensy


3.2; custom-designed for EE 267 by Keenan Molner!
!

!
VRduino!
VRduino!

IMU!
Teensy 3.2!
VRduino!

Lighthouse!
Select!
VRduino!
Photodiode 0! Photodiode 1!

Photodiode 2!
Photodiode 3!
x=-42mm, y=25mm!
VRduino! x=42mm, y=25mm!

x=-42mm, y=-25mm! x=42mm, y=-25mm!


VRduino!

pins 9,10!

pins 20,21!
pins 5,6!
pins 22,23!
VRduino!

SDA: pin 18!


SCL: pin 19!
VRduino!

power (3.3V)!
ground!
VRduino!

Pin!
Breakout!
VRduino!

3.3V power, 200mA MAX!


digital R/W, Serial, cap. sense! SPI, Serial, digital R/W!
digital R/W, Serial, cap. sense! digital R/W!
digital R/W! SPI, analog read, digital R/W!
digital R/W, PWM, CAN! SPI, analog read, digital R/W!
digital R/W, PWM, CAN! I2C, analog read, digital R/W!
digital R/W, Serial, SPI! I2C, analog read, digital R/W!
5V power, 500mA MAX!

For more details, see Lab Writeup!


Pose Estimation with the VRduino!

•  timing of photodiodes reported in Teensy “clock ticks” relative


to last sync pulse!

•  Teensy usually runs at 48 MHz, so 48,000,000 clock ticks per


second!
How to Get the 2D Coordinates?!
•  at time of respective sync pulse,
laser is at 90° horizontally and -90°
vertically!
•  each laser rotates 360° in 1/60 sec!

Side View!


y
z
laser sweep
direction!

current laser
position! −90°
How to Get the 2D Coordinates?!
•  at time of respective sync pulse,
laser is at 90° horizontally and -90°
vertically!
•  each laser rotates 360° in 1/60 sec!

Top View! Side View!


optical axis
(principle direction)!

laser sweep

direction!
y
z
laser sweep
direction!

x current laser current laser


position! 90° position! −90°
z
How to Get the 2D Coordinates?!
•  at time of respective sync pulse, •  convert from ticks to angle first and
laser is at 90° horizontally and -90° then to relative position on plane at
vertically! unit distance!
•  each laser rotates 360° in 1/60 sec!

Top View! Top View!


α

3D
optical axis pi,x/y
(principle direction)!
laser sweep angle
laser sweep
when it hits the
direction! 2D
pi,x/y photodiode!

x current laser x starting laser


position! 90° position! 90°
z z
How to Get the 2D Coordinates?!
•  convert from ticks to angle first and
raw number of ticks from photodiode! then to relative position on plane at
unit distance!
# ticks
Δt [ sec ] =
⎡ ticks ⎤ Top View!
48,000,000 ⎢ CPU speed!
⎣ sec ⎥⎦ α
3D
pi,x/y

laser sweep angle


when it hits the
2D
pi,x/y photodiode!

x starting laser
position! 90°
z
How to Get the 2D Coordinates?!
•  convert from ticks to angle first and
raw number of ticks from photodiode! then to relative position on plane at
unit distance!
# ticks
Δt [ sec ] =
⎡ ticks ⎤ Top View!
48,000,000 ⎢ CPU speed!
⎣ sec ⎥⎦ α
3D
pi,x/y
offset from sync pulse!
laser sweep angle
when it hits the
2D
Δt [ sec ]
pi,x/y photodiode!
360
α [°] = − + [°]
1 4
60 ⎡ sec ⎤ 1
360 ⎢⎣ ° ⎥⎦
x starting laser
time per 1 revolution! position! 90°
z
How to Get the 2D Coordinates?!
•  convert from ticks to angle first and
then to relative position on plane at
unit distance!
⎛ α ⎞
2D
pi,x/y = tan ⎜ ⋅ 2π ⎟
⎝ 360 [ ° ] ⎠
Top View!
α
3D
pi,x/y
offset from sync pulse!
laser sweep angle
when it hits the
2D
Δt [ sec ]
pi,x/y photodiode!
360
α [°] = − + [°]
1 4
60 ⎡ sec ⎤ 1
360 ⎢⎣ ° ⎥⎦
x starting laser
time per 1 revolution! position! 90°
z
How to Get the 2D Coordinates?!

Horizontal Sweep! Vertical Sweep!

Δt [ sec ] 360 Δt [ sec ] 360


α [°] = − + [°] α [°] = − [°]
1 4 1 4
60 ⎡ sec ⎤ 60 ⎡ sec ⎤
360 ⎢⎣ ° ⎥⎦ 360 ⎢⎣ ° ⎥⎦
Understanding Pose Estimation!
1.  how to get projected
2D coordinates?!
2.  image formation! •  how 3D points project
3.  estimate pose with into 2D coordinates in a
linear homography camera (or a Lighthouse
method! base station)!

4.  estimate pose with •  very similar to graphics


nonlinear Levenberg- pipeline!
Marquardt method
(next class)!
Image Formation!
•  image formation is model for mapping 3D points in local “object”
coordinate system to 2D points in “window” coordinates!
3D reference point arrangement! planar 2D arrangement!
local device
frame!
reference points
on the device! ⎛ xi ⎞ ⎛ xi ⎞
⎜ ⎟ ⎜ ⎟
⎜ yi ⎟ ⎜ yi ⎟
⎜ zi ⎟ ⎜ zi = 0 ⎟
⎝ ⎠ ⎝ ⎠

measured 2D
projections!
⎛ xn ⎞ ⎛ xn ⎞
⎜ n ⎟ ⎜ n ⎟
i i

⎜⎝ yi ⎟⎠ ⎜⎝ yi ⎟⎠

normalized
camera frame!
Image Formation – 3D Arrangement!
1. transform 3D point into view space:!

⎛ xc ⎞ ⎛ xi ⎞
⎛ 1 0 0 ⎞ ⎛ r11 r12 r13 tx ⎞ ⎜ ⎟
⎜ c ⎟
i
⎛ xi ⎞
⎜ yi ⎟ = ⎜ 0 1 0 ⎟ ⋅ ⎜ r21 r22 r23

ty ⎟ ⋅ ⎜
yi ⎟
⎜ ⎟
⎜ c ⎟ ⎜⎝ 0 0 −1 ⎟ ⎜ ⎜ zi ⎟ ⎜ yi ⎟
⎠ ⎜⎝ r31 r32 r33 t z ⎟⎠ ⎜ ⎟
⎝ zi ⎠ ⎝ 1 ⎠ ⎜ zi ⎟
⎝ ⎠

“modelview matrix”!
“projection matrix”!
3x3 rotation matrix and ⎛ xn ⎞
⎜ n ⎟
i
translation 3x1 vector!
This is the homogeneous ⎜⎝ yi ⎟⎠
coordinate, which we
could also call w! ⎛ xic ⎞
⎛ xn ⎞ ⎜ ⎟
zic ⎟
⎜ n ⎟ =⎜
i
2. perspective divide:!
⎜⎝ yi ⎟⎠ ⎜ yic ⎟
⎜ ⎟
⎜⎝ zic ⎟⎠
Image Formation – 2D Arrangement!
1. transform 3D point into view space:!
⎛ xc ⎞ ⎛ xi ⎞
⎛ 1 0 0 ⎞ ⎛ r11 r12 r13 tx ⎞
⎜ c ⎟ ⎟ ⎜ ⎟
i

⎜ yi ⎟ = ⎜ 0 1 0 ⎟ ⋅ ⎜ r21 r22 r23 ty ⎟ ⋅ ⎜ yi ⎟ ⎛ xi ⎞

⎜ c ⎟ ⎜⎝ 0 0 −1 ⎟ ⎜ ⎜ 0 ⎟

⎜ yi ⎟

⎠ ⎜⎝ r31 r32 r33 t z ⎟⎠ ⎜ ⎟
⎝ zi ⎠ ⎝ 1 ⎠ ⎜⎝ 0 ⎟⎠

⎛ 1 0 0 ⎞ ⎛ r11 r12 t x ⎞ ⎛ xi ⎞
=⎜ 0 1 0 ⎟ ⋅ ⎜ r21 r22

t y ⎟ ⋅ ⎜ yi ⎟ ⎛ xn ⎞
⎜ ⎟ ⎜ ⎜ ⎟
⎜ n
i

⎝ 0 0 −1 ⎠ ⎜⎝ r31 r32 ⎟
t z ⎠ ⎝ 1 ⎟⎠

⎜⎝ yi ⎟⎠

⎛ xic ⎞
⎛ xn ⎞ ⎜ ⎟
zic ⎟
⎜ n ⎟ =⎜
i
2. perspective divide:!
⎜⎝ yi ⎟⎠ ⎜ yic ⎟
⎜ ⎟
⎜⎝ zic ⎟⎠
Image Formation – 2D Arrangement!
r112 + r212 + r312 = 1
•  all rotation matrices are orthonormal, i.e.!
r122 + r222 + r322 = 1

rotation R! translation T!

⎛ 1 0 0 ⎞ ⎛ r11 r12 tx ⎞
⎜ 0 1 0 ⎟ ⋅ ⎜ r21 r22 ty ⎟

⎜ ⎟ ⎜
⎝ 0 0 −1 ⎠ ⎜⎝ r31 r32 t z ⎟⎠
The Homography Matrix!
r112 + r212 + r312 = 1
•  all rotation matrices are orthonormal, i.e.!
r122 + r222 + r322 = 1

rotation R! translation T!

⎛ 1 0 0 ⎞ ⎛ r11 r12 tx ⎞ ⎛ h1 h2 h3 ⎞
⎟ ⋅ ⎜ r21 ⎟
⎜ 0 1 0 ⎜ ⎟ let’s call this
r22 ty ⎟ = ⎜ h4 h5 h6 ⎟
⎜ ⎟ ⎜ “homography matrix”!
⎝ 0 0 −1 ⎠ ⎜⎝ r31 r32 t z ⎟⎠ ⎜ h7
⎝ h8 h9 ⎟⎠

H
Understanding Pose Estimation!
1.  how to get projected
2D coordinates?!
2.  image formation!
3.  estimate pose with
linear homography •  how to compute the
method! homography matrix!
4.  estimate pose with •  how to get position and
nonlinear Levenberg- rotation from that matrix!
Marquardt method
(next class)!
The Homography Matrix!

Turns out that: any homography matrix has only 8 degrees of


freedom – can scale matrix by s and get the same 3D-to-2D
mapping!
!

•  image formation with scaled homography matrix sH


!

⎛ xic ⎞ ⎛ sh1 xi + sh2 yi + sh3 ⎞ ⎛ s ( h1 xi + h2 yi + h3 ) ⎞


⎛ xn ⎞ ⎜ ⎟
zic ⎟ ⎜ sh7 xi + sh8 yi + sh9 ⎟ ⎜ s (h x + h y + h ) ⎟
⎜ n
i
⎟ =⎜ =⎜ ⎟ =⎜ 7 i 8 i 9 ⎟
⎜⎝ yi ⎟⎠ ⎜ yic ⎟ ⎜ sh4 xi + sh5 yi + sh6 ⎟ ⎜ s ( h4 xi + h5 yi + h6 ) ⎟
⎜ ⎟ ⎜ ⎟ ⎜ ⎟
⎜⎝ zic ⎟⎠ ⎝ sh7 xi + sh8 yi + sh9 ⎠ ⎜⎝ s ( h7 xi + h8 yi + h9 ) ⎟⎠
The Homography Matrix!
•  common approach: estimate a scaled version of the
homography matrix, where ! h9 = 1
•  we will see later how we can get scale factor s !

⎛ 1 0 0 ⎞ ⎛ r11 r12 tx ⎞ ⎛ h1 h2 h3 ⎞
⎟ ⋅ ⎜ r21 ⎟ ⎜ ⎟ estimate these 8
⎜ 0 1 0 r22 t y ⎟ = s ⎜ h4 h5 h6 ⎟ homography matrix
⎜ ⎟ ⎜ elements!!
⎝ 0 0 −1 ⎠ ⎜⎝ r31 r32 t z ⎟⎠ ⎜ h7
⎝ h8 1 ⎟⎠
Pose Estimation via Homography!
⎛ xi ⎞
•  image formation changes to! ⎜ ⎟
⎜ yi ⎟
⎜⎝ 0 ⎟⎠
⎛ xc ⎞ ⎛ h1 h2 h3 ⎞ ⎛ xi ⎞
⎜ c ⎟
i
⎜ ⎟
⎜ yi ⎟ = s ⎜ h4 h5 h6 ⎟ ⋅ ⎜ yi ⎟
⎜ ⎟
⎜ c ⎟ ⎜ h7 h8 1 ⎟⎠ ⎜⎝ 1 ⎟⎠
⎝ zi ⎠ ⎝

⎛ xn ⎞
homography matrix with
⎜ n ⎟
i

8 unknowns!! ⎜⎝ yi ⎟⎠
Pose Estimation via Homography!
⎛ xi ⎞
•  image formation changes to! ⎜ ⎟
⎜ yi ⎟
⎜⎝ 0 ⎟⎠
⎛ xc ⎞ ⎛ h1 h2 h3 ⎞ ⎛ xi ⎞
⎜ c ⎟
i
⎜ ⎟
⎜ yi ⎟ = s ⎜ h4 h5 h6 ⎟ ⋅ ⎜ yi ⎟
⎜ ⎟
⎜ c ⎟ ⎜ h7 h8 1 ⎟⎠ ⎜⎝ 1 ⎟⎠
⎝ zi ⎠ ⎝

⎛ xn ⎞
⎜ n ⎟
i

⎜⎝ yi ⎟⎠
⎛ xic ⎞ ⎛ h1 xi + h2 yi + h3 ⎞
⎜ ⎟
⎛ xn ⎞ zic ⎟ ⎜ h7 xi + h8 yi + 1 ⎟
⎜ n ⎟ =⎜ =⎜ ⎟
i

⎜⎝ yi ⎟⎠ ⎜ yic ⎟ ⎜ h4 xi + h5 yi + h6 ⎟
⎜ ⎟ ⎜ ⎟
⎜⎝ zic ⎟⎠ ⎝ h7 xi + h8 yi + 1 ⎠
Pose Estimation via Homography!
•  multiply by denominator!

⎛ xic ⎞ ⎛ h1 xi + h2 yi + h3 ⎞
⎜ ⎟
⎛ xn ⎞ zic ⎟ ⎜ h7 xi + h8 yi + 1 ⎟ xin ( h7 xi + h8 yi + 1) = h1 xi + h2 yi + h3
⎜ n ⎟ =⎜ =⎜ ⎟
i

⎜⎝ yi ⎟⎠ ⎜ yic ⎟ ⎜ h4 xi + h5 yi + h6 ⎟ yin ( h7 xi + h8 yi + 1) = h4 xi + h5 yi + h6
⎜ ⎟ ⎜ ⎟
⎜⎝ zic ⎟⎠ ⎝ h7 xi + h8 yi + 1 ⎠
Pose Estimation via Homography!
•  reorder equations!

h1 xi + h2 yi + h3 − h7 xi xin − h8 yi xin = xin


h4 xi + h5 yi + h6 − h7 xi yin − h8 yi yin = yin

xin ( h7 xi + h8 yi + 1) = h1 xi + h2 yi + h3
yin ( h7 xi + h8 yi + 1) = h4 xi + h5 yi + h6
Pose Estimation via Homography!
•  8 unknowns (red) but only 2 measurements (blue) per 3D-to-2D
point correspondence!

h1 xi + h2 yi + h3 − h7 xi xin − h8 yi xin = xin


h4 xi + h5 yi + h6 − h7 xi yin − h8 yi yin = yin

•  need at least 4 point correspondences to get to invertible system


with 8 equations & 8 unknowns!!
•  VRduino has 4 photodiodes à need all 4 to compute pose!
Pose Estimation via Homography!
•  solve Ah=b on Arduino using Matrix Math Library via
MatrixInversion function (details in lab)!
⎛ 0 −x1 x1n −y1 x1n ⎞ ⎛ h ⎛ xn ⎞
x1 y1 1 0 0 ⎞
⎜ ⎟ ⎜ n ⎟
1
1

⎜ 0 0 0 x1 y1 1 −x1 y1n −y1 y1n ⎟ ⎜ h ⎟ ⎜ y ⎟


⎜ ⎜ 2 ⎟ ⎜ 1
n ⎟ ⎟
⎜ x2 y2 1 0 0 0 −x2 x2n −y2 x2 ⎟ ⎜ h3 ⎟ ⎜ x2n ⎟
⎜ ⎜ ⎟ ⎜ n
0 0 0 x2 y2 1 −x2 y2n −y2 y2n ⎟ ⎜ h4 ⎟ ⎜ y2

⎜ ⎟ ⎟
⎜ x3 y3 1 0 0 0 −x3 x3n −y3 x3 ⎟ ⎜ h5
n ⎟ = ⎜ xn ⎟
⎜ ⎜ ⎟ ⎜ 3
n ⎟⎜ h ⎟ ⎜ y3n

⎜ 0 0 0 x3 y3 1 −x3 y3n −y3 y3 ⎟ 6 ⎟
⎜ ⎜ ⎟ ⎜ n
x4 y4 1 0 0 0 −x4 x4n −y4 x4n ⎟ ⎜ h7 ⎟ ⎜ x4

⎜ ⎟ ⎟
n ⎟⎜ h ⎟⎠ ⎜ n
⎜⎝ 0 0 0 x4 y4 1 −x4 y4n −y4 y4 ⎠ ⎝ 8 ⎝ y4 ⎟⎠

A h b
Get Position from Homography Matrix!

•  still need scale factor s to get position!!

just computed this!

⎛ 1 0 0 ⎞ ⎛ r11 r12 tx ⎞ ⎛ h1 h2 h3 ⎞
⎜ 0 1 0 ⎟ ⋅ ⎜ r21 r22
⎟ ⎜
t y ⎟ = s ⎜ h4 h5

h6 ⎟
⎜ ⎟ ⎜
⎝ 0 0 −1 ⎠ ⎜⎝ r31 r32 t z ⎟⎠ ⎜ h7
⎝ h8 1 ⎟⎠
Get Position from Homography Matrix!

•  normalize homography to have approx. unit-length columns


for the rotation part, such that! r112 + r212 + r312 ≈ 1, r122 + r222 + r322 ≈ 1

2
s=
h12 + h42 + h72 + h22 + h52 + h82

⎛ 1 0 0 ⎞ ⎛ r11 r12 tx ⎞ ⎛ h1 h2 h3 ⎞
⎜ 0 1 0 ⎟ ⋅ ⎜ r21 r22
⎟ ⎜
t y ⎟ = s ⎜ h4 h5

h6 ⎟
⎜ ⎟ ⎜
⎝ 0 0 −1 ⎠ ⎜⎝ r31 r32 t z ⎟⎠ ⎜ h7
⎝ h8 1 ⎟⎠
Get Position from Homography Matrix!

•  this gives us the position as!


!

t x = sh3 , t y = sh6 , t z = −s

⎛ 1 0 0 ⎞ ⎛ r11 r12 tx ⎞ ⎛ h1 h2 h3 ⎞
⎜ 0 1 0 ⎟ ⋅ ⎜ r21 r22
⎟ ⎜
t y ⎟ = s ⎜ h4 h5

h6 ⎟
⎜ ⎟ ⎜
⎝ 0 0 −1 ⎠ ⎜⎝ r31 r32 t z ⎟⎠ ⎜ h7
⎝ h8 1 ⎟⎠
Get Rotation from Homography Matrix!

1.  get normalized 1st column of 3x3 rotation matrix!


2.  get normalized 2nd column via orthogonalization!
3.  get missing 3rd column with cross product!
!

⎛ 1 0 0 ⎞ ⎛ r11 r12 tx ⎞ ⎛ h1 h2 h3 ⎞
⎜ 0 1 0 ⎟ ⋅ ⎜ r21 r22
⎟ ⎜
t y ⎟ = s ⎜ h4 h5

h6 ⎟
⎜ ⎟ ⎜
⎝ 0 0 −1 ⎠ ⎜⎝ r31 r32 t z ⎟⎠ ⎜ h7
⎝ h8 1 ⎟⎠
Get Rotation from Homography Matrix!

1.  get normalized 1st column of 3x3 rotation matrix!

⎛ h1 ⎞
⎜ ⎟ r!1
r!1 = ⎜ h4 ⎟ , r =
r!1 2
1
⎜ −h7 ⎟
⎝ ⎠

⎛ 1 0 0 ⎞ ⎛ r11 r12 tx ⎞ ⎛ h1 h2 h3 ⎞
⎜ 0 1 0 ⎟ ⋅ ⎜ r21 r22
⎟ ⎜
t y ⎟ = s ⎜ h4 h5

h6 ⎟
⎜ ⎟ ⎜
⎝ 0 0 −1 ⎠ ⎜⎝ r31 r32 t z ⎟⎠ ⎜ h7
⎝ h8 1 ⎟⎠
Get Rotation from Homography Matrix!

2.  get normalized 2nd column via orthogonalization!

⎛ h2 ⎞ ⎛ ⎛ h2 ⎞⎞
⎜ ⎟ ⎜ ⎜ ⎟⎟ r!2
r!2 = ⎜ h5 ⎟ − ⎜ r1 i ⎜ h5 r
⎟⎟ 1 , r =
r!2 2
2
⎜ −h8 ⎟ ⎜ ⎜ −h8 ⎟⎟
⎝ ⎠ ⎝ ⎝ ⎠⎠

⎛ 1 0 0 ⎞ ⎛ r11 r12 tx ⎞ ⎛ h1 h2 h3 ⎞
⎜ 0 1 0 ⎟ ⋅ ⎜ r21 r22
⎟ ⎜
t y ⎟ = s ⎜ h4 h5

h6 ⎟
⎜ ⎟ ⎜
⎝ 0 0 −1 ⎠ ⎜⎝ r31 r32 t z ⎟⎠ ⎜ h7
⎝ h8 1 ⎟⎠
Get Rotation from Homography Matrix!

3.  get missing 3rd column with cross product:! r3 = r1 × r2

•  r3 this is guaranteed to be orthogonal to the other two columns !

⎛ 1 0 0 ⎞ ⎛ r11 r12 tx ⎞ ⎛ h1 h2 h3 ⎞
⎜ 0 1 0 ⎟ ⋅ ⎜ r21 r22
⎟ ⎜
t y ⎟ = s ⎜ h4 h5

h6 ⎟
⎜ ⎟ ⎜
⎝ 0 0 −1 ⎠ ⎜⎝ r31 r32 t z ⎟⎠ ⎜ h7
⎝ h8 1 ⎟⎠
Get Rotation from Homography Matrix!
⎛ r11 r12 r13 ⎞
⎜ ⎟
•  make 3x3 rotation matrix! R = ( r1 r2 r3 ) = ⎜ r21 r22 r23 ⎟
⎜ r31 r32 r33 ⎟⎠

•  convert to quaternion or Euler angles!


Get Rotation from Homography Matrix!
•  remember Euler angles (with yaw-pitch-roll order):!

•  get angles from 3x3 rotation matrix:!


Temporal Filter to Smooth Noise!

•  pose estimation is very sensitive to noise in the measured 2D


coordinates!!
à  estimated position and especially rotation may be noisy !

•  apply a simple temporal filter with weight α to smooth the pose


at time step k:!

(θ ,θ ,θ ,t ,t ,t ) ( ) ( )
( k−1) (k)
+ (1− α ) θ x ,θ y ,θ z ,t x ,t y ,t z
(k )
x y z x y z = α θ x ,θ y ,θ z ,t x ,t y ,t z
filtered filtered unfiltered

•  smaller α à less filtering, larger α à more smoothing!


Pose Estimation via Homographies – Step-by-Step!

in each loop() call of the VRduino:!


1.  get timings from all 4 photodiodes in “ticks”!
2.  convert “ticks” to degrees and then to 2D coordinates on plane
n n
at unit distance (i.e. get xi , yi ) !
3.  populate matrix A using the 2D and 3D point coordinates!
4.  estimate homography as h=A-1b!
5.  get position tx, ty, tz and rotation, e.g. in Euler angles from the
estimated homography!
6.  apply temporal filter to smooth out noise!
Must read: course notes on tracking!!
Understanding Pose Estimation!
1.  how to get projected
2D coordinates?!
2.  image formation!
3.  estimate pose with
linear homography
method!
4.  estimate pose with •  advanced topic!
nonlinear Levenberg-
•  all details of this are also
Marquardt method
derived in course notes!
(next class)!

You might also like