Visual Tracking and Control of A Quadcopter Using PDF
Visual Tracking and Control of A Quadcopter Using PDF
net/publication/228962857
Visual Tracking and Control of a Quadcopter Using a Stereo Camera System and
Inertial Sensors
CITATIONS READS
96 5,748
4 authors, including:
Martin Buss
Technische Universität München
575 PUBLICATIONS 10,543 CITATIONS
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by Markus Wilhelm Achtelik on 21 May 2014.
To gain R and T uniquely in 3-D space, at least three 2) Calculate the vector normal to the plane:
correspondences are necessary. Firstly, the rotation will be n = l1 × l2 (13)
calculated. The sets of points are translated by their centroids
µx and µy to the origin and the covariance matrix Σxy of the 3) Calculate the mean vector µ as in (6)
translated sets of points is computed. n denotes the number 4) Calculate the helper point:
of corresponding pairs of points. X helper = µ + n (14)
n n
1X 1X
µx = Xi µy = Yi (6)
n i=1 n i=1
n
1X
Σxy = (Y i − µy )(X i − µx )T (7)
n i=1
The best solution for R in the sense of minimizing the
squared errors is now achieved by computing the singular
value decomposition Σxy = U SV T and setting the singular
Fig. 6: Construction of the helper points
values equal to 1. If rank (Σxy ) = 2, which is always the
case for only three corresponding pairs of points or more
than three points that are coplanar, the transformation matrix The covariance matrix Σxy calculated from X i , X helper ,
itself is exactly determined, but not uniquely if it is a rotation Y i and Y helper by (6), (7) has full rank and R will always
or a reflection. This can be checked by calculating det (R). be a valid rotation matrix. T can still be computed similar to
Then, if necessary, S has to be corrected: (10).
Thus far, the rotation and translation of the quadcopter are
R = U S̃V T (8) still computed w.r.t. the left camera’s frame. In most cases,
the camera rig will be placed “somewhere” in a room, so
(
diag (1, 1, 1) for det (R) = 1
with S̃ = (9) the pose of the camera would not be a desirable reference
diag (1, 1, −1) for det (R) = −1 frame for further applications such as position control. Thus,
Having calculated the rotation matrix, the translation vector another transformation from the camera to world coordinates
can now be gained by simply rewriting (5) and using the is required. The idea is to determine a new reference frame
mean vectors µx and µy of the two sets of points: by the frame spanned by the quadcopter lying e.g. at its start
position. The desired rotation/translation of the quadcopter
T = µy − Rµx (10) w.r.t. its start position is given by:
Although the distinction between rotation and reflection was T
R0 = RQC,start · RQC,camera (15)
made through calculating the determinant of R, this decision T T
T0 = RQC,start · TQC,camera − RQC,start · TQC,start (16)
caused problems in practice. For example, the roll angle
calculated from the rotation matrix (see below) jumped RQC,camera denotes the rotation w.r.t. the camera frame,
between 0◦ and −180◦ in certain cases. To avoid this effect, RQC,start the start position (i.e. the new reference frame) and
R0 the desired rotation w.r.t. the startposition, T respectively. absolute deviations to the trajectory ∆ were computed as a
When the camera rig’s position changes, simply R0 and T0 quality criterion which are listed in Table II.
need to be stored at the quadcopter’s start position which are
derived by the algorithms above.
IV. PERFORMANCE OF THE SYSTEM
After the implementation of the tracking of the quadcopter,
its performance was evaluated before attempting position
control.
A. Processing times and delays
To measure the total delay between a change of the
quadcopter and a control reaction, processing times of certain
algorithms were logged. In addition, the time between a
change and the moment that the image is delivered by the
camera driver was measured with a digital oscilloscope. As
it can be seen in Table I, the largest part of the delay was
caused by the cameras. This delay included an interframe
delay of 33 ms on average or 66 ms at maximum, assuming
15 FPS. Compared to the communication- and camera delays,
the other processing times were negligible. In the worst case, Fig. 7: measured values (blue) vs. desired trajectory (red)
it takes 121 ms + 96/2 ms = 169 ms and in the mean 71 ms
from a change in position to a response at the quadcopter.
TABLE II: Evaluation of absolute position accuracy
TABLE I: Processing times and delays alignment ∆x [mm] ∆y [mm] ∆yaw [◦ ]
t [ms] std(t)[ms] tmax [ms] parallel, D = 2 m 43 102 1.1
load image from driver 0.5 0.4 1.9 parallel, D = 4 m 98 137 2.8
LED-recogn., all LEDs 0.7 0.2 2.1 converging, D = 2 m 18 87 1.1
LED-recogn., three LEDs 5.8 0.6 7.3 converging, D = 4 m 42 120 2.0
LED-recogn., no LED 20.1 1.2 28.4
3D-reconstruction < 0.1 - - To measure how accurately small changes could be recog-
position- and orientation < 0.1 - - nized by the system, the position was increased in only one
communication, roundtrip 55.1 18.8 96 axis by steps of 50 mm. This was repeated for each of the
camera delay 86.9 20.6 121 x- and y-axis and the camera alignments listed above. This
time, the mean value δ of absolute deviations to the desired
The Software was executed on a laptop with a 2.4 GHz step size of 50 mm was calculated over all steps, shown in
Core2Duo processor and 3 GB RAM. CPU load with only Table III.
3 recognized LEDs at 15 FPS was about 25-30 % includ-
TABLE III: Evaluation of relative position accuracy
ing 15 % needed by the cameradrivers to capture images.
Memory usage was about 50 MB. alignment δ x [mm] δ x [%] δ y [mm] δ y [%]
parallel, D = 2 m 4.3 8.7 1.8 3.5
B. Pose accuracy parallel, D = 4 m 26.0 52.1 3.7 7.3
converging, D = 2 m 4.2 8.3 0.9 1.7
Absolute as well as relative accuracy was measured by converging, D = 4 m 6.6 13.1 1.5 3.1
mounting the quadcopter on a programmable 2D-positioning
table with a traverse path of 600 mm in x- and 1200 mm in The convergent camera-alignment should be clearly pre-
y- direction.The quadcopter could also be rotated around its ferred especially regarding the results of relative accuracy
z-axis. Trajectories were programmed and taken as reference. for larger distances. Overall, the position resolution is suf-
The experiments were repeated with the cameras’ optical axis ficient to stabilize the quadcopter on a desired trajectory. A
aligned parallel as well as converging at certain distances D resonable flight volume ranges from 1.5 m distance from the
to the positioning table. To measure the absolute accuracy, a cameras to 4 m and up to 1.5 in height.
rectangular trajectory that used the whole traverse paths was
programmed. V. POSITION- AND HEADINGCONTROLLER
Fig. 7 shows the measured values (blue) by the tracking We show an exemplary design of a position controller
system compared to the desired trajectory (red). The mean to demonstrate the capabilities of the overall system. The
controllers are implemented on the high-level processor of
the IMU (II-A) using a Matlab/Simulink SDK available for
it.
0.4 x y z
0.2 ACKNOWLEDGMENT
0
-0.2 This work was supported in part within the DFG excellence
0 5 10 15 20 25 30 initiative research cluster Cognition for Technical Systems –
time [s ] CoTeSys, see also www.cotesys.org and the Bernstein
Center for Computational Neuroscience Munich, see also
Fig. 9: stationary hovering
www.bccn-munich.de. The authors also gratefully ac-
knowledge Daniel Gurdan, Jan Stumpf, Michael Achtelik and
Klaus-Michael Doth from Ascending Technologies GmbH
for their technical support with the quadcopter.
R EFERENCES
[1] P. Pounds, R. Mahony, J. Gresham, P. Corke, and J. Roberts, “To-
wards dynamically-favourable quad-rotor aerial robots,” in Australian
Conference on Robotics and Automation,, 2004.
[2] D. Gurdan, J. Stumpf, M. Achtelik, K.-M. Doth, G. Hirzinger, and
D. Rus, “Energy-efficient autonomous four-rotor flying robot controlled
at 1 khz,” in IEEE International Conference on Robotics and Automa-
tion, Roma, Italy, Apr. 2007, pp. 361 – 366.
Fig. 10: disturbances of x- and y-position [3] C. Lu, G. D. Hager, and E. Mjolsness, “Fast and globally convergent
pose estimation from video images,” IEEE Transactions on Pattern
Analysis and Machine Intelligence, vol. 22, pp. 610–622, 2000.
[4] R. Laganiere, S. Gilbert, and G. Roth, “Robust object pose estimation
from feature-based stereo,” IEEE Transactions on Instrumentation and
VI. CONCLUSIONS Measurement. Volume 55, Number 4. August 2006., vol. 55, pp. 1270–
1280, 2006.
We presented an effective and reliable method for tracking [5] E. Altug, J. P. Ostrowski, and C. J. Taylor, “Control of a quadrotor he-
licopter using dual cameravisual feedback,” The International Journal
a quadcopter or general objects with active markers. The of Robotics Research, vol. 24, pp. 329–341, 2005.
tracking system is highly transportable, easy to set up and [6] S. Bouabdallah, “Design and control of quadrotors with application to
the can be executed on a laptop. We also showed that autonomous flying,” Ph.D. dissertation, Ecole polytechnique fdrale de
Lausanne (EPFL), 2007.
negative influence of the time delays on the performance [7] R. He, S. Prentice, and N. Roy, “Planning in information space for
of the controllers could be compensated by evaluating the a quadrotor helicopter in a gps-denied environments,” in Proc. ICRA,
onboard acceleration sensors of the quadcopter. In future, Los Angeles, CA, 2008, pp. 1814–1820.
[8] Y. Ma, S. Soatto, J. Kosecká, and S. S. Sastry, An Invitation to 3-D
we plan to apply more than two cameras to gain a larger Vision. Springer Verlag New York, 2005.
flight volume. Position control and sensor data fusion could [9] S. Umeyama, “Least-squares estimation of transformation parameters
be further enhanced by using techniques like the family of between two point patterns,” in IEEE Transactions on Pattern Analysis
and Machine Intelligence, vol. 13, Apr. 1991, pp. 376–380.
Kalman filters. [10] G. Hoffmann, H. Huang, S. L. Waslander, and C. J. Tomlin, “Quadrotor
helicopter flight dynamics and control: Theory and experiment,” in Pro-
3 http://www.lsr.ei.tum.de/research/videos/ ceedings of the AIAA Guidance, Navigation, and Control Conference,
robotics/qc-tracking-high 2007.