Object Motion Tracking Based On Color Detection For Android Devices
Object Motion Tracking Based On Color Detection For Android Devices
International Scholarly and Scientific Research & Innovation 9(4) 2015 999 scholar.waset.org/1307-6892/10001450
World Academy of Science, Engineering and Technology
International Journal of Computer and Information Engineering
Vol:9, No:4, 2015
Backward, Left, Right and can stop motion with character “S”. filtering, the object that has the pre-defined color resembles
In the proposed work optical encoders or Hall effect sensors like a group of white pixels, Fig. 3 (b). However, there are a
for wheel synchronization are not used, as this was beyond the lot of white pixels dispersed across the frame, which are pixel
purpose of the project. Future work would incorporate noise.
encoders and a PID algorithm for precise movement of the Usually, the removal of pixel noise is performed by erosion
robot car. and dilation morphological operations with a structuring
element that may be of suitable size in order to eliminate the
noise but to leave the object intact, Fig. 3 (c). The resulting
object blob, which is a group of white pixels that form a
coherent space, is analyzed for determining the largest
contour, which is assumed to be the desired object. The largest
contour is shown with a white rectangle that bounds the
desired object whereas other objects of the same color are
ignored. This is a simplistic and rough approach but with low
computational complexity. The alternative and more precise
International Science Index, Computer and Information Engineering Vol:9, No:4, 2015 waset.org/Publication/10001450
International Scholarly and Scientific Research & Innovation 9(4) 2015 1000 scholar.waset.org/1307-6892/10001450
World Academy of Science, Engineering and Technology
International Journal of Computer and Information Engineering
Vol:9, No:4, 2015
to the left or right, the device sends a message to the robot car
in order to turn left or right respectively. The area and the
dimensions of the bounding rectangle may be evaluated by the
Open CV’s functions and therefore the relative position of the
object with respect to smartphone’s view may be determined.
Fig. 5 shows smartphone’s view with the bounding rectangle
of the tracked object and the estimated quantities. The width
and height of the view are represented by W and H, the width
(c) and height of the object are w and h and the coordinates of the
top-left corner of the object are x and y respectively.
The view is divided into three equal zones, as shown in Fig.
5 with the two imaginary lines P1 and P2, and the direction of
the robot car’s motion is defined by the determination of the
object’s relative position. This is estimated by the following
logical statement, where is the middle of the object’s
boundaries. The width of the middle zone may be trimmed so
that the motion of the robot car to be more consistent to the
object’s movement. It should not be too small because the car
will turn left or right very easily and it should not be very
large because the car will mainly move forward losing the
(d) object.
The approaching speed of the car to the object may be
Fig. 3 Image processing steps: (a) Frame of the captured image; (b) stable or vary with inverse proportion to the area of the white
Binary image after color filtering; (c) Binary image after pixel noise
rectangle and the car may stop when the area overcomes a
removal; (d) Object tracking after largest contour evaluation
threshold value.
(1)
2 3
2∙
(2)
2 3
2∙
(3)
3 2 3
International Scholarly and Scientific Research & Innovation 9(4) 2015 1001 scholar.waset.org/1307-6892/10001450
World Academy of Science, Engineering and Technology
International Journal of Computer and Information Engineering
Vol:9, No:4, 2015
ACKNOWLEDGMENT
This paper is part of a Master's degree thesis project for the
Department of Electronic Computer Systems Engineering,
which is funded by the Research Committee of Piraeus
University of Applied Sciences.
REFERENCES
[1] Tang Sze Ling et al, “Colour-based Object Tracking in Surveillance
Application”, Proceedings of the International MultiConference of
Engineers and Computer Scientists, vol. I, Hong Kong, March 2009.
[2] J.F. Engelberger, “Health-care Robotics Goes Commercial: The
Helpmate Experience”, Robotics, vol. 11, 1993, pp. 517-523.
[3] L. Davis, V. Philomin and R. Duraiswami, “Tracking Humans from a
Moving Platform”, IEEE Computer Society Proceedings of the
International Conference on Pattern Recognition, vol. 4, 2000, pp. 4171.
[4] O. Javed and M.S. Yilmaz, “Object Tracking: A survey”, ACM Journal
of Computing Surveys, vol. 38, no. 4, article 13, 2006.
[5] D. Comanciu, P. Meer, “Mean shift: A robust approach toward feature
space analysis”, IEEE Transactions on Pattern Analysis Machine
Intelligence, vol. 24, no. 5, 2002, pp. 603–619.
International Scholarly and Scientific Research & Innovation 9(4) 2015 1002 scholar.waset.org/1307-6892/10001450