ROS Robot Programming en
ROS Robot Programming en
Address #1505, 145, Gasan Digital 1-ro, GeumCheon-gu, Seoul, Republic of Korea
E-mail [email protected]
Website www.robotis.com
ISBN 979-11-962307-1-5
Reproduction and modification of this book in any form or by means is strictly prohibited
without the prior consent or the written permission from the publisher.
ROS
Robot Programming
YoonSeok Pyo, HanCheol Cho, RyuWoon Jung, TaeHoon Lim
Preface
This book is a ROS robot programming guide based on the experiences we had
accumulated from ROS projects. We tried to make this a comprehensive guide
that covers all aspects necessary for a beginner in ROS. Topics such as
embedded system, mobile robots, and robot arms programmed with ROS are
included. For those who are new to ROS, there are footnotes throughout the
book providing more information on the web. Through this book, I hope that
more people will be aware of and participate in bringing forward the ever-
accelerating collective knowledge of Robotics Engineering.
Lastly, I would like to thank everybody who helped in publishing this book. I
am also grateful to Morgan, Brian, Tully and all ROS development team,
maintainers and contributors. A sincere gratitude to the ROS experts Jihoon
Lee, Byeongkyu Ahn, Keunman Jung, Changhyun Sung, Seongyong Koo, who
always shine new knowledge on me. I look forward to continue doing more
great things with you all. A special thanks to Changhoon Han, Inho Lee, Will
Son, Jason and Kayla Kim who was pivotal in helping the book be easy to
understand to non-experts. Thanks to the entire ROBOTIS team. This book is
here thanks to the great team, who started this endeavor with the question of
“What is a robot?” I would like to thank members of Open Source Team(OST),
which strives to help more people ponder upon and develop robots. I also
thanks to Jinwook Kim, he is a pillar in the open source ecosystem and
community. Much thanks to the ROS Avengers Hancheol Cho, Ryuwoon Jung,
iv
Preface
and Taehoon Lim, who are all co-authors of this book. A special thanks to my
academic advisor from Kyushu University, Professor Ryo Kurazume and
Professor Tsutomu Hasegawa. You have allowed me to walk the path of a
researcher, and I continue to learn much from you. Thank you for the never-
ending teachings. I would also like to thank Hyungil Park and the entire
administrative team of OROCA who gave me endless support in making this
book. Thank you to all the members from OROCA and to the staff of the
OROCA Open Projects, who is so passionate of the open robotics platform
development. I look forward to more discussions and projects on many topics
regarding robotics. Thanks to the administrators of the Facebook group, the
Korea Open Society for Robotics, and to all my fellow colleagues who deeply
care for and ponder on the robotics. Thanks to the robot game team, RO:BIT,
with whom I have shared my youth. Thanks to the robot research club, ROLAB.
I would also like to thank the CEO, Bill(Byoungsoo) Kim, and CTO, Inyong Ha,
of ROBOTIS whose support my all activity so that I can write this book.
Last but not least, I would like to thank my loving family. To my parents: I love,
admire, and always thank you. I would like to extend my love and gratitude to
my parents-in-law, who always support me by my side. To my loving wife
Keunmi Park, who always takes care of me: I love you, always thank you, and
wish to live in much happiness with you! To my son, Jian, and daughter, Jiwoo,
who I cherish most in this world: I will always try to be a father that makes this
world brighter and happier!
July 2017,
Yoon Seok Pyo
v
Preface
I would like to thank Hyung Joon Pyo, Hyung il Park, and Byung Hoon Park for
our joined efforts in creating OpenCR. I will cherish memories of you helping
me to overcome my shortcomings. I am also grateful to Open Source Team
(OST) members who always make me cheerful and happy. I would like to
thank In wook Kim for giving me generous advice and encouragement during
difficult times since the beginning of my career. I would also like to thank
Byoung Soo Kim, the CEO of ROBOTIS for giving me the opportunity for a new
challenge in my life. When I was young, I read his writings in the Hitel online
society, which allowed me to learn a lot and eventually led me to make robots,
and ultimately I was able to join his company to make robots.
I promise to be a good father to my loving son, Yu Chan, who I have not been
able to play with a lot for the excuse of being busy. I would like to express my
love and gratitude to my wife Kyoung Soon, who always gives me strength
when I am in need and returns my immature behaviors with love and care.
July 2017,
Han Cheol Cho
vi
Preface
Now, make robots as we imagined! There was a time when I used to make
robots using the robot kits enclosed in books. Even when I would succeed in
making simple movements, I was so pleased and content thinking “This is a
robot!” However, in recent years, many concepts of robots have been redefined
through the enhancement of computer performance, decreasing cost of
equipment, and the rapid and convenient prototyping of materials. Hobbyists
began to dive into making robots, growing the mass of information. Even cars,
planes, and submarines can now be called robot platforms as makers began to
automate their own products. As people in various fields started to incorporate
technology that encompasses a wide range of knowledge, robots have finally
begun taking the form of what it has long been dreamed of. At the first glance,
we may say that the robotics society is at a great age, but on the other side of
this progress, there could be those that have dropped out from the fast-paced
progress and trend of the performance and speed of today’s robots. This could
thereby make robots only accessible to those who have knowledge or the
people inside the industry.
ROS can be the solution to this problem. It is easy to learn and use the skills
required in the field without being an expert. You can save the time and money
it would have taken to aquire the skills that used to be necessary. A system is
developing that allows people to ask the producers about an issue and receive
direct feedback, enhancing the development environment. Companies such
as BMW are currently implementing ROS. It is becoming possible to use ROS
in business or for collaboration. The introduction of ROS can be considered as
having a competitive advantage in the corresponding field.
I hope that I will be able to meet the readers of this book again in the world of
ROS. I would like to express my sincere thanks and appreciation to the
members of Open Source Team (OST), especially Dr. Yoon Seok Pyo, who gave
me the opportunity to participate in writing this book. I would also like to
thank Han Cheol Cho and Tae Hoon Lim, who went through this process with
me amidst various ongoing projects. In addition, I would like to thank
Hyunjong Song and Hyun Suk Kim, who gave me generous advice and help in
the robot society, Jinwook Kim, who helped me so that I could continue
learning about robots, Ki Je Sung, who joined me in hosting the autonomous
driving tournament, and the members of the Oroca AuTURBO project, who I
have spent valuable times with. And I would like to express to appreciation to
my parents for their generous support and care. I want them to know that I
only wish to be able to repay their love somehow. I give my deepest gratitude
to my brother whose company has enriched my life and to Ha Kim, who will
always be by me. First and foremost, I give all the glory to my Creator, God.
July 2017,
Ryu Woon Jung
vii
Preface
Today we can find many videos in articles about how our society, economy,
and culture will change in the future based on state-of-the-art robot technology
and artificial intelligence. Although there is optimism that our lives will
improve thanks to the rapidly developing society, a pessimistic outlook that
the labor market will take a toll is making people more insecure. As such,
research and development on robots and artificial intelligence that is currently
taking place around us will have a profound impact on us in the near future.
Therefore, we have to be more interested in robot technology than we are now
and try to understand and be prepared for the future.
I was in charge of the manipulator part of this book and tried to organize the
ROS, Gazebo, and MoveIt! Wiki contents to be easier to understand. I also tried
filling in gaps by including topics that were not explained in the Wiki which
took me some time to understand. I hope to be a person who can share useful
knowledge with others.
I would first like to thank Dr. Yoon Seok Pyo, who has given me many lessons
as my senior in school and as a supervisor at work. You gave me the courage
and opportunity throughout the entire process of writing this book. I would
also like to thank Dr. Chang Hyun Seong for reviewing my writing in spite of
your busy schedule, and for kindly answering all my questions. Special thanks
to my Open Source Team colleagues whom I spend time with from morning to
evening, and to the whole ROBOTIS company members who have always greet
viii
Preface
me with smiles. I personally want to thank Professor Jong ho Lee, who was my
professor at my graduate school. Under his guidance, I was able to develop not
only engineering knowledge and research but also integrity, patience and
responsibility. Thank you once again.
Lastly, I would like to express my love and gratitude to my loving father who is
always by my side with a warm heart, my mother who has such curiosity and
creativity and is always open to learn from everything, and my only brother
with whom I always feel most comfortable. I would like to thank Go Eun Kim,
who has stood by my side for the past seven years with understanding and
enduring love. You make my heart beat each day.
July 2017,
Tae Hoon Lim
ix
About the Authors
YoonSeok Pyo
HanCheol Cho
x
About the Authors
RyuWoon Jung
TaeHoon Lim
xi
Open Source Contents
≆≆ https://github.com/ROBOTIS-GIT/robotis_tools → Chapter 3
≆≆ https://github.com/ROBOTIS-GIT/turtlebot3_deliver → Chapter 12
≆≆ https://github.com/ROBOTIS-GIT/open_manipulator → Chapter 13
xii
Open Source Contents
≆≆ OpenCR (Chapter 9)
• Board: https://github.com/ROBOTIS-GIT/OpenCR-Hardware
• Burger: http://www.robotis.com/service/download.php?no=676
• Waffle: http://www.robotis.com/service/download.php?no=677
• Chain: http://www.robotis.com/service/download.php?no=690
• SCARA: http://www.robotis.com/service/download.php?no=691
• Link: http://www.robotis.com/service/download.php?no=692
xiii
Open Source Contents
➊ Direct Download
To use the “git” command to download directly in Linux, you will need to install git. Open a
terminal window and install git as follows:
You can download the source code of the repository with the following command.
(e.g.: ros_tutorials Package)
xiv
Open Source Contents
In addition, the contents related to the OpenCR controller for building ROS embedded systems
covered in this book and OpenManipulator for learning manipulation are also available.
Information about Dynamixel, which is used as an actuator for TurtleBot3 and OpenManipulator,
and its required software of Dynamixel SDK and Dynamixel Workbench can also be found from
below links.
Lastly, there are materials that can be used as ROS study reference. It contains chapter-by-chapter
summaries as well as case examples that are very useful if used together with this book, for college
courses, group studies and seminars.
xv
Open Source Contents
ROS Discourse is for news and general interest discussions. ROS Answers provides a forum which
can be filtered by tags to make sure the relevant people can find and/or answer the question, and
not overload everyone with hundreds of posts. Robot Source Community is a robotics technology
sharing community for robot developers.
Disclosure
≆≆ The open source code used in this book is governed by the respective designated license, and the
copyright owner or contributor is not responsible nor liable, in its sole discretion, for any direct or
indirect damages, incidental or consequential damages, special or general damages, illegal or negligent
infringements arising out of the use of the software.
≆≆ The open source code used in this book may differ from the actual code depending on the version
used by the reader.
≆≆ Company names and product names appearing in this book are generally registered trademarks of the
respective companies, and related signs such as TM, ©, ® are omitted in the text.
≆≆ If you have any questions regarding the contents of this book, please contact the publisher or use the
community mentioned above.
Contents
xvii
Contents
4.3. Message 60
4.3.1. msg File 62
4.3.2. srv File 62
4.3.3. action File 63
4.4. Name 64
xviii
Contents
xix
Contents
7.3. Creating and Running Service Servers and Client Nodes 162
7.3.1. Creating a Package 162
7.3.2. Modifying the Package Configuration File (package.xml) 163
xx
Contents
7.4. Writing and Running the Action Server and Client Node 172
7.4.1. Creating a Package 173
7.4.2. Modifying the Package Configuration File (package.xml) 173
7.4.3. Modifying the Build Configuration File (CMakeLists.txt) 174
7.4.4. Writing the Action File 175
7.4.5. Writing the Action Server Node 176
7.4.6. Writing the Action Client Node 179
7.4.7. Building a Node 180
7.4.8. Running the Action Server 180
7.4.9. Running the Action Client 182
xxi
Contents
xxii
Contents
xxiii
Contents
xxiv
Contents
xxv
Contents
Chapter 13 Manipulator
index 456
xxvi
Chapter 1
Robot
Software Platform
Within the IT industry Hardware, Operating System, Application, and User are said to be the
four main ecosystem components of a platform as shown in Figure 1-2. When all these
components exist and when there are an unseen division and collaboration of work between
these components, it is said that a platform can successfully become popular and personalized.
The previously mentioned PC and PP did not have all four of these components from the
beginning. At the beginning, they only had an on-board software to operate a specific hardware
device using the hardware dedicated firmware developed by one company and could only use
services provided by the manufacturer. If this concept is hard to understand, let’s use feature
phones as an example. Feature phones were produced by innumerable manufacturers before
the advent of the iPhone from Apple. One can say that the common factor that allowed the
success of these PC or PP is the appearance of operating systems (Windows, Linux, Android,
iOS, etc.). The appearance of operating systems unified hardware and software which led to the
modularity of hardware. Mass production reduced cost, specialized development brought high
performance, and ultimately made it possible for computers and mobile phones to be
personalized.
Among these software platforms, major platforms are Robot Operating System (ROS)1,
Japanese Open Robotics Technology Middleware (OpenRTM)2, European real-time control
centered OROCOS3, Korean OPRoS4, etc. Although their names are different, the fundamental
reason of advent of robot software platforms is because there are too many different kinds of
robot software, and their complications are causing many problems. Therefore robot researchers
from around the world are collaborating to find a solution. The most popular robot software
platform is ROS, a Robot Operating System that will be covered in this book.
For instance, when implementing a function that helps a robot to recognize its surrounding
situation, the diversity of hardware and the fact that it is directly applied in real-life can be a
burden. Some tasks may be considered easy for humans, but researchers in a college laboratory
or company are too difficult to deal with robots to perform a lot of functions such as sensing,
recognition, mapping, and motion planning. However, it would be a different story if
professionals from around the world shared their specialized software to be used by others. For
example, the robotics company Robotbase5, which drew attention in the social funding
KickStarter and CES2015, recently developed the Robotbase Personal Robot and successfully
launched it through a social funding. In the case of Robotbase, they focused on their core
technology which is face recognition and object recognition, and for their mobile robot they
used the mobile robot base from Yujin Robot6 which supports ROS, for the actuator they used
ROBOTIS Dynamixel7, and for the obstacle recognition, navigation, motor drive, etc. they used
the public package of ROS. Another example can be found in the ROS Industrial Consortium
(ROS-I)8. Many of the companies leading the industrial robot field participate in this consortium
and are solving some of the newly emerging and difficult problems from the industrial robot
field one by one, such as in automation, sensing, and collaborative robot. Using a common
platform, especially a software platform, is proved to be promoting collaboration to solve
problems that were previously difficult to tackle and increasing efficiency.
1 http://www.ros.org/
2 http://openrtm.org
3 http://www.orocos.org/
4 http://ropros.org/
5 https://www.kickstarter.com/projects/403524037/personal-robot
6 http://www.yujinrobot.com/
7 http://www.robotis.com
8 http://rosindustrial.org/
Why should we learn ROS, which is the new concept of robot software platform? This is a
frequently asked question at offline ROS seminars. The short answer is because it can reduce
development time. Often people say they do not want to spare their time learning a new concept
and would rather stick to their current methods to avoid changing the already built system or
existing programs. However, ROS does not require one to develop the existing system and
programs all over again, but can rather easily turn a non-ROS system to a ROS-system by simply
inserting a few standardized codes. In addition, ROS provides various tools and software that are
commonly used, and it allows users to focus on the features that they are interested in or would
like to contribute in, which ultimately reduces the development and maintenance time. Let us
look at the five main characteristics of ROS.
First is the reusability of the program. A user can focus on the feature that the user would
like to develop, and download the corresponding package for the remaining functions. At the
same time, they can share the program that they developed so that others can reuse it. As an
example, it is said that for NASA to control their robot Robonaut29 used in the International
Space Station, they not only used programs developed in-house but also used ROS, which
provides various drivers for multi-platforms, and OROCOS, which supports real-time control,
message communication restoration and reliability, in order to accomplish their mission in
outer space. The Robotbase above is another example of thoroughly implemented reusable
programs.
Third is the support of development tools. ROS provides debugging tools, 2D visualization
tool (rqt, rqt is a software framework of ROS that implements the various GUI tools in the form
9 https://robonaut.jsc.nasa.gov/R2/
Fourth is the active community. The robot academic world and industry that have been
relatively closed until now are changing in the direction of emphasizing collaboration as a result
of these previously mentioned functions. Regardless of the difference in individual objectives,
collaboration through these software platforms is actually occurring. At the center of this
change, there is a community for open source software platform. In case of ROS, there are over
5,000 packages that have been voluntarily developed and shared as of 2017, and the Wiki pages
that explain their packages are exceeding 18,000 pages by the contribution of individual users.
Moreover, another critical part of the community which is the Q&A has exceeded 36,000 posts,
creating a collaboratively growing community. The community goes beyond discussing the
instructions, and into finding necessary components of robotics software and creating
regulations thereof. Furthermore, this is progressing to a state where users come together and
think of what robot software should entail for the advancement of robotics and collaborate in
order to fill the missing pieces in the puzzle.
Fifth is the formation of the ecosystem. The previously mentioned smartphone platform
revolution is said to have occurred because there was an ecosystem that was created by software
platforms such as Android or iOS. This type of progression is likewise underway for the robotic
field. In the beginning, every kind of hardware technology was overflowing, but there was no
operating system to integrate them. Various software platforms have developed and the most
esteemed platform among them, ROS, is now shaping its ecosystem. It is creating an ecosystem
that everyone ― hardware developers from the robotic field such as robot and sensor companies,
ROS development operational team, application software developers, and users ― can be happy
with it. Although the beginning may yet be marginal, when looking at the increasing number of
users and robot-related companies and the surge of related tools and libraries, we can anticipate
a lively ecosystem in the near future.
■■ OpenRTM National Institute of Adv. Industrial Science and Technology (AIST) - Japan
■■ OROCOS Europe
Aside from these, there are also Player, YARP, MARIE, URBI, CARMEN, Orca and MOOS.
As you see above, various robot software platforms are appearing, but it is hard to conclude
which one is better. The reason is that each of them provides unique and convenient functions
such as component extension, communication feature, visualization, simulator, real-time and
much more. However, much like the current operating systems of personal computers, the robot
software platforms that are selected by users will become more popular while others are
diminishing. Since we are not developing the actual software platform, we will focus on our
development skills for application programs that can be running on general purpose robot
software platforms.
10 https://www.microsoft.com/robotics/
11 https://en.wikipedia.org/wiki/Evolution_Robotics
12 https://www.openrobotics.org/
13 http://doc.aldebaran.com/2-1/index_dev_guide.html
We can compare this to the case of Android. Just as we had not developed nor do we have the
power to control the Android ecosystem but it has come to dominate the hardware and software
markets and is greatly contributing to the growth of the economy. We can likely become a great
player in the robot software platform market.
Then which of the currently existing robot software platforms would be good for us to
become familiar with? My best answer would be ROS, which is developed and maintained by
Open Robotics. Not only because of its highly active community, but also taking into account the
various libraries, expandability and convenience of development, there is no other platform like
ROS. For your information, Open Source Robotics Foundation (OSRF) has changed its name to
Open Robotics in May 2017.
It should also be noted that the global ROS community is more active than any other robot
software platform’s community. It is easier to find information when you encounter a problem
while using it because ROS is not solely developed by Open Robotics but by academic researchers,
field developers, and even hobbyists, and all of these people actively utilize the community
when they encounter questions, therefore, making it readily available for other users to find
valuable information. In addition to this, there are not only robot specialists but also a great
number of network specialists and people in the computer science and computer vision field
who are developing ROS even more promising robot software platform.
By using a robot software platform, even if a robot is composed of various hardware as long
as the basic functions are ready, you can create an application program without thoroughly
understanding the hardware. This is much the same as how we can develop mobile apps without
knowing the hardware composition or details of the latest smartphone.
Unlike past work processes, when robot developers had to do everything from hardware
design to software design, more software engineers can now participate in the development
process of actual robot applications. In other words, software platform allowed many engineers
to efficiently contribute to robot development, and hardware technicians, for example, can
focus on designing hardware that supports the interface required by the software platform. This
change provides the opportunity for robot industries to advance rapidly.
Robot Operating
System ROS
http://www.ros.org/
The ROS Wiki defines ROS as above. In other words, ROS includes hardware abstraction layer
similar to operating systems. However, unlike conventional operating systems, it can be used for
numerous combinations of hardware implementation. Furthermore, it is a robot software
platform that provides various development environments specialized for developing robot
application programs.
ROS is the abbreviation for Robot Operating System, so it would be safe to say that it is an
operating system. In particular, those who are new to ROS might think that it is a similar
operating system as aforementioned operating systems. When I first encountered ROS, I also
thought that it was a new operating system for robots.
1 http://wiki.ros.org/ROS/Introduction
As shown in Figure 2-2, ROS data communication is supported not only by one operating
system, but also by multiple operating systems, hardware, and programs, making it highly
suitable for robot development where various hardware are combined. This will be discussed in
detail in the following section.
2 http://wiki.ros.org/Client%20Libraries
3 http://wiki.ros.org/APIs
For example, the smartphone manufacturers will produce devices that support hardware
interfaces of the operating system, and operating system companies create a generic library to
operate devices from various manufacturers. Therefore, software developers can use numerous
devices without understanding hardware to develop applications. The ecosystem includes the
distribution of application to end users.
The ecosystem was not a new concept in the market. There were also a variety of hardware
manufacturers in the personal computer market, and this hardware was bound together mainly
by Microsoft’s Windows operating system and the open source Linux. The formation of
technology ecosystem seems to be as natural as the natural ecosystem.
Robotics is also forming its ecosystem. Various hardware technologies were overflowing in
the beginning, but there was no operating system to integrate them. Several software platforms
appeared and ROS drew enough attention to build an ecosystem. Although its effect may yet be
marginal, when looking at the increasing number of users, robot-related companies, and related
tools and libraries, we can anticipate a fully functional ecosystem in the near future.
Dr. Morgan Quigley is one of the founders and software development manager of Open Robotics
(formerly the Open Source Robotics Foundation, OSRF), which is responsible for the development and
management of ROS. Switchyard is a program created for the development of artificial intelligence
robots used in the AI lab’s projects at the time, and is the predecessor of ROS. In addition, Dr. Brian
Gerkey (http://brian.gerkey.org/), the developer of the Player/Stage Project (Player network server and
2D Stage simulator, later affects the development of 3D simulator Gazebo), which was developed
since 2000 and has had a major impact on ROS’s networking program, is the CEO and co-founder
of Open Robotics. Thus ROS was influenced by Player/Stage from 2000 and Switchyard from 2007
before Willow Garage changed the name to ROS in 2007.
In November 2007, U.S. robot company Willow Garage succeeded the development of ROS.
Willow Garage is a well-known company in the field for personal robots and service robots. It is
also famous for developing and supporting the Point Cloud Library (PCL), which is widely used
for 3D devices such as Kinect and the image processing open source library OpenCV.
Willow Garage started to develop ROS in November of 2007, and on January 22, 2010, ROS 1.0
came out into the world. The official version known to us was released on March 2, 2010 as ROS
Box Turtle. Afterwards, C Turtle, Diamondback and many versions were released in alphabetical
order like Ubuntu and Android.
4 http://roscon.ros.org/2017/
5 http://wiki.ros.org/Metrics
6 http://wiki.ros.org/
7 http://www.willowgarage.com/pages/software/ros-platform
8 https://www.osrfoundation.org/team/morgan-quigley/
9 http://stair.stanford.edu/
10 https://opensource.org/licenses/BSD-3-Clause
11 https://www.apache.org/licenses/LICENSE-2.0
12 http://roscon.ros.org
13 http://wiki.ros.org/Events
14 http://www.willowgarage.com/pages/pr2/overview
15 http://www.turtlebot.com/
■■ Nov 1, 2007 - Willow Garage starts development under the name ‘ROS’
■■ May 1, 2007 - Switchyard Project, Morgan Quigley, Stanford AI LAB, Stanford University
Apart from their version 1.0, ROS has been naming their versions to start in alphabetical
order, same as Ubuntu and Android. For instance, the Kinetic Kame version is the 11th version
thus starting with the alphabet K, and the 10th official release version.
In addition, there is one more rule. Each version has an illustration in the form of a poster
and a turtle icon, as shown in Figure 2-8. These turtle icons are also used in the official simulation
tutorial called ‘turtlesim’. The turtle symbol for ROS was stemmed from the educational
programming language Logo16 from MIT’s AI Lab17 in the 1960s. More than 50 years ago in 1969,
a turtle robot was developed using Logo, which was able to actually move on the floor and draw
pictures according to the instructions given by the computer. Based on this robot, a program
‘turtlesim’ was developed and the actual robot was later also called TurtleBot18.
16 https://en.wikipedia.org/wiki/Logo_(programming_language)
17 http://el.media.mit.edu/logo-foundation/what_is_logo/index.html
18 http://spectrum.ieee.org/automaton/robotics/diy/interview-turtlebot-inventors-tell-us-everything-about-the-robot
The supporting period of ROS is different for each version, but generally two years of support
is available after its release. Every two years, ROS and Linux releases Long Term Support20
version and ROS is supported for the next five years. For instance, the Kinetic Kame version,
which supports Ubuntu 16.04 LTS, will be supported until April 2021. Versions other than the
LTS versions are generally intended for minor upgrades and maintenance as they support the
latest Linux kernel. Therefore many ROS users
use the LTS versions which are released every two years. The latest ROS version released21
since 2014 are shown in Figure 2-9.
19 https://www.worldturtleday.org/
20 https://wiki.ubuntu.com/LTS
21 http://wiki.ros.org/Distributions
Ubuntu ported package information for the kinetic version of ROS can be found at the
corresponding information page22. In this page, you can see whether the kinetic version has
been completed or is in the process of migrating packages (source) for each Linux version.
The release name of Linux may not be familiar with you. As you can see from the following
list of Ubuntu versions, 14.04 Trusty is the Ubuntu ‘T’ version and is being released in alphabetical
order. You should be able to find a stable version from the list. The latest version of ROS shows
that many packages are still in progress, but if it is not critical to your application, you can either
use the latest version or currently available version. If the package you were using is not
converted for the latest ROS version, you might have to wait a bit.
22 http://repositories.ros.org/status_page/ros_kinetic_default.html
■■ Operating System: Ubuntu 16.04 Xenial Xerus23 (LTS) or Linux Mint 18.x or Debian Jessie
23 http://releases.ubuntu.com/16.04/
24 http://wiki.ros.org/kinetic
Configuring the
ROS Development
Environment
If you have a different version of Ubuntu installed on your computer, please check the official
site, and if your operating system is OS X1 or Windows2 you can check the corresponding Wiki3
page for the installation methods. If you are using a single board computer (SBC) that uses an
ARM CPU instead of Intel or AMD CPU, we do not separately provide instruction for the
installation of ROS, but if you are using Ubuntu or Linux Mint then you can follow the below
instructions.
1 http://wiki.ros.org/kinetic/Installation/OSX/Homebrew/Source
2 http://wiki.ros.org/hydro/Installation/Windows
3 http://wiki.ros.org/kinetic/Installation
If you are using Linux Mint version 18.x, use the following command. The code mentioned
above, $(lsb_release -sc) gets the code name of the Linux distribution version. Linux Mint 18.x
uses the code name of Xenial, so it is possible to add the same source list as Ubuntu.
4 https://en.wikipedia.org/wiki/Network_Time_Protocol
The command above will install the basic rqt package, but we can additionally install all of
the packages related to rqt. Installing all the packages related to rqt by using the following
command will facilitate many aspects as well as allow the use of various rqt plugins.
5 http://wiki.ros.org/kinetic/Installation/Ubuntu
To install the ROS packages, you can use the following apt-cache command to search for all the
packages that begin with ‘ros-kinetic’. By running this command, you can see approximately 1,600
packages.
If you wish to install a package individually, you can use the following command.
Another way would be to use the GUI tool Synaptic Package Manager.
The apt (Advanced Packaging Tool) from apt-get, apt-key, and apt-cache is a package management
command that is widely used in Debian series Linux such as Ubuntu and Linux Mint.
http://en.wikipedia.org/wiki/Advanced_Packaging_Tool
Deleting a previous version of ROS and alternating use of different versions of ROS
‘sudo apt-get purge ros-indigo-*’ This command allows configuration and file deletion. When using
together with a previous version, from the command that loads the ROS configuration file added to
‘~/.bashrc’.
$ source /opt/ros/kinetic/setup.bash
Initializing rosdep
Be sure to initialize rosdep before using ROS. The rosdep is a feature that enhances user
convenience by easily installing dependent packages when using or compiling core components
of ROS.
6 https://en.wikipedia.org/wiki/Advanced_Packaging_Tool
$ source /opt/ros/kinetic/setup.bash
$ mkdir -p ~/catkin_ws/src
$ cd ~/catkin_ws/src
$ catkin_init_workspace
Now that we have created the catkin workspace folder, let’s build it. Currently, the catkin
workspace only contains the ‘src’ folder and the ‘CMakeLists.txt’ file inside, but as a test use the
‘catkin_make’ command to build.
$ cd ~/catkin_ws/
$ catkin_make
Once you have finished building without errors, run the ‘ls’ command. In addition to the ‘src’
folder created by the user, ‘build’ and ‘devel’ folders have been created. The build related files for
the catkin build system are saved in the ‘build’ folder, and the execution related files are saved in
the ‘devel’ folder.
$ ls
build
devel
src
$ source ~/catkin_ws/devel/setup.bash
Testing
The installation for ROS has been completed. The following command will verify whether the
installation was successful or not. Close all terminal windows and open a new terminal window.
Now enter the following command to run roscore.
$ roscore
If it runs like the following without errors, the installation is completed. Terminate the
process with [Ctrl+c].
SUMMARY
========
PARAMETERS
* /rosdistro: kinetic
* /rosversion: 1.12.7
NODES
$ wget https://raw.githubusercontent.com/ROBOTIS-GIT/robotis_tools/master/install_ros_kinetic.sh
$ chmod 755 ./install_ros_kinetic.sh
$ bash ./install_ros_kinetic.sh
The install_ros_kinetic.sh shell script downloaded with the ‘wget’ command from above
quick installation contains general installation procedure covered in section 3.1.1 and ROS
environment setting that will be covered in the next section 3.2.1.
$ source /opt/ros/kinetic/setup.bash
$ source ~/catkin_ws/devel/setup.bash
To avoid this repetitive task, we can set it to import a setting file when we open a new terminal
window every time Additionally, we can configure the ROS network and create quick commands
for frequently used commands.
First, we will use the text editor ‘gedit’ program to open the ‘.bashrc’ file. This book uses gedit
as a default text editor, but you can also use atom, sublime text, vim, emacs, nano, visual studio
code, etc.
$ gedit ~/.bashrc
When you import the ‘.bashrc’ file, there are many settings that have already been configured.
Without modifying any of these settings, scroll down to the very bottom of the ‘bashrc’ file and
append the following lines (replace xxx.xxx.xxx.xxx with your IP address. Please refer to the
section on ‘ifconfig’ in footnote 7 for configuring the IP address). Once you have set everything,
save your changes and close gedit.
Now we will enter the following command to apply the new settings of the ‘.bashrc’ file. You
can also close and open the terminal window to apply the configurations of the ‘.bashrc’ file.
From now on, whenever you open a terminal window, ‘.bashrc’ settings will be applied to the
terminal window.
$ source ~/.bashrc
Now let’s take a closer look at what we have previously set up.
If you are running all packages on one PC, there is no need to assign a specific IP, but instead
assign ‘localhost’ for both fields.
ifconfig
7
In order to check the IP address on Linux, use the ifconfig command. By running the ifconfig command
in the terminal window as shown in the following example, IP address will be shown next to the ‘inet
addr’ in ‘enp3s0’ for wired LAN and in ‘wlp2s0’ for wireless LAN. The following example shows both
wireless and wired LAN. The IP for wired LAN connection is 192.168.1.100 whereas the wireless LAN
is 192.168.11.19.
$ ifconfig
enp3s0 Link encap:Ethernet HWaddr d8:cb:8a:f1:74:2b
inet addr:192.168.1.100 Bcast:192.168.1.255 Mask:255.255.255.0
inet6 addr: fe80::60fc:7e2b:b877:f82b/64 Scope:Link
UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1
RX packets:52 errors:0 dropped:0 overruns:0 frame:0
TX packets:81 errors:0 dropped:0 overruns:0 carrier:0
collisions:0 txqueuelen:1000
RX bytes:10172 (10.1 KB) TX bytes:8917 (8.9 KB)
Interrupt:19
7 https://en.wikipedia.org/wiki/Ifconfig
Quick Command
Let’s set the quick commands that are frequently used in ROS development. The following cw,
cs, and cm are custom commands that I have defined as alias commands.
■■ s: Move to the directory ‘~/catkin_ws/src’ in the catkin workspace directory that contains
c
source files
■■ m: Move to the catkin workspace directory ‘~/catkin_ws’, and build ROS packages with
c
‘catkin_make’ command
ROS supports many IDEs8. The most commonly used IDEs are Eclipse, CodeBlocks, Emacs,
Vim, NetBeans, QtCreator9. In my case I used to work with Eclipse, but it has become quite
heavy in the recent versions and it was inconvenient for me to use with ROS’s catkin build
system. Therefore having looked into other IDEs, it seems that the most suitable tool for simple
tasks would be Visual Studio Code, and for GUI interface development it would be QtCreator.
Especially, rqt and RViz for ROS development, debugging, visualization are developed with Qt,
and the fact that users can develop plug-ins for ROS tools using Qt plug-ins makes QtCreator very
useful.
Furthermore, even if you do not use Qt it is well adequate to be used as a general purpose
editor, and it can import a project directly through ‘CMakeLists.txt’, making it very convenient
when using ‘catkin_make’.
8 http://wiki.ros.org/IDEs
9 https://www.qt.io/ide/
Launching QtCreator
The QtCreator can be launched with the icon as well. However, if we are to apply the settings
such as the ROS path that we configured in ‘~/.bashrc’ to QtCreator, then we must open a new
terminal window and enter the following command to launch QtCreator. In this matter, all of
the settings configured in ‘~/.bashrc’ will be applied when launching QtCreator.
$ qtcreator
QtCreator has been opened by the command above as shown in Figure 3-2.
The shortcut for build is [Ctrl+b] and ‘catkin_make’ will be executed when compiling source
code. Built files will be saved in the ‘build’ folder in the same directory of the package. For
example, when you compile the package ‘tms_rp_action’, built files are placed in ‘build-tms_rp_
action-Desktop-Default folder’. The files that would originally be stored in ‘~/catkin_ws/build’
and ‘~/catkin_ws/devel’ are compiled separately and placed in a new location, so in order to run
it you will later need to run ‘catkin_make’ again in the terminal window. It is not necessary to
repeat this process every time, so during development, we can develop and debug in QtCreator,
and once the development is finished then we can use ‘catkin_make’. For your information,
there is a Qt Creator Plugin for ROS (https://github.com/ros-industrial/ros_qtc_plugin/wiki)
which optimizes QtCreator for the ROS development environment.
In addition to QtCreator, we also recommend Visual Studio Code and Eclipse. Visual Studio
Code is very lightweight editor like ‘Atom’, ‘Sublime Text’, ‘Clion’, so it is very fast and ROS
Extension makes it easy to use ROS. Eclipse is a general-purpose IDE used by a very large number
of people and is used by many ROS users as well. For more information, see the following wiki.
Starting from this section, many ROS-specific terms such as node, package, and roscore will
appear, which will be explained in detail in Chapter 4. ROS Terminology. In this section, we will
verify that ROS has been installed without any problems.
Running roscore
Open a new terminal window (Ctrl + Alt + t) and run the following command. This will run
roscore, which will have control over the whole ROS system.
$ roscore
In Linux we often enter commands in the terminal window. There are many users who are unfamiliar
with the command line interface, but as one becomes proficient with it, it becomes a very fast and
convenient method. However, even experienced users do not memorize all the commands and
frequently use the [Tab] key instead. In the Linux terminal window, the [Tab] key supports auto-
completion. This feature eliminates the need to memorize all the commands, and lets you enter
commands quickly and accurately without a typo. As an example, take a look at the rosrun command
used earlier. As shown below, after typing turtlesim we can use the [Tab] key to find the various nodes
that we can use in the turtlesim package.
Additionally, if we type turtle_teleop and press the [Tab] key, then it will auto-complete the rest of
command that we can use. This applies not only for ROS but for all Linux commands, so we suggest
you to make use of this feature.
$ rqt_graph
The square box ‘/turtle1/cmd_vel’, which is a sub-topic of the turtle1 topic, located in between
two arrows is the topic name for two nodes, and visualizes that the speed command entered
with the keyboard in the teleop_turtle node is being sent to the turtlesim node as a message in
the topic.
That is to say, using the above two nodes, keyboard commands were transferred to the robot
simulation. More detailed information will be explained in the following sections, and if you
have followed along well so far, you have completed the ROS operation test.
Important Concepts
of ROS
ROS
ROS provides standard operating system services such as hardware abstraction, device drivers,
implementation of commonly used features including sensing, recognizing, mapping, motion
planning, message passing between processes, package management, visualizers and libraries
for development as well as debugging tools.
Master
The master2 acts as a name server for node-to-node connections and message communication.
The command roscore is used to run the master, and if you run the master, you can register the
name of each node and get information when needed. The connection between nodes and
message communication such as topics and services are impossible without the master.
The master communicates with slaves using XMLRPC (XML-Remote Procedure Call)3, which
is an HTTP-based protocol that does not maintain connectivity. In other words, the slave nodes
can access only when they need to register their own information or request information of
other nodes. The connection status of each other is not checked regularly. Due to this feature,
ROS can be used in very large and complex environments. XMLRPC is very lightweight and
supports a variety of programming languages, making it well suited for ROS, which supports
variety of hardware and programming languages.
When you execute ROS, the master will be configured with the URI address and port
configured in the ROS_MASTER_URI. By default, the URI address uses the IP address of local
PC, and port number 11311, unless otherwise modified.
1 http://wiki.ros.org/ROS/Concepts
2 http://wiki.ros.org/Master
3 https://en.wikipedia.org/wiki/XML-RPC
Upon startup, a node registers information such as name, message type, URI address and
port number of the node. The registered node can act as a publisher, subscriber, service server
or service client based on the registered information, and nodes can exchange messages using
topics and services.
The node uses XMLRPC for communicating with the master and uses XMLRPC or TCPROS5
of the TCP/IP protocols when communicating between nodes. Connection request and response
between nodes use XMLRPC, and message communication uses TCPROS because it is a direct
communication between nodes independent from the master. As for the URI address and port
number, a variable called ROS_HOSTNAME, which is stored on the computer where the node is
running, is used as the URI address, and the port is set to an arbitrary unique value.
Package
A package6 is the basic unit of ROS. The ROS application is developed on a package basis, and the
package contains either a configuration file to launch other packages or nodes. The package also
contains all the files necessary for running the package, including ROS dependency libraries for
running various processes, datasets, and configuration file. The number of official packages is
about 2,500 for ROS Indigo as of July 2017 (http://repositories.ros.org/status_page/ ros_indigo_
default.html) and about 1,600 packages for ROS Kinetic (http://repositories.ros.org/status_page/
ros_kinetic_default.html). In addition, although there could be some redundancies, there are
about 4,600 packages developed and released by users (http://rosindex.github.io/stats/).
Metapackage
A metapackage7 is a set of packages that have a common purpose. For example, the Navigation
metapackage consists of 10 packages including AMCL, DWA, EKF, and map_server.
4 http://wiki.ros.org/Nodes
5 http://wiki.ros.org/ROS/TCPROS
6 http://wiki.ros.org/Packages
7 http://wiki.ros.org/Metapackages
TCPROS and UDPROS communication protocol is used for message delivery. Topic is used in
unidirectional message delivery while service is used in bidirectional message delivery that
request and response are involved.
Topic
The topic9 is literally like a topic in a conversation. The publisher node first registers its topic
with the master and then starts publishing messages on a topic. Subscriber nodes that want to
receive the topic request information of the publisher node corresponding to the name of the
topic registered in the master. Based on this information, the subscriber node directly connects
to the publisher node to exchange messages as a topic.
8 http://wiki.ros.org/Messages
9 http://wiki.ros.org/Topics
Service
The service10 is synchronous bidirectional communication between the service client that
requests a service regarding a particular task and the service server that is responsible for
responding to requests.
Service Server
The ‘service server’ is a server in the service message communication that receives a request as
an input and transmits a response as an output. Both request and response are in the form of
messages. Upon the service request, the server performs the designated service and delivers the
result to the service client as a response. The service server is implemented in the node that
receives and executes a given request.
Service Client
The ‘service client’ is a client in the service message communication that requests service to the
server and receives a response as an input. Both request and response are in the form of message.
The client sends a request to the service server and receives the response. The service client is
implemented in the node which requests specified command and receives results.
Action
The action11 is another message communication method used for an asynchronous bidirectional
communication. Action is used where it takes longer time to respond after receiving a request
and intermediate responses are required until the result is returned. The structure of action file
is also similar to that of service. However, feedback data section for intermediate response is
added along with goal and result data section which are represented as request and response in
service respectively. There are action client that sets the goal of the action and action server that
performs the action specified by the goal and returns feedback and result to the action client.
Action Server
The ‘action server’ is in charge of receiving goal from the client and responding with feedback
and result. Once the server receives goal from the client, it performs predefined process.
10 http://wiki.ros.org/Services
11 http://wiki.ros.org/actionlib
Parameter
The parameter12 in ROS refers to parameters used in the node. Think of it as *.ini configuration
files in Windows program. Default values are set in the parameter and can be read or written if
necessary. In particular, it is very useful when configured values can be modified in real-time.
For example, you can specify settings such as USB port number, camera calibration parameters,
maximum and minimum values of the motor speed.
Parameter Server
When parameters are called in the package, they are registered with the parameter server13
which is loaded in the master.
Catkin
The catkin14 refers to the build system of ROS. The build system basically uses CMake (Cross
Platform Make), and the build environment is described in the ‘CMakeLists.txt’ file in the
package folder. CMake was modified in ROS to create a ROS-specific build system. Catkin started
the alpha test from ROS Fuerte and the core packages began to switch to Catkin in the ROS
Groovy version. Catkin has been applied to most packages in the ROS Hydro version. The Catkin
build system makes it easy to use ROS-related builds, package management, and dependencies
among packages. If you are going to use ROS at this point, you should use Catkin instead of ROS
build (rosbuild).
ROS Build
The ROS build (rosbuild)15 is the build system that was used before the Catkin build system.
Although there are some users who still use it, this is reserved for compatibility of ROS, therefore,
it is officially not recommended to use. If an old package that only supports the rosbuild must be
used, we recommend using it after converting rosbuild to catkin.
12 http://wiki.ros.org/Parameter%20Server#Parameters
13 http://wiki.ros.org/Parameter%20Server
14 http://wiki.ros.org/catkin
15 http://wiki.ros.org/rosbuild
rosrun
rosrun17 is the basic execution command of ROS. It is used to run a single node in the package.
The node uses the ROS_HOSTNAME environment variable stored in the computer on which the
node is running as the URI address, and the port is set to an arbitrary unique value.
roslaunch
While rosrun is a command to execute a single node, roslaunch18 in contrast executes multiple
nodes. It is a ROS command specialized in node execution with additional functions such as
changing package parameters or node names, configuring namespace of nodes, setting ROS_
ROOT and ROS_PACKAGE_PATH, and changing environment variables19 when executing nodes.
roslaunch uses the ‘*.launch’ file to define which nodes to be executed. The file is based on
XML (Extensible Markup Language) and offers a variety of options in the form of XML tags.
bag
The data from the ROS messages can be recorded. The file format used is called bag20, and ‘*.bag’
is used as the file extension. In ROS, bag can be used to record messages and play them back
when necessary to reproduce the environment when messages are recorded. For example, when
performing a robot experiment using a sensor, sensor values are stored in the message form
using the bag. This recorded message can be repeatedly loaded without performing the same
test by playing the saved bag file. Record and play functions of rosbag are especially useful when
developing an algorithm with frequent program modifications.
16 http://wiki.ros.org/roscore
17 http://wiki.ros.org/rosbash#rosrun
18 http://wiki.ros.org/roslaunch
19 http://wiki.ros.org/ROS/EnvironmentVariables
20 http://wiki.ros.org/Bags
Repository
An open package specifies repository in the Wiki page. The repository is a URL address on the
web where the package is saved. The repository manages issues, development, downloads, and
other features using version control systems such as svn, hg, and git. Many of currently available
ROS packages are using GitHub21 as repositories for source code. In order to view the contents of
the source code for each package, check the corresponding repository.
Graph
The relationship between nodes, topics, publishers, and subscribers introduced above can be
visualized as a graph. The graphical representation of message communication does not include
the service as it only happens one time. The graph can be displayed by running the ‘rqt_graph’
node in the ‘rqt_graph’ package. There are two execution commands, ‘rqt_graph’ and ‘rosrun
rqt_graph rqt_graph’.
Name
Nodes, parameters, topics, and services all have names22. These names are registered on the
master and searched by the name to transfer messages when using the parameters, topics, and
services of each node. Names are flexible because they can be changed when being executed,
and different names can be assigned when executing identical nodes, parameters, topics, and
services multiple times. Use of names makes ROS suitable for large-scale projects and complex
systems.
Client Library
ROS provides development environments for various languages by using client library23 in order
to reduce the dependency on the language used. The main client libraries are C++, Python, Lisp,
and other languages such as Java, Lua, .NET, EusLisp, and R are also supported. For this purpose,
client libraries such as roscpp, rospy, roslisp, rosjava, roslua, roscs, roseus, PhaROS, and rosR
have been developed.
21 http://www.github.com/
22 http://wiki.ros.org/Names
23 http://wiki.ros.org/Client%20Libraries
MD5
MD5 (Message-Digest algorithm 5)24 is a 128-bit cryptographic hash function. It is used primarily
to verify data integrity, such as checking whether programs or files are in its unmodified original
form. The integrity of the message transmission/reception in ROS is verified with MD5.
RPC
RPC (Remote Procedure Call)25 stands for the function that calls a sub procedure on a remote
computer from another computer in the network. RPC uses protocols such as TCP/IP and IPX,
and allows execution of functions or procedures without having the developer to write a program
for remote control.
XML
XML (Extensible Markup Language) is a broad and versatile markup language that W3C
recommends for creating other special purpose markup languages. XML utilizes tags in order to
describe the structure of data. In ROS, it is used in various components such as *.launch, *.urdf,
and package.xml.
XMLRPC
XMLRPC (XML-Remote Procedure Call) is a type of RPC protocol that uses XML as the encoding
format and uses the request and response method of the HTTP protocol which does not maintain
nor check the connection. XMLRPC is a very simple protocol, used only to define small data
types or commands. As a result, XMLRPC is very lightweight and supports a variety of
programming languages, making it well suited for ROS, which supports a variety of hardware
and languages.
TCP/IP
TCP stands for Transmission Control Protocol. It is often called TCP/IP. The Internet protocol
layer guarantees data transmission using TCP, which is based on the IP (Internet Protocol) layer
in the Internet Protocol Layers. It guarantees the sequential transmission and reception of data.
24 https://en.wikipedia.org/wiki/Md5sum
25 http://wiki.ros.org/ROS/Technical%20Overview
CMakeLists.txt
Catkin, which is the build system of ROS, uses CMake by default. The build environment is
specified in the ‘CMakeLists.txt’26 file in each package folder.
package.xml
An XML file27 contains package information that describes the package name, author, license,
and dependent packages.
As described in Chapter 2, ROS is developed in unit of nodes, which is the minimum unit of
executable program that has broken down for the maximum reusability. The node exchanges
data with other nodes through messages forming a large program as a whole. The key concept
here is the message communication methods among nodes. There are three different methods
of exchanging messages: a topic which provides a unidirectional message transmission/
reception, a service which provides a bidirectional message request/response and an action
which provides a bidirectional message goal/result/feedback. In addition, the parameters used
in the node can be modified from the outside of node. This can also be considered as a type of
message communication in the larger context. Message communication is illustrated in Figure
4-1 and the differences are summarized in Table 4-1. It is important to use each topic, service,
action, and parameter according to its correct purpose when programming on ROS.
26 http://wiki.ros.org/catkin/CMakeLists.txt
27 http://wiki.ros.org/catkin/package.xml
Service Synchronous Bi-directional Used when request processing requests and responds
current states
Action Asynchronous Bi-directional Used when it is difficult to use the service due to
long response times after the request or when an
intermediate feedback value is needed
4.2.1. Topic
Communication on topic uses the same type of message for both publisher and subscriber as
shown in Figure 4-2. The subscriber node receives the information of publisher node
corresponding to the identical topic name registered in the master. Based on this information,
the subscriber node directly connects to the publisher node to receive messages. For example, if
the current position of the robot is generated in the form of odometry28 information by
calculating the encoder values of both wheels of the mobile robot, the asynchronous odometry
information can be continuously transmitted in unidirectional flow using a topic message(x, y,
28 http://wiki.ros.org/navigation/Tutorials/RobotSetup/Odom
4.2.2. Service
Communication on service is a bidirectional synchronous communication between the service
client requesting a service and the service server responding to the request as shown in Figure
4-3. The aforementioned ‘publish’ and ‘subscribe’ of the topic is an asynchronous method which
is advantageous on periodical data transmission. On the other hands, there is a need for
synchronous communication which uses request and response. Accordingly, ROS provides a
synchronized message communication method called ‘service’.
A service consists of a service server that responds only when there is a request and a service
client that can send requests as well as receiving responses. Unlike the topic, the service is one-
time message communication. Therefore, when the request and response of the service are
completed, the connection between two nodes will be disconnected. A service is often used to
command a robot to perform a specific action or nodes to perform certain events with a specific
condition. Service does not maintain the connection, so it is useful to reduce the load of the
network by replacing topic. For example, if the client requests the server for the current time as
shown in Figure 4-3, the server will check the time and respond to the client, and the connection
is terminated.
4.2.3. Action
Communication on action29 is used when a requested goal takes a long time to be completed,
therefore progress feedback is necessary. This is very similar to the service where ‘goals’ and
‘results’ correspond to ‘requests’ and ‘responses’ respectively. In addition, the ‘feedback’ is added
to report feedbacks to the client periodically when intermediate values are needed. The message
transmission method is the same as the asynchronous topic. The feedback transmits an
asynchronous bidirectional message between the action client which sets the goal of the action
and an action server that performs the action and sends the feedback to the action client. For
example, as shown in Figure 4-4, if the client sets home-cleaning tasks as a goal to the server, the
server informs the user of the progress of the dishwashing, laundry, cleaning, etc. in the form of
feedback, and finally sends the final message to the client as a result. Unlike the service, the
action is often used to command complex robot tasks such as canceling transmitted goal while
the operation is in progress.
29 http://wiki.ros.org/actionlib
The publisher, the subscriber, the service server, the service client, the action server, and the
action client can be implemented in separate nodes. In order to exchange messages among
these nodes, the connection has to be established first with the help of a master. A master acts
like a name server as it keeps names of nodes, topics, services and action as well as the URI
address, port number and parameters. In other words, nodes register their own information
with the master upon launch, and acquire relative information of other nodes from the master.
Then, each node directly connects to each other to perform message communication. This is
shown in Figure 4-5.
Although parameters are not strictly a message communication method, I think that they
belong to the scope of message communication in that they use messages. For example, you can
change parameters to set the USB port to connect to, get the camera color correction value, and
configure the maximum and minimum values of the speed and commands.
$ roscore
TCPROS Connection
The subscriber node creates a client for the publisher node using TCPROS, and connects to the
publisher node. At this point, the communication between nodes uses TCP/IP based protocol
called TCPROS.
■■ Service Server: Receive a service, execute the specified task, and return a response
The connection between the service server and the client is the same as the TCPROS
connection for the publisher and subscriber described above. Unlike the topic, the service
terminates connection after successful request and response. If additional request is necessary,
the connection procedure must be carried out again.
We previously tested ROS with ‘turtlesim’. In this test, the master and two nodes were used,
and the ‘/turtle1/cmd_vel’ topic was used between the two nodes to pass the translational and
rotational messages to the virtual TurtleBot. Putting this in perspective with the ROS concept
described above, it can be represented as in Figure 4-16.
fieldtype1 fieldname1
fieldtype2 fieldname2
fieldtype3 fieldname3
The ROS data type shown in Table 4-2 can be used in the field type. The following example is
the simplest form of message, and you can use an array for the field type as shown in Table 4-3.
Embedded messages in the message are also commonly used.
int32 x
int32 y
ROS Data Type Serialization C++ Data Type Python Data Type
30 http://wiki.ros.org/msg
Table 4-2 B
asic data types for messages in ROS, serialization methods, corresponding C ++ and Python
data types
ROS Data Type Serialization C++ Data Type Python Data Type
Table 4-3 How to use ROS message data types as an array, corresponding C ++ and Python data types
The header (std_msgs/Header), which is commonly used in ROS, can also be used as a
message. The Header.msg file in std_msgs31 contains the sequence ID, time stamp, and frame
ID, and use them to probe the message or measure the time.
std_msgs/Header.msg
# Sequence ID: Messages are sequentially incremented by 1.
uint32 seq
# Timestamp: Has two child attributes, the stamp.sec for second and the stamp.nsec for
nanosecond.
time stamp
# Stores the Frame ID
string frame_id
The following shows how actually to use a message in the ROS program. For example, in the
case of the ‘teleop_turtle_key’ node of the turtlesim package, which we tested in Chapter 3, the
translational speed (meter/sec) and rotational speed (radian/sec) is sent as a message to the
turtlesim node according to the directional keys (←, →, ↑, ↓) entered from the keyboard. The
TurtleBot moves on the screen using the received speed values. The message used at this time is
the ‘twist’32 message in ‘geometry_msgs’.
31 http://wiki.ros.org/std_msgs
32 http://docs.ros.org/api/geometry_msgs/html/msg/Twist.html
In the message structure above, ‘linear’ and ‘angular’ values are declared as a Vector3 type.
This is the similar form to the nested message as the Vector3 is a message type in the ‘geometry_
msgs’33. The Vector334 contains the following data.
float64 x
float64 y
float64 z
In other words, six topics published from the ‘teleop_turtle_key’ node are linear.x, linear.y,
linear.z, angular.x, angular.y, and angular.z. All of these are float64 type which is one of the basic
data types described in ROS. With these data, arrow keys of the keyboard can be converted to the
translational speed (meter/sec) and the rotational speed (radian/sec) message, so that the
TurtleBot could be controlled.
The topic, service, and action described in the previous section use messages. Although they
are similar in the form and the concept, they are divided into three types according to their
usage. This will be discussed in more detail in the following section.
geometry_msgs/Twist.msg
Vector3 linear
Vector3 angular
33 http://docs.ros.org/api/geometry_msgs/html/index-msg.html
34 http://docs.ros.org/api/geometry_msgs/html/msg/Vector3.html
35 http://docs.ros.org/api/geometry_msgs/html/msg/Twist.html
36 http://docs.ros.org/api/sensor_msgs/html/srv/SetCameraInfo.html
sensor_msgs/SetCameraInfo.srv
sensor_msgs/CameraInfo camera_info
---
bool success
string status_message
geometry_msgs/PoseStamped start_pose
geometry_msgs/PoseStamped goal_pose
---
geometry_msgs/PoseStamped result_pose
---
float32 percent_complete
37 http://wiki.ros.org/actionlib_msgs
38 http://wiki.ros.org/actionlib
The topic is usually declared as shown in the following code. This will be covered in more
detail in Chapter 7. Here, let’s modify the topic’s name in order to understand how to use names.
In the above example, the name of the node is ‘/node1’. If the publisher is declared as a ‘bar’
without any symbols, the topic will have the relative name ‘/bar’. Even if the slash(/) character is
used to declare in global, the topic name will still be ‘/bar’.
However, if you declare the name as private using the tilde(~) character, the topic name
becomes ‘/node1/bar’.
The declaration of the name can vary as shown in Table 4-4. The ‘/wg’ means a change of the
namespace. This is discussed in more detail in the next section.
39 http://wiki.ros.org/ROS/Concepts
40 http://wiki.ros.org/Names
How can two cameras be run? Simply executing the related node twice will terminate the
previously executed node, due to the fact that there must be a unique name in ROS. However,
achieving two cameras does not require you to run a separate program or change the source
code. Simply change the name of the node when running it by using either namespaces or
remapping.
To help you understand, suppose you have a virtual ‘camera_package’. Suppose that the
camera node is executed when the ‘camera_node’ of ‘camera_package’ is executed, the way to
run this is as follows.
If ‘camera_node’ transmits the image data of the camera via the image topic, this image
topic can be received with ‘rqt_image_view’ as follows
Now let’s modify the topic values of these nodes by remapping. The following command will
change the topic name as ‘/front/image’. In the below command, the ‘image’ is the topic name of
‘camera_node’ and the below example shows how to change the topic name by setting options in
execution commands.
For example, if there are three cameras, such as front, left, and right, when the multiple
nodes are executed under the same name, there will be conflicted names and therefore the
previously executed node gets terminated. Therefore, nodes with the same name can be
executed in the following way. Below, the name option is followed by consecutive underscores(__).
Options such as ‘__ns’, ‘__name’, ‘__log’, ‘__ip’, ‘__hostname’, and ‘__master’ are special options
used when running the node. Also, single underscore(_) is placed in front of the topic name if it
is used as a private.
The following example will bind the nodes and topics into a single namespace. This ensures
that all nodes and topics are grouped into a single namespace and all names are changed
accordingly.
We have seen various uses of names. Names support the ability to seamlessly connect the
ROS system as a whole. In this section, we learned how to change the name value using the node
command ‘rosrun’. Similarly, by using ‘roslaunch’, it is possible to execute these options at once.
This will be discussed in more detail in Chapter 7 with examples.
41 http://wiki.ros.org/geometry/CoordinateFrameConventions
42 http://wiki.ros.org/tf
In ROS, the coordinate transformation TF is one of the most useful concepts when describing
the robot parts as well as obstacles and objects. The pose can be described as a combination of
positions and orientations. Here, the position is expressed by three vectors x, y, and z, and the
orientation by four vectors x, y, z, and w, called a quaternion. The quaternion is not intuitive
because they do not describe the rotation of three axes (x, y, z), such as the roll, pitch, and yaw
angles that are often used. However, the quaternion form is free from the gimbal lock or speed
issues that present in the Euler method of roll, pitch and yaw vectors. Therefore, the quaternion
type is preferred in robotics, and ROS also uses quaternion for this reason. Of course, functions
to convert Euler values to quaternions are provided for convenience.
TF uses the message structure43 shown below. The Header is used to record the converted
time, and a message named ‘child_frame_id’ is used to specify the child coordinates. The relative
position and orientation are described in the following data form: transform.translation.x /
transform.translation.y / transform.translation.z / transform.rotation.x / transform.rotation.y /
transform.rotation.z / transform.rotation.w
43 http://docs.ros.org/api/geometry_msgs/html/msg/TransformStamped.html
A brief description of TF was given. More detailed TF examples will be explained in the
modeling part of mobile robots (Chapter 10, 11) and manipulators (Chapter 13).
This book will focus on ‘roscpp’ for the C++ language. Even if different languages are used,
the concepts are the same with different programming syntax. Therefore, developers can use
the language most suitable for the purpose by referring to the wiki of each client library.
The user’s workspace can be create wherever the user wants, but let’s create it in the Linux
user folder of ‘~/catkin_ws/ (‘~/’ is the ‘/home/user/’ folder in the Linux). Next, let’s learn about
the ROS installation folder and workspace.
There are two methods to install ROS packages. The first is to install the packages provided in the
binary form which can be executed immediately without a build process. The second is for the user to
download the source code of the package and build it before installation. These methods are used in
different purposes. If you would like to modify the package or check the contents of the source code,
you can use the latter installation method. The following is an example of a TurtleBot3 package and
describes the differences between the two installation methods.
1. Binary Installation
$ cd ~/catkin_ws/src
$ cd ~/catkin_ws/
$ catkin_make
File Configuration
As shown in Figure 4-20, there is a folder called ‘catkin_ws’ under the ‘/home/username/’ folder,
and it consists of build, devel, and src folders. Note that the build and devel folders are created
after catkin_make.
User Package
The ‘~/catkin_ws/src’ folder is the space for the user source code. In this folder, you can save and
build your own ROS packages or packages developed by other developers. The ROS build system
will be described in detail in the next section. Figure 4-21 below shows the state after completing
the ‘ros_tutorials_topic’ package. It describes folders and files that are commonly used, although
the configuration can vary depending on the purpose of the package.
The reason for using CMake on ROS is to allow the ROS package to be built on multiple
platforms. Unlike Make, which relies only on Unix-based systems, CMake supports Windows, as
well as Unix-based systems of Linux, BSD, and OS X. It also supports Microsoft Visual Studio and
can be easily applied to Qt development. Furthermore, the Catkin build system makes it easy to
use ROS-related builds, package management, and dependencies between packages.
‘catkin_create_pkg’ command creates a package folder that contains the ‘CMakeLists.txt’ and
‘package.xml’ files necessary for the Cake build system. Let’s create a simple package to help you
understand. First, open a new terminal window (Ctrl + Alt + t) and run the following command
to move to the workspace folder.
$ cd ~/catkin_ws/src
The package name to be created is ‘my_first_ros_pkg’. Package names in ROS should all be
lowercase and must not contain spaces. The naming guideline also uses an underscore(_)
between each word instead of a dash(-) or a space. See the relevant pages for coding style guide44
45
and naming conventions in ROS. Now, let’s create a package named ‘my_first_ros_pkg’ with
the following command:
‘std_msgs’ and ‘roscpp’ were added as optional dependent packages in the previous
command. This means that the ‘std_msgs’, which is a standard message package of ROS, and the
‘roscpp’, which is a client library necessary to use C/C++ in ROS, must be installed prior to the
44 http://wiki.ros.org/CppStyleGuide
45 http://wiki.ros.org/PyStyleGuide
Once the package is created, ‘my_first_ros_pkg’ package folder will be created in the ‘~/
catkin_ws/src’ folder, along with the default internal folder that the ROS package should have,
and the ‘CMakeLists.txt’ and ‘package.xml’ files. The contents can be checked with the ‘ls’
command as below, and the inside of the package can be checked using the GUI-based tool
Nautilus which acts like Window Explorer.
$ cd my_first_ros_pkg
$ ls
include → Include Folder
src → Source Code Folder
CMakeLists.txt → Build Configuration File
package.xml → Package Configuration File
Figure 4-22 Automatically Created Files and Folders when Creating a New Package
package.xml
<?xml version="1.0"?>
<package>
<name>my_first_ros_pkg</name>
<version>0.0.0</version>
<description>The my_first_ros_pkg package</description>
<!-- One license tag required, multiple allowed, one license per tag -->
<!-- Commonly used license strings: -->
<!-- BSD, MIT, Boost Software License, GPLv2, GPLv3, LGPLv2.1, LGPLv3 -->
<license>TODO</license>
<!-- Url tags are optional, but mutiple are allowed, one per tag -->
<!-- Optional attribute type can be: website, bugtracker, or repository -->
<!-- Example: -->
<!-- <url type="website">http://wiki.ros.org/my_first_ros_pkg</url> -->
<!-- Author tags are optional, mutiple are allowed, one per tag -->
<!-- Authors do not have to be maintianers, but could be -->
<!-- Example: -->
<!-- <author email="[email protected]">Jane Doe</author> -->
</export>
</package>
■■ <?xml> This tag indicates that the contents in the document abide by the
XML Version 1.0.
■■ <package> This tag is paired with </package> tag to indicate the configuration
part of the ROS package configuration part.
■■ <name> This tag indicates the package name. The package name entered
when creating the package is used. The name of the package can be
changed by the developer.
■■ <version> This tag indicates the package version. The developer can assign the
version of the package.
■■ <description> A short description of the package. Usually 2-3 sentences.
■■ <license> This tag indicates the license, such as BSD, MIT, Apache, GPLv3,
LGPLv3.
■■ <url> This tag indicates address of the webpage describing the package, or
bug management, repository, etc. Depending on the type, you can
assign it as a website, bugtracker, or repository.
■■ <author> The name and email address of the developer who participated in the
package development. If multiple developers were involved, append
multiple <author> tags to the following lines.
■■ <buildtool_depend> Describes the dependencies of the build system. As we are using the
Catkin build system, write ‘catkin’.
■■ <build_depend> Dependent package name when building the package.
■■ <export> It is used when using a tag name that is not specified in ROS. The
most widely used case is for metapackages. In this case, use <export>
<metapackage/></export> to notify that the package is a metapackage.
■■ <metapackage> The official tag used within the export tag that declares the current
package as a metapackage.
package.xml
<?xml version="1.0"?>
<package>
<name>my_first_ros_pkg</name>
<version>0.0.1</version>
<description>The my_first_ros_pkg package</description>
<license>Apache License 2.0</license>
<author email="[email protected]">Yoonseok Pyo</author>
<maintainer email="[email protected]">Yoonseok Pyo</maintainer>
<url type="bugtracker">https://github.com/ROBOTIS-GIT/ros_tutorials/issues</url>
<url type="repository">https://github.com/ROBOTIS-GIT/ros_tutorials.git</url>
<url type="website">http://www.robotis.com</url>
<buildtool_depend>catkin</buildtool_depend>
<build_depend>std_msgs</build_depend>
<build_depend>roscpp</build_depend>
<run_depend>std_msgs</run_depend>
<run_depend>roscpp</run_depend>
<export></export>
</package>
CMakeLists.txt
cmake_minimum_required(VERSION 2.8.3)
project(my_first_ros_pkg)
################################################
## Declare ROS messages, services and actions ##
################################################
## Generate added messages and services with any dependencies listed here
# generate_messages(
# DEPENDENCIES
# std_msgs
# )
################################################
## Declare ROS dynamic reconfigure parameters ##
################################################
###################################
## catkin specific configuration ##
###################################
## The catkin_package macro generates cmake config files for your package
## Declare things to be passed to dependent projects
## INCLUDE_DIRS: uncomment this if you package contains header files
## LIBRARIES: libraries you create in this project that dependent projects also need
## CATKIN_DEPENDS: catkin_packages dependent projects also need
## DEPENDS: system dependencies of this project that dependent projects also need
catkin_package(
# INCLUDE_DIRS include
# LIBRARIES my_first_ros_pkg
# CATKIN_DEPENDS roscpp std_msgs
# DEPENDS system_lib
)
###########
## Build ##
###########
#############
## Install ##
#############
## Mark other files for installation (e.g. launch and bag files, etc.)
# install(FILES
# # myfile1
# # myfile2
# DESTINATION ${CATKIN_PACKAGE_SHARE_DESTINATION}
# )
#############
## Testing ##
#############
Options in the build configuration file (CMakeLists.txt) are as follows. The below describes
the minimum required version of ‘cmake’ installed on the operating system. Since it is currently
specified as version 2.8.3, if you use a lower version of Cmake than this, you need to update the
‘cmake’ to meet the minimum requirement.
cmake_minimum_required(VERSION 2.8.3)
The project describes the name of the package. Use the package name entered in ‘package.
xml’. Note that if the package name is different from the package name described in the <name>
tag in ‘package.xml’, an error will occur when building the package.
project(my_first_ros_pkg)
The following is a method used when using packages other than ROS. For example, when
using Boost, the ‘system’ package must be installed beforehand. This feature is an option that
allows you to install dependent packages.
The ‘catkin_python_setup()’ is an option when using Python with ‘rospy’. It invokes the
Python installation process ‘setup.py’.
catkin_python_setup()
‘add_message_files’ is an option to add a message file. The ‘FILES’ option will automatically
generate a header file (*.h) by referring to the ‘.msg’ files in the ‘msg’ folder of the current
package. In this example, message files Message1.msg and Message2.msg are used.
add_message_files(
FILES
Message1.msg
Message2.msg
)
‘add_service_files’ is an option to add a service file to use. The ‘FILES’ option will refer to ‘.srv’
files in the ‘srv’ folder in the package. In this example, you have the option to use the service files
Service1.srv and Service2.srv.
add_service_files(
FILES
Service1.srv
Service2.srv
)
generate_messages(
DEPENDENCIES
std_msgs
)
generate_dynamic_reconfigure_options(
cfg/DynReconf1.cfg
cfg/DynReconf2.cfg
)
The following are the options when performing a build on Catkin. ‘INCLUDE_DIRS’ is a
setting that specifies to use the header file in the ‘include’ folder, which is the internal folder of
the package. ‘LIBRARIES’ is a setting used to specify the package library in the following
configuration. ‘CATKIN_DEPENDS’ specifies dependent packages and in this example, the
dependent packages are set to ‘roscpp’ and ‘std_msgs’. ‘DEPENDS’ is a setting that describes
system-dependent packages.
catkin_package(
INCLUDE_DIRS include
LIBRARIES my_first_ros_pkg
CATKIN_DEPENDS roscpp std_msgs
DEPENDS system_lib
)
include_directories(
${catkin_INCLUDE_DIRS}
)
‘add_library’ declares the library to be created after the build. The following option will
create ‘my_first_ros_pkg’ library from ‘my_first_ros_pkg.cpp’ file in the ‘src’ folder.
‘add_dependencies’ is a command to perform certain tasks prior to the build process such as
creating dependent messages or dynamic reconfigurations. The following options describe the
creation of dependent messages and dynamic reconfiguration, which are the dependencies of
the ‘my_first_ros_pkg’ library.
‘add_executable’ specifies the executable to be created after the build. The option specifies
the system to refer to the ‘src/my_first_ros_pkg_node.cpp’ file to generate the ‘my_first_ros_pkg_
node’ executable file. If there are multiple ‘*.cpp’ files to be referenced, append them after ‘my_
first_ros_pkg_node.cpp’. If there are two or more executable files to be created, add an additional
‘add_executable’ entry.
add_executable(my_first_ros_pkg_node src/my_first_ros_pkg_node.cpp)
‘target_link_libraries’ is an option that links libraries and executables that need to be linked
before creating an executable file.
target_link_libraries(my_first_ros_pkg_node
${catkin_LIBRARIES}
)
In addition, the Install option used when creating the official distribution ROS package and
the testing option used for the package test is provided.
CMakeLists.txt
cmake_minimum_required(VERSION 2.8.3)
project(my_first_ros_pkg)
find_package(catkin REQUIRED COMPONENTS roscpp std_msgs)
catkin_package(CATKIN_DEPENDS roscpp std_msgs)
include_directories(${catkin_INCLUDE_DIRS})
add_executable(hello_world_node src/hello_world_node.cpp)
target_link_libraries(hello_world_node ${catkin_LIBRARIES})
add_executable(hello_world_node src/hello_world_node.cpp)
This is the setting to create the executable ‘hello_world_node’ by referring to the ‘hello_
world_node’ source code in the ‘src’ folder of the package. As ‘hello_world_node.cpp’ source
code has to be manually created and written by developer, let’s write a simple example.
First, move to the source code folder (src) in your package folder by using ‘cd’ command and
create the ‘hello_world_node.cpp’ file as shown below. This example uses the gedit editor, but
you can use your preferred editor, such as vi, gedit, qtcreator, vim, or emacs.
$ cd ~/catkin_ws/src/my_first_ros_pkg/src/
$ gedit hello_world_node.cpp
hello_world_node.cpp
#include <ros/ros.h>
#include <std_msgs/String.h>
#include <sstream>
while (ros::ok())
{
std_msgs::String msg;
std::stringstream ss;
ss << "hello world!" << count;
msg.data = ss.str();
ROS_INFO("%s", msg.data.c_str());
chatter_pub.publish(msg);
ros::spinOnce();
loop_rate.sleep();
++count;
}
return 0;
}
$ rospack profile
The following is a Catkin build. Go to the Catkin workspace and build the package.
As mentioned in Section 3.2 Setting Up the ROS Development Environment, if you set alias cm=’cd
~/catkin_ws && catkin_make’ in the ‘.bashrc’ file, you can replace the above command with ‘cm’
command in the terminal window. As it is very useful, make sure to set it by referring to the ROS
Development Environment setup section.
The next step is to run the node. Open a terminal window (Ctrl + Alt + t) and run roscore
before running the node. Note that roscore must be running in order to execute ROS nodes, and
roscore only needs to be run once unless it stops.
$ roscore
Finally, open a new terminal window (Ctrl + Alt + t) and run the node with the command
below. This is a command to run a node called ‘hello_world_node’ in a package named ‘my_first_
ros_pkg’.
When the node is running, messages such as ‘hello world!0,1,2,3 ...’ can been viewed in the
terminal window in strings. This is not an actual message transfer in ROS, but it can be seen as
a result of the build system example discussed in this section. Since this section is intended to
describe the build system of ROS, the source code for messages and nodes will be discussed in
more detail in the following chapters.
ROS
Commands
ROS Wiki
The ROS commands are explained in detail on the Wiki page ‘http://wiki.ros.org/ROS/Command
LineTools’. In addition, the GitHub repository ‘https://github.com/ros/cheatsheet/releases’ summarizes
important commands described in this chapter. It will be a helpful reference to utilize along with the
descriptions in this chapter.
When using ROS, we enter commands in a Shell environment to perform tasks such as using file
systems, editing, building, debugging source codes, package management, etc. In order to use
ROS properly, we will need to familiarize ourselves with not only the basic Linux commands but
also with the ROS specific commands.
In order to become proficient with the various commands used for ROS, we will give a brief
description of the functions of each command, and introduce them one by one with examples.
When introducing a command, each one is ranked with three stars based on their frequency of
use and importance. You may need some time to get used to the commands, but the more you
use them, you will soon find yourself using each kind of ROS functions quickly and easily using
these commands.
From these, we will look into the commands ‘roscd’, ‘rosls’, and ‘rosed’ which are used
frequently.
1 http://wiki.ros.org/rosbash
In order to use ROS shell commands, ‘rosbash’ must be installed using the following command,
and can only be used in a terminal window that has configured ‘source /opt/ros/<ros distribution>/
setup.bash’. This does not have to be installed separately, and if you have finished building the ROS
development environment from Chapter 3 then you will be able to use it.
$ sudo apt-get install ros-<ros distribution>-rosbash
roscd [PACKAGE_NAME]
This is a command to move to the directory where the package is saved. The basic instruction
is to type the ‘roscd’ command followed by the package name as a parameter. In the following
example, the turtlesim package is in the folder where ROS is installed so we get the following
result, but if you put the name of a package that you created (for example, my_first_ros_pkg
created in Chapter 4) as a parameter, then it moves to the folder of the package that you have
designated. This is a command that is frequently used in the command-based ROS.
$ roscd turtlesim
/opt/ros/kinetic/share/turtlesim $
$ roscd my_first_ros_pkg
~/catkin_ws/src/my_first_ros_pkg $
The ros-kinetic-turtlesim package must be installed first in order to get the identical result as
shown above. If it is not installed yet, you can install it with the following command.
If the package is already installed, you will see the message that says the package is already
installed as shown below.
rosls [PACKAGE_NAME]
This is a command to check the file list of the specific ROS package. We can use the ‘roscd’
command to move to the corresponding package folder and then use the ‘ls’ command to
perform the same function, but this command is used occasionally when we need to check
without moving to the package directory.
$ rosls turtlesim
cmake images msg srv package.xml
This is a command used to edit a specific file in the package. If you run this command, it
opens the corresponding file with the editor that the user has set up. This is often used when you
want to quickly make a simple modification. The user can assign which editor will be used by
editing the command export EDITOR = ‘emacs -nw’ in your ‘~/.bashrc’ file. As previously
mentioned, this command is used for simple tasks that need to modify directly in the command
window, and it is not recommended for complex tasks. It is not a command that is used often.
roscore [OPTION]
Roscore is the master that manages the connection information for communication among
nodes, and is an essential element that must be the first to be launched to use ROS. The ROS
master is launched by the ‘roscore’ command, and runs as an XMLRPC server. The master
registers node information such as names, topics and service names, message types, URI
addresses and ports, and when there is a request this information is passed to other nodes. Upon
the launch of ‘roscore’, ‘rosout’ is executed as well, which is used to record ROS standard output
logs such as DEBUG, INFO, WARN, ERROR, FATAL, etc. Also the parameter server which
manages the parameters is executed.
When roscore is running, the URI configured in the ROS_MASTER_URI is set as the master
URI to run the master. ROS_MASTER_URI can be configured by the user in ‘~/.bashrc’ file as
mentioned in the ROS Configuration section in Chapter 3.
$ roscore
... logging to /home/pyo/.ros/log/c2d0b528-6536-11e7-935b-08d40c80c500/roslaunch-pyo-20002.log
Checking log directory for disk usage. This may take awhile.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.
SUMMARY
========
PARAMETERS
* /rosdistro: kinetic
* /rosversion: 1.12.7
NODES
auto-starting new master
In the terminal screen, we can see that the logs are saved in the directory ‘/home/xxx/.ros/
log/’. The displayed message also notifies that we can close roscore with [Ctrl+c], the information
of roslaunch server and ROS_MASTER_URI, and that the parameter server of ‘/rosdistro’ and ‘/
rosversion’ and the /rosout node have all been running.
In the above example, it shows that the path where the logs are saved is ‘/home/xxx/.ros/log/’, but
in reality it is saved in the location where the ROS_HOME environment variable is configured. If the
ROS_HOME environment variable has not been configured, then the default value is ‘~/.ros/log/’.
Rosrun is a command that runs only one node in the specified package. The following
example is a command that runs the ‘turtlesim_node’ node in the turtlesim package. For your
information, the turtle icon that appears on the screen is selected randomly.
Roslaunch is a command that executes more than one node in the specified package or sets
execution options. As shown in the following example, simply launching the ‘openni_launch’
package will run more than 20 nodes and more than 10 parameter servers, such as ‘camera_
nodelet_manager’, ‘depth_metric’, ‘depth_metric_rect’, ‘depth_points’, etc. As we can see, using
the launch file is quite useful for running multiple nodes at the same time, and is a frequently
used execution method in ROS. More information about creating the ‘*.launch’ file that is used in
this example will be provided in Section 7.6 Using roslaunch.
Note that in order to run this example and get the same result, the relevant package ‘ros-
kinetic-openni-launch’ must be installed. If it is not installed yet, you can install it with the
following command.
rosclean [OPTION]
This is a command to check or delete the ROS log file. As ‘roscore’ is launched, the history of
all nodes is recorded in the log file and the data accumulates over time, so it needs to be
periodically deleted using the ‘rosclean’ command.
$ rosclean check
320K ROS node logs → This means the total usage for the ROS node is 320KB
When running ‘roscore’, if the following WARNING message appears, it means that the log
file exceeds 1GB. If the system is running out of space for the log, clean up the space with
‘rosclean’ command.
The following is an example of deleting logs in the ROS log repository (it is ‘/home/rt/.ros/log’
in this example). If you wish to delete, press the ‘y’ key to proceed.
$ rosclean purge
Purging ROS node logs.
PLEASE BE CAREFUL TO VERIFY THE COMMAND BELOW!
Okay to perform:
rm -rf /home/pyo/.ros/log
(y/n)?
Run roscore
Close all the terminals to ensure any conflicts with other processes. Then open a new terminal
and run the following command.
$ roscore
In order to run the ‘turtlesim_node’ in the ‘turtlesim’ package, open a new terminal and run
the following command. This will run ‘turtlesim_node’ in the ‘turtlesim’ package. A blue screen
with a random turtle image will appear.
Command Description
rosnode machine [PC_NAME OR IP] Check the list of nodes running on the corresponding PC
rosnode cleanup Delete the registered information of the ghost nodes for
which the connection information cannot be checked
$ rosnode list
/rosout
/teleop_turtle
/turtlesim
In the previous example, when ‘turtlesim_node’ and ‘turtle_teleop_key’ are being executed and yet
‘teleop_turtle’ and ‘turtlesim’ are appearing in the ‘rosnode list’ is because the name of the running
node is different from the actual node name. For example, the ‘turtle_teleop_key’ node is configured
as “ros :: init (argc, argv, “teleop_turtle”);” in the source file. We recommend creating the same name
of the running node and the actual node name.
If there is a problem running the corresponding node or the communication has been
interrupted, the following error message will appear.
rosnode machine [PC_NAME OR IP]: Check the list of nodes running on the
corresponding PC
By using this command, we can see the list of nodes that are running on a specific device (PC
or mobile device).
If we close a node with this command, a warning message will appear on the terminal
window where the corresponding node is running as shown below and the node will be closed.
rosnode cleanup: Delete the registered information of the ghost nodes with
unverified connection information
This command deletes unverified connection information of ghost nodes. When a node
shuts down abnormally due to an unexpected error, this command deletes the corresponding
node from the list that the connection information has been cut. Although this command is not
frequently used, it is a useful command because you do not need to terminate and run ‘roscore’
again to delete up the ghost node.
$ rosnode cleanup
Command Description
rostopic echo [TOPIC_NAME] Show the content of a message in real-time for a specific topic
rostopic find [TYPE_NAME] Show the topics that use specific message type
rostopic hz [TOPIC_NAME] Show the message data publishing period of a specific topic
rostopic pub [TOPIC_NAME] Publish a message with the specific topic name
[MESSAGE_TYPE] [PARAMETER]
Close all the nodes before running the example regarding ROS topic. Then run ‘roscore’,
‘turtlesim_node’ and ‘turtle_teleop_key’ in three different terminal windows by running the
following commands.
$ roscore
$ rosrun turtlesim turtlesim_node
$ rosrun turtlesim turtle_teleop_key
$ rostopic list
/rosout
/rosout_agg
/turtle1/cmd_vel
/turtle1/color_sensor
/turtle1/pose
If you add the ‘-v’ option to the ‘rostopic list’ command, it separates the published topics and
the subscribed topics, and shows the message type for each topic as well.
$ rostopic list -v
Published topics:
* /turtle1/color_sensor [turtlesim/Color] 1 publisher
* /turtle1/cmd_vel [geometry_msgs/Twist] 1 publisher
* /rosout [rosgraph_msgs/Log] 2 publishers
* /rosout_agg [rosgraph_msgs/Log] 1 publisher
* /turtle1/pose [turtlesim/Pose] 1 publisher
Subscribed topics:
* /turtle1/cmd_vel [geometry_msgs/Twist] 1 subscriber
* /rosout [rosgraph_msgs/Log] 1 subscriber
rostopic find [TYPE_NAME]: Show the topics that use a specific message type
$ rostopic bw /turtle1/pose
subscribed to [/turtle1/pose]
average: 1.27KB/s
mean: 0.02KB min: 0.02KB max: 0.02KB window: 62 ...
~ omitted ~
$ rostopic pub -1 /turtle1/cmd_vel geometry_msgs/Twist -- '[2.0, 0.0, 0.0]' '[0.0, 0.0, 1.8]'
publishing and latching message for 3.0 seconds
■■ - 1 Publish the message only once (it runs only once, but it runs for 3 seconds as shown
above).
■■ /turtle1/cmd_vel the specific topic name
■■ - - ‘[2.0, 0.0, 0.0]’ ‘[0.0, 0.0, 1.8]’ moving in the x-axis coordinate with a speed of 2.0m per
second, and with a rotation of 1.8rad per second about the z-axis
Command Description
rosservice call [SERVICE_NAME] [PARAMETER] Request service with the input parameter
Close all nodes before running the example regarding ROS service. Then run ‘roscore’,
‘turtlesim_node’ and ‘turtle_teleop_key’ in different terminal windows by running the following
commands.
$ roscore
$ rosrun turtlesim turtlesim_node
$ rosrun turtlesim turtle_teleop_key
$ rosservice list
/clear
/kill
/reset
/rosout
/get_loggers
/rosout
/set_logger_level
/spawn
/teleop_turtle/get_loggers
/teleop_turtle/set_logger_level
/turtle1/set_pen
/turtle1/teleport_absolute
/turtle1/teleport_relative
/turtlesim/get_loggers
/turtlesim/set_logger_level
Using the command above, we requested for a service that changes the properties of the pen
used in turtlesim, and by ordering a command to move in ‘turtle_teleop_key’, we can see that the
color of pen that was white is now displayed in red as below.
Command Description
Let us close all the nodes before running the example regarding ROS parameter. Then run
‘roscore’, ‘turtlesim_node’ and ‘turtle_teleop_key’ in different terminal windows by running the
following commands.
$ roscore
$ rosrun turtlesim turtlesim_node
$ rosrun turtlesim turtle_teleop_key
$ rosparam list
/background_b
/background_g
/background_r
/rosdistro
/roslaunch/uris/host_192_168_1_100__39536
/rosversion
/run_id
If you want to check the values of all the other parameters apart from a specific parameter,
you can use ‘/’ as an option which will show the values of all the parameters as below.
$ rosparam get /
background_b: 255
background_g: 86
background_r: 69
rosdistro: 'kinetic'
roslaunch:
uris: {host_192_168_1_100__43517: 'http:// 192.168.1.100:43517/'}
rosversion: '1.12.7'
run_id: c2d0b528-6536-11e7-935b-08d40c80c500
RGB is changed from ‘255, 86, 69’ to ‘0, 86, 69’, so the color becomes a dark green as shown in
the picture to the right in Image 5-4. However, the ‘turtlesim’ node does not read and apply the
parameters right away, so we need to first modify the parameters with the command ‘rosparam
set background_b 0’ and then refresh the screen with the command ‘rosservice call clear’. The
application of a parameter changes according to the node.
Command Description
Let us close all the nodes before running the example regarding ROS message information.
Then run ‘roscore’, ‘turtlesim_node’ and ‘turtle_teleop_key’ in different terminal windows by
running the following commands.
$ roscore
$ rosrun turtlesim turtlesim_node
$ rosrun turtlesim turtle_teleop_key
$ rosmsg list
actionlib/TestAction
actionlib/TestActionFeedback
actionlib/TestActionGoal
actionlib/TestActionResult
$ rosmsg packages
actionlib
actionlib_msgs
actionlib_tutorials
base_local_planner
bond
control_msgs
costmap_2d
~omitted~
Command Description
Let us close all the nodes before running the example regarding ROS service information.
Then run ‘roscore’, ‘turtlesim_node’ and ‘turtle_teleop_key’ in different terminal windows by
running the following commands.
$ roscore
$ rosrun turtlesim turtlesim_node
$ rosrun turtlesim turtle_teleop_key
$ rossrv packages
control_msgs
diagnostic_msgs
dynamic_reconfigure
gazebo_msgs
map_msgs
nav_msgs
navfn nodelet
oroca_ros_tutorials
roscpp
sensor_msgs
std_srvs
tf
tf2_msgs
turtlesim
~omitted~
rosbag record [OPTION] [TOPIC_NAME] Record the message of a specific topic on the bsg file
rosbag filter [INPUT_FILE] [OUTPUT_FILE] Create a new bag file with the specific content
[OPTION] removed
rosbag check bag [FILE_NAME] Check if the specific bag file can be played in the
current system
rosbag fix [INPUT_FILE] [OUTPUT_FILE] Fix the bag file version that was saved as an
[OPTION] incompatible version
Let us close all the nodes before running the example regarding ROS log information. Then
run ‘roscore’, ‘turtlesim_node’ and ‘turtle_teleop_key’ in different terminal windows by running
the following commands.
$ roscore
$ rosrun turtlesim turtlesim_node
$ rosrun turtlesim turtle_teleop_key
$ rostopic list
/rosout
/rosout_agg
/turtle1/cmd_vel
/turtle1/color_sensor
/turtle1/pose
If you wish to record all the topics at the same time, then add the ‘-a’ option.
$ rosbag record -a
[WARN] [1499664121.243116836]: --max-splits is ignored without --split
[INFO] [1499664121.248582681]: Recording to 2017-07-10-14-22-01.bag.
[INFO] [1499664121.248879947]: Subscribing to /turtle1/color_sensor
[INFO] [1499664121.252689657]: Subscribing to /rosout
[INFO] [1499664121.257219911]: Subscribing to /rosout_agg
[INFO] [1499664121.260671283]: Subscribing to /turtle1/pose
As in the following figure, we can see that the original data and the data during playback are
the same.
The bag file from the example above is reduced to a quarter as shown below. And the original
file before compression is separately saved with ‘orig’ tag added to its name.
2017-07-10-14-16-28.bag 12.7kB
2017-07-10-14-16-28.orig.bag 45.5kB
catkin_make [OPTION]
$ cd ~/catkin_ws
$ catkin_make
To build just some of the packages and not all of the packages, run with the ‘--pkg [PACKAGE_
NAME]’ option as shown below.
$ cd ~/catkin_ws
$ catkin_eclipse
$ cd ~/catkin_ws/src
$ catkin_init_workspace
catkin_find [PACKAGE_NAME]
By using the ‘catkin_find’ command, we can find out all the working folders we are using.
Additionally, if we run ‘catkin_find PACKAGE_NAME’, it will show the working folders relevant
to the package specified in the option as shown below.
$ catkin_find
/home/pyo/catkin_ws/devel/include
/home/pyo/catkin_ws/devel/lib
/home/pyo/catkin_ws/devel/share
/opt/ros/kinetic/bin
/opt/ros/kinetic/etc
/opt/ros/kinetic/include
/opt/ros/kinetic/lib
/opt/ros/kinetic/share
The ‘rospack’ is a command to show information such as the save location, dependency,
entire package list regarding the specific ROS package, and we can use options such as ‘find’,
‘list’, ‘depends-on’, ‘depends’, ‘profile’, etc. As shown in the example below, if we specify a package
name after the ‘rospack find’ command, the saved location of the package will be shown.
The ‘rospack list’ command shows all the packages in the PC. By combining the ‘rospack list’
command with the Linux search command ‘grep’ we can easily find a package. For instance,
running ‘rospack list | grep turtle’ will only display the packages related to turtle as shown in the
example below.
If we specify a package name after the ‘rospack depends-on’ command, it will only show the
packages that are using the specific package as shown in the following example.
If we specify the package name after the ‘rospack depends’ command, it will show the
dependency packages needed to run the specific package as shown in the following example.
The ‘rospack profile’ command re-indexes the package by checking the package information
and working folders such as ‘/opt/ros/kinetic/share’ or ‘~/catkin_ws/src’ where packages are
saved. You can use this command when a newly added package is not listed by the ‘roscd’
command.
$ rospack profile
Full tree crawl took 0.021790 seconds.
Directories marked with (*) contain no manifest. You may
want to delete these directories.
To get just of list of directories without manifests,
re-run the profile with --zombie-only
-------------------------------------------------------------
0.020444 /opt/ros/kinetic/share
0.000676 /home/pyo/catkin_ws/src
0.000606 /home/pyo/catkin_ws/src/ros_tutorials
0.000240 * /opt/ros/kinetic/share/OpenCV-3.2.0-dev
0.000054 * /opt/ros/kinetic/share/OpenCV-3.2.0-dev/haarcascades
0.000035 * /opt/ros/kinetic/share/doc
0.000020 * /opt/ros/kinetic/share/OpenCV-3.2.0-dev/lbpcascades
0.000008 * /opt/ros/kinetic/share/doc/liborocos-kdl
rosdep [OPTION]
The ‘roslocate’ is a command that shows information such as the ROS version used for the
package, SCM type, repository location, and so on. Available options are ‘info’, ‘vcs’, ‘type’, ‘uri’,
‘repo’, etc. Here we will look at ‘info’, which shows all of this information at once.
ROS
Tools
rqt_graph
■■ A tool that visualizes the correlation between nodes and messages
as a graph (a type of rqt)
■■ rqt_plot 2D data plot tool (a type of rqt)
1 http://wiki.ros.org/rviz
Figure 6-2 RViz example 1: Navigation using TurtleBot3 and LDS sensor
2 http://wiki.ros.org/rviz/Tutorials/Interactive%20Markers%3A%20Getting%20Started
3 http://wiki.ros.org/urdf
The execution command of RViz is as follows. However, just as for any other ROS tool,
roscore must be running. For your reference, you can also run it with the node running
command ‘rosrun rviz rviz’.
$ rviz
➌ ➊
➍ ➎
➏
Figure 6-6 Composition of the RViz screen
➊ 3D View: This black area is located in the middle of the screen. It is the main screen which
allows us to see various data in 3D. Options such as background color of the 3D view, fixed
frame, and grid can be configured in the Global Options and Grid settings on the left column
of the screen.
➋ Displays: The Displays panel on the left column is for selecting the data that we want to
display from the various topics. If we click [Add] button on the lower left corner of the panel,
the display4 selection screen will appear as shown in Figure 6-7. Currently, there are about 30
different types of displays we can choose from, which we will explore more in the following
section.
➌ Menu: The Menu bar is located on the top of the screen. We can select commands to save or
load the current display settings, and also can select various panels.
➍ Tools: Tools are located below the menu bar, where we can select buttons for various functions
such as interact, camera movement, selection, camera focus change, distance measurement,
2D position estimation, 2D navigation target-point, publish point.
4 http://wiki.ros.org/rviz/DisplayTypes
➏ Time: Time shows the current time (wall time), ROS Time, and the elapsed time for each of
them. This is mainly used in simulations, and if there is a need to restart it, simply click the
[Reset] button at the very bottom.
Effort Displays the force applied to each rotary joint of the robot.
5 http://wiki.ros.org/rviz/DisplayTypes
Point Cloud Displays point cloud data. This is used to display sensor data
from depth cameras such as RealSense, Kinect, Xtion, etc. Since
PointCloud2 is compatible with the latest Point Cloud Library (PCL),
Point Cloud2
we can generally use PointCloud2.
Not only that, but as the name suggests, rqt was developed based on Qt, which is a cross-
platform framework widely used for GUI programming, making it very convenient for users to
freely develop and add plugins. In this section we will learn about the ‘rqt’ plugins ‘rqt_image_
view’, ‘rqt_graph’, ‘rqt_plot’ and ‘rqt_bag’.
The command to run rqt is as follows. You can simply type in ‘rqt’ on the terminal. For your
reference, we can also run it with the node execution command ‘rosrun rqt_gui rqt_gui’.
$ rqt
If we run ‘rqt’ then the GUI screen of rqt will appear as shown in Figure 6-8. If it is the first
time being launched, it will display only the menu without any content below. This is because
the plugin, which is the program that is directly run by ‘rqt’, has not been specified.
6 http://wiki.ros.org/rqt
■■ File The File menu only contains the sub-menu to close ‘rqt’.
■■ Running The currently running plugins are shown and they can be stopped when
they are not needed.
■■ Perspectives This menu saves operating plugins as a set and uses them later to run the
same plugins.
7 http://wiki.ros.org/rqt/Plugins
8 http://wiki.ros.org/rqt_common_plugins
Configuration
■■ Dynamic Reconfigure: This is a plugin to modify the parameter value of a node.
■■ aunch This is a GUI plugin of roslaunch, which is useful when we cannot remember the
L
name or composition of roslaunch.
Introspection
■■ ode Graph: This is a plugin for the graphical view that allows us to check the relationship
N
diagram of the currently running nodes or message flows.
■■ ackage Graph: This is a plugin for the graphical view that displays the dependencies of the
P
packages.
■■ rocess Monitor: We can check the PID (Processor ID), CPU usage, memory usage, and
P
number of threads of the currently running nodes.
Logging
■■ Bag: This is a plugin regarding the ROS data logging.
■■ onsole: This is a plugin to check the warning and error messages occurring in the nodes
C
in one screen.
■■ ogger Level: This is a tool to select a logger, which is responsible for publishing the logs,
L
and set the logger level9 to publish a specific log such as ‘Debug’, ‘Info’, ‘Warn’, ‘Error’, and
‘Fatal’. It is very convenient if ‘Debug’ is selected while debugging process.
Miscellaneous Tools
■■ Python Console: This is a plugin for the Python console screen.
9 http://wiki.ros.org/roscpp/Overview/Logging
■■ Moveit! Monitor: This is a plugin to check the MoveIt! data that is used for motion planning.
■■ obot Steering: This is a GUI tool for robot manual control, and this GUI tool is useful for
R
controlling a robot in remote.
■■ Runtime Monitor: This is a plugin to check the warnings or errors of the nodes in real-time.
Services
■■ ervice Caller: This is a GUI plugin that connects to a running service server and requests a
S
service. This is useful for testing service.
■■ Service Type Browser: This is a plugin to check the data structure of a service type.
Topics
■■ Easy Message Publisher: This is a plugin that can publish a topic in a GUI environment.
■■ Topic Publisher: This is a GUI plugin that can publish a topic. This is useful for topic testing.
■■ opic Type Browser: This is a plugin that can check the data structure of a topic. This is
T
useful for checking the topic type.
■■ opic Monitor: This is a plugin that lists the currently used topics, and checks the
T
information of the selected topic from the list.
Visualization
■■ I mage View: This is a plugin that can check the image data from a camera. This is useful for
simple camera data testing.
■■ avigation Viewer: This is a plugin to check the position or goal point of the robot in the
N
navigation.
■■ Plot: This is a GUI plugin for plotting 2D data. This is useful for schematizing 2D data.
■■ ose View: This is a plugin for displaying the pose (position+orientation) of a robot model
P
or TF.
■■ RViz: This is the RViz plugin which is a tool for 3D visualization.
■■ F Tree: This is a plugin of a graphical view type that shows the relationship of each
T
coordinates acquired from the TF in the tree structure.
Since it is difficult to introduce all of the plugins, in this chapter we will learn about the ones
that are most frequently used, which are ‘rqt_image_view’, ‘rqt_bag’, ‘rqt_graph’ and ‘rqt_plot’.
6.2.3. rqt_image_view
This is a plugin10 to display the image data of a camera. Although it is not an image processing
process, it is still quite useful for simply checking an image. A USB camera generally supports
UVC, so we can use the ‘uvc_camera’ package of ROS. First, install the ‘uvc_camera’ package
using the following command.
Connect the USB camera to the USB port of the PC, and launch the ‘uvc_camera_node’ in the
‘uvc_camera’ package using the following command.
10 http://wiki.ros.org/rqt_image_view
$ rqt
Apart from selecting the plugin from the rqt menu, we can also use dedicated execution
command as below.
$ rqt_image_view
FIgure 6-10 Checking the image data of the USB camera as an image view
After executing all nodes, launch ‘rqt’ with the ‘rqt’ command, and go to the menu to select
[Plugins] → [Node Graph]. For your information, we can also run it with ‘rqt_graph’ without
having to manually select the plugin from the menu.
The correlation among nodes and topics when ‘rqt_graph’ is running is shown as Figure 6-11.
$ rqt
11 http://wiki.ros.org/rqt_graph
We can also verify that the ‘uvc_camera’ node in the ‘uvc_camera’ package is publishing the ‘/
image_raw’ topic message and the ‘image_view_xxx’ node is subscribing it. Unlike this simple
example, the actual ROS programming consists of tens of nodes that transmit various topic
messages. In this situation, ‘rqt_graph’ becomes very useful for checking the correlation of
nodes on the ROS network.
6.2.5. rqt_plot
This time let’s run ‘rqt_plot’ with the following command instead of selecting the plugin from
the rqt menu. For your reference, we can run it with the node execution command ‘rosrun rqt_
plot rqt_plot’.
$ rqt_plot
Once ‘rqt_plot’ is up and running, click the gear shaped icon on the top right corner of the
program. We can select the option as shown in Figure 6-12, where the default setting is ‘MatPlot’.
Apart from MatPlot we can also use PyQtGraph and QwtPlot, so refer to the corresponding
installation method and use the graph library of your choice.
For example, in order to use PyQtGraph as the default plot instead of MatPlot, download and
install the latest ‘python-pyqtgraph_0.9.xx-x_all.deb’ file from the download address below. If
installation is completed, the PyQtGraph item will be enabled and you will be able to use
PyQtGraph.
■■ http://www.pyqtgraph.org/downloads/
The ‘rqt_plot’12 is a tool for plotting 2D data. Plot tool receives ROS messages and displays
them on the 2D coordinates. As an example, let us plot the x and y coordinates of the ‘turtlesim’
node pose message. First we need to launch ‘turtlesim_node’ of the turtlesim package.
Enter ‘/turtle1/pose/’ in the Topic field on the top of ‘rqt_plot’ tool, and it will draw the ‘/
turtle1/pose/’ topic on the 2D (x-axis: time, y-axis: data value) plane. Alternatively, we can run it
with the following command by specifying the topic to be schematized.
$ rqt_plot /turtle1/pose/
Then launch ‘turtle_teleop_key’ in the ‘turtlesim’ package so that we can move around the
turtle on the screen.
12 http://wiki.ros.org/rqt_plot
6.2.6. rqt_bag
The ‘rqt_bag’ is a GUI tool for visualizing a message. The ‘rosbag’ that we covered in ‘Section
5.4.8 rosbag: ROS Log Information’ was text-based tool, but ‘rqt_bag’ has a visualization function
added which allows us to see the image of the camera right away, making it very useful for
managing image data messages. Before we begin practice, we must run all of the ‘turtlesim’ and
‘uvc_camera’ related nodes covered in the ‘rqt_image_view’ and ‘rqt_graph’ tool. Then we create
a bag file with the ‘/image_raw’ message of the camera and the ‘/turtle1/cmd_vel’ message of
‘turtlesim’ using the following command.
In Section 5.4 we have used the ‘rosbag’ program to record, play, and compress various topic
messages of ROS as a bag file. The ‘rqt_bag’ is a GUI version of the previously introduced ‘rosbag’,
and just like rosbag it can also record, play, and compress topic messages. In addition, since it is
a GUI program, all commands are provided as buttons so they are easy to use, and we can also
watch the camera images according to the change in time like the video editor.
As in the following example, in order to take advantage of the feature of ‘rqt_bag’, let us save
the USB camera image as a bag file and then play it with ‘rqt_bag’.
We have now completed the installations and instructions of the rqt tools. As we could not
explain all of the plugins in this section, we encourage you to try using these tools for yourself
referring to the few of the examples we have seen until now. Although these tools may not
directly involved with robots or sensors as ROS nodes do, when we are performing these tasks
they can be used as helpful supplementary tools for saving, preserving, modifying, and analyzing
data.
Basic
ROS
Programming
REP is a proposal that is used when suggesting rules, new functions, and management methods within
the ROS community. It is used to democratically create ROS rules or negotiate contents necessary for
development, operation and management of ROS. Once a proposal is received, many ROS users can
review and refer to it as a standard document that is created through mutual collaboration. An REP
document can be found at http://www.ros.org/reps/rep-0000.html.
1 http://www.ros.org/reps/rep-0103.html
You can use the right-hand-rule3 for the rotation direction of the robot. The direction that
your right hand curls is the positive rotation direction. For example, if the robot rotates from 12
to 9 o’clock direction, using the radian for the rotation angle, the robot rotates by +1.5708 radians
on the z-axis.
These coordinate representations are used frequently in ROS programming and must be
programmed in the form of x: forward, y: left, z: up.
2 http://www.ros.org/reps/rep-0103.html#coordinate-frame-conventions
3 http://en.wikipedia.org/wiki/Right-hand_rule
However, messages, services and action file names placed in the /msg and /srv folders follow
CamelCased rules when using ROS messages and services. This is because the *.msg, *.srv, and *.action
files are converted to header files and then used as structures or types (e.g. TransformStamped.msg,
SetSpeed.srv)
4 http://wiki.ros.org/CppStyleGuide
5 http://wiki.ros.org/PyStyleGuide
6 http://wiki.ros.org/ROS/Patterns/Conventions#Naming_ROS_Resources
$ cd ~/catkin_ws/src
$ catkin_create_pkg ros_tutorials_topic message_generation std_msgs roscpp
When the package is created, the ‘ros_tutorials_topic’ package folder is created in the ‘~/
catkin_ws/src’ folder. In this package folder, the ‘CMakeLists.txt’ and ‘package.xml’ files are
created along with default folders. You can inspect it with the ‘ls’ command as below, or check
the inside of the package using the GUI-based Nautilus, which is similar to Windows File
Explorer.
$ cd ros_tutorials_topic
$ ls
include → Header File Folder
src → Source Code Folder
CMakeLists.txt → Build Configuration File
package.xml → Package Configuration File
$ gedit package.xml
The following code shows how to modify the ‘package.xml’ file to match the package you
created. Personal information will be included in the content, so you can modify it as you wish.
For a detailed description of each option, see Section 4.9.
$ gedit CMakeLists.txt
The following is the modified code of CMakeLists.txt for the package we created. See Section
4.9 for a detailed description of each option.
ros_tutorials_topic/CMakeLists.txt
cmake_minimum_required(VERSION 2.8.3)
project(ros_tutorials_topic)
## A Catkin package option that describes the library, the Catkin build dependencies,
## and the system dependent packages.
catkin_package(
LIBRARIES ros_tutorials_topic
CATKIN_DEPENDS std_msgs roscpp
)
add_message_files(FILES MsgTutorial.msg)
The content in the message file is quite simple. There are time type of ‘stamp’ and ‘int32’ type
of data variables in the message. Other than these two types, the following types are also
available: basic message types7 such as ‘bool’, ‘int8’, ‘int16’, ‘float32’, ‘string’, ‘time’, ‘duration’, and
‘common_msgs’8 which is a collection of messages frequently used in ROS. In this simple
example, we use time and int32.
ros_tutorials_topic/msg/MsgTutorial.msg
time stamp
int32 data
It is generally recommended to create a separate package for the message file ‘msg’ and the service
file ‘srv’ rather than to include the message file in the executable node. It is because when the
subscriber node and the publisher node are executed on different computers both the publisher and
subscriber nodes are dependent on the identical message. Therefore unnecessary nodes must be
installed if the message file exists in the package. If the message is created as an independent
package, the message package can be added to the dependency option, thus eliminating unnecessary
dependencies between packages. However, we have included the message file in the executable
node in this book to simplify the code.
add_executable(topic_publisher src/topic_publisher.cpp)
7 http://wiki.ros.org/std_msgs
8 http://wiki.ros.org/common_msgs
$ roscd ros_tutorials_topic/src → Move to the 'src' folder, which is the source folder of
the package
$ gedit topic_publisher.cpp → Create or modify new source file
ros_tutorials_topic/src/topic_publisher.cpp
// ROS Default Header File
#include "ros/ros.h"
// MsgTutorial Message File Header
// The header file is automatically created when building the package.
#include "ros_tutorials_topic/MsgTutorial.h"
// Set the loop period. '10' refers to 10 Hz and the main loop repeats at 0.1 second intervals
ros::Rate loop_rate(10);
while (ros::ok())
{
msg.stamp = ros::Time::now(); // Save current time in the stamp of 'msg'
msg.data = count; // Save the the 'count' value in the data of 'msg'
return 0;
}
add_executable(topic_subscriber src/topic_subscriber.cpp)
This mean that the ‘topic_publisher.cpp’ file is built to create the ‘topic_subscriber’ executable
file. Let’s write a code that performs subscriber node functions in the following order:
$ roscd ros_tutorials_topic/src → Move to the 'src' folder, which is the source folder of
the package
$ gedit topic_subscriber.cpp → Create or modify new source file
ros_tutorials_topic/src/topic_subscriber.cpp
// ROS Default Header File
#include "ros/ros.h"
// MsgTutorial Message File Header
// The header file is automatically created when building the package.
#include "ros_tutorials_topic/MsgTutorial.h"
ros::NodeHandle nh; // Node handle declaration for communication with ROS system
return 0;
}
The output files of built package will be located in the ‘/build’ and ‘/devel’ folders in ‘~/catkin_
ws’. The configuration used in Catkin build is stored in ‘/build’ folder, and executable files are
stored in ‘/devel/lib/ros_tutorials_topic’ and the message header file that is automatically
generated from the message file is stored in ‘/devel/include/ros_tutorials_topic’. Check the files
in each folder above to verify the created output.
$ roscore
$ rosrun ros_tutorials_topic topic_publisher
When you run the publisher, you can see the output screen shown in Figure 7-2. However,
the string displayed on the screen is the data in the publisher using the ROS_INFO() function,
which is similar to the printf() function used in common programming languages. In order to
actually publish the message on topic, we must use a command that acts as a subscriber node,
such as a subscriber node or rostopic.
Let’s use the ‘rostopic’ command to receive the topic published by ‘topic_publisher’. First, list
up the topics currently running on the ROS. Use ‘rostopic list’ command to verify that the ‘ros_
tutorial_msg’ topic is running.
$ rostopic list
/ros_tutorial_msg
/rosout
/rosout_agg
When the subscriber is executed, the output screen is shown as in Figure 7-4. The published
message on the ‘ros_tutorial_msg’ topic is received, and the value is displayed on the screen.
$ rqt_graph or $ rqt
In this section, we have created a publisher and subscriber nodes that are used in the topic
communication, and executed them to learn how to communicate between nodes. The example
source can be found at the following github address:
■■ https://github.com/ROBOTIS-GIT/ros_tutorials/tree/master/ros_tutorials_topic
If you want to run it right away, you can clone the source code with the following command
in the ‘~/catkin_ws/src’ folder and build the source. Then run the ‘topic_publisher’ and ‘topic_
subscriber’ nodes.
$ cd ~/catkin_ws/src
$ git clone https://github.com/ROBOTIS-GIT/ros_tutorials.git
$ cd ~/catkin_ws
$ catkin_make
These services are often used when requesting a robot to perform a specific action.
Alternatively, it is used for nodes that require specific events to occur under certain conditions.
As service is a single occurrence of communication method, it is a very useful method that can
replace the topic with a small network bandwidth.
In this section, we will create a simple service file and run a service server node and a service
client node.
$ cd ~/catkin_ws/src
$ catkin_create_pkg ros_tutorials_service message_generation std_msgs roscpp
When the package is created, the ‘ros_tutorials_service’ package folder is created in the ‘~/
catkin_ws/src’ folder. In this package folder, the ‘CMakeLists.txt’ and ‘package.xml’ files are
created along with default folders. You can inspect it with the ‘ls’ command as shown below,
$ cd ros_tutorials_service
$ ls
include → Header File Folder
src → Source Code Folder
CMakeLists.txt → Build Configuration File
package.xml → Package Configuration File
$ gedit package.xml
The following code shows how to modify the ‘package.xml’ file to serve the package you are
creating. Personal information is included in the content, so you can modify it as needed. For a
detailed description of each option, see Section 4.9.
ros_tutorials_service/package.xml
<?xml version="1.0"?>
<package>
<name>ros_tutorials_service</name>
<version>0.1.0</version>
<description>ROS tutorial package to learn the service</description>
<license>Apache License 2.0</license>
$ gedit CMakeLists.txt
ros_tutorials_service/CMakeLists.txt
cmake_minimum_required(VERSION 2.8.3)
project(ros_tutorials_service)
## A Catkin package option that describes the library, the Catkin build
## dependencies, and the system dependent packages.
catkin_package(
LIBRARIES ros_tutorials_service
CATKIN_DEPENDS std_msgs roscpp
)
add_service_files(FILES SrvTutorial.srv)
This option will include the ‘SrvTutorial.srv’ when building the package, which will be used
in this node. Let’s create the ‘SrvTutorial.srv’ file in the following order:
Let’s create an ‘int64’ type of ‘a’ and ‘b’ service requests and ‘result’ service response as
follows. The ‘---’ is a delimiter that separates the request and the response. The structure is
similar to the message of the topic described above, except the delimiter ‘---’ that separates the
request and response messages.
ros_tutorials_service/srv/SrvTutorial.srv
int64 a
int64 b
---
int64 result
add_executable(service_server src/service_server.cpp)
This meaning, the ‘service_server.cpp’ file is built to create the ‘service_server’ executable
file. Let’s create the code that performs the service server node function in the following order:
$ roscd ros_tutorials_service/src → Move to the 'src' folder, which is the source folder of
the package
$ gedit service_server.cpp → Create or modify the source file
ros_tutorials_service/src/service_server.cpp
// ROS Default Header File
#include "ros/ros.h"
// SrvTutorial Service File Header (Automatically created after build)
#include "ros_tutorials_service/SrvTutorial.h"
return true;
}
return 0;
}
add_executable(service_client src/service_client.cpp)
When the ‘service_client.cpp’ file is built, the ‘service_client’ executable file will be generated.
Let’s create a code that performs the service client node function in the following order:
ros_tutorials_service/src/service_client.cpp
#include "ros/ros.h" // ROS Default Header File
// SrvTutorial Service File Header (Automatically created after build)
#include "ros_tutorials_service/SrvTutorial.h"
#include <cstdlib> // Library for using the "atoll" function
ros::NodeHandle nh; // Node handle declaration for communication with ROS system
// Declares the 'srv' service that uses the 'SrvTutorial' service file
ros_tutorials_service::SrvTutorial srv;
// Parameters entered when the node is executed as a service request value are stored at 'a' and 'b'
srv.request.a = atoll(argv[1]);
srv.request.b = atoll(argv[2]);
// Request the service. If the request is accepted, display the response value
if (ros_tutorials_service_client.call(srv))
$ cd ~/catkin_ws && catkin_make → Go to the catkin folder and run the catkin build
The output of the build is saved in the ‘~/catkin_ws/build’ and ‘~/catkin_ws/devel’ folders. The
executable files are stored in ‘~/catkin_ws/devel/lib/ros_tutorials_service’ and the catkin build
configuration is stored in ‘~/catkin_ws/build’. The service header file that is automatically
generated from the message file is stored in ‘~/catkin_ws/devel/include/ros_tutorials_service’.
Check the files in each path above to verify the created output.
$ roscore
$ rosrun ros_tutorials_service service_server
[INFO] [1495726541.268629564]: ready srv server!
The parameter 2 and 3 entered with execution command are programmed to be transmitted
as the service request values. As a result, a and b requested service as a value of 2 and 3
respectively, and the sum of these two values is transmitted as a response value. In this case,
execution parameter is used as a service request, but actually, it can be replaced with a
command, or a value to be calculated and a variable for a trigger can be used as a service request.
Note that the service can’t be seen in the ‘rqt_graph’ because it is a one-time communication
while Topic publishers and subscribers are maintaining the connection as shown in Figure 7-6.
Write the corresponding service name, such as ‘/ros_tutorial_srv’, after the rosservice call
command as shown in the command below. This is followed by the required parameters for the
service request.
In the previous example, we set the ‘int64’ type variable ‘a’ and ‘b’ as the request as shown in
the service file below, so we entered ‘10’ and ‘2’ as parameters. The ‘int64’ type of ‘12’ is returned
as a ‘result’ of the service response.
int64 a
int64 b
---
int64 result
$ rqt
Next, select [Plugins] → [Services] → [Service Caller] from the menu of the ‘rqt’ program and
the below screen will appear.
The rosservice call described above has the advantage of running directly on the terminal,
but for those who are unfamiliar with Linux or ROS commands, we recommend to use rqt’s
‘Service Caller’.
In this section, we have created the service server and the service client, and executed them
to learn how to communicate between nodes with service. Source codes for the example can be
found in the following GitHub address:
■■ https://github.com/ROBOTIS-GIT/ros_tutorials/tree/master/ros_tutorials_service
If you want to run the example right away, you can clone the source code with the following
command in the ‘~/catkin_ws/src’ folder and build it. Then run the ‘service_server’ and ‘service_
client’ nodes.
$ cd ~/catkin_ws/src
$ git clone https://github.com/ROBOTIS-GIT/ros_tutorials.git
$ cd ~/catkin_ws
$ catkin_make
7.4. Writing and Running the Action Server and Client Node
In this section, we will create and run action server and action client nodes, and we will look at
Action9, which is the third message communication method we discussed in Section 4.2. Unlike
topics and services, actions are very useful for asynchronous, bidirectional, and more complex
programming where an extended response time is expected, after processing request and
intermediate feedbacks are needed. Here we will use the ‘actionlib’ example10 introduced in the
ROS Wiki.
9 http://wiki.ros.org/actionlib
10 http://wiki.ros.org/actionlib_tutorials/Tutorials
$ cd ~/catkin_ws/src
$ catkin_create_pkg ros_tutorials_action message_generation std_msgs actionlib_msgs actionlib roscpp
$ roscd ros_tutorials_action
$ gedit package.xml
ros_tutorials_action/package.xml
<?xml version="1.0"?>
<package>
<name>ros_tutorials_action</name>
<version>0.1.0</version>
<description>ROS tutorial package to learn the action</description>
<license>BSD</license>
<author>Melonee Wise</author>
<maintainer email="[email protected]">pyo</maintainer>
<buildtool_depend>catkin</buildtool_depend>
<build_depend>roscpp</build_depend>
<build_depend>actionlib</build_depend>
<build_depend>message_generation</build_depend>
<build_depend>std_msgs</build_depend>
<build_depend>actionlib_msgs</build_depend>
<run_depend>roscpp</run_depend>
<run_depend>actionlib</run_depend>
<run_depend>std_msgs</run_depend>
<run_depend>actionlib_msgs</run_depend>
<run_depend>message_runtime</run_depend>
$ gedit CMakeLists.txt
ros_tutorials_action/CMakeLists.txt
cmake_minimum_required(VERSION 2.8.3)
project(ros_tutorials_action)
add_action_files(FILES Fibonacci.action)
generate_messages(DEPENDENCIES actionlib_msgs std_msgs)
catkin_package(
LIBRARIES ros_tutorials_action
CATKIN_DEPENDS std_msgs actionlib_msgs actionlib roscpp
DEPENDS Boost
)
include_directories(${catkin_INCLUDE_DIRS} ${Boost_INCLUDE_DIRS})
add_executable(action_client src/action_client.cpp)
add_dependencies(action_client ${${PROJECT_NAME}_EXPORTED_TARGETS} ${catkin_EXPORTED_TARGETS})
target_link_libraries(action_client ${catkin_LIBRARIES})
add_action_files(FILES Fibonacci.action)
The above option indicates to include the service ‘Fibonacci.action’, which will be used in
this node, when building the package. As ‘Fibonacci.action’ file has not been created yet, let’s
create it in the following order:
In the action file three consecutive hyphens (---) are used in two places as delimiters. The
first section is the ‘goal’ message, the second is the ‘result’ message, and the third is the ‘feedback’
message. The main difference is that the relationship between the ‘goal’ message and the ‘result’
message is the same as in the above-mentioned ‘srv’ file. However, the ‘feedback’ message is
used for intermediate feedback transmission while the specified process is being performed.
Fibonacci.action
# goal definition
int32 order
---
# result definition
int32[] sequence
---
# feedback definition
int32[] sequence
In addition to the Goal, Result, and Feedback message that can be found in an action file, the action
file uses two additional messages: Cancel and Status. The Cancel message uses ‘actionlib_msgs/
GoalID’ as a message that cancels the action execution from the action client or from a separate node
while the action is being processed. The Status message can check the status of the current action
according to State transitions11 such as PENDING, ACTIVE, PREEMPTED, and SUCCEEDED12.
add_executable(action_server src/action_server.cpp)
That is, the ‘action_server.cpp’ file is built to create the ‘action_server’ executable file. Let’s
write the code that performs as the action server node in the following order:
$ roscd ros_tutorials_action/src → Move to the 'src' folder, which is the source folder of
the package
$ gedit action_server.cpp → Create or modify new source file
ros_tutorials_action/src/action_server.cpp
#include <ros/ros.h> // ROS Default Header File
#include <actionlib/server/simple_action_server.h> // action Library Header File
#include <ros_tutorials_action/FibonacciAction.h> // FibonacciAction Action File Header
class FibonacciAction
{
protected:
11 http://wiki.ros.org/actionlib/DetailedDescription
12 http://docs.ros.org/kinetic/api/actionlib_msgs/html/msg/GoalStatus.html
public:
// Initialize action server (Node handle, action name, action callback function)
FibonacciAction(std::string name) : as_(nh_, name, boost::bind(&FibonacciAction::executeCB,
this, _1), false), action_name_(name)
{
as_.start();
}
~FibonacciAction(void)
{
}
// Notify the user of action name, goal, initial two values of Fibonacci sequence
ROS_INFO("%s: Executing, creating fibonacci sequence of order %i with seeds %i, %i",
action_name_.c_str(), goal->order, feedback_.sequence[0], feedback_.sequence[1]);
// Action content
// Store the sum of current Fibonacci number and the previous number in the feedback
// while there is no action cancellation or the action target value is reached.
feedback_.sequence.push_back(feedback_.sequence[i] + feedback_.sequence[i-1]);
as_.publishFeedback(feedback_); // Publish feedback
r.sleep(); // sleep according to the defined loop rate.
}
add_executable(action_client src/action_client.cpp)
This meaning, the ‘action_client.cpp’ file is built to generate the ‘action_client’ executable
file. Let’s write the code that performs the action client node function in the following order:
ros_tutorials_action/src/action_client.cpp
#include <ros/ros.h> // ROS Default Header File
#include <actionlib/client/simple_action_client.h> // action Library Header File
#include <actionlib/client/terminal_state.h> // Action Goal Status Header File
#include <ros_tutorials_action/FibonacciAction.h> // FibonacciAction Action File Header
//exit
return 0;
}
$ cd ~/catkin_ws && catkin_make → Go to the catkin folder and run the catkin build
$ roscore
$ rosrun ros_tutorials_action action_server
Action is similar to Service, as described in Section 4.3, in that there is an action ‘goal’ and a
‘result’ corresponding to the ‘request’ and ‘response’. However, actions have ‘feedback’ messages
corresponding to intermediate feedback in the process. This is might look similar to Service, but
it is very similar to Topic in its actual message communication method. The use of the current
action message can be verified through the ‘rqt_graph’ and ‘rostopic list’ commands as shown
below.
For more information on each message, append the ‘-v’ option to the rostopic list. This will
separate the topics to be published and subscribed, as follows:
$ rostopic list -v
Published topics:
* /ros_tutorial_action/feedback [ros_tutorials_action/FibonacciActionFeedback] 1 publisher
* /ros_tutorial_action/status [actionlib_msgs/GoalStatusArray] 1 publisher
* /rosout [rosgraph_msgs/Log] 1 publisher
* /ros_tutorial_action/result [ros_tutorials_action/FibonacciActionResult] 1 publisher
* /rosout_agg [rosgraph_msgs/Log] 1 publisher
Subscribed topics:
* /ros_tutorial_action/goal [ros_tutorials_action/FibonacciActionGoal] 1 subscriber
* /rosout [rosgraph_msgs/Log] 1 subscriber
* /ros_tutorial_action/cancel [actionlib_msgs/GoalID] 1 subscriber
To visually verify the information, use the ‘rqt_graph’ command shown below. Figure 7-8
shows the relationship between the action server and client as well as the action message, which
are transmitted and received bidirectionally. Here, the action message is represented by the
name ‘ros_tutorial_action/action_topics’. When Actions is deselected in the group menu, all five
messages used in the action can be seen as shown in Figure 7-9. Here we can see that the action
basically consists of 5 topics and nodes that publish and subscribe to this topic.
$ rqt_graph
In this section, we have created action server and action client nodes, and executed them to
learn how to communicate between nodes. Related sources can be found in the following
GitHub address:
■■ https://github.com/ROBOTIS-GIT/ros_tutorials/tree/master/ros_tutorials_action
$ cd ~/catkin_ws/src
$ git clone https://github.com/ROBOTIS-GIT/ros_tutorials.git
$ cd ~/catkin_ws
$ catkin_make
ros_tutorials_service/src/service_server.cpp
#include "ros/ros.h" // ROS Default Header File
#include "ros_tutorials_service/SrvTutorial.h" // SrvTutorial Service File Header
// Displays the values of 'a' and 'b' used in the service request, and the 'result' value
// corresponding to the service response.
ROS_INFO("request: x=%ld, y=%ld", (long int)req.a, (long int)req.b);
ROS_INFO("sending back response: [%ld]", (long int)res.result);
return true;
}
As most of the contents are similar to the previous examples, let’s just take a look at the
additional parts needed to use parameters. In particular, the ‘setParam’ and ‘getParam’ methods
in bold font are the most important parts when using parameters. As they are very simple
methods, it can be easily understood by just reading its usage.
nh.setParam("calculation_method", PLUS);
Note that parameters can be set to integers, floats, boolean, string, dictionaries, list, and so
on. For example, ‘1’ is an integer, ‘1.0’ is a float, ‘internetofthings’ is a string, ‘true’ is a boolean,
‘[1,2,3]’ is a list of integers, and ‘a: b, c: d’ is a dictionary.
nh.getParam("calculation_method", g_operator);
When the build is done, run the ‘service_server’ node of the ‘ros_tutorials_service’ package
with the following command.
$ roscore
$ rosrun ros_tutorials_service service_server
[INFO] [1495767130.149512649]: ready srv server!
$ rosparam list
/calculation_method
/rosdistro
/rosversion
/run_id
The ‘calculation_method’ parameter can be changed with the ‘rosparam set’ command. With
the changed parameters, you can see the different result values with the same input of ‘rosservice
call /ros_tutorial_srv 10 5’. As shown in above example, parameters in ROS can control the flow,
setting, and processing of nodes from outside the node. It’s a very useful feature so familiarize
yourself with this feature even if you do not need it right away.
In this section, we have modified a service server and learned how to use parameters. The
corresponding source code has been renamed as ‘ros_tutorials_parameter’ package to distinguish
it from the service source code that was previously created, and can be found in the following
GitHub address.
■■ https://github.com/ROBOTIS-GIT/ros_tutorials/tree/master/ros_tutorials_parameter
If you want to run the example right away, you can clone the source code with the following
command in the ‘catkin_ws/src’ folder and build the package. Then run the ‘service_server’ and
‘service_client’ nodes.
$ cd ~/catkin_ws/src
$ git clone https://github.com/ROBOTIS-GIT/ros_tutorials.git
$ cd ~/catkin_ws
$ catkin_make
The ‘roslaunch’ uses the ‘*.launch’ file to select executable nodes, which is XML-based and
provides tag-specific options. The execution command is ‘roslaunch [package name] [roslaunch
file]’.
First, write a ‘*.launch’ file. The file used for roslaunch has a ‘*.launch’ extension file name,
and you have to create a ‘launch’ folder in the package folder and place the launch file in that
folder. Create a folder with the following command and create a new file called ‘union.launch’.
$ roscd ros_tutorials_topic
$ mkdir launch
$ cd launch
$ gedit union.launch
union.launch
<launch>
<node pkg="ros_tutorials_topic" type="topic_publisher" name="topic_publisher1"/>
<node pkg="ros_tutorials_topic" type="topic_subscriber" name="topic_subscriber1"/>
<node pkg="ros_tutorials_topic" type="topic_publisher" name="topic_publisher2"/>
<node pkg="ros_tutorials_topic" type="topic_subscriber" name="topic_subscriber2"/>
</launch>
The tags required to run the node with the ‘roslaunch’ command are described within the
<launch> tag. The <node> tag describes the node to be executed by ‘roslaunch’. Options include
‘pkg’, ‘type’, and ‘name’.
■■ name The name (executable name) to used when the node corresponding to the
‘type’ above is executed. The name is generally set to be the same as the type,
but it can be set to use a different name when executed.
Once the ‘roslaunch’ file is created, run ‘union.launch’ as follows. Note that when the
‘roslaunch’ command runs several nodes, the output (info, error, etc.) of the executed nodes is
not displayed on the terminal screen, making it difficult to debug. If you add the ‘--screen’ option,
the output of all nodes running on that terminal will be displayed on the terminal screen.
What would the screen look like if we run it? First, let’s take a look at the nodes currently
running with the following command.
$ rosnode list
/rosout
/topic_publisher1
/topic_publisher2
/topic_subscriber1
/topic_subscriber2
The problem is that unlike the initial intention to “run two publisher nodes and two
subscriber nodes and make them communicate with their corresponding pairs”, we can see
through ‘rqt_graph’ (Figure 7-10) that each subscriber is receiving a topic from both publishers.
This is because we simply changed the name of the node to be executed without changing the
name of the message to be used. Let’s fix this problem with namespace tag in ‘roslaunch’.
$ roscd ros_tutorials_topic/launch
$ gedit union.launch
ros_tutorials_topic/launch/union.launch
<launch>
<group ns="ns1">
<node pkg="ros_tutorials_topic" type="topic_publisher" name="topic_publisher"/>
<node pkg="ros_tutorials_topic" type="topic_subscriber" name="topic_subscriber"/>
</group>
<group ns="ns2">
<node pkg="ros_tutorials_topic" type="topic_publisher" name="topic_publisher"/>
<node pkg="ros_tutorials_topic" type="topic_subscriber" name="topic_subscriber"/>
</group>
</launch>
The tag <group> binds specific nodes. The option ‘ns’ refers to the name of the group as a
namespace, and the name and message of the node belonging to the group are both included in
the name specified by ‘ns’.
13 http://wiki.ros.org/roslaunch/XML
■■ <rosparam> Check and modify parameter information such as load, dump, and delete like
the ‘rosparam’ command.
■■ <group> Group executable nodes.
■■ <test> Used to test nodes. Similar to <node>, but with options available for testing
purposes.
■■ <arg> Define a variable in the launch file so that the parameter is changed when
executed as shown below.
Internal parameters can be changed from the outside when executing a launch file with
<param> and <arg> tags, which is a variable in the launch file. Familiarize yourself with this
parameter as it is a very useful and widely used method.
<launch>
<arg name="update_period" default="10" />
<param name="timing" value="$(arg update_period)"/>
</launch>
Robot.
Sensor.
Motor.
ROS can be classified as application software and depending on the specialized applications
it is classified as robot package1, sensor package2 and motor3 package. These packages are
provided by robot companies such as Willow Garage, ROBOTIS, Yujin Robot and Fetch Robotics
as well as Open Robotics (OSRF, formerly Open Source Robotics Foundation) and university labs
in robot engineering. Individual developers also develop and distribute packages related to ROS
robots, sensors and motors.
The masterpiece of robot package is without a doubt PR2 and TurtleBot in Figure 8-1. Among
them, PR2 is a mobile-based humanoid robot developed for research by Willow Garage
responsible for the ROS development. Until this day, the core package of other robots is derived
from PR2 which is a representative robot package of ROS.
Although PR2 is a general purpose and its performance is superior, the price was not
competitive enough to vitalize ROS in the market, so TurtleBot was developed to increase the
boundary of ROS. First TurtleBot was based on the Create which is iRobot’s cleaning robot
platform. For TurtleBot2, KOBUKI was adopted as a mobile platform which was the improved
version of iCLEBO of Yujin Robot in Korea. Now TurtleBot3, a Dynamixel based mobile robot,
developed in colloboration with ROBOTIS, Open Robotics which will be extensively covered in
this book. Details on TurtleBot and usage of the robot packages are covered in section 10.
Figure 8-1 PR2 (Left), TurtleBot2 (2nd from the left), TurtleBot3 (3 models on the right)
1 http://robots.ros.org/, http://wiki.ros.org/Robots
2 http://wiki.ros.org/Sensors
3 http://wiki.ros.org/Motor%20Controller%20Drivers
The following are the different types of robots that are used in almost every field and
disclosed robot packages can be found at http://robots.ros.org/.
■■ Manipulator
■■ Mobile robot
■■ Autonomous car
■■ Humanoid
The installation procedure for the robot package should be very simple if it is a ROS official
package. First, check if the robot package that you are about to use is listed at ROS Wiki (http://
robots.ros.org/) or use the command below to search for the entire ROS package list.
It is recommended to use the latest source code instead of the binary installation since
TurtleBot3 is constantly getting upated. Details about this method are covered in the chapter 10
‘Mobile Robot’.
Even if the robot package you are about to use is not an official package, information for
installation can be found in Wiki of the robot package. For example, Pioneer, widely known for
its mobile robot, the package can be downloaded to the source folder of catkin workspace from
the repository as shown below.
As such, the robot package can be downloaded from either the open source repository
according to the installation method shown in the Wiki or released official ROS package.
Please follow the description of the corresponding robot package for the usage of each node
included in the package. The robot package basically includes a robot operating drive node, a
node for acquiring and utilizing the mounted sensor data, and a remote control node. If the
robot is an articulated robot, it includes an inverse kinematics node whereas a navigation node
is included for a mobile robot.
The problem is that there are a lot of sensors to use as shown in Figure 8-3. There’s a limit with
a microprocessor to receive various sensor data. For example, LDS, 3D sensor, camera transmit a
lot of data and require high processing power so it is too much for a microprocessor. In order to
use PC for high performance data processing, drivers for devices as well as libraries for processes
such as point cloud processing with OpenNI and OpenCV, and image processing are necessary.
ROS provides a development environment in which drivers and libraries of the aforementioned
sensors can be used. Not all sensors are supported by ROS package, but more and more sensor
related packages are increasing. Sensors using the same communication protocol, such as I2C and
UART are adopting unified communication method. Sensor makers are actively supporting ROS
sensor packages, which will accelerate ROS support of the future sensor products.
For more information on sensor packages, see the ROS sensor Wiki page mentioned above.
In particular, the following packages are considered important packages in this book.
■■ 1D Range Finders: Infrared distance sensors that can be used to make low-cost robots.
■■ D Sensors: Sensors such as Intel’s RealSense, Microsoft’s Kinect and ASUS’s Xction are
3
necessary for 3D measurements.
■■ udio/Speech Recognition: Currently, there are very few areas related to speech
A
recognition, but it seems to be added continuously.
■■ ameras: Camera drivers and various application packages that are widely used for object
C
recognition, face recognition, character recognition are listed.
■■ ensor Interfaces: Very few sensors support USB and Web protocols. Still, many sensor
S
data can be easily obtained from the microprocessor. These sensors can be connected to
ROS via UART of microprocessor or mini PCs. These sensor interfaces are introduced.
There are various sensor packages available, so find the sensors best suitable for your project
and apply them. The most commonly used cameras, depth cameras and laser distance sensors
(LDS) will be discussed in detail in the following sections.
8.3. Camera
The camera corresponds to the eye in the robot. The images obtained from the camera are very
useful for recognizing the environment around the robot. For example, object recognition using
a camera image, facial recognition, a distance value obtained from the difference between two
different images using two cameras (stereo camera), mono camera visual SLAM, color
recognition using information obtained from an image and object tracking are very useful.
There are many kinds of cameras to be used for image processing but, we will cover the USB
camera. The USB camera means that it supports USB connection. Another name is USB Video
As of July 2017, the latest version of UVC is 1.55. The UVC 1.5 version supports the latest USB
3.0 and is available on almost all operating systems including Linux, Windows, and OS X. It is
easy to use, widespread and cheaper than other cameras. In this section, we will practice how to
run a USB camera and check the data.
Camera Interface
USB is not the only available interface for a camera. Some cameras have network capabilities that
can be connected via LAN or WiFi to stream videos to the web. Such cameras are called webcams.
Also, some cameras use FireWire (IEEE 1394 protocol) for high-speed transmission, and they are
mainly used for research purposes where high-speed image transmission is required. The FireWire
standard is not readily available on common boards, but it is developed by Apple and is mostly used
in Apple products.
■■ l ibuvc-camera: This is an interface package for operating cameras with the UVC standard.
(Developer: Ken Tossell)
■■ vc-camera: This is a very convenient package with relatively detailed camera settings.
u
Moreover, if you are considering a stereo camera configuration, this package would be
ideal for the purpose.
■■ usb-cam: This is a very simple camera driver for Bosch. (By Benjamin Pitzer)
■■ rosilica-camera: This is used in AVT’s prosilica camera, which is widely used for research
p
purposes.
4 https://en.wikipedia.org/wiki/USB_video_device_class
5 http://www.usb.org/developers/docs/devclass_docs/
■■ USB Camera: Connect the USB camera to the USB port of your computer.
■■ amera Connection Information: Open a new terminal window and check that the
C
connection is correctly made with ‘lsusb’ command as shown below. If you have a generic
UVC camera, you can check if the camera is connected like the underlined message below.
$ lsusb
Bus 004 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub
Bus 003 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 002 Device 002: ID 2109:0812 VIA Labs, Inc. VL812 Hub
Bus 002 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub
Bus 001 Device 005: ID 046d:c52b Logitech, Inc. Unifying Receiver
Bus 001 Device 006: ID 05e3:0608 Genesys Logic, Inc. Hub
Bus 001 Device 013: ID 046d:08ce Logitech, Inc. QuickCam Pro 5000
Bus 001 Device 012: ID 0c45:7603 Microdia
Bus 001 Device 002: ID 2109:2812 VIA Labs, Inc. VL812 Hub
Bus 001 Device 007: ID 8087:0a2a Intel Corp.
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
6 http://wiki.ros.org/uvc_camera
$ roscore
$ rosrun uvc_camera uvc_camera_node
$ rostopic list
/camera_info
/image_raw
/image_raw/compressed
/image_raw/compressed/parameter_descriptions
/image_raw/compressed/parameter_updates
/image_raw/compressedDepth
/image_raw/compressedDepth/parameter_descriptions
/image_raw/compressedDepth/parameter_updates
/image_raw/theora
/image_raw/theora/parameter_descriptions
/image_raw/theora/parameter_updates
/rosout
/rosout_agg
$ rqt_image_view image:=/image_raw
$ rviz
Change the Displays option when RViz is executed. Click [Add] at the bottom left of RViz and
select [Image] in the [By display type] tab as shown in Figure 8-6 to bring up the image display.
However, since the robot is moving, the operator cannot follow the robot to see the image
from the camera attached to the robot. In this section, I will explain how to check the image
information of the camera mounted on the robot from another computer in the remote place.
Be sure to read and follow the instructions carefully.
There will be many settings when you open the bashrc file. Leave the previous settings as
they are, and go down to the bottom of the bashrc file and modify the ROS_MASTER_URI and
ROS_HOSTNAME variables as below. Note that the IP address (192.168.1.100) in the following
example must be the IP address of the computer to which the camera is connected. If necessary,
modify the IP address. To check your IP address, you can use the ‘ifconfig’ command, which is
described in Section 3.2.
Then run roscore and run ‘uvc_camera_node’ node in another terminal window.
$ roscore
Remote Computer
Likewise the master PC, remote computer needs to be configured. Open the bashrc file and
modify the variables ROS_MASTER_URI and ROS_HOSTNAME. Set ROS_MASTER_URI to the IP
address of the computer to which the camera is connected (master PC), and change the IP
address of the ROS_HOSTNAME as remote computer (192.168.1.120 is the IP of the remote
computer in this example). Check your IP address of the remote PC with ifconfig and configure
the setting correctly.
In this section, we covered how to view the image acquired from the camera mounted on the
robot from the remote computer. It can be used as a remote sensing robot, a video conferencing
robot, or a webcam, that is, a surveillance system, which is a digital camera capable of
transmitting images to the network in real time, because the robot can be remotely controlled
and recognize environment.
However, camera calibration is necessary if you are measuring distance from images
acquired with a stereo camera or processing images for object recognition.
In order to obtain accurate distance information from the image obtained from the camera,
information such as the lens characteristics, the gap between the lens and the image sensor, and
the twisted angle of the image sensor are required for each camera. This is because the camera
image is a projection of the three-dimensional space into a two-dimensional face, and this
projection process is affected by the characteristics of each camera.
For example, each lens and the image sensor are different from each other, the gap between
the lens and the image sensor is different due to the hardware structure of the camera, the lens
and the image sensor are not perfectly aligned during camera production process which causes
misalignment of the image center and the principal point, and the image sensor could be slightly
twisted or angled.
ROS offers a calibration package using OpenCV’s camera calibration. Calibrate your camera
as described in the following sections.
Camera Calibration
Install the camera calibration package and run the ‘uvc_camera_node’ node as follows:
Since there is no information about camera calibration yet, default values will be displayed.
Prepare a Chessboard
Calibration is based on a chessboard consisting of black and white squares, as shown in Figure
8-8. You can download the 8x6 chessboard from the following address. Print the chessboard and
fix it on a flat surface. A4 or letter size paper should be good. For reference, 8x6 has 9 horizontal
squares that make 8 intersections and 7 vertical squares make 6 intersections, so it is called as
8x6 chessboard.
■■ ttp://wiki.ros.org/camera_calibration/Tutorials/MonocularCalibration?action=
h
AttachFile&do=view&target=check-108.pdf
Once the calibration node is running, the GUI will appear as shown in Figure 8-9. If you
point the camera to the chess board, the calibration will start immediately. On the right side of
the GUI screen you will see colored horizontal bars labeled as X, Y, Size and Skew. These are
calibration conditions for correct calibration and you have to adjust the chessboard to various
directions with respect to the camera. As conditions are getting better, the bars of X, Y, Size and
Skew will get longer and turn green.
The ‘CALIBRATE’ button is activated as shown in Figure 8-10 when necessary images for
calibration are collected. Calibration process takes approximately 1 to 5 minutes to calculate the
actual calibration. When the calculation is completed, click the SAVE button to save the
calibration information of the camera. The saved address is displayed in the terminal window
where the calibration was performed and is stored in the /tmp folder like ‘/tmp/calibrationdata.
tar.gz’.
$ cd /tmp
$ tar -xvzf calibrationdata.tar.gz
Rename the ‘ost.txt’ to ‘ost.ini’, and create a camera parameter file (camera.yaml) using the
convert node of the ‘camera_calibration_parsers’ package. After creating the parameter file,
save it in ‘~/.ros/camera_info/’ folder as the following example, and camera related packages
used in ROS will refer to this information.
$ mv ost.txt ost.ini
$ rosrun camera_calibration_parsers convert ost.ini camera.yaml
$ mkdir ~/.ros/camera_info
$ mv camera.yaml ~/.ros/camera_info/
The ‘camera.yaml’ file contains parameters shown as the following example and user can
customize ‘camera_name’ value. Generally, camera-related packages often uses ‘camera’ as a
default value, so I changed the ‘camera_name’ value from ‘narrow_stereo’ to ‘camera’.
In addition, if you look at the ‘/camera_info’ topic, you can see that the D, K, R, and P
parameters are populated, as shown in the following example.
In this section we will call it Depth Camera so that the terms are not confused.
7 https://en.wikipedia.org/wiki/Time-of-flight_camera
8 https://en.wikipedia.org/wiki/Structured-light_3D_scanner
9 https://en.wikipedia.org/wiki/Range_imaging
Structured Light
Representative products of structured light based products include Microsoft’s Kinect and
ASUS’s Xtion, which use a coherent radiation pattern (patent cited with US20100225746 patent,
also called coherent radiation). These sensors use PrimeSense’s PrimeSense System on a Chip
(SoC).
The Depth Camera using PrimeSense’s PrimeSense SoC is a sensor consisting of one infrared
projector and one infrared camera, which uses a coherent radiation pattern that was not present
in the existing ToF method. This technology started getting attention after solving its high cost
and external interference issues, and then Carmine, Capri with the PrimeSense SoC were
released. In addition, Microsoft’s Kinect with the same SoC chip became popular as a controller
of Xbox. After that, Asus’s Xtion was released, which was designed with a general computer
usage in mind. These are all sensors equipped with the PrimeSense SoC.
PrimeSense’s Carmine and Capri products were no longer available for purchase, Microsoft’s
Kinect was discontinued, and ASUS’s Xtion was discontinued later as well. Occipital’s Structure
Sensor, which is the last product with the PrimeSense SoC, sells its products as an accessory to
Apple until today, but we do not know what will happen in the future. These popular low-cost
products now became a history.
Stereo
A stereo camera (see Figure 8-13), which is one of the Depth Camera types, has been researched
from a much longer time ago than the previous two types, and its distance is calculated using
binocular parallax like the left and right eyes of a person. As the name suggests, the stereo
camera is equipped with two image sensors at specific distance and calculates the grid value
using the difference between the two image images captured by these two image sensors.
Representative products include Point Gray’s Bumblebee camera and WithRobot’s OjOcamStereo.
There are various types of stereo camera, and distinctive method is to configure an infrared
projector that emits infrared rays with a coherent pattern and two infrared image sensors that
receive infrared rays to obtain distance by triangulation method. The former with dual image
sensor is called passive stereo camera and the latter with the infrared projector is called active
stereo camera. One of the representative active stereo cameras is Intel’s RealSense, which is
about $100 for the R200 model. It is the cheapest among the Depth cameras so far, small in size
and similar in performance to the Xtion described above. The D400 series is a new generation of
RealSense, which has been widely used in the robotics field because of its small size, wide
viewing angle, outdoor use, improved sensing distance.
$ roscore
If you followed the above instructions but run into package installation issues or operation
issues, you may need to use a specific configuration for different Linux kernels. Refer to the
following Wiki address for more information.
■■ http://wiki.ros.org/librealsense
➋ Click the [Add] button at the bottom left of RViz, then select [PointCloud2] to add it.
Set topic to ‘camera/depth/points’ and select the size and color.
➌ Once all the settings are completed, you can see the PCD data as shown in Figure 8-14.
Since the color reference is set to the X axis, the farther the point locates from the X axis,
the color gets closer to purple.
If you are using other Depth cameras, check the following Wiki address to learn how to
operate the camera and how to use the package.
■■ http://wiki.ros.org/Sensors#A3D_Sensors_.28range_finders_.26_RGB-D_cameras.29
10 http://pointclouds.org/
Similar libraries include Microsoft’s Kinect Windows SDK and Libfreenect, which was once
known for freeing up the Kinect by hacking device for the first time. In addition to the basic
driver for managing Point Cloud Data, OpenNI also includes middleware such as NITE, which
handles the human body skeleton. After Apple took over PrimeSense, OpenNI was put on the
brink of disposal, but Occipital is now offering OpenNI12 in its Github repository13.
Typical products are Hokuyo’s URG series which is widely used indoors as shown in Fig. 8-15
while SICK is widely used for outdoor use. Velodyne’s HDL series equipped with several laser
sensors. The biggest disadvantage of these sensors is price. Prices vary from product to product,
but they usually cost in the thousands of dollars and Velodyne’s HDL series are worth much
more. Chinese products (such as RPLIDAR), which compensate for these shortcomings, hit the
market at a low price of about $400/USD. Recently, LDS (HLS-LFCD2)14 product is expected to hit
the market as low as $170/USD.
11 http://en.wikipedia.org/wiki/OpenNI
12 https://structure.io/openni
13 https://github.com/occipital/openni2
14 http://wiki.ros.org/hls_lfcd_lds_driver
The left image of Figure 8-16 shows the LDS with a laser inside and a mirror that is tilted at
an angle. The motor rotates the mirror and sensor measures the return time of the laser
(calculates the difference in wavelength). This way, the sensor scans objects in a horizontal
plane around the LDS as shown in the center image. However, the accuracy is dropped as the
distance becomes longer as shown in the right image.
First, since the laser is used as a light source, a strong laser beam can damage the eye.
Products are classified based on the laser source, so it is important to note this when purchasing
a product. In general, the laser is classified from Class 1 to Class 4, and the higher the number,
the more dangerous it is. Class 1 is a safe product with no problem with direct eye contact. Class
2 increases the risk of prolonged exposure. The LDS described above correspond to class 1.
Secondly, because it measures the return of the laser source, therefore, is useless if nothing
is reflected. In other words, transparent glass, plastic bottles, glass cups are tend to reflect or
scatter the laser source in many directions. And for mirrors, lights are reflected back to the
mirror making it inaccurate measurement.
Lastly, because the horizontal plane is scanned, only objects on the horizontal plane are
detected by the sensor. In other words, you need to know that it is 2D data (in some cases, the
LDS is rotated to measure 3D space by collecting multiple 2D dimension data).
15 http://wiki.ros.org/hls_lfcd_lds_driver
In the previous command, read and write permission for ttyUSB0 are not granted. Let’s set
and check the permission using below commands.
You can see from the result that the permission has been granted with the chmod command.
$ rviz
➊ Set the ‘Type’ in Views on the top right of the RViz to ‘TopDownOrtho’ so that the display
is changed to top view that draws distance information on XY plane.
➋ From the left column, go to [Global Options] and set the [Fixed Frame] to ‘laser’.
➌ Click the [Add] button in the bottom left corner of RViz, then select [Axes] in the display.
Change the detail settings for Length and Radius as shown in Figure 8-17.
➍ Click the [Add] button on the bottom left of RViz and select [LaserScan] from the display.
Change the detail settings for Topic, Color Transformer and Color as shown in Figure
8-17.
The RViz configuration can be saved as a file. Let’s check the Laser sensor data in RViz with
the following command.
As another example of using LDS, the robot is able to detect various objects in the
surroundings and react based on the current environment as shown in Fig. 8-19. Practical
applications of LDS will be covered in more detail in Chapters 10 and 11 by using the LDS on the
robot.
16 https://en.wikipedia.org/wiki/Simultaneous_localization_and_mapping
8.6.1. Dynamixel
The Dynamixel is an integrated module that is composed of a reduction gear, a controller, a
motor and a communication circuit. Dynamixel series offers feedbacks for position, speed,
temperature, load, voltage and current data by using a daisy-chain method that enables simple
wire connection between devices. In addition to a basic position control, speed control and
torque control (for specific models) are also available, which are commonly used in robotics.
17 http://wiki.ros.org/Motor%20Controller%20Drivers
18 http://wiki.ros.org/dynamixel_sdk
19 http://wiki.ros.org/dynamixel_workbench
■■ http://www.ros.org/browse/list.php
This is the packages released as ROS Kinetic version. The number of packages seems to be
about 1,600. Indigo, the previous LTS version, had released more than 2,900 packages. Most
packages are continuously supported over multiple ROS versions, but some packages are not
continuously supported. Even if a specific package is not supported in your ROS version, ROS
has some compatibility with other versions, so a few modifications will allow you to use the
package. The next section will explain how to use packages.
The relevant package to the search term is displayed, as shown in Figure 8-22. There are a
number of related packages, but here we will use the ‘find_object_2d’ package from the second
‘find_object_2d - ROS Wiki’ above. Clicking ‘find_object_2d - ROS Wiki’ will open the Wiki page
of the ‘find_object_2d’ package as shown in Figure 8-23.
On this page, you can see whether the build system is catkin or rosbuild, who created it, and
what kind of open source license it is. Let’s first look at the kinetic version information by
selecting the kinetic button at the top. In the kinetic version page, you can check the list of
dependent packages by clicking on the ‘Dependencies’ link on the article menu, the link to the
project’s external website, the repository address of the package, and instructions about how to
use the package. Especially, be sure to check for package dependencies.
■■ catkin
■■ cv_bridge
■■ genmsg
■■ image_transport
■■ message_filters
■■ pcl_ros
■■ roscpp
■■ rospy
■■ sensor_msgs
■■ std_msgs
■■ std_srvs
■■ tf
Use the ‘rospack list’ or ‘rospack find’ command to verify that the necessary packages are
installed.
$ rospack list
actionlib /opt/ros/kinetic/share/actionlib
actionlib_msgs /opt/ros/kinetic/share/actionlib_msgs
actionlib_tutorials /opt/ros/kinetic/share/actionlib_tutorials
20 http://wiki.ros.org/find_object_2d
I f the dependent package is not installed, check the installation method on each Wiki
page and install all dependency packages.
Also, the ‘find_object_2d’ package in ‘2. Quick start’ on the Wiki page (http://wiki.ros.org/
find_object_2d) is described to require the ‘uvc_camera’ package (http://wiki.ros.org/uvc_
camera), so install the ‘uvc_camera’ package.
Binary Installation
Source Installation
$ cd ~/catkin_ws/src
$ git clone https://github.com/introlab/find-object.git
$ cd ~/catkin_ws/
$ catkin_make
The following packages are not directly related to ROS, but the OpenCV and Qt libraries are
used in the ‘find_object_2d’ package, so you need to install them.
$ roscore
Then open another terminal window and run the ‘find_object_2d’ node like the following
command.
Save the image from the USB camera as a common image file format such as PNG or JPEG,
and drag and drop the file to the executed GUI program. Here, we used two images for detection
as shown in Figure 8-24.
Let’s try object detection now. Prepare an image that includes both registered and
unregistered objects, and place the image in front of the camera as shown in Figure 8-25. As a
result, you can see that two objects are surrounded by rectangles and are properly detected.
You can also use the ‘rostopic echo’ command in the terminal window to verfiy the ‘/object’
topic or run the ‘print_objects_detected’ node to see the information of detected object. When
you create a new package using this package, you can create another application package if you
subscribe the coordinates of detected objects as a topic.
The packages released on ROS are increasing rapidly as the ROS begins to be widely used.
As it is explained in this section, if you know how to find and use packages when you need
them, those who have been worked hard to develop the package will allow you to go beyond one
step further and to spend more time on what you really need to concentrate on. This is the basic
idea of ROS. Accumulated knowledge leads us to higher level of robotics development.
This chapter explained how to use packages in the ROS. Please refer to the ROS Wiki for
details on how to use each package.
Embedded
System
Embedded System
1
An embedded system is an electronic system that exists within a device as a computer system that
performs specific functions for control of a machine or other system that requires control. In other
words, an embedded system can be defined as a specific purpose computer system that is a part of
the whole device and serves as a brain for systems that need to be controlled.
Many embedded devices are used to implement the functions of robots as shown in Figure
9-1.
Figure 9-22 shows various systems from 8-bit microcontroller to high-performance PC and
you need to configure an embedded system with the right performance to match your needs.
ROS requires an operating system such as Linux which it is operated by a PC or ARM Cortex-A
series high-performance CPUs.
1 https://en.wikipedia.org/wiki/Embedded_system
2 https://roscon.ros.org/2015/presentations/ros2_on_small_embedded_systems.pdf
Operating systems such as Linux do not guarantee real-time operation, and microcontrollers
suitable for real-time control are required to control actuators and sensors.
In case of TurtleBot3 Burger and Waffle Pi, a ARM Cortex-M7 series microcontroller is used
for its actuator and sensor control, and the Raspberry Pi 3 board, which uses for Linux and ROS,
is connected via USB and configured as shown in Figure 9-3.
The STM32F7466 from ST is used as the main MCU with the built in ARM Cortex-M7 core, and
the floating point calculation is supported by hardware, making it suitable for implementing
functions requiring high performance.
3 http://emanual.robotis.com/docs/en/platform/turtlebot3/appendix_opencr1_0/
4 https://github.com/ROBOTIS-GIT/OpenCR-Hardware
5 https://github.com/ROBOTIS-GIT/OpenCR
6 http://www.st.com/en/microcontrollers/stm32f746ng.html
9.1.1. Characteristics
High Performance
ST’s STM32F746 chip used in OpenCR is a high-performance microcontroller running at up to
216MHz with the Cortex-M7 core at the top of ARM microcontrollers. It can also be used for
processing large amounts of data by utilizing algorithms and various peripheral devices that
require high-speed operation.
Arduino Support
For those who are unfamiliar with the embedded development environment, the development
environment of OpenCR is easy to use by using the Arduino IDE7. The OpenCR provides Arduino
UNO8 compatible interface, so various libraries, source code and shield modules made for
Arduino development environment can be used. Since the OpenCR board is added and managed
by the board manager in Arduino IDE, it is easy to update the firmware.
7 https://www.arduino.cc/en/Main/Software
8 https://store.arduino.cc/usa/arduino-uno-rev3
IMU Sensor
OpenCR includes MPU925010 chip, which is integrated triple-axis gyroscope, triple-axis
accelerometer, and triple-axis magnetometer sensor in one chip, therefore, various applications
using IMU sensor can be used without adding a sensor. High speed read and write is available
because the sensor data is transferred with I2C or SPI communication.
Power Output
When the OpenCR is receiving 7V ~ 24V input power source, 12V (1A), 5V (4A) and 3.3V (800mA)
outputs are supported. It can be used as a power source of SBC and sensor such as Raspberry Pi,
USB camera because it supports up to 5V / 4A.
9 https://en.wikipedia.org/wiki/JTAG
10 https://www.invensense.com/products/motion-tracking/9-axis/mpu-9250/
Open Source
Materials required for building OpenCR board are open. The bootloader, firmware and the PCB
gerber that are necessary for hardware manufacture is available on GitHub. Therefore, the user
can modify it as needed and produce it.
■■ https://github.com/ROBOTIS-GIT/OpenCR
■■ https://github.com/ROBOTIS-GIT/OpenCR-Hardware
Hardware Specification
Hardware specifications of OpenCR are shown in Table 9-1.
Items Specifications
UART x 2
CAN
SPI
User Switch x 2
└ 12V@1A, 5V@4A, 3.3V@800mA
Weight 60g
$ wget https://raw.githubusercontent.com/ROBOTIS-GIT/OpenCR/master/99-opencr-cdc.rules
$ sudo cp ./99-opencr-cdc.rules /etc/udev/rules.d/
$ sudo udevadm control --reload-rules
$ sudo udevadm trigger
The ‘99-opencr-cdc.rules’ file contains options to change the access permission of the USB
port and to prevent the OpenCR from being recognized as a modem.
In Linux, when the serial device is connected, the system transmits a command to identify
whether the device is a modem or not, and this command could cause an issue when OpenCR is
connected to the Linux system.
Compiler Preferences
The GCC in OpenCR uses 32-bit executable files so if you have a 64-bit OS installed, you need to
add a 32-bit compatibility on the system.
Download the latest version and extract it to the ‘~/tools’ folder and proceed with the
installation. If you do not have the tools folder, create a new one with the command ‘cd ~/ &&
mkdir tools’.
$ cd ~/tools/arduino-1.8.2
$ ./install.sh
Add the Arduino IDE path to your shell script file so you can run it from any location. The
shell script file can be edited using gedit or other text editing programs like vim, emacs, nano,
sublime text and visual studio code.
$ gedit ~/.bashrc
Add the location of uncompressed file path to your PATH and add the following command to
apply it.
export PATH=$PATH:$HOME/tools/arduino-1.8.2
$ source ~/.bashrc
$ arduino
OpenCR Settings
Once the Arduino IDE installation is completed, you need to add the board so that you can build
and download the firmware to OpenCR. From the menu of the Arduino IDE, select File →
Preferences, enter the following address to the board configuration file in the Additional Boards
Manager URLs field in Figure 9-12 and click ‘OK’.
■■ ttps://raw.githubusercontent.com/ROBOTIS-GIT/OpenCR/master/arduino/opencr_release/
h
package_opencr_index.json
After entering the board configuration file URL, select Tools → Board → Boards Manager
from the Arduino IDE menu as shown in Figure 9-13.
Once the source code has been compiled, the Arduino IDE calls the OpenCR downloader and
starts downloading the firmware. At the bottom of the message window, the following message
will be displayed and the downloaded firmware will be executed.
Update Bootloader
When OpenCR bootloader needs to be updated, the DFU function of the bootloader that is built
in STM32F746 can be used. DFU mode allows users to update bootloader without having
additional equipments such as JTAG. For reference, the bootloader is pre-loaded at the time of
board production, so there are not much instances when users have to update it. While the
OpenCR is connected to PC with USB, press and hold the BOOT0 pin and press RESET to activate
the bootloader built into the STM32F746 and enter the DFU mode.
$ lsusb
In order to activate the DFU mode, select Tools → Programmer → DFU_UTIL from the
Arduino IDE menu as shown in Figure 9-22.
LED
OpenCR has 4 additional LEDs shown in the following Figure 9-25 that users can use. Let’s
display the information using these LEDs.
blink_led
int led_pin = 13;
int led_pin_user[4] = { BDPIN_LED_USER_1, BDPIN_LED_USER_2, BDPIN_LED_USER_3, BDPIN_LED_USER_4 };
void setup() {
pinMode(led_pin, OUTPUT);
pinMode(led_pin_user[0], OUTPUT);
pinMode(led_pin_user[1], OUTPUT);
void loop() {
int i;
digitalWrite(led_pin, HIGH);
delay(100);
digitalWrite(led_pin, LOW);
delay(100);
Buzzer
The OpenCR has a built-in Buzzer and one of the basic functions in Arduino can be used to
generate the sound. The Buzzer is connected to the pin that is defined as BDPIN_BUZZER.
Parameters for the tone() function are pin number, frequency (Hz) and duration (ms).
buzzer
void setup()
{
}
void loop() {
tone(BDPIN_BUZZER, 1000, 100);
delay(200);
}
read_voltage
void setup() {
Serial.begin(115200);
}
void loop() {
float voltage;
voltage = getPowerInVoltage();
Serial.print("Voltage : ");
Serial.println(voltage);
}
IMU Sensor
The accelerometer and gyroscope sensor values are converted to the Roll, Pitch and Yaw value
of the board by sensor fusion. When the IMU class is created as an object and the update()
function is called, the acceleration and gyro values are periodically read from the sensor. The
default calculation cycle is 200Hz, but can be changed.
read_roll_pitch_yaw
#include <IMU.h>
cIMU IMU;
void setup()
{
Serial.begin(115200);
IMU.begin();
}
void loop()
{
IMU.update();
Serial.print(IMU.rpy[0]);
Serial.print(" ");
Serial.print(IMU.rpy[1]);
Serial.print(" ");
Serial.println(IMU.rpy[2]);
}
}
Dynamixel SDK
In order to control the Dynamixel actuators from ROBOTIS, the OpenCR uses the modified
DynamixelSDK C++ for Arduino. The usage of DynamixelSDK is the same, and the existing
source code might be partially modified.
DynamixelSDK can be downloaded at the following address, and the modified version is
already downloaded to your OpenCR by default.
■■ https://github.com/ROBOTIS-GIT/DynamixelSDK
Some of the DynamixelSDK examples are included in the OpenCR library and support both
protocols 1.0 and 2.0.
9.2. rosserial
The rosserial11 is a package that converts ROS messages, topics, and services to be used in a
serial communication. Generally, microcontrollers use serial communication like UART rather
than TCP/IP which is used as default communication in ROS. Therefore, to convert message
communication between a microcontroller and a computer using ROS, rosserial should interpret
messages for each device.
Figure 9-28 rosserial server (for PC) and client (for embedded system)
In the figure 9-28, the PC running ROS is a rosserial server12 and the microcontroller
connected to the PC becomes the rosserial client13. Both server and client send and receive data
using the rosserial protocol, any hardware that is capable of sending and receiving data can be
used. This makes it possible to use ROS messages with UART that is often used in microcontrollers.
11 http://wiki.ros.org/rosserial
12 http://wiki.ros.org/rosserial_server
13 http://wiki.ros.org/rosserial_client
In the case of general computer including SBC, rosserial can not guarantee real-time control.
However, using microcontroller as an auxiliary hardware controller enables real-time control.
rosserial_python
This package is implemented with Python language and is commonly used to use rosserial.
rosserial_server
Although the performance has been improved with the use of C++ language, there are some
functional limitations compared to rosserial_python.
rosserial_java
The rosserial_java library is used when a Java-based module is required, or when it is used with
the Android SDK.
14 http://wiki.ros.org/rosserial/Overview/Protocol
rosserial_embeddedlinux
This is a library that can be used on embedded Linux.
rosserial_windows
This library supports Windows operating system and communicates with Windows applications.
rosserial_mbed
This library supports mbed platform, which is an embedded development environment, and
enables the use of mbed boards.
rosserial_tivac
This is a library for use on the Launchpad board manufactured by TI.
Packet configuration
The rosserial packet includes the header field to send and receive the ROS standard message and
the checksum field to verify the validity of the data.
This flag byte is always 0xFF and indicates the start of the packet.
This field indicates the protocol version of ROS where Groovy is 0xFF and Hydro, Indigo, Jade,
Kinetic are 0xFE.
Message Length
This 2 bytes field indicates the data length of the message transmitted through the packet. The
Low byte comes first, followed by the High byte.
The checksum verifies the validity of the message length and is calculated as follows.
255 - ((Message Length Low Byte + Message Length High Byte) % 256)
Topic ID
The ID field consists of 2 bytes and is used as an identifier to distinguish the message type. Topic
IDs from 0 to 100 are reserved for system functions. The main topic IDs used by the system are
shown below and they can be displayed from ‘rosserial_msgs/TopicInfo’.
uint16 ID_PUBLISHER=0
uint16 ID_SUBSCRIBER=1
uint16 ID_SERVICE_SERVER=2
uint16 ID_SERVICE_CLIENT=4
uint16 ID_PARAMETER_REQUEST=6
uint16 ID_LOG=7
uint16 ID_TIME=10
uint16 ID_TX_STOP=11
This checksum is for validating Topic ID and message data, and is calculated as follows.
Query Packet
When the rosserial server starts, it requests information such as topic name and type to the
client. When requesting information, the query packet is used. The Topic ID of query packet is 0
and the data size is 0. The data in the query packet is shown below.
When the client receives the query packet, it sends a message to the server with the following
data, and the server sends and receives messages based on this information.
uint16 topic_id
string topic_name
string message_type
string md5sum
int32 buffer_size
Memory Constraints
The microcontrollers used in the embedded system have a considerably smaller memory
compare to standard PCs. Therefore, the memory capacity has to be considered in advance
before defining the number of publishers, subscribers, and transmit and receive buffers.
Messages exceeding the size of the transmission or reception buffers can not be handled so
beware of the message size. In the case of OpenCR, it provides 1MB of Flash memory and 320KB
SRAM so that you can upload a lot of programming. In addition, you can use rosserial more
freely by setting the buffer size for serialization and deserialization to 1024 bytes for setting up to
use 25 publishers and 25 subscribers.
Float64
When using rosserial_arduino as a rosserial client, the microcontroller in the Arduino board
does not support 64bit floating point calculation, so when building a library, the 64bit data type
is automatically converted to 32bit type. If the embedded board supports 64bit floating point
Strings
Because of the memory restriction of the microcontroller, the string data is not stored in the
String message, but only the pointer to the defined string is stored in the message. The following
example shows how to use a String message.
std_msgs::String str_msg;
unsigned char hello[13] = "hello world!";
str_msg.data = hello;
Arrays
Similar to Strings, array uses pointer to handle the actual data in the array, and the end of an
array data is not known. Therefore, length of the array should be included when transmitting
and receiving messages.
Baudrate
UART communication with 115200bps could become slower to handle messages when the
number of message increases. OpenCR uses virtual USB serial communication in order to
provide high-speed communication.
Installing Package
Install the rosserial and Arduino support packages with the following command. In addition,
there are ‘ros-kinetic-rosserial-windows’, ‘ros-kinetic-rosserial-xbee’ and ‘ros-kinetic-rosserial-
embeddedlinux’, but install additional packages if necessary.
$ cd ~/Arduino/libraries/
$ rm -rf ros_lib
$ rosrun rosserial_arduino make_libraries.py .
ros.h
#include "ros/node_handle.h"
#include "ArduinoHardware.h"
namespace ros
{
/* Publishers, Subscribers, Buffer Sizes */
typedef NodeHandle_<ArduinoHardware, 25, 25, 1024, 1024> NodeHandle;
}
ArduinoHardware.h
#define SERIAL_CLASS USBSerial
__contents omitted__
class ArduinoHardware
{
iostream = &Serial;
You must run roscore first before trying the following examples.
LED
Four LEDs are defined in the ‘led_out’ subscriber using ‘std_msg/Byte’, which is ROS standard
data type. When the callback function in the subscriber is called, the corresponding LED to the
bit 0 ~ 3 of the received message will respond by turning on or off the light when the bit is 1 or 0
respectively.
a_LED.ino
#include <ros.h>
#include <std_msgs/String.h>
#include <std_msgs/Byte.h>
ros::NodeHandle nh;
void setup() {
pinMode(led_pin_user[0], OUTPUT);
pinMode(led_pin_user[1], OUTPUT);
pinMode(led_pin_user[2], OUTPUT);
pinMode(led_pin_user[3], OUTPUT);
nh.initNode();
nh.subscribe(sub);
}
void loop() {
nh.spinOnce();
}
Download the firmware with Arduino IDE and run the rosserial server using rosserial_
python as follows. If OpenCR is connected to another USB port other than ‘ttyACM0’, modify the
execution command ‘_port: = /dev/ttyACM0’ to correct serial port. Once the rosserial server is
running, rosserial server and OpenCR client will send and receive packets through USB serial
port.
$ rostopic list
/diagnostics
/led_out
/rosout
/rosout_agg
The following ‘rostopic pub’ commands will write a value in the message and publish it to the
‘led_out’ topic.
Button
This example also uses ‘std_msgs/Byte’ data type like the LED example, but ‘button’ is declared
as a topic name for ‘pub_button’ node to publish the button status to the server. The following
example code publishes pressed or released status of SW1 and SW2 button of the OpenCR board
as a Byte value. The button status is published at regular interval of 50ms.
b_Button.ino
#include <ros.h>
#include <std_msgs/Byte.h>
ros::NodeHandle nh;
std_msgs::Byte button_msg;
ros::Publisher pub_button("button", &button_msg);
void setup()
{
nh.initNode();
nh.advertise(pub_button);
pinMode(BDPIN_PUSH_SW_1, INPUT);
void loop()
{
uint8_t reading = 0;
static uint32_t pre_time;
if (digitalRead(BDPIN_PUSH_SW_1) == HIGH)
{
reading |= 0x01;
}
if (digitalRead(BDPIN_PUSH_SW_2) == HIGH)
{
reading |= 0x02;
}
nh.spinOnce();
}
Download the Button example and run rosserial_python from PC as follows, and the message
for setting up the button publisher will be displayed.
Make sure you have a button topic with ‘rostopic list’ command.
Open a new terminal window and enter the following command to display the published
button status in every 50ms.
In order to represent the exact voltage, ‘std_msgs/Float32’ data type is used in the voltage
measuring example. Declare the publisher node ‘pub_voltage’ with ‘voltage’ topic, and measured
input voltage will be published every 50ms.
ros::NodeHandle nh;
std_msgs::Float32 voltage_msg;
ros::Publisher pub_voltage("voltage", &voltage_msg);
void setup()
{
nh.initNode();
nh.advertise(pub_voltage);
}
void loop()
{
static uint32_t pre_time;
nh.spinOnce();
}
IMU
In order to pass the IMU sensor value with the ROS standard message type, ‘sensor_msgs/Imu’
message is used. The following example converts the calculated attitude with the IMU sensor
library to ‘sensor_msgs/Imu’ message type and publishes the message via a topic. Generate the tf
with ‘base_link’ to set it as reference tf for IMU sensor, and connect ‘base_link’ to IMU sensor so
that the change of attitude can be monitored from the ‘base_link’ value.
d_IMU.ino
#include <ros.h>
#include <sensor_msgs/Imu.h>
#include <tf/tf.h>
#include <tf/transform_broadcaster.h>
#include <IMU.h>
ros::NodeHandle nh;
sensor_msgs::Imu imu_msg;
ros::Publisher imu_pub("imu", &imu_msg);
geometry_msgs::TransformStamped tfs_msg;
tf::TransformBroadcaster tfbroadcaster;
cIMU imu;
imu.begin();
}
void loop()
{
static uint32_t pre_time;
imu.update();
imu_msg.header.stamp = nh.now();
imu_msg.header.frame_id = "imu_link";
imu_msg.angular_velocity.x = imu.gyroData[0];
imu_msg.angular_velocity.y = imu.gyroData[1];
imu_msg.angular_velocity.z = imu.gyroData[2];
imu_msg.angular_velocity_covariance[0] = 0.02;
imu_msg.angular_velocity_covariance[1] = 0;
imu_msg.angular_velocity_covariance[2] = 0;
imu_msg.angular_velocity_covariance[3] = 0;
imu_msg.angular_velocity_covariance[4] = 0.02;
imu_msg.angular_velocity_covariance[5] = 0;
imu_msg.angular_velocity_covariance[6] = 0;
imu_msg.angular_velocity_covariance[7] = 0;
imu_msg.angular_velocity_covariance[8] = 0.02;
imu_msg.linear_acceleration.x = imu.accData[0];
imu_msg.linear_acceleration.y = imu.accData[1];
imu_msg.linear_acceleration.z = imu.accData[2];
imu_msg.orientation.w = imu.quat[0];
imu_msg.orientation.x = imu.quat[1];
imu_msg.orientation.y = imu.quat[2];
imu_msg.orientation.z = imu.quat[3];
imu_msg.orientation_covariance[0] = 0.0025;
imu_msg.orientation_covariance[1] = 0;
imu_msg.orientation_covariance[2] = 0;
imu_msg.orientation_covariance[3] = 0;
imu_msg.orientation_covariance[4] = 0.0025;
imu_msg.orientation_covariance[5] = 0;
imu_msg.orientation_covariance[6] = 0;
imu_msg.orientation_covariance[7] = 0;
imu_msg.orientation_covariance[8] = 0.0025;
imu_pub.publish(&imu_msg);
tfs_msg.header.stamp = nh.now();
tfs_msg.header.frame_id = "base_link";
tfs_msg.child_frame_id = "imu_link";
tfs_msg.transform.rotation.w = imu.quat[0];
tfs_msg.transform.rotation.x = imu.quat[1];
tfs_msg.transform.rotation.y = imu.quat[2];
tfs_msg.transform.rotation.z = imu.quat[3];
tfs_msg.transform.translation.x = 0.0;
tfs_msg.transform.translation.y = 0.0;
tfs_msg.transform.translation.z = 0.0;
nh.spinOnce();
}
Download the example and run the rosserial server with the following command to create
the ‘imu’ and ‘tf’ publishers.
Open a new terminal window and verify the ‘imu’ topic message data as follows.
In order to visualize the IMU data, use RViz which can graphically represent the IMU data in
3D space. Enter the command as follows to run the RViz.
$ rviz
Go to Global Options → Fixed Frame and select ‘base_link’ in the RViz Displays option panel
as shown in Figure 9-32. Then click Add at the bottom of the Displays option panel to add ‘Axes’
to the option item and select ‘imu_link’ for the Reference Frame to see the change of axis on the
screen according to the attitude of OpenCR.
For the TurtleBot3 Burger and Waffle (Waffle Pi), there is a difference between the mounting
position and turning radius of the Dynamixel actuator. The specific setting values have been
used for both models in ‘turtlebot3_core_config.h’. If the mounting position of the Dynamixel
actuator is changed, the parameter value must also be changed.
turtlebot3_core_config.h
#define WHEEL_RADIUS 0.033 // meter
#define WHEEL_SEPARATION 0.160 // meter
#define TURNING_RADIUS 0.080 // meter
#define ROBOT_RADIUS 0.105 // meter
To use the information in the TurtleBot3 Burger, run the ‘rosserial_python’ node with the
following command. When this is done, the ‘turtlebot3_core’ node is created and the move
command is subscribed from the ‘/cmd_vel’ topic as shown in Figure 9-34 while the odometry
information (/odom), IMU information (/imu), sensor information (/sensor_state) are published.
The same topics are used for TurtleBot3 Waffle and Waffle Pi. Refer to Chapter 10 for more
details.
turtlebot3_core_config.h
#define WHEEL_RADIUS 0.033 // meter
#define WHEEL_SEPARATION 0.287 // meter
#define TURNING_RADIUS 0.1435 // meter
#define ROBOT_RADIUS 0.220 // meter
Click the Upload button on the Arduino IDE to download and once download is completed,
click the serial monitor icon on the upper right corner of the application as shown in Figure
9-37. Connect the Dynamixel to the OpenCR. Note that this firmware only works with one
Dynamixel, so you have to connect only one Dynamixel at a time.
To prevent input mistakes, a confirmation menu is displayed once again. To proceed with the
changes, enter ‘Y’.
If you enter ‘Y’, the setup tool starts to search the connected Dynamixel using different
baudrates, and ID. If a Dynamixel is found, it will be reset for the TurtleBot3 configuration.
When the setup is completed, ‘OK’ message is printed.
Dynamixel Test
Complete the setup procedure and verify if the change has been properly made. If you select one
of the test menu for the motor, the connected Dynamixel with correct configuration will iterate
the rotation in the clockwise and counterclockwise. To end the test, press the Enter key again. To
test the left Dynamixel, enter ‘3’ as shown in Figure 9-41 and enter ‘4’ for the right Dynamixel.
We have discussed how to integrate ROS in an embedded system. The embedded system is
closely related to robots that require real-time control. Having learned to integrate ROS will help
with your future robot developments. Chapters 10, 11, 12, and 13 will cover practical examples of
mobile robots using the embedded system described in this chapter.
Mobile
Robots
There are 3 versions of the TurtleBot series3 (see Figure 10-2). TurtleBot1 was developed by
Tully (Platform Manager at Open Robotics) and Melonee (CEO of Fetch Robotics) from Willow
Garage on top of the iRobot’s Roomba-based research robot, Create, for ROS deployment. It was
developed in 20104 and has been on sale since 2011. In 2012, TurtleBot2 was developed by Yujin
Robot based on the research robot, iClebo Kobuki. In 2017, TurtleBot3 was developed with
features to supplement the lacking functions of its predecessors, and the demands of users. The
TurtleBot3 adopts ROBOTIS smart actuator ‘Dynamixel’ for driving.
1 http://el.media.mit.edu/logo-foundation/index.html
2 http://el.media.mit.edu/logo-foundation/what_is_logo/logo_primer.html
3 http://www.turtlebot.com/about
4 http://spectrum.ieee.org/automaton/robotics/diy/interview-turtlebot-inventors-tell-us-everything-about-the-robot
5 http://turtlebot3.robotis.com
The 3D CAD design files of TurtleBot3 are available through cloud-based 3D CAD tool
‘Onshape’, and allows all design team members to access the shared design files using
smartphones, tablets as well as PC, regardless of operating system. Not only you can check each
component of TurtleBot3 using a web browser, but also you can download parts to your
repository and make your own parts by modifying the design. After downloading the STL file,
parts can be printed using a 3D printer. The files for each model can be found in the Open
Source item provided as an appendix at the TurtleBot3 official Wiki6.
6 http://turtlebot3.robotis.com
The aforementioned hardware details of TurtleBot3 and basic contents described in this chapter can
also be found in the official TurtleBot3 wiki in following link. To learn ROS using TurtleBot3, refer to the
link below.
http://turtlebot3.robotis.com
The hardware design files and software of the TurtleBot3 are open to the public. If you need the
hardware files for each TurtleBot3 model and for OpenCR used as the sub-controller of the TurtleBot3,
refer to the list and links below. Unless otherwise stated, all hardware are open source and comply
with the Hardware Statement of Principles and Definition v1.0 license.
OpenCR: https://github.com/ROBOTIS-GIT/OpenCR-Hardware
The software of TurtleBot3 is disclosed to public as an open source. The OpenCR bootloader used
as a sub-controller of TurtleBot3, the firmware for developing with Arduino IDE, and the firmware
for controlling TurtleBot3, as well as the ROS packages (turtlebot3, turtlebot3_msgs, turtlebot3_
simulations, turtlebot3_applications) are also available as an open source. Licenses for open source
software vary for each source but basically comply with Apache license 2.0, and some software uses
3-Clause BSD License and GPLv3.
https://github.com/ROBOTIS-GIT/OpenCR
https://github.com/ROBOTIS-GIT/turtlebot3
https://github.com/ROBOTIS-GIT/turtlebot3_msgs
https://github.com/ROBOTIS-GIT/turtlebot3_simulations
https://github.com/ROBOTIS-GIT/turtlebot3_applications
■■ http://turtlebot3.robotis.com
If you have Linux and ROS installed, you can install the software associated with TurtleBot3.
All of these installation methods are described in the above-mentioned TurtleBot3 wiki, but
here we summarize them briefly and explain just the installation method. Let’s install dependent
packages and TurtleBot3 packages on the operating PC (this PC will be referred as the Remote
PC) that controls the TurtleBot3. However, we have excluded the ‘turtlebot3_applications’
package which contains various examples not mentioned in this book.
Next, install TurtleBot3 packages and dependent packages such as sensor package in the SBC
of TurtleBot3.
Once all software are installed, it is important to configure the network environment as
shown in Figure 10-5. For details on how to change the settings of ROS_HOSTNAME and ROS_
MASTER_URI, refer to Section 3.2 and Section 8.3. TurtleBot3 uses desktop PC or laptop as a
Remote PC, which acts as a master to run the roscore and takes process demanding controls
such as SLAM and Navigation. The SBC in TurtleBot3 is responsible for operating robot
components and sensor data collection. The following is an example of remote control setting
when ROS Master is running on the remote PC.
export ROS_HOSTNAME=192.168.7.100
export ROS_MASTER_URI=http://${ROS_HOSTNAME}:11311
export ROS_HOSTNAME=192.168.7.200
export ROS_MASTER_URI=http://192.168.7.100:11311
Now we have completed the development environment for TurtleBot3. In the next section,
let’s control TurtleBot3 with various ROS packages starting with remote control.
$ roscore
Roslaunch can execute multiple nodes at the same time. However, messages from each node are not
displayed by default. If necessary, you can use the ‘--screen’ option to see all of the hidden messages
during operation. If you are using roslaunch, it is recommended to append this option.
When you run this launch file, the ‘turtlebot3_teleop_keyboard’ node is executed and the
following message appears in the terminal window. This node receives ‘w’, ‘a’, ‘d’, ‘x’ key inputs
and transmits the translational and rotational speed to the robot in m/sec and rad/sec
respectively. The ‘spacebar’ and ‘s’ key will reset the translational and rotational speed to ‘0’ to
stop the movement of TurtleBot3.
CTRL-C to quit
If you want to use the PS3 joystick instead of the keyboard to teleoperate the TurtleBot3,
install the dependent package for joystick on the remote PC as follows. Then run the ‘teleop.
launch’ file in the ‘teleop_twist_joy’ package to control the robot with the PS3 joystick. The PS3
joystick must be connected to the remote PC via Bluetooth.
$ export TURTLEBOT3_MODEL=burger
$ roslaunch turtlebot3_bringup turtlebot3_remote.launch
$ rosrun rviz rviz -d `rospack find turtlebot3_description`/rviz/model.rviz
When RViz is executed, the 3D model of TurtleBot3 Burger will be displayed at the origin of
RGB coordinates along with the tf of each joint of the robot as shown in Figure 10-6. Also, the
distance data from 360 degree LDS sensor can be displayed around the robot as red dots.
In order to operate TurtleBot3, some works has to be done from local TurtleBot3 SBC, which
will be a nuisance for the user to work back and forth on two computers. To solve this problem,
remotely accessing the TurtleBot3 SBC from the remote PC using SSH is recommended. This
allows you to execute commands on the TurtleBot3 SBC from the remote PC. Here’s an example
of how to remotely connect to the TurtleBot3 SBC from the remote PC. For more information,
see the notes on SSH.
$ ssh [email protected]
SSH refers to the application or the protocol that allows you to log in to another computer in the
network and to run commands on a remote system and copy files to another system. It is often used
when connecting to remote computer and sends a command from a terminal window in Linux. To do
this, the ssh application should be installed as follows.
To connect to a remote computer (in this case TurtleBot3), use the following command to connect in
the terminal window. Once connection is established to the PC, commands can be entered just like
using a local computer.
In case of Raspberry Pi (TurtleBot3 Burger and Waffle Pi), since the SSH server of Ubuntu MATE
16.04.x and Raspbian is disabled by default. If you want to enable SSH, please refer to the documents
below.
https://www.raspberrypi.org/documentation/remote-access/ssh/
https://ubuntu-mate.org/raspberry-pi/
For example, the following ‘rostopic list’ command can verify that various topics are being
published or subscribed:
$ rostopic list
/cmd_vel
/cmd_vel_rc100
/diagnostics
/imu
/joint_states
To get more details on node and topic, run ‘rqt_graph’ as shown in the following example.
You can then check out the topics published from and subscribed by the TurtleBot3 as shown in
Figure 10-7.
$ rqt_graph
Next, let’s control the velocity of the TurtlebBot3. The x and y used here are the translational
speeds, and the unit is the ROS standard m/s. And z is the rotational speed in rad/s. When the
value of x is 0.02 as shown in the following example, the TurtleBot3 advances at 0.02 m/s in the
positive x-axis direction.
When the z value is given as 1.0 as shown in the following example, the TurtleBot3 will rotate
counterclockwise at 1.0 rad/s.
You do not need to know all the published topics, but it would be a good practice to learn
how to use them. Especially, ‘odom’ for odometry information, ‘tf’ for transformation
information, ‘joint_states’ for joint information, and sensor information related topics are
essential when using TurtleBot3 hereafter.
scan sensor_msgs/LaserScan Topic that confirms the scan values of the LiDAR
mounted on the TurtleBot3
The ‘sensor_state’ topic mainly deals with analog sensors connected to the OpenCR embedded
board. You can get information such as bumper, cliff, button, left_encoder, right_encoder as the
following example.
The ‘odom’ topic can be used to obtain odometry information, which records driving
information. In TurtleBot3, the essential odometry information can be obtained based on gyro
and encoder. The odometry is necessary for navigation.
You can also use the ‘tf_tree’ plugin in ‘rqt’ as shown in Figure 10-8 to visualize the tf
information in GUI environment. In Figure 10-8, since the robot model information is missing
and therefore each coordinate is not connected, but when performing coordinate translation
with robot model information, the connection information of each joint can be used as shown in
Figure 10-14.
We have completed the topic section. ROS uses topic, service and action to communicate
among nodes, which are the processors. Especially, topic is the most widely used message
communication method.
10.8.1. Simulation
TurtleBot3 supports development environment that can be programmed and developed with a
virtual robot in the simulation. There are two development environments to do this, one is using
3D visualization tool ‘RViz’ and the other is using the 3D robot simulator ‘Gazebo’.
In this section, we will first look into how to use RViz. RViz is a very useful to control TutleBot3
and test SLAM and Navigation using the ‘turtlebot3_simulations’ metapackage. To use virtual
simulation with this metapackage, the ‘turtlebot3_fake’ package should be installed first. This is
mentioned in Section ‘10. 5 TurtleBot3 Development Environment’. If you have already installed
the package, move on to the next section.
$ export TURTLEBOT3_MODEL=burger
$ roslaunch turtlebot3_fake turtlebot3_fake.launch
The above command loads the 3D modeling file in the turtlebot3_description package and
executes the ‘turtlebot3_fake_node’ which publishes fake topics as actual robot publishes, and
the ‘robot_state_publisher’ node which publishes the transformation of each wheel to ‘tf’ topic.
However, the sensor information cannot be used in RViz, a 3D simulator based on physics engine
‘Gazebo’ should be used. As Gazebo will be explained in the next section, we will take a look at
Odometry and TF that can be verified during a simple movement.
In order to visualize TurtleBot3 on RViz, run RViz and go to [Global Options] → [fixed frame]
and select ‘/odom’. Then click the ‘Add’ button in the bottom left corner of the window to add
‘RobotModel’ and display the 3D model file loaded from ‘turtlebot3_fake.launch’ in the center of
the screen as shown in Figure 10-9.
Next, let’s run a virtual robot. Run the ‘turtlebot3_teleop_key.launch’ file from the ‘turtlebot3_
teleop’ package, which lets you operate the robot remotely with the keyboard.
Figure 10-11 Adding the odometry display to verify the odom topic
Now, let’s move around the virtual TurtleBot3 using the ‘turtlebot3_teleop_keyboard’ node.
As shown in Figure 10-12, red arrows are displayed on the robot’s trajectory. This odometry is a
very basic information indicating where the robot is currently located at. In the above practice,
we checked that odometry information is displayed correctly.
The tf topic containing the relative coordinates of TurtleBot3 components can be verified
with the rostopic command as before, but let’s visualize it with RViz like odom and visualize the
hierarchy with ‘rqt_tf_tree’.
Click the Add button at the bottom left of RViz and select ‘TF’. This will display ‘odom’, ‘base_
footprint’, ‘imu_link’, ‘wheel_left_link’, ‘wheel_right_link’, etc., as shown in Figure 10-13. Let’s
move the virtual TurtleBot3 again using the ‘turtlebot3_teleop_keyboard’ node. As TurtleBot3
moves, you can see the ‘wheel_left_link’ and ‘wheel_right_link’ rotate.
Now, run ‘rqt_tf_tree’ with the following command. We can see that the TurtleBot3
components are relatively transformed as shown in Figure 10-14. The position of sensors that
can be mounted on the robot can be expressed likewise. This will be covered in detail in the next
chapter.
7 http://www.darpa.mil/program/darpa-robotics-challenge
8 http://gazebosim.org/
The latest version of Gazebo is 8.0, and just five years ago, it was 1.9. The current version is
the adopted as a default application of the ROS Kinetic Kame used in this book. If ROS is installed
as instructed in ‘3.1 ROS installation’, Gazebo can be used without additional installation.
Now let’s run Gazebo. If there are no problems, you can see that Gazebo is running as shown
in Figure 10-15. For now, Gazebo can be seen as an independent simulator as it is not related to
ROS.
$ gazebo
Below is a command to set the 3D model file to Burger or Waffle, Waffle Pi. The example
command is based on the Waffle which can get the camera information. Set the TURTLEBOT3_
MODEL variable to ‘waffle’ with the following command. If the command is written in the ‘~/.
bashrc’ file, you do not need to set the model every time when opening a new terminal window.
$ export TURTLEBOT3_MODEL=waffle
Now run the launch file as shown in the following example. The ‘gazebo’, ‘gazebo_gui’,
‘mobile_base_nodelet_manager’, ‘robot_state_publisher’, and ‘spawn_mobile_base’ nodes are
executed together and TurtleBot3 Waffle appears in the Gazebo screen, as shown in Figure 10-16.
Gazebo is a 3D simulator that uses a lot of CPU, GPU and RAM resources due to the use of physics
engine and graphic effects. Depending on the hardware specifications of PC, it may take a
considerable amount of time to load the application.
As shown in the following figure, you can see that only the robot is displayed in an empty
plane.
In the above example, only the robot is loaded in the Gazebo. In order to perform the actual
simulation, the user can specify the environment or load the environment model provided by
Gazebo. An environment model can be added to Gazebo by clicking on ‘Insert’ at the top of the
screen and selecting a file. There are various robot models and objects as well as environment
models, so let’s add them when necessary.
In this instruction, we will use provided environment. Close the currently active Gazebo
screen by clicking the ‘X’ button in the upper left corner of the screen or enter [Ctrl + c] in the
terminal window where you launched Gazebo.
Gazebo looks pretty similar to the RViz simulation of the previous section until now. Gazebo,
however, is not only designed to look like a virtual robot, but it can also virtually check the
collision, calculate the position, and use the IMU sensor and camera. Below is an example of
using this launch file. When the file is executed, the virtual TurtleBot3 moves randomly in the
loaded environment and avoid obstacles before running into the wall as shown in Figure 10-18.
This is a great example of learning Gazebo.
$ export TURTLEBOT3_MODEL=waffle
$ roslaunch turtlebot3_gazebo turtlebot3_simulation.launch
In addition to this, let’s run RViz with the following command. As shown in Figure 10-19,
RViz can visualize the position of robot operating in Gazebo, distance sensor data and camera
image. This simulation output is almost identical to operating an actual robot in the environment
model designed just like the one in Gazebo.
$ export TURTLEBOT3_MODEL=waffle
$ roslaunch turtlebot3_gazebo turtlebot3_gazebo_rviz.launch
Launch Gazebo
$ export TURTLEBOT3_MODEL=waffle
$ roslaunch turtlebot3_gazebo turtlebot3_world.launch
Launch SLAM
$ export TURTLEBOT3_MODEL=waffle
$ roslaunch turtlebot3_slam turtlebot3_slam.launch
Execute RViz
$ export TURTLEBOT3_MODEL=waffle
$ rosrun rviz rviz -d `rospack find turtlebot3_slam`/rviz/turtlebot3_slam.rviz
Execute Gazebo
$ export TURTLEBOT3_MODEL=waffle
$ roslaunch turtlebot3_gazebo turtlebot3_world.launch
Execute Navigation
$ export TURTLEBOT3_MODEL=waffle
$ roslaunch turtlebot3_navigation turtlebot3_navigation.launch map_file:=$HOME/map.yaml
$ export TURTLEBOT3_MODEL=waffle
$ rosrun rviz rviz -d `rospack find turtlebot3_navigation`/rviz/turtlebot3_nav.rviz
TurtleBot Simulation
TurtleBot supports three types of simulation which are stage, stdr, and Gazebo. Refer to the Wiki
below to perform various simulations with a virtual robot.
http://wiki.ros.org/TurtleBot_stdr
http://wiki.ros.org/TurtleBot_gazebo
http://wiki.ros.org/TurtleBot_stage
SLAM
and
Navigation
The navigation system has a relatively short history. In 1981, a Japnese car maker Honda first
proposed an analog system based on a three-axis gyroscope and a film map called ‘Electro
Gryrocator1’. Afterwards Etak Navigator2, an electronic navigation system operated with
electronic compass and sensors attached to wheels, was introduced by the U.S. automotive
supply company Etak. However, mounting the sensor and electronic compass to the car was a
heavy burden for the automobile prices and had reliability problems of the navigation system.
Since the 1970s, the United States has been developing satellite positioning systems for military
purposes, and in the 2000s, 24 GPS3 (Global Positioning System) satellites became available for
general purpose, and triangulation based navigation systems using these satellites began to
spread.
1 https://en.wikipedia.org/wiki/Electro_Gyrocator
2 https://en.wikipedia.org/wiki/Etak
3 https://en.wikipedia.org/wiki/Global_Positioning_System
➊ Map
➋ Pose of Robot
➌ Sensing
11.1.2. Map
The first essential feature for navigation is the map. The navigation system is equipped with a
very accurate map from the time of purchase, and the modified map can be downloaded
periodically so that it can be guided to the destination based on the map. But will a map of the
room be available where the service robot will be placed? Like a navigation system, a robot
needs a map, so we need to create a map and give it to the robot, or the robot should be able to
create a map by itself.
SLAM4 (Simultaneous Localization And Mapping) is developed to let the robot create a map
with or without a help of a human being. This is a method of creating a map while the robot
explores the unknown space and detects its surroundings and estimates its current location as
well as creating a map.
4 https://en.wikipedia.org/wiki/Simultaneous_localization_and_mapping
5 https://en.wikipedia.org/wiki/Differential_GPS
6 https://en.wikipedia.org/wiki/Dead_reckoning
7 http://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/16311/www/s07/labs/NXTLabs/Lab%203.html
ROS defines pose as the combination of the robot’s position(x, y, z) and orientation (x, y, z, w). As
described in Section 4.5 TF, the orientation is described by x, y, z, and w in quaternion form, whereas
position is described by three vectors, such as x, y, and z. For details on the ‘pose’ message, refer to
the following address. There are other technical terms that implicitly include direction information, so
don’t get confused with those terms.
http://docs.ros.org/api/geometry_msgs/html/msg/Pose.html
Figure 11-2 Information required for dead reckoning (center(x, y), wheel-to-wheel distance D and wheel
radius r)
( Elc - Elp ) r
vl = . (radian / sec) (Equation 11-1)
Te 180
( Erc - Erp ) r
vr = . (radian / sec) (Equation 11-2)
Te 180
The Equations 11-3 and 11-4 calculates the velocity of the left and right wheel (Vl, Vr). From
the left and right wheel velocity, linear velocity (vk) and the angular velocity (~k) of the robot can
be obtained as shown in Equations 11-5 and 11-6.
(Vr + Vl )
vk = (meter/sec) (Equation 11-5)
2
(Vr - Vl )
~k = (radian/sec) (Equation 11-6)
D
Finally, using these values, we can obtain the position (x( k+1), y( k+1 ) ) and the orientation (i( k+1 ))
of the robot from Equation 11-7 to 11-10.
Di
x( k+1 ) = xk + Ds cos U ik + Z (Equation 11-8)
2
Di
y( k+1 ) = yk + Ds sin Uik + Z (Equation 11-9)
2
In this section, we have briefly summarized SLAM and the components of navigation, but it
is still difficult and vast to understand. The robot pose measurement and estimation was
explained in the previous section. The obstacle measurement such as wall and objects, is
described in the Chapter 8 Robots, Sensors and Motors. Now, let’s look at the SLAM to create a
map, and navigation with the generated map.
8 https://en.wikipedia.org/wiki/A*_search_algorithm
9 http://www.cs.cmu.edu/~./motionplanning/lecture/Chap4-Potential-Field_howie.pdf
10 https://en.wikipedia.org/wiki/Particle_filter
11 https://en.wikipedia.org/wiki/Rapidly-exploring_random_tree
12 http://wiki.ros.org/gmapping
13 http://wiki.ros.org/cartographer
14 http://wiki.ros.org/rtabmap
Odometry
The odometry information should be obtainable. The traveled distance should be measured by
calculating via dead reckoning or compensating pose with inertial data or estimating translation
speed and angular speed with IMU sensor so the robot can calculate and estimate its current
pose.
Shape of Robot
Robots with a regular polygon or circular shapes are covered in this section. Transformed robots
that are long on one axis, robots too big to pass between doors, bipedal humanoid robots, multi-
joint mobile robots, and flying robots are not considered for SLAM. In this chapter, we will use
TurtleBot3, the official ROS platform we discussed in Chapter 10. The TurtleBot3 in Figure 11-4
satisfies all four SLAM constraints mentioned above.
In the example in the book, the environment is set as a labyrinth with a grid structure that
can be measured the length as shown in Figure 11-5.
roscore
Run ‘roscore’ on the [Remote PC]
$ roscore
Launch Robot
In [TurtleBot], run the ‘turtlebot3_robot.launch’ file for executing the ‘turtlebot3_core’ and
‘turtlebot3_lds’ nodes.
$ export TURTLEBOT3_MODEL=waffle
$ roslaunch turtlebot3_slam turtlebot3_slam.launch
Run RViz
Let’s run the visualization tool RViz so that you can visually check the results during SLAM.
When you run RViz with the following options, the display plugins are added from the beginning.
$ export TURTLEBOT3_MODEL=waffle
$ rosrun rviz rviz -d `rospack find turtlebot3_slam`/rviz/turtlebot3_slam.rviz
Robot Control
The following command allows the user to control the robot to perform SLAM operation
manually. It is important to avoid vigorous movements such as changing the speed too quickly
or rotating too fast. When building a map using the robot, the robot should scan every corner of
the environment to be measured. It requires some experiences to build a clean map, so let’s
practice SLAM multiple times to build up know how.
Create Map
Now that you have all the work done, let’s run the ‘map_saver_node’ to create a map. The map is
drawn based on the robot’s odometry, tf information, and scan information of the sensor when
the robot moves. These data can be seen in the RViz from the previous example. The created
map is saved in the directory in which ‘map_saver’ is runnig. Unless you specify the file name, it
is stored as ‘map.pgm’ and ‘map.yaml’ file which contains map information.
The ‘-f’ option refers to the folder and file name where the map file is saved. If ‘~/map’ is used
as an option, ‘map.pgm’ and ‘map.yaml’ will be saved in the map folder of user’s home folder(~/).
Use the process above to create your map. The nodes and topics required for mapping can be
obtained by using ‘rqt_graph’ as shown in Figure 11-6. The mapping process is shown in Figure
11-7, and the completed map is shown in Figure 11-8. We can confirm that the above-mentioned
experimental environment is properly drawn in the map.
$ wget https://raw.githubusercontent.com/ROBOTIS-GIT/turtlebot3/master/turtlebot3_slam/bag/
TB3_WAFFLE_SLAM.bag
Rest of the procedure is similar to the above SLAM instructions. However, the ‘rosbag’
command option needs to be modified from ‘save’ to ‘play’. Then it will behave in the same
manner as the actual experiment.
$ roscore
$ export TURTLEBOT3_MODEL=waffle
$ export TURTLEBOT3_MODEL=waffle
$ rosrun rviz rviz -d `rospack find turtlebot3_slam`/rviz/turtlebot3_slam.rviz
$ roscd turtlebot3_slam/bag
$ rosbag play ./TB3_WAFFLE_SLAM.bag
The following section describes the source codes of the packages and how to set up the packages
you have run in the previous example.
This section is based on TurtleBot3 robot platform and LDS sensor, but SLAM can be
implemented on a customized robot without limitation to a specific robot platform or sensor. If
you want to create your own robot platform or build your customized TurtleBot3 robot platform,
this section will be helpful.
11.3.1. Map
First of all, there’s a lot more to be covered about maps since map is the output we are looking
for in this section. If we give a paper map to a robot, do you think the robot can understand it?
Probably not. The robot would need it digitalized to understand and compute the information.
The definition of the map for robot navigation has been discussed for a long time and is still
being discussed. Particularly, recent maps include not only two-dimensional information but
also three-dimensional information, and sometimes it even includes segmentations of an object
which is unrelated to the navigation information.
In this section, we will use the two-dimensional Occupancy Grid Map (OGM), which is
commonly used in the ROS community. The map obtained from the previous section as shown
in Figure 11-9, white is the free area in which the robot can move, black is the occupied area in
which the robot can not move, and gray is the unknown area.
The area in the map is represented by grayscale values that range from ‘0’ to ‘255’. This value
is obtained through the posterior probability of the Bayes’ theorem, which calculates the
occupancy probability that represents the occupancy state. The occupancy probability ‘occ’ is
expressed as ‘occ = (255 - color_avg) / 255.0’. If the image is 24 bit, ‘color_avg = (grayscale value of
one cell / 0xFFFFFF x 255)’. The closer this ‘occ’ is to 1, the higher the probability that it is
occupied, and the closer to ‘0’, it is the less likely to be occupied.
In ROS, actual map is stored in ‘*.pgm’ file format(portable graymap format) and the ‘*.yaml’
file contains map information. For example, if we check out the map information(map.yaml) we
saved in Section 11.2, the image parameter defines the map file name and resolution defines the
map resolution in meters/pixel.
image: map.pgm
resolution: 0.050000
origin: [-10.000000, -10.000000, 0.000000]
negate: 0
occupied_thresh: 0.65
free_thresh: 0.196
That is to say, each pixel can be converted to 5cm. Origin is the origin of the map, and each
value represents x, y and yaw respectively. The lower left corner of the map represents x = -10m,
y = -10m. Negate inverts black and white color. The color of each pixel is determined by the
occupancy probability. If the occupancy probability exceeds occupied threshold (occupied_
thresh), the pixel is expressed as an occupied area in black color. Otherwise, the pixel will be
expressed as a free area in white color.
Second is the pose value which stands for the pose information of the sensor that is attached
to the robot. Thus, the pose value of the sensor depends on the odometry of the robot. It is
necessary to provide the odometry information to calculate the pose value.
In the below Figure 11-11, the distance measured with LDS is called ‘scan’ in ROS and the
pose (position + orientation) information is affected by the relative coordinate, so it is called ‘tf’
(transform). As shown in Figure 11-11, we run SLAM based on two pieces of information, ‘scan’
and ‘tf’, and create the map we want.
➌ ➋ ➊
turtlebot3_teleop(Example: turtlebot3_teleop_keyboard)
The ‘turtlebot3_teleop_keyboard’ node is a node that can receive the keyboard input and control
the robot. This node sends the translation and rotation speed command to the ‘turtlebot3_core’
node.
turtlebot3_core
The ‘turtlebot3_core’ node receives the translation and rotation speed command and moves the
robot. While the node publishes ‘odom’, which is the measured and estimated pose of robot, it
also publishes the converted relative coordinate of ‘odom’ in ‘tf’ form, in the order of odom →
base_footprint → base_link → base_scan.
turtlebot3_slam_gmapping
The ‘turtlebot3_slam_gmapping’ node creates a map based on the scan information from the
distance measuring sensor and the tf information, which is the pose value of the sensor.
map_saver
The ‘map_saver’ node in the ‘map_server’ package creates a ‘map.pgm’ file and a ‘map.yaml’ file
that is an information file for the map.
If you execute the above command, you can check the relative coordinate transformation (tf)
of the robot and sensor with the ‘tf tree viewer’ as shown in Figure 11-13. In other words, if we
Figure 11-13 Relative Coordinate Transformation Status of Map and Robot Parts
turtlebot3_slam/launch/turtlebot3_slam.launch
<launch>
<!-- Turtlebot3 -->
<include file="$(find turtlebot3_bringup)/launch/turtlebot3_remote.launch" />
</node>
</launch>
First, let’s look at the ‘turtlebot3_remote.launch’ file. This file contains the user-specified
robot model to be loaded and executes the ‘robot_state_publisher’ node, which publishes the
robot state information of both wheels and each joint in TF.
All of the contents necessary for mapping has been explained. The following section deals with
the theory of SLAM.
11.4.1. SLAM
SLAM (Simultaneous Localization And Mapping) means to explore and map the unknown
environment while estimating the pose of the robot itself by using the mounted sensors on the
robot. This is the key technology for navigation such as autonomous driving.
Encoders and inertial measurement units (IMU) are typically used for pose estimation. The
encoder calculates the approximate pose of the robot with dead reckoning which measures the
amount of rotation of the driving wheel. This process comes with quite an amount of estimation
error and the inertial information measured by the inertial sensor compensates for the error of
the calculated pose. Depending on the purpose, the pose can be estimated without the encoder,
but only using the inertial sensor.
This estimated pose can be corrected once again with the surrounding environmental
information obtained through the distance sensor or the camera used when creating the map.
This pose estimation methodology includes Kalman filter, Markov localization, Monte Carlo
localization using particle filter, and so on.
Also, a method of recognizing the environment by attaching markers has been proposed. For
example, this method has markers on the ceiling for the camera to distinguish them. Recently,
depth cameras (RealSense, Kinect, Xtion, etc.) have been widely used to extract the distance
information which is as accurate as distance sensors.
Kalman filter
The Kalman filter which was used in NASA’s Apollo project was developed by Dr. Rudolf E.
Kalman, who has since then become famous for the algorithm. His filter was a recursive filter
that tracks the state of an object in a linear system with noise. The filter is based on the Bayes
probability which assumes the model and uses this model to predict the current state from the
previous state. Then, an error between the predicted value of the previous step and the actual
measured present value obtained by measuring instrument is used to perform an update step of
estimating more accurate state value. The filter repeats above process and increases the
accuracy. This process is simplified as shown in Figure 11-14.
However, the Kalman filter only applies to linear systems. Most of our robots and sensors are
nonlinear systems, and the EKF (Extended Kalman Filter) modified from Kalman filter are
widely used. In addition, there are many KF variants such as UKF (Unscented Kalman Filter)
which improved the accuracy of EKF, and Fast Kalman filter which has improved speed, and
these are still being researched today. Kalman filter is also often used with other algorithms
such as the Rao-Blackwelled Particle Filter (RBPF) which is used with particle filters.
Particle filter
Particle Filter is the most popular algorithm in object tracking. Typical examples are Monte
Carlo localization using particle filters. The previously described Kalman filter guarantees
accuracy only for a linear system and a system to which Gaussian noise is applied. Most of the
problems in the real world are nonlinear systems.
Because robots and sensors are also nonlinear, particle filters are often used for pose
estimation. If the Kalman filter is an analytical method that assumes the sysm as a linear and
searches for parameters by linear motion, the particle filter is a technique to predict through
simulation based on try-and-error method. A particle filter gained its name because the
estimated value generated by the probability distribution in the system is represented as
particles. This is also called the Sequential Monte Carlo (SMC) method or the Monte Carlo
method.
The particle filter, like other pose estimation algorithms, estimates the pose of the object
assuming that the error is included in the incoming information. When using SLAM, the robot’s
odometry value and the measurement values using the distance sensor are used to estimate the
robot’s current pose.
This particle filter goes through the following 5 procedures. Except for the initialization in
step 1, steps 2~5 are repeatedly performed to estimate the robot’s pose. In other words, it is a
method to estimate the pose of the robot by updating the distribution of the particles that shows
the probability of the robot on the X, Y coordinate plane based on the measured sensor value.
➊ Initialization
Since the robot’s initial pose (position, orientation) is unknown, the particles are randomly
arranged within the range where the pose can be obtained with N particles. Each of the initial
particle weighs 1/N, and the sum of the weight of particles is 1. N is empirically determined,
usually in the hundreds. If the initial position is known, particles are placed near the robot.
➋ Prediction
Based on the system model describing the motion of the robot, it moves each particle as the
amount of observed movement with odometry information and noise.
➌ Update
Based on the measured sensor information, the probability of each particle is calculated and the
weight value of each particle is updated based on the calculated probability.
➍ Pose estimation
The position, orientation, and weight of all particles are used to calculate the average weight,
median value, and the maximum weight value for estimating pose of the robot.
➎ Resampling
The step of generating new particles is to remove the less weighed particles and to create new
particles that inherit the pose information of the weighted particles. Here, the number of
particles N must be maintained.
In addition, if the number of samples is sufficient, the particle filter may be more accurate
than the pose estimation from EKF or UKF that improved the Kalman filter. However, if the
numbers are not sufficient, it may not be accurate. SLAM based on Rao-Blackwelled Particle
Particle Filter
To learn more about particle filters, refer to the book ‘Probabilistic Robotics’ used as a textbook
in robotics, by Sebastian Thrun (Professor at Stanford, Google Fellow, Udacity founder). I strongly
recommend this book to anyone who wants to study robotics.
http://www.probabilistic-robotics.org/
https://www.udacity.com/course/cs373
This concludes the explanation of SLAM. The explanation of gmapping has been replaced by
the particle filter description. Please refer to the thesis papers mentioned in the following
reference for details. The next section deals with navigation.
As explained, the SLAM is an extensively researched field in robotics. Such informations can be
found in the latest academic journals and presentations, and many of these studies are open-source
compiled by the OpenSLAM group and can be found at OpenSLAM.org. This site is a must visit site.
The gmapping we used in section 11.4 is also introduced here, and the ROS community uses it
extensively in SLAM. There are two papers introduced in regards to gmapping. One was published at
ICRA 2005 and another was published at “Robotics, IEEE Transactions on” journal in 2007 .
These papers describe how to reduce the number of particles to reduce the computation volume and
realize real-time operation. The main approach is to use the Rao-Blackwellized particle filter described
above. Refer to the article for more details and a rough description can be seen in the discussion of
particle filters in section 11.4.2.
[1]
Grisetti, Giorgio, Cyrill Stachniss, and Wolfram Burgard, Improving grid-based slam with rao-
blackwellized particle filters by adaptive proposals and selective resampling, Proceedings of the
2005 IEEE International Conference on Robotics and Automation, pp. 2432-2437, 2005.
[2] Grisetti, Giorgio, Cyrill Stachniss, and Wolfram Burgard, Improved techniques for grid mapping with
rao-blackwellized particle filters, IEEE Transactions on Robotics, Vol.23, No.1, pp.34-46, 2007
roscore
Run roscore on the [Remote PC].
$ roscore
Launch Robot
From [TurtleBot], run the ‘turtlebot3_robot.launch’ file for excuting the ‘turtlebot3_core’ and
‘turtlebot3_lds’ nodes.
$ export TURTLEBOT3_MODEL=waffle
$ roslaunch turtlebot3_navigation turtlebot3_navigation.launch map_file:=$HOME/map.yaml
Execute RViz
Let’s run RViz, a visualization tool of ROS that enables visual confirmation of the goal pose
designation and results in navigation. When you run RViz with the following options, it is very
convenient to add display plugins from the scratch.
When you run the above command, you can see the screen shown in Figure 11-15. On the
right map you will see many green arrows, which are the particles of the particle filter described
in SLAM theory. This will be explained later, but navigation also uses particle filters. The robot is
in the middle of the green arrows.
Figure 11-15 Particles visible in RViz (green arrows around the robot)
Figure 11-16 Destination settings (big arrow) and how the robot is moving
The following sections describe the source details and how to set up the packages you have
run previously. It will be divided into practice, application, and theory just as a SLAM example.
In this section, we will explain based on the TurtleBot3 robot platform and the LDS sensor
used with it. By understanding this instruction, you can perform navigation with your own
robot, which is not limited to a specific robot platform or specific sensors. If you want to create
your own robot platform or build customized robot based on TurtleBot3 robot platform, this
tutorial will be helpful.
The navigation enables a robot to move from the current pose to the designated goal pose on
the map by using the map, robot’s encoder, inertial sensor, and distance sensor. The procedure
for performing this task is as follows.
Sensing
On the map, the robot updates its odometry information with the encoder and the inertial sensor
(IMU sensor), and measures the distance from the pose of the sensor to the obstacle (wall,
object, furniture, etc.).
Motion planning
The motion planning, which is also called as path planning, creates a trajectory from the current
pose to the target pose specified on the map. The created path plan includes the global path
planning in the whole map and the local path planning for smaller areas around the robot. We
plan to use the ‘move_base’ and ‘nav_core’ route planning packages in ROS based on the Dynamic
Window Approach (DWA), which is an obstacle avoidance algorithm.
Figure 11-17 Relationship between essential nodes and topics on the navigation packages configuration
$ export TURTLEBOT3_MODEL=waffle
$ roslaunch turtlebot3_navigation turtlebot3_navigation.launch map_file:=$HOME/map.yaml
When you execute the ‘rqt_graph’, which visualizes the node and topic information running
on the ROS, the information can be visually shown as Figure 11-18. As you can see in the diagram,
the information required for the navigation described above is published and subscribed as the
topic names ‘/odom’, ‘/tf’, ‘/scan’, ‘/map’ and ‘/cmd_vel’. The ‘move_base_simple/goal’ is published
when the destination coordinate is specified from the RViz.
/launch/turtlebot3_navigation.launch
/launch/amcl.launch.xml
The ‘amcl.launch.xml’ file contains various parameter settings of Adaptive Monte Carlo
Localization (AMCL) and is used with ‘turtlebot3_navigation.launch’ file.
This configuration file configures the parameter of ‘move_base’ that supervises motion planning.
/param/costmap_common_params_burger.yaml
/param/costmap_common_params_waffle.yaml
/param/global_costmap_params.yaml
/param/local_costmap_params.yaml
Navigation uses the occupancy grid map described in section 11.3.1. Based on this occupancy
grid map, each pixel is calculated as an obstacle, a non-movable area, and a movable area using
the robot’s pose and surrounding information obtained from the sensor. In this calculation, the
concept of costmap is applied. The above files are configuring parameters of the costmap. The
‘costmap_common_params.yaml’ has common parameters where ‘global_costmap_params.
yaml’ file is required for the global area motion planning while ‘local_costmap_params.yaml’ file
is required for the local area motion planning. The ‘costmap_common_params.yaml’ comes
with a suffix of burger or waffle according to the model of the robot and the file contains different
information for each model. However, the TurtleBot3 Waffle Pi model is the same as the
TurtleBot3 Waffle model except the camera. So the TurtleBot3 Waffle Pi model uses the ‘waffle’
suffix to use the TurtleBot3 Waffle model settings.
/param/dwa_local_planner_params.yaml
‘dwa_local_planner’ is a package that ultimately trasmits the speed command to the robot and
sets the parameters for it.
base_local_planner_params.yaml
This file contains configuration value for ‘base_local_planner’, however, it is not used because
turtlebot3 uses ‘dwa_local_planner’ instead. This is because the parameter setting in the ‘move_
base’ node has been modified in advance as follows:
/maps/map.pgm
/maps/map.yaml
Save and use the previously created occupancy grid map in the ‘/maps’ folder.
This file contains the setting information of RViz. This file is used to load Grid, RobotModel, TF,
LaserScan, Map, Global Map, Local Map and AMCL Particles from the RViz display plug-in.
The following ‘turtlebot3_navigation.launch’ file contains details of the robot model, ‘robot_
state_publisher’, map server, AMCL, and ‘move_base’ execution and configuration.
/launch/turtlebot3_navigation.launch
<launch>
<arg name="model" default="$(env TURTLEBOT3_MODEL)" doc="model type [burger, waffle,
waffle_pi]"/>
turtlebot3_bringup/launch/turtlebot3_remote.launch
<launch>
<arg name="model" default="$(env TURTLEBOT3_MODEL)" doc="model type [burger, waffle,
waffle_pi]"/>
Map server
The map_server node loads the map information (map.yaml) and the map (map.pgm) stored in
the ‘turtlebot3_navigation/maps/’ folder. The map is published in the form of a topic by the
‘map_server’ node.
move_base
Set costmap-related parameters necessary for motion planning, and set parameters for ‘dwa_
local_planner’ that passes the moving speed command to the robot, and set parameters for
‘move_base’ that supervises the motion planning. Detailed explanations are available in section
11.6.6.
turtlebot3_navigation/launch/amcl.launch.xml
<launch>
<!-- if true, AMCL receives map topic instead of service call. -->
<arg name="use_map_topic" default="false"/>
<!-- topic name for the sensor values from the distance sensor. -->
<arg name="scan_topic" default="scan"/>
<!-- used as the initial x-coordinate value of the Gaussian distribution in initial pose
estimation.-->
<arg name="initial_pose_x" default="0.0"/>
<!-- used as the initial y-coordinate value of the Gaussian distribution in the initial pose
estimation.-->
<arg name="initial_pose_y" default="0.0"/>
<!-- used as the initial yaw coordinate value of the Gaussian distribution in the initial pose
estimation. -->
<arg name="initial_pose_a" default="0.0"/>
<!-- execute the amcl node by referring to the parameter settings below. -->
<node pkg="amcl" type="amcl" name="amcl">
move_base
This is a parameter setting file of ‘move_base’ that supervises the motion planning.
turtlebot3_navigation/param/move_base_params.yaml
# choosing whether to stop the costmap node when move_base is inactive
shutdown_costmaps: false
# cycle of control iteration (in Hz) that orders the speed command to the robot base
controller_frequency: 3.0
# maximum time (in seconds) that the controller will listen for control information before the
space-clearing operation is performed
controller_patience: 1.0
# repetition cycle of global plan (in Hz)
planner_frequency: 2.0
# maximum amount of time (in seconds) to wait for an available plan before the space-clearing
costmap
Navigation uses an occupancy grid map. Based on this occupancy grid map, each pixel is
calculated as an obstacle, a non-movable area, and a movable area based on the robot’s pose and
surrounding information obtained from the sensor. In this calculation, the concept of costmap
is applied. The parameter of costmap configuration consists of ‘costmap_common_params.
yaml’ file, ‘global_costmap_params.yaml’ file for the global area motion planning, and ‘local_
costmap_params.yaml’ file for the local area motion planning. The ‘costmap_common_params_
burger.yaml’ file is used for burger and ‘costmap_common_params_waffle.yaml’ file is used for
waffle. However, the TurtleBot3 Waffle Pi model is the same as the TurtleBot3 Waffle model
except the camera. So the TurtleBot3 Waffle Pi model uses the ‘waffle’ suffix to use the TurtleBot3
Waffle model settings.
turtlebot3_navigation/param/costmap_common_params_burger.yaml
# Indicate the object as an obstacle when the distance between the robot and obstacle is within
this range.
obstacle_range: 2.5
# sensor value that exceeds this range will be indicated as a freespace
raytrace_range: 3.5
# external dimension of the robot is provided as polygons in several points
footprint: [[-0.110, -0.090], [-0.110, 0.090], [0.041, 0.090], [0.041, -0.090]]
# radius of the robot. Use the above footprint setting instead of robot_radius.
# robot_radius: 0.105
# radius of the inflation area to prevent collision with obstacles
inflation_radius: 0.15
# scaling variable used in costmap calculation. Calculation formula is as follows.
Below is parameter settings for TurtleBot3 Waffle. Unlike the TurtleBot3 Burger, the Waffle’s
‘footprint’ and ‘inflation_radius’, which prevents collision to the obstacle, are different. All other
values are the same, and for the parameter description, refer to the Burger’s description.
turtlebot3_navigation/param/costmap_common_params_waffle.yaml
obstacle_range: 2.5
raytrace_range: 3.5
footprint: [[-0.205, -0.145], [-0.205, 0.145], [0.077, 0.145], [0.077, -0.145]]
inflation_radius: 0.20
cost_scaling_factor: 0.5
map_type: costmap
transform_tolerance: 0.2
observation_sources: scan
scan: {data_type: LaserScan, topic: scan, marking: true, clearing: true}
turtlebot3_navigation/param/global_costmap_params.yaml
global_costmap:
global_frame: /map # set map frame
robot_base_frame: /base_footprint # set robot's base frame
update_frequency: 2.0 # update frequency
publish_frequency: 0.1 # publish frequency
static_map: true # setting whether or not to use given map
transform_tolerance: 1.0 # transform tolerance time
dwa_local_planner
The package ‘dwa_local_planner’ ultimately publishes the speed command to the robot and this
file sets parameters for it.
turtlebot3_navigation/param/dwa_local_planner_params.yaml
DWAPlannerROS:
# robot parameters
max_vel_x: 0.18 # max velocity for x axis(meter/sec)
min_vel_x:-0.18 # min velocity for x axis (meter/sec)
max_vel_y: 0.0 # Not used. applies to omni directional robots only
min_vel_y: 0.0 # Not used. applies to omni directional robots only
max_trans_vel: 0.18 # max translational velocity(meter/sec)
min_trans_vel: 0.05 # min translational velocity (meter/sec), negative value for reverse
# trans_stopped_vel: 0.01 # translation stop velocity(meter/sec)
max_rot_vel: 1.8 # max rotational velocity(radian/sec)
min_rot_vel: 0.7 # min rotational velocity (radian/sec)
# rot_stopped_vel: 0.01 # rotation stop velocity (radian/sec)
acc_lim_x: 2.0 # limit for x axis acceleration(meter/sec^2)
acc_lim_y: 0.0 # limit for y axis acceleration (meter/sec^2)
acc_lim_theta: 2.0 # theta axis angular acceleration limit (radian/sec^2)
# Debugging
publish_traj_pc: true # debugging setting for the movement trajectory
publish_cost_grid_pc: true # debugging setting for costmap
global_frame_id: odom # ID setting for global frame
map
The previously created occupancy grid map is saved at the ‘/maps’ folder. There are no other
configuration parameters.
/maps/map.pgm
/maps/map.yaml
Details to use the navigation package has been explained. In the next section, we will discuss
the theory of costmap, Adaptive Monte Carlo Localization (AMCL), and Dynamic Window
Approach (DWA).
11.7.1. Costmap
The pose of the robot is estimated based on the odometry obtained from the encoder and inertial
sensor (IMU sensor). And the distance between the robot and the obstacle is obtained by the
distance sensor mounted on the robot. The pose of robot and sensor, obstacle information, and
the occupancy grid map obtained as a result from SLAM are used to load the static map and
utilize the occupied area, free area, and unknown area for the navigation.
In navigation, costmap calculates obstacle area, possible collision area, and a robot movable
area based on the aforementioned four factors. Depending on the type of navigation, costmap
can be divided into two. One is the ‘global_costmap’, which sets up a path plan for navigating in
the global area of the fixed map. The other is ‘local_costmap’ which is used for path planning
and obstacle avoidance in the limited area around the robot. Although their purposes are
different, both costmaps are represented in the same way.
The costmap is expressed as a value between ‘0’ and ‘255’. The meaning of the value is shown
in Figure 11-19, and to briefly summarize, the value is used to identify whether the robot is
movable or colliding with an obstacle. The calculation of each area depends on the costmap
configuration parameters specified in Section 11.6.
For example, the actual costmap is expressed as shown in Figure 11-20. In detail, there is a
robot model in the middle and the rectangular box around it corresponds to the outer surface of
the robot. When this outline line contacts the wall, the robot will bump into the wall as well.
Green represents the obstacle with the distance sensor value obtained from the laser sensor. As
the gray scale costmap gets darker, it is more likely to be collision area. The same works for the
case of color representation. The pink area is the actual obstacle, and the light blue area is the
point where the center of the robot comes into this area, and the border is drawn with a thick
red pixel. There is no important meaning for color since the user can modify it in RViz.
11.7.2. AMCL
As mentioned in section 11.4 SLAM Theory Particle Filter, the Monte Carlo localization (MCL)
pose estimation algorithm is widely used in the field of pose estimation. AMCL (Adaptive Monte
Carlo Localization) can be regarded as an improved version of Monte Carlo pose estimation,
which improves real-time performance by reducing the execution time with less number of
samples in the Monte Carlo pose estimation algorithm. So, let’s look at the basic Monte Carlo
pose estimation (MCL).
The ultimate goal of Monte Carlo pose estimation (MCL) is to determine where the robot is
located in a given environment. That is, we must get x, y, and θ of the robot on the map. For this
purpose, MCL calculates the probability that the robot can be located. First, the position and
orientation (x, y, θ) of the robot at time t are denoted by xt, and the distance information obtained
from the distance sensor up to time t is denoted by z0...t = {z0, z1, ..., zt}, and the movement
information obtained from the encoder up to time t is u0...t = {u0, u1, ..., ut}. Then we can calculate
belief (posterior probability using Bayesian update formula) with the following equation.
In the prediction step, the position bel'( xt ) of the robot at the next time frame is calculated
using the movement model p( xt | xt-1, ut-1 ) of the robot and the probability bel( xt-1 ) at the
previous position, and the movement information u received from the encoder.
The following is the update step. This time, sensor model p( zt | xt ), the probability bel( xt ) and
the normalization constant eta ( ht ) is used to obtained more accurate probability bel'( xt ) based
on the sensor information.
The following is the procedure for estimating the position by generating N particles with the
particle filter using the calculated probability bel( xt ) of the current position. Please refer to the
particle filter description in section 11.4 SLAM Theory. In the MCL, the term “sample” is used
instead of particle, and goes through the SIR (Sampling Importance weighting Re-sampling)
process. The first is the sampling process. Here, a new~sample zt | x t’(iis
t = hp (set
(i) )
) extracted using the robot
movement model p( xt | xt-1, ut-1 ) at the probability bel( xt-1 ) of the previous position. The sample
zt | x t’(i)among
~(ti) = hp (“i” ) ~sample
t = hp (set
(i)
zt | x t’(,i)the
) distance information zt, and the normalization constant h is used
to obtain the weight ~ t . = hp (zt | x t )
(i) ’(i)
Finally, in the resampling process, we create N samples of new Xt sampling (particle) sets
using sample
~(ti) = hp (zt | x t’(i))and weight ~(ti).= hp (zt | x t’(i))
By repeating the above SIR process while moving the particles, the estimated robot position
increases in accuracy. For example, as shown in Figure 11-21, we can see the process of
converging location from ‘t1’ to ‘t4’. All of this process was referred to ‘Probabilistic Robotics’ by
Professor Sebastian Thrun, which is called as the textbook of probability related field in robotics.
If time permits, I recommend you to take a look at this book.
First, the robot is not in the x and y-axis coordinates, but in the velocity search space with the
translational velocity v and the rotational velocity ω as axes, as shown in Figure 11-22. In this
space, the robot has a maximum allowable speed due to hardware limitations, and this is called
Dynamic Window.
In this dynamic window, the objective function G(v, ~) is used to calculate the translational
velocity v and the rotational velocity ~ that maximizes the objective function which considers
the direction, velocity and collision of the robot. If we plot it, we can find optimal velocity among
various v and ~ options to destination as shown in Figure 11-23.
This concludes the exercise, application, and theory of SLAM and navigation. Although this
was mainly described as a mobile robot platform with the TurtleBot3, the same can be applied to
other robots. Feel free to apply it to another robot or developed your own.
Service
Robot
Service Core
Service core is a kind of database that manages the order status of customers and the service
performance status of a robot. Upon receiving an order from a customer, it plays the key role in
handling the order and scheduling the robot’s service process. Each particular order and
processing system is unique and affects the order of other customers, thus, every service core
must exist as one core.
Service Master
A service master receives the customer’s orders and transfers the order details to the service
core. It also displays a list of items that can be ordered and notifies the service status to the
customer. For the service master to do this, the service core’s database must be synchronized.
For example, as in Figure 12-2, I used the following devices to implement a delivery service
robot system.
■■ ervice Slave: 3 Intel Joule 570x of TurtleBot3 Carrier (TurtleBot3 Waffle customized for a
S
delivery robot)
Figure 12-2 Actual image of the delivery service robots and its operating systems.
Figure 12-3 Example of the messages in each area being transmitted by the delivery service robot
Although one pad can supervise mutiple robots, to simplify the system structure, one pad
per robot was paired to carry out the service. The pad used to place an order must send the
information of the pad ID and selected items to the service core. Based on the pad ID, the service
core determines which service robot to designate the work and transmits the corresponding
target site for the ordered item to perform the service.
Based on path planning algorithm, the service robot moves to the received target site. While
carrying out this mission, the robot transmits the service status of the robot including the
collision with the obstacle, the route failure, the arrival of the target, and whether or not the
ordered item is acquired. The service core processes information such as status on duplicate
orders related to the ordered items, orders received from the robot while the robot is in service
to present to the customer through the pad. Every information sent and received during this
process uses customized new msg files or srv files.
1 http://wiki.ros.org/roslaunch/XML/group
The source code necessary to build a delivery service robot described here, are available in
the address below. Let’s move on to the configuration and sources of the nodes.
■■ https://github.com/ROBOTIS-GIT/turtlebot3_deliver
Item Description
pubServiceStatusPadTb3p = nh_.advertise<turtlebot3_carrier::ServiceStatus>("/tb3p/
service_status", 1);
pubServiceStatusPadTb3g = nh_.advertise<turtlebot3_carrier::ServiceStatus>("/tb3g/
service_status", 1);
pubServiceStatusPadTb3r = nh_.advertise<turtlebot3_carrier::ServiceStatus>("/tb3r/
service_status", 1);
subArrivalStatusTb3p = nh_.subscribe("/tb3p/move_base/result", 1,
&ServiceCore::cbCheckArrivalStatusTB3P, this);
subArrivalStatusTb3g = nh_.subscribe("/tb3g/move_base/result", 1,
&ServiceCore::cbCheckArrivalStatusTB3G, this);
subArrivalStatusTb3r = nh_.subscribe("/tb3r/move_base/result", 1,
&ServiceCore::cbCheckArrivalStatusTB3R, this);
ros::Rate loop_rate(5);
while (ros::ok())
{
fnPubServiceStatus();
fnPubPose();
ros::spinOnce();
loop_rate.sleep();
}
}
The fnInitParam() function obtains the target pose (position + orientation) data of the robot
on the map from the given parameter file. In this example, the robot should be able to move to
three positions where customers can place an order and also be able to move to the three
locations where ordered items can be acquired, therefore, it is necessary to store the information
of the six coordinates of the locations on the map. The structure of the fnInitParam() function is
as follows.
Item Description
poseStampedTable[0].header.frame_id = "map";
poseStampedTable[0].header.stamp = ros::Time::now();
poseStampedTable[0].pose.position.x = target_pose_position[0];
poseStampedTable[0].pose.position.y = target_pose_position[1];
poseStampedTable[0].pose.position.z = target_pose_position[2];
poseStampedTable[0].pose.orientation.x = target_pose_orientation[0];
poseStampedTable[0].pose.orientation.y = target_pose_orientation[1];
poseStampedTable[0].pose.orientation.z = target_pose_orientation[2];
poseStampedTable[0].pose.orientation.w = target_pose_orientation[3];
nh_.getParam("table_pose_tb3g/position", target_pose_position);
nh_.getParam("table_pose_tb3g/orientation", target_pose_orientation);
poseStampedTable[1].header.frame_id = "map";
poseStampedTable[1].header.stamp = ros::Time::now();
poseStampedTable[1].pose.position.x = target_pose_position[0];
poseStampedTable[1].pose.position.y = target_pose_position[1];
poseStampedTable[1].pose.position.z = target_pose_position[2];
poseStampedTable[1].pose.orientation.x = target_pose_orientation[0];
poseStampedTable[1].pose.orientation.y = target_pose_orientation[1];
poseStampedTable[1].pose.orientation.z = target_pose_orientation[2];
poseStampedTable[1].pose.orientation.w = target_pose_orientation[3];
nh_.getParam("table_pose_tb3r/position", target_pose_position);
nh_.getParam("table_pose_tb3r/orientation", target_pose_orientation);
poseStampedTable[2].header.frame_id = "map";
poseStampedTable[2].pose.position.x = target_pose_position[0];
poseStampedTable[2].pose.position.y = target_pose_position[1];
poseStampedTable[2].pose.position.z = target_pose_position[2];
poseStampedTable[2].pose.orientation.x = target_pose_orientation[0];
poseStampedTable[2].pose.orientation.y = target_pose_orientation[1];
poseStampedTable[2].pose.orientation.z = target_pose_orientation[2];
poseStampedTable[2].pose.orientation.w = target_pose_orientation[3];
nh_.getParam("counter_pose_bread/position", target_pose_position);
nh_.getParam("counter_pose_bread/orientation", target_pose_orientation);
poseStampedCounter[0].header.frame_id = "map";
poseStampedCounter[0].header.stamp = ros::Time::now();
poseStampedCounter[0].pose.position.x = target_pose_position[0];
poseStampedCounter[0].pose.position.y = target_pose_position[1];
poseStampedCounter[0].pose.position.z = target_pose_position[2];
poseStampedCounter[0].pose.orientation.x = target_pose_orientation[0];
poseStampedCounter[0].pose.orientation.y = target_pose_orientation[1];
poseStampedCounter[0].pose.orientation.z = target_pose_orientation[2];
poseStampedCounter[0].pose.orientation.w = target_pose_orientation[3];
nh_.getParam("counter_pose_drink/position", target_pose_position);
nh_.getParam("counter_pose_drink/orientation", target_pose_orientation);
poseStampedCounter[1].header.frame_id = "map";
poseStampedCounter[1].header.stamp = ros::Time::now();
poseStampedCounter[1].pose.position.x = target_pose_position[0];
poseStampedCounter[1].pose.position.y = target_pose_position[1];
poseStampedCounter[1].pose.position.z = target_pose_position[2];
poseStampedCounter[1].pose.orientation.x = target_pose_orientation[0];
nh_.getParam("counter_pose_snack/position", target_pose_position);
nh_.getParam("counter_pose_snack/orientation", target_pose_orientation);
poseStampedCounter[2].header.frame_id = "map";
poseStampedCounter[2].header.stamp = ros::Time::now();
poseStampedCounter[2].pose.position.x = target_pose_position[0];
poseStampedCounter[2].pose.position.y = target_pose_position[1];
poseStampedCounter[2].pose.position.z = target_pose_position[2];
poseStampedCounter[2].pose.orientation.x = target_pose_orientation[0];
poseStampedCounter[2].pose.orientation.y = target_pose_orientation[1];
poseStampedCounter[2].pose.orientation.z = target_pose_orientation[2];
poseStampedCounter[2].pose.orientation.w = target_pose_orientation[3];
}
The values of the parameters indicated in the ‘target_pose.yaml’ file are coordinate values on
the map that the robot needs to perform its services. There are various ways to obtain the
coordinate values, and the simplest method is to use the ‘rostopic echo’ command to get the pose
value during navigation. However, these coordinates change with each mapping through SLAM
so avoid rewritting the map while performing the actual navigation to reduce the positional
changes of objects and obstacles so that you can continue with the same map.
/turtlebot3_carrier/param/target_pose.yaml
table_pose_tb3p:
position: [-0.338746577501, -0.85418510437, 0.0]
orientation: [0.0, 0.0, -0.0663151963596, 0.997798724559]
table_pose_tb3g:
position: [-0.168751597404, -0.19147400558, 0.0]
orientation: [0.0, 0.0, -0.0466624033917, 0.998910716786]
table_pose_tb3r:
position: [-0.251043587923, 0.421476781368, 0.0]
counter_pose_bread:
position: [-3.60783815384, -0.750428497791, 0.0]
orientation: [0.0, 0.0, 0.999335763287, -0.0364421763375]
counter_pose_drink:
position: [-3.48697376251, -0.173366710544, 0.0]
orientation: [0.0, 0.0, 0.998398746904, -0.0565680314445]
counter_pose_snack:
position: [-3.62247490883, 0.39046728611, 0.0]
orientation: [0.0, 0.0, 0.998908838216, -0.0467026009308]
Next, based on which service procedure the current robot is in, the fnPubPose() function is
used to set the next target position when the robot reaches the target position. When the robot
completes the service, all parameters are reset.
Item Description
robot_service_sequence[ROBOT_NUMBER] = 2;
}
else if (robot_service_sequence[ROBOT_NUMBER] == 2)
{
pubPoseStampedTb3p.publish(poseStampedCounter[item_num_chosen_by_pad[ROBOT_NUMBER]]);
is_robot_reached_target[ROBOT_NUMBER] = false;
robot_service_sequence[ROBOT_NUMBER] = 3;
}
else if (robot_service_sequence[ROBOT_NUMBER] == 3)
{
fnPublishVoiceFilePath(ROBOT_NUMBER, "~/voice/voice1-3.mp3");
robot_service_sequence[ROBOT_NUMBER] = 4;
}
else if (robot_service_sequence[ROBOT_NUMBER] == 4)
{
pubPoseStampedTb3p.publish(poseStampedTable[ROBOT_NUMBER]);
is_robot_reached_target[ROBOT_NUMBER] = false;
robot_service_sequence[ROBOT_NUMBER] = 5;
}
else if (robot_service_sequence[ROBOT_NUMBER] == 5)
{
fnPublishVoiceFilePath(ROBOT_NUMBER, "~/voice/voice1-4.mp3");
robot_service_sequence[ROBOT_NUMBER] = 0;
item_num_chosen_by_pad[ROBOT_NUMBER] = -1;
}
}
}
… omitted …
The cbReceivePadOrder() function receives the number of the pad used at the time of
ordering and the item number of the ordered item to determine whether the service is available.
If the service is available, the ‘robot_service_sequence’ is set to ‘1’ to start the service. The source
code of this function is as follows.
Item Description
if (is_item_available[item_number] != 1)
{
ROS_INFO("Chosen item is currently unavailable");
return;
}
if (robot_service_sequence[pad_number] != 0)
{
ROS_INFO("Your TurtleBot is currently on servicing");
return;
}
if (item_num_chosen_by_pad[pad_number] != -1)
{
item_num_chosen_by_pad[pad_number] = item_number;
is_item_available[item_number] = 0;
}
The cbCheckArrivalStatus() function shown below checks the moving state of the subscribing
robot. The ‘else’ block of the code handles if the robot comes across an obstacle during its
navigation or is unable to find its path in the algorithm.
str.data = file_path;
if (robot_num == ROBOT_NUMBER_TB3P)
{
pubPlaySoundTb3p.publish(str);
}
else if (robot_num == ROBOT_NUMBER_TB3G)
{
pubPlaySoundTb3g.publish(str);
}
else if (robot_num == ROBOT_NUMBER_TB3R)
{
pubPlaySoundTb3r.publish(str);
}
}
turtlebot3_carrier_pad/ServicePad.java
package org.ros.android.android_tutorial_pubsub;
import org.ros.concurrent.CancellableLoop;
2 http://wiki.ros.org/rosjava
import javax.security.auth.SubjectDomainCombiner;
public ServicePad() {
this.pub_pad_order_topic_name = "/tb3g/pad_order";
this.sub_service_status_topic_name = "/tb3g/service_status";
this.pub_pad_status_topic_name = "/tb3g/pad_status";
}
subscriber.addMessageListener(new MessageListener<turtlebot3_carrier.SerivceStatus>() {
@Override
public void onNewMessage(turtlebot3_carrier.SerivceStatus serviceStatus)
{
item_num_chosen_by_pad = serviceStatus.item_num_chosen_by_pad;
is_item_available = serviceStatus.is_item_available;
robot_service_sequence = serviceStatus.robot_service_sequence
}
});
connectedNode.executeCancellableLoop(new CancellableLoop() {
protected void setup() {}
if (button_pressed[0])
{
selected_item_num = 0;
str += "Burger was selected";
button_pressed[0] = false;
}
else if (button_pressed[1])
{
selected_item_num = 1;
button_pressed[1] = false;
}
else if (button_pressed[2])
{
selected_item_num = 2;
str += "Waffle was selected";
button_pressed[2] = false;
}
else
{
selected_item_num = -1;
str += "Sorry, selected item is now unavailable. Please choose another item.";
}
if (is_item_available[selected_item_num] != 1)
{
str += ", but chosen item is currently unavailable.";
jump = true;
}
else if (robot_service_sequence[robot_num] != 0)
{
str += ", but your TurtleBot is currently on servicing";
jump = true;
}
else if (item_num_chosen_by_pad[robot_num] != -1)
{
str += ", but your TurtleBot is currently on servicing";
jump = true;
}
padStatus.setData(str);
pub_pad_status.publish(padStatus);
if(!jump)
{
padOrder.pad_number = robot_num;
Thread.sleep(1000L);
}
});
}
}
In the code, MainActivity class receives and uses an instance of ServicePad class. Other than
that, it is pretty much the same as general MainActivity class so we will omit detailed explanation
and only look into the part corresponding to delivery service.
The number of the robot to be controlled by the tablet PC to which this example is uploaded
is designated as robot_num as the following source code, and the number of the selected product
on the tablet PC is initialized to ‘-1’.
In order to avoid duplicate orders, the tablet PC must be synchronized with the order status
stored in the service core before receiving the customer’s order. The following source code is
declared in the service core in the same format as the array that records the order status.
When you create an instance of the ServicePad class in the MainActivity class, you will
specify the topic name as follows. In this example, each pad and robot pair is specified as a
group namespace, so it is necessary to write the name used in the namespace in front of each
topic name.
public ServicePad() {
this.pub_pad_order_topic_name = "/tb3g/pad_order";
this.sub_service_status_topic_name = "/tb3g/service_status";
this.pub_pad_status_topic_name = "/tb3g/pad_status";
}
From the service core, the user receives service conditions such as the item number of
ordered item, the availability of item selection, and the service status of the robot.
subscriber.addMessageListener(new MessageListener<turtlebot3_carrier.ServiceStatus>() {
@Override
public void onNewMessage(turtlebot3_carrier.ServiceStatus serviceStatus)
{
item_num_chosen_by_pad = serviceStatus.item_num_chosen_by_pad;
is_item_available = serviceStatus.is_item_available;
robot_service_sequence = serviceStatus.robot_service_sequence;
}
});
This portion processes the customer’s orders based on the overall service situation
synchronized with the service core. Likewise, checks whether or not the order is duplicated, and
if the service is available, publishes the order. Figures 12-8 and 12-9 show examples of successful
and unsuccessful orders, respectively, when a customer selects an item drawn on the pad.
if (button_pressed[0])
{
button_pressed[0] = false;
}
else if (button_pressed[1])
{
selected_item_num = 1;
str += "Coffee was selected";
button_pressed[1] = false;
}
else if (button_pressed[2])
{
selected_item_num = 2;
str += "Waffle was selected";
button_pressed[2] = false;
}
else
{
selected_item_num = -1;
str += "Sorry, selected item is now unavailable. Please choose another item.";
}
if (is_item_available[selected_item_num] != 1)
{
str += ", but chosen item is currently unavailable.";
jump = true;
}
else if (robot_service_sequence[robot_num] != 0)
{
str += ", but your TurtleBot is currently on servicing";
jump = true;
}
else if (item_num_chosen_by_pad[robot_num] != -1)
{
str += ", but your TurtleBot is currently on servicing";
jump = true;
}
if(!jump)
{
padOrder.pad_number = robot_num;
padOrder.item_number = selected_item_num;
pub_pad_order.publish(padOrder);
}
}
Thread.sleep(1000L);
}
Figure 12-8 Example 1 of menu displayed on a pad (When the order is successfully received)
Figure 12-9 Example 2 of menu displayed on a pad (When unavailable item is selected)
The following describes the modified source for creating a delivery service robot. The poll()
function processes the distance measurement values of nearby objects using the LDS(HLS-
LFCD2) laser sensor. The TurtleBot3 Carrier has pillars around the LDS and it is stacked-up for
delivery services. Since it recognizes these pillars when the LDS is operated, it will affect the
results of the SLAM or navigation. Therefore, when an object is detected at a distance less than a
certain value, TurtleBot3 Carrier will set the detected value as ‘0’. Later, in the navigation
algorithm, the value of distance ‘0’ is recognized as ‘no object’, and the existence of the pillar
does not affect SLAM or navigation.
hld_lfcd_lds_driver/src/hlds_laser_publisher.cpp
void LFCDLaser::poll(sensor_msgs::LaserScan::Ptr scan)
{
…omitted…
3 http://emanual.robotis.com/docs/en/platform/turtlebot3/friends/#turtlebot3-friends-carrier
scan->time_increment = motor_speed/good_sets/1e8;
}
else
{
start_count = 0;
}
}
}
}
The ‘turtlebot3_core’ is a firmware installed on the control board OpenCR used by TurtleBot3,
and ‘turtlebot3_motor_driver.cpp’ is a source that directly controls the actuator used in
TurtleBot3. The actual service robot moves with objects loaded, so proper control is required for
safe movement. Therefore, we added the profile control of Dynamixels which is not included in
the original source of ‘turtlebot3_motor_driver.cpp’. Here, the value of ADDR_X_PROFILE_
ACCELERATION is 108. For more information about the actuator, please refer to the Dynamixel
manual (http://emanual.robotis.com/).
… omitted …
return true;
}
The file ‘turtlebot3_navigation.launch’ launches nodes that are executed when using
navigation in TurtleBot3. As described previously, nodes and ROS topics must be grouped into a
group namespace in order to use the identical ROS package in many robots at the same time. In
the launch source code above, we grouped the nodes with ‘tb3g’ namespace and used the remap
function to receive messages from other nodes that are not grouped in the same namespace.
turtlebot3/turtlebot3_navigation/turtlebot3_navigation.launch
<launch>
<group ns="tb3g">
<remap from="/tf" to="/tb3g/tf"/>
<remap from="/tf_static" to="/tb3g/tf_static"/>
… omitted …
</node>
</group>
</launch>
If everything has gone smoothly up to this point, building the system shown in Figure 12-11
will be successful.
The following describes how to install the necessary packages for installing Android Studio
IDE and applying ROS Java environment. ROS Java refers to the ROS client library running in the
Java language. Let’s first set the necessary environment for using Java. The necessary packages
are the Java SE Development Kit (JDK), and you need to specify the location where you want to
run it. This book describes how to download JDK 8, but you should modify it when the JDK
version is updated.
4 https://developer.android.com/studio/index.html
$ mkdir -p ~/android_core
$ wstool init -j4 ~/android_core/src https://raw.github.com/rosjava/rosjava/kinetic/
android_core.rosinstall
$ source /opt/ros/kinetic/setup.bash
$ cd ~/android_core
$ catkin_make
Here’s how to install Android Studio IDE. To avoid confusion, I will use identical names and
locations of the folders and files described in the ROS Android. These packages are add-on
packages for installing mksdcard that allows you to use the SD card in your Android Studio IDE
to implement the Virtual Device. If you do not install these packages, the mksdcard function
fails to install.
This book installs the Android Studio IDE and SDK in ‘/opt’ folder, which is the installation
folder recommended by ROS Android5. To install to the ‘/opt’ folder, you need to change the user
permissions of ‘/opt’ to be writable.
The Android Studio IDE installation file can be found here. https://developer.android.com/
studio/index.html#download
5 http://wiki.ros.org/android
When the extraction is completed, enter the following command to start the IDE installation.
$ /opt/android-studio/bin/studio.sh
When the window shown in Figure 12-13 appears, click ‘Run without import’ to proceed to
‘custom install’.
After that, you should see the window for installing Android SDK like Figure 12-14. Please
note that you need to set the installation location to ‘/opt/android-sdk’. If you do not have the
‘android-sdk’ folder, create it in the location and proceed.
Now, proceed to update the Android SDK through Configure → SDK Manager.
When the source download on the device is successfully completed, the IP of the ROS master
(the computer on which roscore is running) is entered on the terminal as shown in Figure 12-21.
Enter the appropriate IP and click Connect.
We looked at SLAM and service robot that have applied navigation in the previous chapter.
As described in this chapter, it is not difficult to build a service robot. I close this chapter with
hopes that this book helps you build a great robot.
Manipulator
As the research on human robot interaction (HRI)3 became active, manipulators have started
being used in various fields (Media Arts4, VR5, etc.) as well as factories and providing a new
experience to the general public. By combining a digital actuator and 3D printing technology,
the manipulators have become more accessible for the general public, and it is growing as a
major player in the maker and education industry.6 7 8 On the other hands, many worry and fear
that the combination of manipulators and artificial intelligence will take away their jobs.9 10
Manipulators have long been a tool for enriching society and are still helping people in many
different areas.11 12 In the future, if the development of the manipulator can pervade our lives
without getting out of its essence, it is expected that the manipulator will become a part of our
life just like robot vacuum cleaners.
We will introduce a structural description of the manipulator and a library for the manipulator
that ROS supports. OpenManipulator by ROBOTIS is one of the manipulators that support ROS,
and has the advantage of being able to easily manufacture at a low cost by using Dynamixel
actuators with 3D printed parts. With this manipulator, I will introduce and explain how to use
the Gazebo 3D simulator that works with ROS and an integrated library for manipulators called
MoveIt!. Finally, I’m going to talk about the compatibility between OpenManipulator, TurtleBot3
Waffle, Waffle Pi and how to configure and to control the actual platform.
1 https://www.automationworld.com/inside-human-robot-collaboration-trend
2 https://www.kuka.com/en-us/technologies/human-robot-collaboration
3 https://en.wikipedia.org/wiki/Human%E2%80%93robot_interaction
4 https://youtu.be/lX6JcybgDFo
5 http://www.asiae.co.kr/news/view.htm?idxno=2016100416325879220
6 http://www.littlearmrobot.com/
7 https://niryo.com/products/
8 http://www.ufactory.cc/#/en/
9 http://time.com/4742543/robots-jobs-machines-work/
10 http://adage.com/article/digitalnext/5-jobs-robots/308094/
11 https://www.bostonglobe.com/magazine/2015/09/24/this-robot-going-take-your-job/paj3zwznSXMSvQiQ8pdBjK/story.html
12 https://www.automationworld.com/article/abb-unveils-future-human-robot-collaboration-yumi
A manipulator generally has a fixed form on one side, and the fixed part is called a base. The
base of manipulator is the most rigid part as the force applied on it is proportional to the length
and speed of the end effector. The base can also be an object with motion like a mobile robot,
which complement the degree of freedom of the manipulator. A manipulator based on ‘base’ is
made up of a cascade of links and joints. A link usually has one joint, but can have more than
one joint. Joints represent the axis of rotation and are mostly composed of electric motors.
Through the rotation of this electric motor, the joint produces the movement of the link.
According to the movement of the joints, the joints can be divided into Revolute, Prismatic,
Screw, Cylindrical, Universal, and Spherical joints. In recent years, joints using hydraulic rather
than electric motors have been introduced to the public, and researches to find new types of
joints that can replace electric motors are actively underway.
There is a cascade of links and joints on the base, and at the end there is an end effector. ‘End
effector’ is often a gripper due to the characteristic of the manipulator that is intended to pick up
and to move objects. As shown in Figure 13-3, size and shape for grippers differ according to the
purpose and many attempts have been made to provide a gripper with a shape and size that
varies according to the purpose of use, and many attempts have been made to provide a gripper
capable of picking various shapes.
The method of controlling the manipulator can be classified into Joint Space Control and
Task Space Control.
The joint space control is a method of calculating the coordinate of the end of the manipulator
by inputting the angle of rotation of each joint as shown in Figure 13-4. The coordinate of the
end of the manipulator(X, Y, Z, i, z, and }) can be obtained through Forward Kinematics.
As shown in Figure 13-5, the task space control has an input of the coordinate of the end of
the manipulator and outputs of the rotation of each joint, as opposed to the joint space control.
The pose of object in the task space includes its position and orientation. Since we live in a
three-dimensional world, position of object can be expressed as X, Y, and Z and orientation of it
can be expressed as i(roll), z(pitch), and } (yaw). Let’s use a cup on a table as an example. A cup
on the desk (assuming that the origin of the cup is at the center of the cup) can be said that even
if the position is fixed, its pose can changed by lying it down or by rotating the direction of the
cup. In other words, when mathematically interpreted, it means that there are 6 unknowns, so if
there are 6 equations, you can find the unique solution. However, not all manipulator has six
degrees of freedom. It is more efficient to design the degree of freedom according to the purpose
and environment in which the manipulator is used. The degree of rotation of each joint
according to the end of the manipulator can be obtained by using inverse kinematics.
Figure 13-6 Various manipulators with ROS support (from left : ABB, ROBOTIS, Kinova)
13 https://www.roscomponents.com/en/
14 http://robots.ros.org/
15 http://wiki.ros.org/abb/
16 http://rosindustrial.org/
17 http://wiki.ros.org/Robots/JACO/
18 http://wiki.ros.org/ROBOTIS-MANIPULATOR-H/
The first is to create a Unified Robot Description Format (URDF) 19 file as Extensible Markup
Language (XML) as a visualization tool for robot modeling. You can create a simple manipulator
or custom designed parts for your URDF file format and see it in the ROS visualization tool RViz.
The second is a 3D simulator Gazebo20 that can simulate the actual operating environment.
The Gazebo simulation environment, like URDF, can be easily created using Simulation
Description Format (SDF) 21 files using XML. Gazebo also supports ROS-Control22 and plugin23
functions to control various sensors and robots.
The third is MoveIt! 24 It is an integrated library and powerful tool for manipulators that
provides open libraries such as the Kinematics and Dynamics Library (KDL) 25 and The Open
Motion Planning Library (OMPL) 26, which helps you identify the various functions of the
manipulator, such as collision calculation, motion planning and Pick and Place demos.
Let’s look at how to use the three tools mentioned above and implement them with sample
codes.
13.2.1. OpenManipulator
OpenManipulator is an open source software and open source hardware based manipulator
developed by ROBOTIS. OpenManipulator supports the Dynamixel X series27, and you can make
robots by choosing the actuators of the specifications you require. Also, since it is composed of
the basic frame and the 3D printed frame, it is possible to produce a new type of manipulator
according to your environment or purpose. With these characteristics, we will provide
manipulators with various shapes and functions such as SCARA, Planar, and Delta in addition to
the four-joint manipulator. OpenManipulator supports ROS, OpenCR28, Arduino IDE29 and
Processing30.
19 http://wiki.ros.org/urdf
20 http://gazebosim.org/
21 http://sdformat.org/
22 http://wiki.ros.org/ros_control
23 http://gazebosim.org/tutorials?tut=ros_gzplugins
24 http://moveit.ros.org/
25 http://www.orocos.org/kdl
26 http://ompl.kavrakilab.org/
27 http://en.robotis.com/index/product.php?cate_code=101210
28 http://emanual.robotis.com/docs/en/parts/controller/opencr10/
29 https://www.arduino.cc/en/main/software
30 https://processing.org/
We will look at OpenManipulator Chain source code, URDF, Gazebo and MoveIt!. The
following are the ROS packages required to use the above three tools. Let’s install these packages.
First, create the ‘testbot_description’ package as follows, and then create the urdf folder.
Then use an editor to create the testbot.urdf file and enter the following URDF example.
$ cd ~/catkin_ws/src
$ catkin_create_pkg testbot_description urdf
$ cd testbot_description
$ mkdir urdf
$ cd urdf
$ gedit testbot.urdf
31 https://goo.gl/NsqJMu
32 https://github.com/ROBOTIS-GIT/open_manipulator
<material name="black">
<color rgba="0.0 0.0 0.0 1.0"/>
</material>
<material name="orange">
<color rgba="1.0 0.4 0.0 1.0"/>
</material>
<link name="base"/>
<joint name="fixed" type="fixed">
<parent link="base"/>
<child link="link1"/>
</joint>
<link name="link1">
<collision>
<origin xyz="0 0 0.25" rpy="0 0 0"/>
<geometry>
<box size="0.1 0.1 0.5"/>
</geometry>
</collision>
<visual>
<origin xyz="0 0 0.25" rpy="0 0 0"/>
<geometry>
<box size="0.1 0.1 0.5"/>
</geometry>
<material name="black"/>
</visual>
<inertial>
<origin xyz="0 0 0.25" rpy="0 0 0"/>
<mass value="1"/>
<inertia ixx="1.0" ixy="0.0" ixz="0.0" iyy="1.0" iyz="0.0" izz="1.0"/>
</inertial>
</link>
<link name="link2">
<collision>
<origin xyz="0 0 0.25" rpy="0 0 0"/>
<geometry>
<box size="0.1 0.1 0.5"/>
</geometry>
</collision>
<visual>
<origin xyz="0 0 0.25" rpy="0 0 0"/>
<geometry>
<box size="0.1 0.1 0.5"/>
</geometry>
<material name="orange"/>
</visual>
<inertial>
<origin xyz="0 0 0.25" rpy="0 0 0"/>
<mass value="1"/>
<inertia ixx="1.0" ixy="0.0" ixz="0.0" iyy="1.0" iyz="0.0" izz="1.0"/>
</inertial>
</link>
<link name="link3">
<collision>
<origin xyz="0 0 0.5" rpy="0 0 0"/>
<link name="link4">
<collision>
<origin xyz="0 0 0.25" rpy="0 0 0"/>
<geometry>
<box size="0.1 0.1 0.5"/>
</geometry>
</collision>
<visual>
<origin xyz="0 0 0.25" rpy="0 0 0"/>
<geometry>
<box size="0.1 0.1 0.5"/>
</geometry>
<material name="orange"/>
</robot>
URDF describes each component of the robot using XML tags. In the URDF format, first
describe the name of the robot, the name and type of the base (URDF assumes that the base is a
fixed link), and the description of the link connected to the base and then describe each joint
and link. A link describes the name, size, weight, inertia of the link. The joints describes the
name, type, and link connected to each joint. The dynamic parameters of the robot, visualization,
and the collision model can be easily set. The URDF is initiated by the <robot> tag, and in
general, it is common for the <link> tag and the <joint> tag to appear alternately to define links
and joints that are components of the robot. The <transmission> tag is also often included for
interfacing with the ROS-Control to establish the relationship between the joint and the actuator.
Let’s take a closer look at the testbot.urdf we created.
The material tag describes information such as the color and texture of the link. In the
following example, we have defined two materials, black and orange, to distinguish each link.
The color uses a color tag, which can be set after the rgba option by entering a number between
0.0 and 1.0, corresponding to red, green, and blue. The last number stands for a transparency
(alpha) value of 0.0 to 1.0, and a value of 1.0 means that the part is not transparent.
<material name="black">
<color rgba="0.0 0.0 0.0 1.0"/>
</material>
<material name="orange">
<color rgba="1.0 0.4 0.0 1.0"/>
</material>
The first component of the manipulator is the base which can be represented as a link in
URDF. The base is connected to the first link and the joint, and this joint does not move and is
located at the origin (0, 0, 0). Let’s look at the first link tag for a more detailed description of the
<link> tag.
<link name="link1">
<collision>
<origin xyz="0 0 0.25" rpy="0 0 0"/>
<geometry>
<box size="0.1 0.1 0.5"/>
</geometry>
</collision>
<visual>
<origin xyz="0 0 0.25" rpy="0 0 0"/>
<geometry>
<box size="0.1 0.1 0.5"/>
</geometry>
<material name="black"/>
</visual>
<inertial>
<origin xyz="0 0 0.25" rpy="0 0 0"/>
<mass value="1"/>
<inertia ixx="1.0" ixy="0.0" ixz="0.0" iyy="1.0" iyz="0.0" izz="1.0"/>
</inertial>
</link>
The URDF <link> tag consists of collision, visual, and inertial tags (see Figure 13-7) as in the
link1 example above. The collision tag allows you to enter geometric information that indicates
the range of interference of the link. Origin specifies the center coordinates of the interference
range. And geometry writes the shape and size of the interference range centered on the origin
coordinate. For example, the interference range of a hexahedron is a value of width, length and
height. In addition to hexahedron type, there are cylindrical type and spherical type, and each
shape has different parameters to input. Write the actual shape in the visual tag. Origin and
geometry are identical to collision tags. You can also enter CAD files such as STL and DAE here.
You can use the CAD model in the collision tag, but it can only be used with some physics engines
such as ODE or Bullet. DART and Simbody do not support the CAD files. The inertial tag specifies
the weight of the link(kg) and the moment of inertia(kg·m2). This inertia information can be
obtained through design software or actual measurement and calculation, and is used for
dynamics simulation.
The understanding of relative coordinate transformation in URDF can be quite difficult when
first encountered. It may be easier to understand each configuration value by seeing it, but it is
helpful to understand how the relative coordinate transformation of each axis is expressed on
RViz as shown in the Figure. 13-12. I recommended you to try it.
<origin>: Set transformation and rotation relative to the link’s relative coordinate system
<geometry>: Enter the shape of the model. box, cylinder, and sphere.
COLLADA (.dae), STL (.stl) format design files can also be imported. In the <collision>
tag, you can reduce the calculation time by specifying it in a simple form
33 http://wiki.ros.org/urdf/XML/link
34 https://en.wikipedia.org/wiki/Moment_of_inertia
Next, let’s look at the joint tag that connects links to links. The joint tag describes the
characteristics of the joint as shown in Figure 13-8. Specifically, the name and type of joint, such
as revolute, prismatic, continuous, fixed, floating, and planar form. The joint tag also describes
the names of the two links that are connected, the location of the joints, and the limitations of
the axis motion in terms of rotation and translational motion. Connected links are assigned with
names of parent links and child links. The parent link is usually the link closest to the base link.
The following example shows a joint setting for joint2.
<origin>: Convert parent link coordinate system to child link coordinate system
<limit>: Set the speed, force, and radius of the joint (applies only for revolute or prismatic joints)
35 http://wiki.ros.org/urdf/XML/joint
When you finish creating the model, let’s examine each link and joints to see if they are
logically correct. In ROS, you can check the grammatical error of URDF created by ‘check_urdf’
command and the connection relation of each link as the following example. If the file is
grammatically and logically correct, we can confirm that links 1, 2, 3, and 4 are connected as
follows.
$ check_urdf testbot.urdf
robot name is: testbot
---------- Successfully Parsed XML ---------------
root Link: base has 1 child(ren)
child(1): link1
child(1): link2
child(1): link3
child(1): link4
$ urdf_to_graphiz testbot.urdf
Created file testbot.gv
Created file testbot.pdf
It is the fastest way to check the model’s link relationship with ‘check_urdf’ and ‘urdf_to_
graphiz’. Finally, let’s check the robot model using RViz. To do this, go to the ‘testbot_description’
package folder and create a ‘testbot.launch’ file as shown in the following example.
testbot_description/launch/testbot.launch
<launch>
<arg name="model" default="$(find testbot_description)/urdf/testbot.urdf" />
<arg name="gui" default="True" />
<param name="robot_description" textfile="$(arg model)" />
<param name="use_gui" value="$(arg gui)"/>
<node pkg="joint_state_publisher" type="joint_state_publisher" name="joint_state_publisher"/>
<node pkg="robot_state_publisher" type="state_publisher" name="robot_state_publisher"/>
</launch>
The launch file consists of parameters containing URDF, a ‘joint_state_publisher’36 node, and
a ‘robot_state_publisher’ node. The ‘joint_state_publisher’37 node publishes the joint state of the
robot made by URDF through the ‘sensor_msgs/JointState’ message and provides a GUI tool to
give commands to the joints. The ‘robot_state_publisher’ node publishes the result of the
forward kinematics computed with the robot information and the ‘sensor_msgs/JointState’ topic
information set in the URDF as a tf38 message (see Figure 13-10).
Figure 13-10 Topics in the joint_state_publisher node and the robot_state_publisher node
36 http://wiki.ros.org/urdf/XML/joint
37 http://wiki.ros.org/robot_state_publisher
38 http://wiki.ros.org/tf
When the launch file is executed, the GUI of the ‘joint_state_publisher’ node is executed as
shown in Figure 13-11. Here you can control the joint values of joints 1, 2 and 3. If you run RViz,
select ‘base’ on Fixed Frame options and click the [Add] button at the lower left to add the
‘RobotModel’ to see the shape of each joint and link in RViz as shown in Figure 13-12. If you add
‘TF’ display and modify the ‘Alpha’ value of robot model to about 0.3, you can check the shape of
each link and the relationship between joints as shown in the lower figure of Figure 13-12.
If you adjust the GUI bar of the ‘joint_state_publisher’ node, you can see that the virtual robot
on RViz is controlled as shown in Figure 13-13. The relevant source code can be found in the
GitHub repository:
■■ https://github.com/ROBOTIS-GIT/ros_tutorials/tree/master/testbot_description
We have created the 3-axis manipulator according to the URDF format and confirmed it with
RViz. Based on this, let’s look at URDF of OpenManipulator Chain which consists of 4-axis joint
and linear gripper. First, download the source code from GitHub on OpenManipulator and
TurtleBot3.
$ cd ~/catkin_ws/src
$ git clone https://github.com/ROBOTIS-GIT/open_manipulator.git
$ cd ~/catkin_ws && catkin_make
$ cd ~/catkin_ws/src/
$ git clone https://github.com/ROBOTIS-GIT/turtlebot3.git
$ git clone https://github.com/ROBOTIS-GIT/turtlebot3_msgs.git
$ git clone https://github.com/ROBOTIS-GIT/turtlebot3_simulations.git
$ cd ~/catkin_ws && catkin_make
$ roscd open_manipulator_description/urdf
$ ls
materials.xacro → Material info
open_manipulator_chain.xacro → Manipulator modeling
open_manipulator_chain.gazebo.xacro → Manipulator Gazebo modeling
$ roscd open_manipulator_description/launch
$ ls
open_manipulator.rviz → RViz configuration file
open_manipulator_chain_ctrl.launch → File to execute manipulator state info Publisher node
open_manipulator_chain_rviz.launch → File to execute manipulator modeling info visualization node
$ roscd open_manipulator_description/urdf
$ gedit materials.xacro
open_manipulator_description/urdf/materials.xacro
<?xml version="1.0"?>
<robot>
<material name="white">
<color rgba="1.0 1.0 1.0 1.0"/>
</material>
<material name="red">
<color rgba="0.8 0.0 0.0 1.0"/>
</material>
<material name="blue">
<color rgba="0.0 0.0 0.8 1.0"/>
</material>
<material name="green">
<color rgba="0.0 0.8 0.0 1.0"/>
</material>
<material name="grey">
<color rgba="0.5 0.5 0.5 1.0"/>
</material>
<material name="orange">
<color rgba="${255/255} ${108/255} ${10/255} 1.0"/>
</material>
<material name="brown">
<color rgba="${222/255} ${207/255} ${195/255} 1.0"/>
</material>
</robot>
The XML Macro39 is a macro language that allows you to recall code that is abbreviated as
xacro. I recommended you to create a macro for repeatedly used codes. The ‘material.xacro’ file
specifies the colors required for the visualization of the future manipulator.
39 http://wiki.ros.org/xacro
$ roscd open_manipulator_description/urdf
$ gedit open_manipulator_chain.xacro
open_manipulator_description/urdf/open_manipulator_chain.xacro
<!-- some parameters -->
<xacro:property name="pi" value="3.141592654" />
URDF has many repetitive phrases in order to represent the connecting structure of links
and joints, and therefore has a disadvantage of taking a lot of modification time. However, using
xacro can greatly reduce these tasks. For example, it is possible to manage the code efficiently by
setting the variable of the circumference as above, or by separately managing the material
information file and the Gazebo configuration file created in the above and including it in the
actually used file.
We have previously covered about the <link> and <joint> tags when creating a URDF for a
3-axis manipulator. The OpenManipulator chain also uses the <transmission> tag to work with
ROS-Control. Let’s take a look at the transmission tag.
open_manipulator_description/urdf/open_manipulator_chain.xacro
<!-- Transmission 1 -->
<transmission name="tran1">
<type>transmission_interface/SimpleTransmission</type>
<joint name="joint1">
<hardwareInterface>PositionJointInterface</hardwareInterface>
</joint>
<actuator name="motor1">
<hardwareInterface>PositionJointInterface</hardwareInterface>
<mechanicalReduction>1</mechanicalReduction>
</actuator>
</transmission>
<transmission> tag
The OpenManipulator Chain consists of four joints (motors) and five links, a linear gripper
consists of two links and one joint (motor). Other than the linear gripper being prismatic, the
rest of the descriptions are the same with the previous description so please check the URDF
file.
Run the launch file to visualize the completed URDF file in RViz and move the joint using the
‘joint_state_publisher’ GUI as shown in Figure 13-14.
The URDF created in the previous section was designed for visualization using RViz. Let’s
add a few tags to use it in the Gazebo simulation environment. The tags for the Gazebo simulation
are stored in the ‘open_manipulator_chain.gazebo.xacro’ file. Let’s take a look.
open_manipulator_description/urdf/open_manipulator_chain.gazebo.xacro
<!-- Link1 -->
<gazebo reference="link1">
<mu1>0.2</mu1>
<mu2>0.2</mu2>
<material>Gazebo/Grey</material>
</gazebo>
Color and inertia information are essential for setting up the link to be used in Gazebo. Since
the inertia information is included in the URDF file created earlier, only the color needs to be
configured. In addition, gravity, damping, and frictional forces can be set for the Open Dynamics
Engine (ODE)40 41, a physics engine supported by Gazebo. In the above file, only the coefficient of
friction is set as an example. There is also a parameter for the joint information, which is not
mentioned.
<gazebo> tag
open_manipulator_description/urdf/open_manipulator_chain.gazebo.xacro
<!-- ros_control plugin -->
<gazebo>
<plugin name="gazebo_ros_control" filename="libgazebo_ros_control.so">
<robotNamespace>/open_manipulator_chain</robotNamespace>
<robotSimType>gazebo_ros_control/DefaultRobotHWSim</robotSimType>
</plugin>
</gazebo>
40 http://gazebosim.org/tutorials?tut=ros_urdf&cat=connect_ros
41 http://www.ode.org/
<gazebo> tag
The above code will be repeated, so check out the following source code.
open_manipulator_description/urdf/open_manipulator_chain.gazebo.xacro
<?xml version="1.0"?>
<robot>
42 http://gazebosim.org/tutorials?tut=ros_gzplugins&cat=connect_ros
</robot>
$ roscd open_manipulator_gazebo/launch
$ ls
open_manipulator_gazebo.launch → File to launch Gazebo
position_controller.launch → File to launch ROS-CONTROL
$ roscd open_manipulator_gazebo/launch
$ gedit open_manipulator_gazebo.launch
open_manipulator_gazebo/launch/open_manipulator_gazebo.launch
<?xml version="1.0" ?>
<launch>
<!-- These are the arguments you can pass this launch file, for example paused:=true -->
<arg name="paused" default="false"/>
<arg name="use_sim_time" default="true"/>
<arg name="gui" default="true"/>
<arg name="headless" default="false"/>
<arg name="debug" default="false"/>
<!-- We resume the logic in empty_world.launch, changing only the name of the world to be
launched -->
<include file="$(find gazebo_ros)/launch/empty_world.launch">
<arg name="world_name" value="$(find open_manipulator_gazebo)/world/empty.world"/>
<arg name="debug" value="$(arg debug)" />
<!-- Load the URDF into the ROS Parameter Server -->
<param name="robot_description"
command="$(find xacro)/xacro.py '$(find open_manipulator_description)/urdf/open_manipulator_ch
ain.xacro'"/>
<!-- Run a python script to the send a service call to gazebo_ros to spawn a URDF robot -->
<node name="urdf_spawner" pkg="gazebo_ros" type="spawn_model" respawn="false" output="screen"
args="-urdf -model open_manipulator_chain -z 0.0 -param robot_description"/>
The above launch file includes the ‘empty_world.launch’ file, the ‘spawn_model’ node, and
the ‘position_controller.launch’ file. The ‘empty_world.launch’ file contains nodes that executes
Gazebo, so you can set up the simulation environment, GUI, and time. The Gazebo simulation
environment supports files created in SDF format. The ‘spawn_model’ node is responsible for
calling the robot based on the URDF, and ‘position_controller.launch’ is responsible for setting
up and executing ROS-Control.
Now, if you run Gazebo by typing the following command into the terminal, you can see the
OpenManipulator Chain in the Gazebo simulation space, as shown in Figure 13-15.
Next, open a new terminal window and check the topic list.
$ rostopic list
/clock
/gazebo/link_states
/gazebo/model_states
/gazebo/parameter_descriptions
/gazebo/parameter_updates
/gazebo/set_link_state
/gazebo/set_model_state
/joint_states
/open_manipulator_chain/grip_joint_position/command
/open_manipulator_chain/grip_joint_sub_position/command
/open_manipulator_chain/joint1_position/command
Looking at the list of topics, there are topics with the ‘/gazebo’ namespace and topics with the
‘/open_manipulator_chain’ namespace. We can use ROS-Control to check and control the state of
the robot on Gazebo using topics with the ‘/open_manipulator_chain namespace’. Let’s move the
robot using the following command.
13.3. MoveIt!
MoveIt!43 is an integrated library for manipulators that provides a variety of functions including
fast inverse kinematics analysis for motion planning, advanced algorithms for manipulation,
robot hand control, dynamics, controllers, and motion planning. It is also easy to use without
advanced knowledge of the manipulator because the GUI is offered to assist with various settings
needed to use MoveIt!. This is a tool that many ROS users love because it allows visual feedback
using RViz. Let’s briefly review the structure of MoveIt!, and then create a MoveIt! package to
control the OpenManipulator chain.
13.3.1. move_group
43 http://moveit.ros.org/
The ‘move_group’ node receives information about the robot from the URDF, Semantic Robot
Description Format (SRDF) 44, and MoveIt! Configuration. URDF that you have created will be
used whereas SRDF and MoveIt! Configuration will be created through the Setup Assistant45
provided by MoveIt!.
The ‘move_group’ node provides the state and control of the robot and its environment
through ROS topics and actions. The joint state uses ‘sensor_msg/JointStates’ message, the
transformation information uses the tf library, and the controller uses the ‘FollowTrajectoryAction’
interface to inform the user about the robot status. In addition, the user is provided with
information on the environment where the robot is operating and the status of the robot through
the ‘planning scene’.
The ‘move_group’ provides a plugin function for its extensibility, and provides an opportunity
to apply various functions (control, path generation, dynamics, etc.) to the user’s robot through
an open source library. The plugin built into MoveIt! is a great library that has already been
proven to a lot of people, and a number of recently-developed open source libraries will soon be
available as well. Open Motion Planning Library (OMPL) 46, Kinematic and Dynamic Library
(KDL)47, and A Flexible Collision Library (FCL) 48 are in the category of those libraries.
Type the following command into the terminal and run MoveIt! Setup Assistant.
44 http://wiki.ros.org/srdf
45 http://docs.ros.org/kinetic/api/moveit_tutorials/html/doc/setup_assistant/setup_assistant_tutorial.html
46 http://ompl.kavrakilab.org/
47 http://www.orocos.org/kdl
48 http://gamma.cs.unc.edu/FCL/fcl_docs/webpage/generated/index.html
Figure 13-18 is the first page you will see when you run the ‘MoveIt! Setup Assistant’. In this
screen, you can see the representative ROS character, the Turtle, on the right, and you can
choose whether to create a new package or modify an existing package in the left page. Since we
need to create a new package, let’s click on the [Create New MoveIt Configuration Package]
button.
If you have successfully loaded the file, go to the ‘Self-Collision’ page. This page allows you to
define the sampling density needed to build the ‘Self-Collision Matrix’ and, if necessary, the user
can determine the range of collisions between the links that make up the robot, as shown in
Figure 13-20 below. The higher the sampling density, the more computation is required to
prevent collision between links in various poses of the robot. Set the desired sampling density
and click the [Generate Collision Matrix] button. The default value is set to ‘10,000’.
The ‘Virtual Joints’ page provides a virtual joint between the base of the manipulator and the
reference coordinate system. For example, if an ‘OpenManipulator Chain’ is mounted to the
TurtleBot3 Waffle or Waffle Pi, the degree of freedom of it can be provided to the OpenManipulator
Chain through a Virtual Joint. Since the base is fixed, you do not need to set up a virtual joint as
shown in Figure 13-21.
Figure 13-23 Create new group page on the Planning Groups page
In this step, you can specify the group name and select the desired kinematic analysis plug-in
(Kinematic Slover). Set the group name as arm and select the desired kinematics interpretation
plugin. Since the manipulator will be divided into joint groups, choose [Add Joint] button. If the
window changes as shown in Figure 13-24, select joints 1~4 and click the [Save] button to create
the group as shown in Figure 13-25.
Figure 13-26 Arm group and gripper groups created on the Planning Groups page
Figure 13-30 End Effectors settings window on the End Effectors page
Write the ‘End Effector Name’ as shown in Figure 13-30 and select ‘gripper’ in the group
created on the ‘Planning Groups’ page. When you check URDF, the gripper has the fifth link as
its parent link.
The ‘Passive Joints’ page allows you to specify joints that are excluded from motion planning.
In the OpenManipulator Chain, there is no passive joint, so let’s move on without making any
changes, as shown in Figure 13-31.
Once all the settings are completed, you can finish the ‘Configuration Files’ page. Click the
[Browse] button at the top of Figure 13-33 to create the ‘open_manipulator_moveit_example’
folder in the ‘open_manipulator’ folder and click the [Generate Package] button in the lower
right corner to create a config folder and a launch folder containing executable files for MoveIt!
Configuration.
$ cd ~/catkin_ws/src/open_manipulator/open_manipulator_moveit_example
$ ls
Config → YAML and SRDF files for MoveIt! Configuration
launch → Launch file
.setup_assistant → Package information generated by the setup assistant
CMakeLists.txt → CMake build system input file
package.xml → Defining package properties
When you run the Demo, you can see the OpenManipulator Chain through the RViz window
as shown in Figure 13-34. In the Context page of the MotionPlanning command window located
in the lower left corner, you can select the motion planning library49 supported by OMPL. Select
‘RRTConnectkConfigDefault’ and go to the Planning page.
Originally, the coordinates of the end point of the manipulator are represented as X, Y, Z and
rotation values represented as i(Roll), z(Pitch), and }(Yaw). However, since the OpenManipulator
Chain has only four joints, the end points will have only the degrees of freedom of the X, Z and
Pitch axes (the remaining one degree of freedom is the Yaw axis rotation of the first joint).
Remember this and move the end point to the desired coordinates.
49 http://ompl.kavrakilab.org/planners.html
If you have moved to the desired location, click the [Plan & Excute] button on the Planning
page to check the movement.
In this example, you can specify the target pose of the manipulator by inputting the position and
orientation on RViz. In addition to this, you can also use the method of specifying only the position by
moving the green sphere marker (Interactive marker). To do this, write ‘position_only_ik: true’ at the
bottom of the kinematics.yaml file in the config folder.
$ roscd open_manipulator_moveit/config/
$ gedit kinematics.yaml
arm:
$ roscd open_manipulator_moveit
$ ls
Config → YAML and SRDF files for the move_group setup
include → Trajectory filter header file
launch → Launch file
src → Trajectory filter source code file
.setup_assistant → Package information produced by the setup assistant
CMakeLists.txt → CMake build system input file
package.xml → Defining package properties
planning_request_adapters_plugin_description.xml → Plugin set-up file
You can see more files than those created using the setup assistant above. Rapidly-exploring
Random Tree (RRT) 50, commonly known as a path generation algorithm, randomly samples the
path to find a proper path to move from a current location to a target location, and returns the
result. Each joint value returned represents the required pose data to move to the target position.
However, in order to move from the n-th posture to the (n+1)-th posture, joint values obtained at
a smaller sampling time are needed. Therefore, ‘industrial_trajectory_filters’51 supported by
ROS-INDUSTRIAL are used for this purpose.
$ cd ~/catkin_ws/src/industrial_core/industrial_trajectory_filters/
50 https://en.wikipedia.org/wiki/Rapidly-exploring_random_tree
51 http://wiki.ros.org/industrial_trajectory_filters
$ cd ~/catkin_ws/src/open_manipulator/open_manipulator_moveit_example/config
$ gedit smoothing_filter_params.yaml
smoothing_filter_name: /move_group/smoothing_5_coef
smoothing_5_coef:
- 0.25
- 0.50
- 1.00
- 0.50
- 0.25
Open ‘ompl_planning_pipeline.launch.xml’ in the launch folder and add the following filter to
5.
planning_adapters:
$ cd ~/catkin_ws/src/open_manipulator/open_manipulator_moveit_example/launch
$ gedit ompl_planning_pipeline.launch.xml
industrial_trajectory_filters/UniformSampleFilter
industrial_trajectory_filters/AddSmoothingFilter
7.
The contents of the launch file when you insert contents 5 and 6 are shown in the
following example.
<launch>
<!-- OMPL Plugin for MoveIt! -->
<arg name="planning_plugin" value="ompl_interface/OMPLPlanner" />
<!-- The request adapters (plugins) used when planning with OMPL.
ORDER MATTERS -->
<arg name="planning_adapters" value="
industrial_trajectory_filters/UniformSampleFilter
industrial_trajectory_filters/AddSmoothingFilter
</launch>
8. How to launch
Use the following command to see how this differs from the previous demonstration.
In the previous section, we tried to control the robot of Gazebo simulator using message
communication. Let’s look at the ‘open_manipulator_position_ctrl’ package, which is the source
code for this.
open_manipulator_position_ctrl/src/position_controller.cpp
bool PositionController::initStatePublisher(bool using_gazebo)
{
// ROS Publisher
if (using_gazebo)
{
ROS_WARN("SET Gazebo Simulation Mode");
for (std::map<std::string, uint8_t>::iterator state_iter = joint_id_.begin();
state_iter != joint_id_.end(); state_iter++)
{
std::string joint_name = state_iter->first;
gazebo_goal_joint_position_pub_[joint_id_[joint_name]-1] =
nh_.advertise<std_msgs::Float64>("/" + robot_name_ + "/" + joint_name + "_position/command", 10);
}
If you select the desired motion planning library in the RViz window and define the end
point coordinates and click the Plan and Excute button as shown in Figure 13-35, you can see
that the manipulator in the Gazebo simulator environment and the manipulator in the RViz
window as shown in Figure 13-36 move together.
The ‘position_controller’ node is responsible for controlling the linear gripper with message
communication with ‘move_group’.
open_manipulator_position_ctrl/src/position_controller.cpp
void PositionController::gripperPositionMsgCallback(const std_msgs::String::ConstPtr &msg)
{
if (msg->data == "grip_on")
{
gripOn();
}
else if (msg->data == "grip_off")
{
gripOff();
}
else
{
ROS_ERROR("If you want to grip or release something, publish 'grip_on' or 'grip_off'");
}
}
To move the linear gripper, just publish the topic as follows. You can see how the gripper
moves as shown in Figure 13-38. To operate the other way around, use the ‘grip_off’ in the
command.
52 https://goo.gl/NsqJMu
To run Dynamixel on ROS, you need the ‘dynamixel_sdk’ package53. ‘dynamixel_sdk’ is a ROS
package included in the DYNAMIXEL SDK provided by ROBOTIS which helps to control
Dynamixel more easily by providing a control function for packet communication.
53 http://wiki.ros.org/dynamixel_sdk
Figure 13-41 dynamixel-workbench, One of the ROS official packages provided by ROBOTIS,
Set all the baudrates of the Dynamixels to 1Mbps (1,000,000bps), and set the operation mode
to position control mode. Assign IDs of 1 through 5 for each Dynamixel, and start assembling by
referring to the OpenManipulator Wiki and published hardware (Onshape) information. Once
the assembly is completed, use U2D2 to communicate with OpenManipulator Chain, convert
the communication method TTL or RS-485 to USB and connect it to the main computer. For
54 http://wiki.ros.org/dynamixel_workbench
U2D2 is the latest version of USB2DYNAMIXEL and has connectors compatible with Dynamixel X
series. Also, unlike the existing USB2DYNAMIXEL, it supports micro USB connector and its size has
been significantly reduced. U2D2 supports RS-485, TTL and additional UART
When the connection is completed, enter the following command in the terminal window.
Here, ‘chmod’ sets the permission to use the device. The following example is an example when
U2D2 is recognized as ttyUSB0.
When execution is completed, torque is applied to each Dynamixel. Let’s check the topic list.
$ rostopic list
/robotis/dynamixel/goal_states
/robotis/dynamixel/present_states
/rosout
/rosout_agg
As mentioned above, the current position of each Dynamixel can be monitored through the
topic message communication, and the Dynamixel can be operated at a desired angle value.
The following example shows an example of using the only position of the target pose by
setting the ‘position_only_ik’ item is ‘true’. See [Reference] for details. ([Reference] ‘Designation
of the target pose (position + orientation) of the manipulator’ that was covered in the ‘13.3.2
MoveIt! Setup Assistant’)
Figure 13-45 OpenManipulator Chain motion planning using MoveIt! (Position IK Only)
In this section, we will look at RViz by adding TurtleBot3 Waffle URDF to the ‘open_
manipulator_chain.xacro’ file created above.
You can see the URDF folder where you saved the URDF file in the turtlebot3_description
folder. When I created the URDF file for the OpenManipulator Chain, I mentioned that it would
be easier to reuse it later if saved in the xacro file format. Let’s examine the ‘open_manipulator_
with_tb3’ package.
$ roscd open_manipulator_with_tb3/urdf
$ gedit open_manipulator_chain_with_tb3.xacro
open_manipulator_with_tb3/urdf/open_manipulator_chain_with_tb3.xacro
<!-- Include TurtleBot3 Waffle URDF -->
<xacro:include filename="$(find turtlebot3_description)/urdf/turtlebot3_waffle_naked.urdf.xacro" />
If you open the URDF file of the ‘open_manipulator_with_tb3’ package, you can see the code
to include the file ‘turtlebot3_waffle_naked.urdf.xacro’ at the top. With this code, the TurtleBot3
Waffle can be loaded, and the manipulator creates a fixed joint on the link where it should be
positioned.
The TurtleBot3 Waffle and Waffle Pi are basically equipped with a LIDAR sensor, but for user
convenience, ROBOTIS also offers a URDF without a LIDAR sensor.
The OpenManipulator Chain uses a modular actuator called Dynamixel, which can be
mounted on various robots. It is recommended to use the reuseability of xacro on a customized
robot.
Symbols C
~/ 70 Camera 199
/(forward slash) 64 Camera Calibration 207
--screen 288 Catkin 45, 74, 121
--- (three consecutive hyphens) 63, 166, 175 catkin_create_pkg 152
~ (tilde) 64 catkin_make 122
__ (two consecutive underscores) 65 catkin_python_setup 84
_ (underscore) 65 Chessboard 208
Client Library 13, 47, 68
CMake 74
456
Index
E J
F
K
feedback 58
Forward Kinematics 401 Kalman filter 333
Kinetic Kame 18, 25
G
L
Gazebo 303, 421, 445
Gmapping 336 LDS 218
goal 58 LED 251, 262
GUI 129, 137, 415 link 400, 408
Log 97, 117, 146
LTS 19, 21
History 15
M
Manipulator 399
457
Index
O
R
Occupancy Grid Map 325
Odometry 299, 318, 341 remapping 65
458
Index
reusability 5 RPC 48
Robot 195 rqt 137
ROS 41 rqt_bag 146
rosbag 117, 146 rqt_graph 38, 47, 143
rosbuild 45 rqt_image_view 141
roscd 94 rqt_plot 144
rosclean 99 RViz 129, 204, 215, 221, 272, 297, 298
roscore 28, 36, 46, 54, 96 RViz Displays 135
roscpp 47, 68
rosdep 26, 126
rosed 95 S
ROS_HOSTNAME 30, 205, 287
rosinstall 27, 126 SDF 403
rosserial 255, 259, 260, 262 SLAM 308, 317, 324, 332
459
Index
T X
TurtleBot3 273, 279, 280, 283, 284, 287, 290, 297, 303
turtlesim 37
U2D2 452
Unit 149
URDF 403, 404, 430
urdf_to_graphiz 413
URI 48
visual 409
Voltage 253, 266
Wiki 47, 91
Workspace 27, 71
460
ROS Robot Programming
ROS Robot
From Programming
the basic concept to practical robot
application programming
From the basic
• ROS Kineticconcept toconcept,
Kame : Basic practical robotand tools
instructions
application programming
• How to use sensor and actuator packages on ROS
• Embedded board for ROS : OpenCR1.0
• ROS Kinetic Kame : Basic concept, instructions and tools
• SLAM & navigation with TurtleBot3
• How to use sensor and actuator packages on ROS
• How to program a delivery robot using ROS Java
• Embedded board for ROS : OpenCR1.0
• OpenManipulator simulator using MoveIt! and Gazebo
• SLAM & navigation with TurtleBot3
Robot Programming
• How to program a delivery robot using ROS Java
This Handbook
• OpenManipulator is written
simulator using MoveIt! for
and Gazebo
ISBN 979-11-962307-1-5
93000
9 791196 230715