It was decided that in order to make the development of modules in parallel be completed more quickly that a simulation environment would be used. This environment is able to place the TurtleBot inside of a virtual world and act like real hardware. Robotis Inc. recommends the use of Robot Operating System (ROS) for interfacing and control of the TurtleBot, and the Noetic version of ROS was selected due to its compatibility with the other modules of the project. The simulation environment Gazebo was selected due to its ROS compatibility and robust engine and existing map environments. The mapping and navigation portion of the project was developed on Gazebo and verified on hardware when necessary, which allowed for other modules to be worked on simultaneously since there is only one TurtleBot to use. All simulation development was done on the Robotis created map called turtlebot3_house. The portion of the map used for development can be seen in Figure 14.
Simulation Environment turtlebot3_house
In this virtual environment, mapping using the LiDAR sensor was developed. LiDAR data is taken in, and the map is updated every couple of seconds. The TurtleBot is driven around the environment until a sufficient map of the space is created. The light gray is explored space, the dark gray is unexplored space, the black is physical obstacles, and the green is current readings from the LiDAR sensor. A map of the simulation environment from Figure 14 was created and can be seen in Figure 15, below.
Mapping Simulation of turtlebot3_house
Verification on the actual TurtleBot is a fairly easy task, with most of the modules and commands executed being the same. The main difference is the need to connect to the TurtleBot itself and bring up the ROS system. In order to do this an ssh command is used to launch bringup, which is a package that initializes a lot of the configuration and launch files that allows for the smooth operation of the TurtleBot. Once this is completed, the same series of commands is run to begin mapping around the environment. An example of a map generated on actual TurtleBot hardware can be seen in Figure 16. It should be noted that the map generated is not nearly as clean and nice-looking as the simulation results, however, this is expected due to the fact that orientation tracking on the TurtleBot is less precise than what is possible in simulation.
Hardware Verification Map of Armstrong 124
Once a map of the environment is saved, the navigation module can be launched. This module requires the location and orientation of the TurtleBot to be estimated so that the system can be aware of different obstacles that the LiDAR sensor cannot currently detect. Once the initial pose is sufficiently close to the real location, a goal location and orientation can be marked on the map. The TurtleBot will then create a plan to path from its current location to the goal location while remaining a certain distance away from any mapped obstacles. A simulation of the navigation module using the map created in Figure 16 can be seen in Figure 17.
Navigation Simulation in turtlebot3_house