Design

Simultaneous localization and mapping are done through the drawing of a map to estimate the TurtleBot’s current location in a specific space, but in this group’s case, because LiDAR is not being used, a map will not be generated. Using similar logic for connection and movement in the motors provided in the turlebot3_setup_motor.ino, a motor_tags.ino script which communicates with a Python script using PySerial which is a serial communication library that provides support for serial connections. In this case, the PySerial library is used for communication between the Nvidia Jetson and the OpenCR board. Within this script, the motors of the TurtleBot3 Waffle Pi are primarily controlled with the functions moveMotors, moveRightMotor, stopRightMotor, and stopLeftMotor. For proper synchronization of the left and right motors when desired, whilst within the moveMotors function the moveRightMotor function is called to. Similar actions are taken when stopping both motors. Once the user has entered their desired location, both motors begin moving forward and only stop if the user presses the emergency stop button attached to the leash or if the correct AprilTag, room number, is detected and lies at a specific distance away from the StereoCamera. In terms of obstacle avoidance, the logic being utilized is because the guide dog will always be navigating and checking the AprilTags on the right-hand side of a hallway if an obstacle is detected then the guide dog will traverse around the obstacle by moving left around the obstacle. After the obstacle is successfully avoided the guide dog will return to the pre-determined distance from the right-hand side of the hallway. Figure 1 depicts the guide dog moving forward and passing the AprilTags on its righthand side, which is how testing will be carried out.  

Close-up of Hardware Components