Router information is important to Wi-Fi positioning systems; however, continuously collecting wifi data of an area can be difficult. Our project team seeks to optimize an existing robot to collect router information automatically. Prior to this project commencement, the physical structures of the robot was already implemented, the robot is able to map the surrounding environment, as well as navigating to a designated location.
The project will utilize TurtleBot, as shown in Figure 1 below, which is a new generation mobile robot that is modular, compact and customizable. TurtleBot is a small, affordable, programmable, ROS-based mobile robot for use in education, research, hobby, and product prototyping [1]. The TurtleBot can be customized in various ways depending on how you reconstruct the mechanical parts and use optional parts such as the computer and sensors. Optional parts such as chassis, computers and sensors are available, and TurtleBot can be customized in various ways. This project will consist of building a TurtleBot that will be programmed to self navigate in a dynamic environment containing moving obstacles.
The current robot setup has problems in obstacle avoidance. The robot uses lasers shot by a depth camera to detect obstacles. The depth image and video are not used since the processing power is limited by the CPU. Using only the camera for obstacle detection limited the detection range of the robot. In this case, the robot can only sense obstacles in front of it. Another problem is the lasers shot by the camera is in a horizontal 2D plane. Any obstacles that is not in the same plane will not be detected. For the TurtleBot setup shown in figure 1 above, the camera is installed in a flat platform that is 1m above the floor. Therefore, the robot could detect obstacles at 1m height, for any other obstacles that is lower than 1m will be ignored by the robot.
In this project, optimizing the navigation, adding new sensors, and implementing the room exploration feature will be the main objectives. Analyzing router information and the path planning algorithm will not be included in this project.
With TurtleBot, you’ll be able to build a robot that can drive around your house, see in 3D, and have enough horsepower to create exciting applications.
Jetson TX2 development board, running on Robot Operating Language collects camera and LIDAR information and produces relative Simultaneous Localization and Mapping (SLAM) data. The SLAM data consists of the current relative robot locations, the 3D environment, and the obstacle map.
Lidar is a Laser based radar system. That is able to scan a 2D plane all around the robot. This is useful for navigation and object avoidance.
The main goal of the Ultrasonic Sensors are to detect and avoid objects that are too low to be detected by either the Lidar or depth camera.
This sensor is used to map the space that the robot is navigating and can produce a 3D point cloud that can be used for object detection. However, using this to produce a point cloud is too computationally expensive, therefore it is usually used as a 2D scanning device.
RViz is a 3D tool for visualising and controlling a ROS based robot. We use it to visualise our robot's structure, and to provide instruction.
The automated navigation allows the robots to self explore an office environment, and this feature is built on top of the mapping functions. On the external computer that sends commands to the robot, users can define area on the user interface which the robot will explore all the unknown area within the user defined area. While exploring the room, the mapping function is called to simultaneously draw a 2D grid map on the user interface.
Check out this great video
[1] Robotis, “TurtleBot3 e-manual” 2017. [Online]Available: http://emanual.robotis.com/docs/en/platform/turtlebot3/overview/
[2] Maxbotics, “LV-MaxSonar® -EZ™ Series High Performance Sonar Range Finder” MB1040 datasheet, 2005.
[3] Wiki.ros.org. (2017). frontier_exploration - ROS Wiki. [online]
Available at: http://wiki.ros.org/frontier_exploration
[4] Medium. (2019). Getting Started with NVIDIA Jetson TX2. [online] Available at:
https://medium.com/@surmenok/getting-started-with-nvidia-jetson-tx2-5952a2d8d7ae [Accessed 30 Jul.
2019].
[5] Digikey.com. (2019). MB1403-000 MaxBotix Inc. | Sensors, Transducers | DigiKey. [online] Available at:
https://www.digikey.com/product-detail/en/maxbotix-inc/MB1403-000/1863-1021-ND/7896793 [Accessed
30 Jul. 2019].
[6] Scanner, R., Scanner, R., One, T., Duo, T., Tower, T., UST-10LX, H., 64px, T., 60m, T., 600Hz, T., 3m,
T., Multiflex, T., Scanner, R. and Plus, T. (2019). RPLIDAR A2M6 360 ° Laser Scanner. [online] ROS
Components. Available at: https://www.roscomponents.com/en/lidar-laser-scanner/236-rplidar-a2m8.html
[Accessed 30 Jul. 2019].
[7] Pubs.sciepub.com. (2019). Figure 17. Desired mobile robot trajectory and path around an obstacle : An
ANN Based NARX GPS/DR System for Mobile Robot Positioning and Obstacle Avoidance : Science and
Education Publishing. [online] Available at: http://pubs.sciepub.com/automation/1/1/2/figure/17 [Accessed
30 Jul. 2019].
[8] Kumar, M. (2019). Autonomous Robot Based on Robot Operating System ( ROS ) for Mapping and
Navigation. [online] Semanticscholar.org. Available at:
https://www.semanticscholar.org/paper/Autonomous-Robot-Based-on-Robot-Operating-System-(-
Kumar/4c7ebbfc044545cbfd38c3195093cc402f6181c9/figure/7 [Accessed 30 Jul. 2019].