Autonomous navigation in agriculture, a comparative study between 2D and 3D LiDAR’s in greenhouses

Citation
Share
Abstract
Autonomous navigation is no longer a car-only application. Over the past years, more and more areas have come to autonomous navigation to automate some tasks and simplify processes. Agriculture is a clear example of this with self-driving trucks that harvest or plant crops; self-driving drones to oversee big cultivation fields; and, of course, mobile robots too. Integrating Robotics with Agriculture can reduce exposure from human workers to chemicals or adverse conditions which can damage their health and reduce costs in both workforce and maintenance. This work shows a comparative study of a mobile robot that can navigate autonomously in a greenhouse using two different types of LiDARs, paving the way for future developments that use this platform as a base. Robotics Operating System (ROS) is used on a Jackal robot equipped with wheel encoders, GPS, and an IMU; the last three sensors are fused together for improved odometry. An RPLiDAR A3 from SLAMTEC is used as the 2D LiDAR, and a VLP16 from Velodyne is used as a 3D LiDAR. Both simulated and real-world tests are developed to calibrate and compare LiDARs regarding the computational load, safety, and performance to test the hypothesis that autonomous navigation in greenhouses differs between 2D and 3D LiDARs. Tests and their analysis revealed that each type of LiDAR is better at certain scenarios, accepting the initial hypothesis. Some future implementations are also outlined, intended to guide the reader into the next steps if the decision to follow this project is decided.
Description
https://orcid.org/0000-0003-3098-7275