Research article

LIDAR-based autonomous navigation method for an agricultural mobile robot in strawberry greenhouse: AgriEco Robot

  • Received: 04 July 2022 Revised: 23 August 2022 Accepted: 09 September 2022 Published: 19 September 2022
  • This paper presents an autonomous navigation method for an agricultural mobile robot "AgriEco Robot", with four-wheel-drive and embedded perception sensors. The proposed method allows an accurate guidance between strawberry crop rows while automatically spraying pesticides, as well as detecting the end and switching to the next rows. The main control system was developed using Robot Operating System (ROS) based on a 2D LIDAR sensor. The acquired 2D point clouds data is processed to estimate the robot's heading and lateral offset relative to crop rows. A motion controller is incorporated to ensure the developed autonomous navigation method. Performance in terms of accuracy of the autonomous navigation has been evaluated in real-world conditions within strawberry greenhouses, proving its usefulness for automatic pesticide spraying.

    Citation: Abdelkrim Abanay, Lhoussaine Masmoudi, Mohamed El Ansari, Javier Gonzalez-Jimenez, Francisco-Angel Moreno. LIDAR-based autonomous navigation method for an agricultural mobile robot in strawberry greenhouse: AgriEco Robot[J]. AIMS Electronics and Electrical Engineering, 2022, 6(3): 317-328. doi: 10.3934/electreng.2022019

    Related Papers:

  • This paper presents an autonomous navigation method for an agricultural mobile robot "AgriEco Robot", with four-wheel-drive and embedded perception sensors. The proposed method allows an accurate guidance between strawberry crop rows while automatically spraying pesticides, as well as detecting the end and switching to the next rows. The main control system was developed using Robot Operating System (ROS) based on a 2D LIDAR sensor. The acquired 2D point clouds data is processed to estimate the robot's heading and lateral offset relative to crop rows. A motion controller is incorporated to ensure the developed autonomous navigation method. Performance in terms of accuracy of the autonomous navigation has been evaluated in real-world conditions within strawberry greenhouses, proving its usefulness for automatic pesticide spraying.



    加载中


    [1] Tsimbiri PF, Moturi WN, Sawe J, et al. (2015) Health Impact of Pesticides on Residents and Horticultural Workers in the Lake Naivasha Region, Kenya. Occup Dis Environ Med 3: 24-34. https://doi.org/10.4236/odem.2015.32004 doi: 10.4236/odem.2015.32004
    [2] Rincón VJ, Páez FC, Sánchez-Hermosilla J (2018) Potential dermal exposure to operators applying pesticide on greenhouse crops using low-cost equipment. Sci Total Environ 630: 1181-1187. https://doi.org/10.1016/j.scitotenv.2018.02.235 doi: 10.1016/j.scitotenv.2018.02.235
    [3] Collado E, Valdes E, Garcia A, et al. (2021) Design and implementation of a low-cost IoT-based agroclimatic monitoring system for greenhouses. AIMS Electronics and Electrical Engineering 5: 251-283. https://doi.org/10.3934/electreng.2021014 doi: 10.3934/electreng.2021014
    [4] Feng Q, Zou W, Fan P, et al. (2018) Design and test of robotic harvesting system for cherry tomato. Int J Agric Biol Eng 11: 96-100. https://doi.org/10.25165/j.ijabe.20181101.2853 doi: 10.25165/j.ijabe.20181101.2853
    [5] Reiser D, Sehsah ES, Bumann O, et al. (2019) Development of an Autonomous Electric Robot Implement for Intra-Row Weeding in vineyards. Agriculture 9: 18. https://doi.org/10.3390/agriculture9010018 doi: 10.3390/agriculture9010018
    [6] Haibo L, Shuliang D, Zunmin L, et al. (2015) Study and Experiment on a Wheat Precision Seeding Robot. J Robot 2015: 1-9. https://doi.org/10.1155/2015/696301 doi: 10.1155/2015/696301
    [7] Shalal N, Low T, McCarthy C, et al. (2013) A review of autonomous navigation systems in agricultural environments.
    [8] Åstrand B, Baerveldt AJ, (2005) A vision based row-following system for agricultural field machinery. Mechatronics 15: 251-269. https://doi.org/10.1016/j.mechatronics.2004.05.005 doi: 10.1016/j.mechatronics.2004.05.005
    [9] Ericson S, Åstrand B (2009) A vision-guided mobile robot for precision agriculture. Precision agriculture'09: papers presented at the 7th European Conference on Precision Agriculture, Wageningen, the Netherlands, 6-8 June 2009.
    [10] Chen B, Tojo S, Watanabe K (2003) Machine Vision for a Micro Weeding Robot in a Paddy Field. Biosyst Eng 85: 393-404. https://doi.org/10.1016/S1537-5110(03)00078-3 doi: 10.1016/S1537-5110(03)00078-3
    [11] Montalvo M, Pajares G, Guerrero JM, et al. (2012) Automatic detection of crop rows in maize fields with high weeds pressure. Expert Syst Appl 39: 11889-11897. https://doi.org/10.1016/j.eswa.2012.02.117 doi: 10.1016/j.eswa.2012.02.117
    [12] Hiremath SA, van der Heijden GW, van Evert FK, et al. (2014) Laser range finder model for autonomous navigation of a robot in a maize field using a particle filter. Comput Electron Agric 100: 41-50. https://doi.org/10.1016/j.compag.2013.10.005 doi: 10.1016/j.compag.2013.10.005
    [13] Malavazi FB, Guyonneau R, Fasquel JB, et al. (2018) LiDAR-only based navigation algorithm for an autonomous agricultural robot. Comput Electron Agric 154: 71-79. https://doi.org/10.1016/j.compag.2018.08.034 doi: 10.1016/j.compag.2018.08.034
    [14] Higuti VA, Velasquez AE, Magalhaes DV, et al. (2019) Under canopy light detection and ranging‐based autonomous navigation. J Field Robot 36: 547-567. https://doi.org/10.1002/rob.21852 doi: 10.1002/rob.21852
    [15] Barawid OC, Mizushima A, Ishii K, et al. (2007) Development of an Autonomous Navigation System using a Two-dimensional Laser Scanner in an Orchard Application. Biosyst Eng 96: 139-149. https://doi.org/10.1016/j.biosystemseng.2006.10.012 doi: 10.1016/j.biosystemseng.2006.10.012
    [16] Martínez A, Fernández E (2013) Learning ROS for robotics programming : a practical, instructive, and comprehensive guide to introduce yourself to ROS, the top-notch, leading robotics framework. Available from: https://www.semanticscholar.org/paper/Learning-ROS-for-robotics-programming-%3A-a-and-guide-Mart%C3%ADnez-Fern%C3%A1ndez/d5c04230cc426c85c09933e3ab1ec6208a747293
    [17] Abanay A, Masmoudi L, Elharif A, et al. (2017) Design and development of a mobile platform for an agricultural robot prototype. Proceedings of the 2nd International Conference on Computing and Wireless Communication Systems - ICCWCS'17, 1-5. https://doi.org/10.1145/3167486.3167527 doi: 10.1145/3167486.3167527
    [18] Abanay A, Masmoudi L, El Ansari M (2022) A Calibration Method of 2D LIDAR-Visual Sensors Embedded on an Agricultural Robot. Optik 249: 168254. https://doi.org/10.1016/j.ijleo.2021.168254 doi: 10.1016/j.ijleo.2021.168254
    [19] ZED Stereo Camera. Available from: https://www.stereolabs.com/zed/
    [20] Giubilato R, Chiodini S, Pertile M, et al. (2019) An evaluation of ROS-compatible stereo visual SLAM methods on a nVidia Jetson TX2. Measurement 140: 161-170. https://doi.org/10.1016/j.measurement.2019.03.038 doi: 10.1016/j.measurement.2019.03.038
    [21] Min SK, Delgado R, Byoung WC (2018) Comparative Study of ROS on Embedded System for a Mobile Robot. J Autom Mob Robot Intell Syst 12: 61-67. https://doi.org/10.14313/JAMRIS_3-2018/19 doi: 10.14313/JAMRIS_3-2018/19
  • Reader Comments
  • © 2022 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(3072) PDF downloads(223) Cited by(13)

Article outline

Figures and Tables

Figures(10)

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog