Research article

LIDAR-based autonomous navigation method for an agricultural mobile robot in strawberry greenhouse: AgriEco Robot

  • Received: 04 July 2022 Revised: 23 August 2022 Accepted: 09 September 2022 Published: 19 September 2022
  • This paper presents an autonomous navigation method for an agricultural mobile robot "AgriEco Robot", with four-wheel-drive and embedded perception sensors. The proposed method allows an accurate guidance between strawberry crop rows while automatically spraying pesticides, as well as detecting the end and switching to the next rows. The main control system was developed using Robot Operating System (ROS) based on a 2D LIDAR sensor. The acquired 2D point clouds data is processed to estimate the robot's heading and lateral offset relative to crop rows. A motion controller is incorporated to ensure the developed autonomous navigation method. Performance in terms of accuracy of the autonomous navigation has been evaluated in real-world conditions within strawberry greenhouses, proving its usefulness for automatic pesticide spraying.

    Citation: Abdelkrim Abanay, Lhoussaine Masmoudi, Mohamed El Ansari, Javier Gonzalez-Jimenez, Francisco-Angel Moreno. LIDAR-based autonomous navigation method for an agricultural mobile robot in strawberry greenhouse: AgriEco Robot[J]. AIMS Electronics and Electrical Engineering, 2022, 6(3): 317-328. doi: 10.3934/electreng.2022019

    Related Papers:

    [1] Mark Beckerleg, Justin Matulich, Philip Wong . A comparison of three evolved controllers used for robotic navigation. AIMS Electronics and Electrical Engineering, 2020, 4(3): 259-286. doi: 10.3934/ElectrEng.2020.3.259
    [2] Thanh Tung Pham, Chi-Ngon Nguyen . Adaptive PID sliding mode control based on new Quasi-sliding mode and radial basis function neural network for Omni-directional mobile robot. AIMS Electronics and Electrical Engineering, 2023, 7(2): 121-134. doi: 10.3934/electreng.2023007
    [3] M. P. Cooper, C. A. Griffiths, K. T. Andrzejewski, C. Giannetti . Motion optimisation for improved cycle time and reduced vibration in robotic assembly of electronic components. AIMS Electronics and Electrical Engineering, 2019, 3(3): 274-289. doi: 10.3934/ElectrEng.2019.3.274
    [4] Edwin Collado, Euribiel Valdés, Antony García, Yessica Sáez . Design and implementation of a low-cost IoT-based agroclimatic monitoring system for greenhouses. AIMS Electronics and Electrical Engineering, 2021, 5(4): 251-283. doi: 10.3934/electreng.2021014
    [5] Benaoumeur Ibari, Mourad Hebali, Baghdadi Rezali, Menaouer Bennaoum . Collision detection and external force estimation for robot manipulators using a composite momentum observer. AIMS Electronics and Electrical Engineering, 2024, 8(2): 247-264. doi: 10.3934/electreng.2024011
    [6] Feng Hu, Zhigang Zhu, Jeury Mejia, Hao Tang, Jianting Zhang . Real-time indoor assistive localization with mobile omnidirectional vision and cloud GPU acceleration. AIMS Electronics and Electrical Engineering, 2017, 1(1): 74-99. doi: 10.3934/ElectrEng.2017.1.74
    [7] K. Siva Krishna, D. Venkata Ratnam . Analysis of differential code biases and inter-system biases for GPS and NavIC satellite constellations. AIMS Electronics and Electrical Engineering, 2021, 5(3): 194-205. doi: 10.3934/electreng.2021011
    [8] Simona Miclaus, Delia-Bianca Deaconescu, David Vatamanu, Andreea Maria Buda, Annamaria Sarbu, Bogdan Pindaru . Peculiarities of the radiated field in the vicinity of a mobile terminal connected to 4G versus 5G networks during various applications usage. AIMS Electronics and Electrical Engineering, 2022, 6(2): 161-177. doi: 10.3934/electreng.2022010
    [9] Professor Peter Chong . SmartGift 2018 — Mobile and wireless technologies for sustainable mobility and transportation system. AIMS Electronics and Electrical Engineering, 2018, 2(4): 131-132. doi: 10.3934/ElectrEng.2018.4.131
    [10] Deven Nahata, Kareem Othman . Exploring the challenges and opportunities of image processing and sensor fusion in autonomous vehicles: A comprehensive review. AIMS Electronics and Electrical Engineering, 2023, 7(4): 271-321. doi: 10.3934/electreng.2023016
  • This paper presents an autonomous navigation method for an agricultural mobile robot "AgriEco Robot", with four-wheel-drive and embedded perception sensors. The proposed method allows an accurate guidance between strawberry crop rows while automatically spraying pesticides, as well as detecting the end and switching to the next rows. The main control system was developed using Robot Operating System (ROS) based on a 2D LIDAR sensor. The acquired 2D point clouds data is processed to estimate the robot's heading and lateral offset relative to crop rows. A motion controller is incorporated to ensure the developed autonomous navigation method. Performance in terms of accuracy of the autonomous navigation has been evaluated in real-world conditions within strawberry greenhouses, proving its usefulness for automatic pesticide spraying.



    Pesticide spraying is one of the most important farming processes, as it plays a crucial role in increasing agricultural productivity. The conventional spraying methods in strawberry greenhouses, typically, employ hazardous chemicals that are consistently applied by an operator with a manual sprayer. Despite the use of pesticide protection equipment (i.e., protective suit, gas mask, etc.), farmers are still exposed to toxic and dangerous chemicals that can cause health problems [1,2,3]. In addition, un-suitable spraying can infect both the plants and the soil, potentially causing harm to the final consumer. Because of this drawback, agricultural robotics has become a major interest of researchers in order to reduce the intervention of the farmers in risky situations, together with the protection of the environment and final consumers.

    Nowadays, robots play a significant role in crop inspection and treatment as well as in weeds, pests, and diseases detection. Agricultural robots have been improved through the development of mobile platforms equipped with a variety of sensors and algorithms for performing agricultural tasks. Agricultural robotics mobile technology has been widely used for a variety of purposes, such as the harvesting of cherry tomatoes based on stereo vision and a fruit collector [4]. An autonomous robot has been developed for intra-row weeding in vineyards by means of a rotary weeder using sonar and feeler to detect the trunk of the plants [5]. Another example can be found in [6], where a wheeled robot has been designed for precise wheat seeding.

    Robust and efficient autonomous navigation is a critical component for mobile robots that operate autonomously in agricultural environments, and it is becoming increasingly important in current robotics research. The majority of cultures are planted in straight rows with almost equal spacing between the rows, which favors the use of robotics to achieve agricultural tasks, e.g., pesticide spraying. This way, several agricultural navigation methods have been developed using plant rows as landmarks for navigation algorithms [7]. For that, vision sensors have been widely employed in plant rows identification due to their low cost and ability to provide a large amount of data that can be used to guide robots. Methods for detecting crop rows based on the Hough transform are broadly used in robot visual navigation systems [8,9,10]. Astrand and Baerveldt [8,9] developed a crop row recognition method that is also based on the Hough transform. Chen et al [10] successfully realized the automatic detection of the transplanting robot's navigation target by extracting the target row based on the improved Hough transform. Montalvo et al [11] proposed a least-squares method for detecting crop rows in a grassed cornfield.

    On the other hand, the use of laser rangefinder (LIDAR) sensors for navigation has resulted in a great number of research contributions, making them one of the most commonly used sensor systems in robotic platforms. They have the advantages of providing direct distance measurements, being less sensitive to environmental variables (e.g. work in the dark) and having a greater range than other sensors. Also, the recent cost reductions have increased the interest in this technology. Because of these, they are commonly used for local navigation between cultural ranges, which entails determining the robot's relative location across cultural ranges and guiding directions to avoid collisions [12,13,14]. Barawid et al. [15] used a 2D LIDAR to develop a real-time guidance system for driving an autonomous vehicle in an orchard, using again the Hough transform to extract plant rows in order to guide the vehicle. In [12], Hiremath and colleagues propose an autonomous robot navigation model in a maize field using 2D LIDAR and a particle filtering algorithm. It estimates the robot's state in relation to its surroundings, such as the robot's cap and lateral deviation.

    The main contribution of this work is the development of an autonomous navigation method for an agricultural mobile robot called "AgriEco Robot" operating in a strawberry greenhouse. We present a 2D LIDAR-based controller navigation approach that steers the robot using the estimated heading and lateral offset of the robot relative to crop rows. This allows the "AgriEco Robot" to autonomously guide itself between the crop rows while automatically spraying the pesticide. Then, after successfully navigating between the rows and detecting their ends, the robot autonomously takes care of reaching the next crop rows and continues the navigation between them while resuming the spraying process. The control software is based on Robot Operating System (ROS) [16], which runs on a Jetson Tx2 high-performance processor.

    The "AgriEco Robot" (Figure 1) has been devised to work specifically within an agricultural greenhouse and navigate between rows of strawberries for automatic pesticide spraying. The robot's dimensions (65 cm long x 55 cm wide) were chosen in accordance with strawberry cultivation standards.

    Figure 1.  The mobile robot platform "Agri-Eco Robot.

    The mobile robot platform is composed of a chassis made of steel, equipped with a four-wheel drive; a spraying system installed in the rear part of the robot, consisting of two motorized arms; and a tank of pesticide with an emergent pump. It also carries three on-board perception sensors: a stereo camera, a 2D LIDAR and an omnidirectional camera. In the rest of this section, we describe the robot's main components.

    The robotic platform is composed of a robotic chassis and four-wheel-drive with an in-wheel brush-less direct current engine (BLDC) linked with its controllers (see Figure 1). The chassis is made of lightweight steel and has been designed and analyzed to support loads up to 100 kg and to tow up to 80 kg. The four wheels operate separately and rotate along a single axis [17,18]. Each in-wheel motor has an independent controller which provides, through an Arduino Mega 2560 microcontroller, total and smooth control over the following functions: forward and backward motion, brake, and speed variation.

    The BLDC motor controller constitutes the driver of the motor. It acts as an intermediary between the motor and the battery, receiving a control signal from a microcontroller and converting it into a higher current signal that can drive the motor. The controller converts the DC voltage from the battery to AC voltage using power electronic switches. The controller, in turn, is characterized by a maximum current of 15 A and a power of 250 W and provides inputs for Hall effect sensors. PWM (Pulse Width Modulation) signals are generated and filtered by low-pass RC to control the motor speed variation. The "AgriEco Robot" is completely powered by a 36 V/30 Ah Lithium-ion battery.

    The sensory system of the robot comprises a Hokuyo URG-04LX-UG01 2D LIDAR, a ZED stereo camera, and an omnidirectional camera system, but it is not used in this work. The 2D LIDAR (Figure 1) has been mounted in the front of the robot, at a height of 12 cm above the ground. According to the manufacturer, this device has a scanning area of 240° and provides an angle resolution of 0.36° with a scan frequency of 10 Hz. The detection range is approximately 20‒5,600 mm with an accuracy of ±30 mm at a distance between 60 and 1000 mm, and ±3% of the measurement up to 4095 mm. It is connected to the onboard Jetson TX2 via a USB 2.0 interface and is powered by an additional 5 V power source.

    The ZED stereo camera (Figure 1), developed by Stereolabs [19], integrates two cameras with a maximum resolution of 4416 x 1242 px and a frame rate of 15 fps. It has been specially designed for autonomous navigation and 3D analysis applications, and it provides robust visual odometry estimations. The ZED SDK supports ROS integration through the zed-ros-wrapper package. It uses depth perception to accurately estimate the camera's 6 DoF pose (x, y, z, roll, pitch, yaw) with frequency up to 100 Hz and thus the pose of the system it is mounted on. The positional tracking accuracy for ZED cameras is +- (0.01, 0.1, 0.1) in meters for the x, y and z-axes. The camera has been positioned on the robot at a height of 44 cm from the ground, with a pitch angle of 15° so that it points to the ground.

    The omnidirectional vision sensor (Figure 1) is composed of a CCD camera and a spherical mirror in a face-to-face configuration. The catadioptric system has been positioned perpendicular to the robot. It covers a 360-degree field of view of the robot environment which makes it suited for autonomous navigation, obstacle detection, and localized pesticide spraying.

    The "Agri-Eco Robot" systems uses a combination of CPU and GPU cores in a real time application, which implies the need for considerable computing power to fulfill the embedded sensors' requirements, like the Zed camera, which recommend NVIDIA TX2 [20] as the optimal embedded processing card. The Nvidia Jetson TX2 is a unit that is equipped with a Quad-core 2.0 Ghz 64-bit ARMv8 A57, a dual-core 2.0 Ghz ARMv8 Denver, a 256 CUDA core 1.3 MHz Nvidia Pascal and 8 GB memory. This embedded card runs the developed systems in 0.25 s.

    It runs on an Ubuntu Linux 18.04 operating system with the open-source Melodic Robot Operating System (ROS). The general scheme of the complete system is shown in Figure 2.

    Figure 2.  General system architectur.

    Our system is based on the versatile and widely employed ROS framework [21], which is able to execute multiple programs in parallel in the form of nodes that communicate through services and messages broadcast via topics. ROS includes a set of libraries, a wide range of sensor drivers and a set of tools which make easy the implementation of the proposed application. The "AgriEco Robot" system implements in this work three ROS nodes:

    ● The ZED node (zed-ros-wrapper) broadcast its data in a set of topics providing access to the stereo images, the depth map, the 3D point cloud and an estimation of the camera's 6-DOF tracking.

    ● The Hokuyo node produces messages containing the values of the laser scans.

    ● The microcontroller node receives commands from other process and controls the robot motion and spraying systems.

    The spraying system (Figure 1 and Figure 3) has been developed with the aim of optimizing the automatic pesticide spraying by the "AgriEco Robot". This system has been attached to the rear part of the robot and is composed of a 10 L bank (1) of polythene with a submerged pump (2) and two motorized arms (5) built by 3D printing using poly-lactic acid (PLA) material. The pipe is made of pure silicone and is tied to the pump inside the bank. The subsystem is completed by a T-type connection (3) to distribute the pesticide to both arms. A set of adjustable flow nozzles (6) are fixed at the end of the pipes.

    Figure 3.  Spraying syste.
    Figure 4.  The "AgriEco Robot" between strawberry crops rows in the experiment greenhous.

    The "AgriEco Robot", must be able to navigate through strawberry crop rows within an agricultural greenhouse to perform tasks such as automatic of spraying pesticide. For this purpose, we have developed a navigation method, based on the 2D LIDAR sensor, allowing the robot to autonomously navigate through the crop rows inside a strawberry greenhouse. As it will be explained later, the proposed autonomous navigation method is divided into two major parts: between and outside the crop rows.

    We propose a 2D LIDAR-based navigation method that uses 2D range scans to autonomously drive between the strawberry plants by benefiting from the row-based arrangement of the crops. The robot moves and controls its trajectory between crop rows based on its heading and lateral offset estimated until the detection of the end of the row. For such a task, the LIDAR scans its surroundings in a plane parallel to the ground and uses the measurements to determine the robot localization relative to the rows. This procedure can detect if the robot is navigating correctly (i.e. the robot is positioned in the center of the space between crop rows and moving forward), or it deviates to the left or right with respect to the crop rows, as seen in Figure 5.

    Figure 5.  1- The robot's straight navigation, 2- The robot's right deviation, 3- The robot's left deviatio.

    The deviation is related to two parameters: the robot's lateral shift from the center of the row and the robot's deviation angle from the center of the row. In each situation, the motion controller generates a decision for the mobile robot, to correct its trajectory or to move straight ahead in case the robot is well positioned between the rows.

    Figure 6 shows the scheme of the navigation system where R and L represent the distances to the rows detected by the laser scanner at angles of 0° and 180°, respectively. The width of the robot is represented by W, and C is half the length of the robot. In turn, Cr and Cl are the perpendicular distances between the laser scanner (positioned at the center of the robot) and the right and left rows, respectively. In case the robot deviates to the left, Cl is extracted from the laser scanner data, and Cr is calculated as follows:

    f=(Cl+Cr)2ClCcosθ (1)

    With:

    θ=arccos(ClR) (2)
    Cr=Lcosθ (3)

    Similarly, if the robot deviates to the right, Cr is extracted from the laser scanner data, and Cl is calculated.

    The robot's deviation angles θ to the right and left between crop rows is represented by θ. Finally, f denotes the displacement of the robot center with respect to the middle point between rows (see Figure 6b).

    Figure 6.  Scheme of the autonomous navigatio.

    Apart from this, there are other crucial factors that must be estimated during the navigation of the mobile robot, such as the accurate position of the robot limits with respect to the crop rows, to ensure the operation safety. As shown in Figure 6a, these factors have been represented by the following parameters: k denotes the distance corner of the robot to the right row, while m is its analog to the left crop row. The correct estimation of these parameters allows the robot to safely navigate without hitting the crops, and they are computed through these equations depending on the type of deviation the robot is suffering:

    Deviation to the left:

    k=(RW2)cosθ (4)
    m=(Cr(Lsinθ)) (5)

    Deviation to the right:

    k=(LW2)cosθ (6)
    m=(Cl(Lsinθ)) (7)

    Once we detect the position of the robot between the crop rows, the navigation motion controller provides the needed speed and steering values. The strategies for motions were described in detail in [17]. To keep the proper robot trajectory between the rows, we follow the scheme shown in Figure 7, which depicts the motion corrections the robot applies when it deviates from the middle of the crops (i.e. the reference position). Thus, the input of the motion control is composed of the lateral and the heading offsets of the robot. Then, the control system outputs three different control actions: i) an angular correction to the left by acting the two right motors, ii) an angular correction to the right by acting the left motors, iii) and a linear velocity by acting the four motors with the same speed.

    Figure 7.  Scheme of the "AgriEco Robot" controlle.

    Finally, in order to increase the safety of the robot in case an emergency occurs, and the system loses control of the actuators, the robot stops moving if it did not get any new command every 0.5 s. The farmers use many daily tools in farms, and they can forget an object in the robot's trajectory. For this reason, during the navigation process within the greenhouse, the "AgriEco Robot" will be coupled with an obstacle detector based on the 2D Lidar/visual sensors embedded on the robot [18].

    Once the robot has safely navigated between the strawberry crop rows, when it reaches the end of the rows, the robot must deal with its transition to the next row, which is performed as follows:

    The first step is to detect the end of the current row by means of the LIDAR in the front of the robot, which is easily carried out by inspecting the laser readings. Then, we use the odometry system provided by the ZED camera to get an estimation of the total travelled distance from the beginning of the rows and compare it with their actual length in order to ensure that the robot has reached the end of the rows. After that, to enter the next one, the robot performs a sequence of a circular-arc movement followed by a backwards action and a new final circular-arc movement, as shown in Figure 8.

    Figure 8.  The "AgriEco" robot trajectory between row.

    Once the robot is placed at the beginning of the new crop row, the odometry system is reset to start computing the distance travelled for the new row, and the robot continues navigating and spraying pesticides until all the rows are completed.

    Finally, the pesticide spraying system operates automatically and simultaneously with the autonomous navigation, with the robot following a stop-and-go strategy. This way, the robot moves a specific distance according to real distance between successive plants (known beforehand), subsequently stopping and spraying the plants located at both sides of the robot. Then, it restarts its navigation and repeats this process until the end of the rows. The robot 2D LIDAR-navigation can encounter certain difficulties, such as LIDAR sensor failure. For this purpose, in order to improve the robustness and the efficiency of our process, we have incorporated a Zed-based verification system that provides the robot's orientation angle as an auxiliary system during the robot navigation.

    In this section we present the experiments performed to evaluate the robustness and performance of the "AgriEco Robot" when operating within a real greenhouse, shown in Figure 4, which has been built at the Faculty of Science in Rabat, Morocco, (GPS coordinates: 34.008287475248935, -6.838260257670796) and has dimensions of 5 m x 9 m x 2.5 m (width, length and height, respectively). The greenhouse consists of four rows with an inter-row space of 70 cm and an aisle length of 500 cm. Each row has a height of 25 cm and includes 11 plants. The experiment consisted of automatically navigating between the two rows in the right of the image while spraying pesticides, detecting the end of the rows, changing to the next aisle and continuing navigating and spraying. A calibration was made beforehand for the 2D LIDAR by using two perpendicular walls to ensure that the sensor was properly positioned in the middle of the robot. In order to assess the performance of the robot navigation, we computed the lateral error (i.e. the deviation of the following trajectory from the center of the aisle) as well as the heading error of the robot (see Figure 9 and Figure 10).

    Figure 9.  The estimated robot lateral error.
    Figure 10.  The estimated robot heading angle error.

    As can be seen in the figure, the lateral error remains bounded by a few centimeters during the whole experiment, presenting a root mean square (RMS) value of 2.99 cm, while, in turn, the heading RMS error is 3.27°. In this experiment, the robot navigated at a speed of 0.44 m/s (which can be considered a safe speed for indoor operation) through a trajectory of approximately 5 m. Finally, it is important to note that in the actual field, the rows were not symmetrically aligned, but the calculation of the distance between rows in real time allowed the robot to reactively adjust its navigation to the actual row alignments.

    This work presented an autonomous navigation method for an agricultural robot based on a 2D LIDAR sensor and a motion control. Our system is able to automatically estimate the robot's lateral deviation with respect to the center of the crops aisles as well as its heading. Control actions are then sent to the robot motors in order to correct them. The entire system has been implemented in ROS, which has facilitated the integration and communication of the software and the hardware of our robot. The autonomous navigation method has been evaluated in a real-world scenario with a robot spraying pesticides within strawberry greenhouses, proving its usefulness and performance for agricultural application. As future work, we will focus on improving the spraying procedure by developing an automatic and targeted pesticide spraying system. For this, we intend to use an image processing approach based on the omnidirectional vision sensor embedded on the robot to automatically recognize crops and servo the sprayer system's motor arms.

    The authors of this paper are thankful to the Minister of Higher Education, Research and Innovation (MHERI) and the National Center for Scientific and Technical Research of Morocco (CNRST) for financing this project.

    The authors declare that there is no conflict of interest in this paper.



    [1] Tsimbiri PF, Moturi WN, Sawe J, et al. (2015) Health Impact of Pesticides on Residents and Horticultural Workers in the Lake Naivasha Region, Kenya. Occup Dis Environ Med 3: 24-34. https://doi.org/10.4236/odem.2015.32004 doi: 10.4236/odem.2015.32004
    [2] Rincón VJ, Páez FC, Sánchez-Hermosilla J (2018) Potential dermal exposure to operators applying pesticide on greenhouse crops using low-cost equipment. Sci Total Environ 630: 1181-1187. https://doi.org/10.1016/j.scitotenv.2018.02.235 doi: 10.1016/j.scitotenv.2018.02.235
    [3] Collado E, Valdes E, Garcia A, et al. (2021) Design and implementation of a low-cost IoT-based agroclimatic monitoring system for greenhouses. AIMS Electronics and Electrical Engineering 5: 251-283. https://doi.org/10.3934/electreng.2021014 doi: 10.3934/electreng.2021014
    [4] Feng Q, Zou W, Fan P, et al. (2018) Design and test of robotic harvesting system for cherry tomato. Int J Agric Biol Eng 11: 96-100. https://doi.org/10.25165/j.ijabe.20181101.2853 doi: 10.25165/j.ijabe.20181101.2853
    [5] Reiser D, Sehsah ES, Bumann O, et al. (2019) Development of an Autonomous Electric Robot Implement for Intra-Row Weeding in vineyards. Agriculture 9: 18. https://doi.org/10.3390/agriculture9010018 doi: 10.3390/agriculture9010018
    [6] Haibo L, Shuliang D, Zunmin L, et al. (2015) Study and Experiment on a Wheat Precision Seeding Robot. J Robot 2015: 1-9. https://doi.org/10.1155/2015/696301 doi: 10.1155/2015/696301
    [7] Shalal N, Low T, McCarthy C, et al. (2013) A review of autonomous navigation systems in agricultural environments.
    [8] Åstrand B, Baerveldt AJ, (2005) A vision based row-following system for agricultural field machinery. Mechatronics 15: 251-269. https://doi.org/10.1016/j.mechatronics.2004.05.005 doi: 10.1016/j.mechatronics.2004.05.005
    [9] Ericson S, Åstrand B (2009) A vision-guided mobile robot for precision agriculture. Precision agriculture'09: papers presented at the 7th European Conference on Precision Agriculture, Wageningen, the Netherlands, 6-8 June 2009.
    [10] Chen B, Tojo S, Watanabe K (2003) Machine Vision for a Micro Weeding Robot in a Paddy Field. Biosyst Eng 85: 393-404. https://doi.org/10.1016/S1537-5110(03)00078-3 doi: 10.1016/S1537-5110(03)00078-3
    [11] Montalvo M, Pajares G, Guerrero JM, et al. (2012) Automatic detection of crop rows in maize fields with high weeds pressure. Expert Syst Appl 39: 11889-11897. https://doi.org/10.1016/j.eswa.2012.02.117 doi: 10.1016/j.eswa.2012.02.117
    [12] Hiremath SA, van der Heijden GW, van Evert FK, et al. (2014) Laser range finder model for autonomous navigation of a robot in a maize field using a particle filter. Comput Electron Agric 100: 41-50. https://doi.org/10.1016/j.compag.2013.10.005 doi: 10.1016/j.compag.2013.10.005
    [13] Malavazi FB, Guyonneau R, Fasquel JB, et al. (2018) LiDAR-only based navigation algorithm for an autonomous agricultural robot. Comput Electron Agric 154: 71-79. https://doi.org/10.1016/j.compag.2018.08.034 doi: 10.1016/j.compag.2018.08.034
    [14] Higuti VA, Velasquez AE, Magalhaes DV, et al. (2019) Under canopy light detection and ranging‐based autonomous navigation. J Field Robot 36: 547-567. https://doi.org/10.1002/rob.21852 doi: 10.1002/rob.21852
    [15] Barawid OC, Mizushima A, Ishii K, et al. (2007) Development of an Autonomous Navigation System using a Two-dimensional Laser Scanner in an Orchard Application. Biosyst Eng 96: 139-149. https://doi.org/10.1016/j.biosystemseng.2006.10.012 doi: 10.1016/j.biosystemseng.2006.10.012
    [16] Martínez A, Fernández E (2013) Learning ROS for robotics programming : a practical, instructive, and comprehensive guide to introduce yourself to ROS, the top-notch, leading robotics framework. Available from: https://www.semanticscholar.org/paper/Learning-ROS-for-robotics-programming-%3A-a-and-guide-Mart%C3%ADnez-Fern%C3%A1ndez/d5c04230cc426c85c09933e3ab1ec6208a747293
    [17] Abanay A, Masmoudi L, Elharif A, et al. (2017) Design and development of a mobile platform for an agricultural robot prototype. Proceedings of the 2nd International Conference on Computing and Wireless Communication Systems - ICCWCS'17, 1-5. https://doi.org/10.1145/3167486.3167527 doi: 10.1145/3167486.3167527
    [18] Abanay A, Masmoudi L, El Ansari M (2022) A Calibration Method of 2D LIDAR-Visual Sensors Embedded on an Agricultural Robot. Optik 249: 168254. https://doi.org/10.1016/j.ijleo.2021.168254 doi: 10.1016/j.ijleo.2021.168254
    [19] ZED Stereo Camera. Available from: https://www.stereolabs.com/zed/
    [20] Giubilato R, Chiodini S, Pertile M, et al. (2019) An evaluation of ROS-compatible stereo visual SLAM methods on a nVidia Jetson TX2. Measurement 140: 161-170. https://doi.org/10.1016/j.measurement.2019.03.038 doi: 10.1016/j.measurement.2019.03.038
    [21] Min SK, Delgado R, Byoung WC (2018) Comparative Study of ROS on Embedded System for a Mobile Robot. J Autom Mob Robot Intell Syst 12: 61-67. https://doi.org/10.14313/JAMRIS_3-2018/19 doi: 10.14313/JAMRIS_3-2018/19
  • This article has been cited by:

    1. Jiayou Shi, Yuhao Bai, Zhihua Diao, Jun Zhou, Xingbo Yao, Baohua Zhang, Row Detection BASED Navigation and Guidance for Agricultural Robots and Autonomous Vehicles in Row-Crop Fields: Methods and Applications, 2023, 13, 2073-4395, 1780, 10.3390/agronomy13071780
    2. J.A. Sánchez-Molina, F. Rodríguez, J.C. Moreno, J. Sánchez-Hermosilla, A. Giménez, Robotics in greenhouses. Scoping review, 2024, 219, 01681699, 108750, 10.1016/j.compag.2024.108750
    3. Khalid El Amraoui, Mohamed El Ansari, Mouataz Lghoul, Mustapha El Alaoui, Abdelkrim Abanay, Bouazza Jabri, Lhoussaine Masmoudi, José Valente de Oliveira, Embedding a Real-Time Strawberry Detection Model into a Pesticide-Spraying Mobile Robot for Greenhouse Operation, 2024, 14, 2076-3417, 7195, 10.3390/app14167195
    4. Haoran Tan, Xueguan Zhao, Changyuan Zhai, Hao Fu, Liping Chen, Minli Yang, Design and experiments with a SLAM system for low-density canopy environments in greenhouses based on an improved Cartographer framework, 2024, 15, 1664-462X, 10.3389/fpls.2024.1276799
    5. Jonathan Tobias, Shen Hin Lim, Mike Duke, Benjamin McGuinness, Chi Kit Au, Implementation of an autonomous mobile platform for agricultural tasks in corridor-like environments, 2024, 2366-5971, 10.1007/s41315-024-00386-3
    6. Chao Ban, Lin Wang, Ruijuan Chi, Tong Su, Yueqi Ma, A Camera-LiDAR-IMU fusion method for real-time extraction of navigation line between maize field rows, 2024, 223, 01681699, 109114, 10.1016/j.compag.2024.109114
    7. Changjie Wu, Xiaolong Tang, Xiaoyan Xu, System Design, Analysis, and Control of an Intelligent Vehicle for Transportation in Greenhouse, 2023, 13, 2077-0472, 1020, 10.3390/agriculture13051020
    8. Pui Yee Leong, Nur Syazreen Ahmad, Exploring Autonomous Load-Carrying Mobile Robots in Indoor Settings: A Comprehensive Review, 2024, 12, 2169-3536, 131395, 10.1109/ACCESS.2024.3435689
    9. Yang Yang, Xinyue Shen, Dong An, Huayu Han, Wu Tang, Yu Wang, Yuhang Yang, Qianglong Ma, Liqing Chen, Crop Row Detection Algorithm Based on 3-D LiDAR: Suitable for Crop Row Detection in Different Periods, 2024, 73, 0018-9456, 1, 10.1109/TIM.2024.3391816
    10. Mustapha El Alaoui, Khalid EL Amraoui, Lhoussaine Masmoudi, Aziz Ettouhami, Mustapha Rouchdi, Unleashing the potential of IoT, Artificial Intelligence, and UAVs in contemporary agriculture: A comprehensive review, 2024, 115, 00224898, 100986, 10.1016/j.jterra.2024.100986
    11. Anna ALDRIGHETTI, Ilaria PERTOT, Epidemiology and control of strawberry powdery mildew: a review, 2023, 62, 1593-2095, 427, 10.36253/phyto-14576
    12. Fernando Cañadas Aranega, Jose Luis Blanco Claraco, Francisco José Mañas Alvarez, José Carlos Moreno Úbeda, Navegación de un robot Ackermann para tareas de transporte en invernaderos mediterráneos con MultiVehicle Simulator (MVSim), 2024, 3045-4093, 10.17979/ja-cea.2024.45.10753
    13. Fernando Cañadas-Aránega, José C. Moreno, José L. Blanco-Claraco, Antonio Giménez, Francisco Rodríguez, Julián Sánchez-Hermosilla, Autonomous collaborative mobile robot for greenhouses: Design, development, and validation tests, 2024, 9, 27723755, 100606, 10.1016/j.atech.2024.100606
  • Reader Comments
  • © 2022 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(3817) PDF downloads(248) Cited by(13)

Figures and Tables

Figures(10)

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog