
Because of the recent development in advanced sensors, data acquisition platforms, and data analysis methods, unmanned aerial vehicle (UAV) or drone-based remote sensing has gained significant attention from precision agriculture (PA) researchers. The massive amount of raw data collected from such sensing platforms demands large-scale data processing algorithms such as machine learning and deep learning methods. Therefore, it is timely to provide a detailed survey that assimilates, categorises, and compares the performance of various machine learning and deep learning methods for PA. This paper summarises and synthesises the recent works using a general pipeline of UAV-based remote sensing for precision agriculture research. We classify the different features extracted from UAV imagery for various agriculture applications, showing the importance of each feature for the performance of the crop model and demonstrating how the multiple feature fusion can improve the models' performance. In addition, we compare and contrast the performances of various machine learning and deep learning models for three important crop trait estimations: yield estimation, disease detection and crop classification. Furthermore, the recent trends in applications of UAVs for PA are briefly discussed in terms of their importance, and opportunities. Finally, we recite the potential challenges and suggest future avenues of research in this field.
Citation: Tej Bahadur Shahi, Cheng-Yuan Xu, Arjun Neupane, William Guo. Machine learning methods for precision agriculture with UAV imagery: a review[J]. Electronic Research Archive, 2022, 30(12): 4277-4317. doi: 10.3934/era.2022218
[1] | V. Volpert, B. Xu, A. Tchechmedjiev, S. Harispe, A. Aksenov, Q. Mesnildrey, A. Beuter . Characterization of spatiotemporal dynamics in EEG data during picture naming with optical flow patterns. Mathematical Biosciences and Engineering, 2023, 20(6): 11429-11463. doi: 10.3934/mbe.2023507 |
[2] | Sridevi Sriram, Hayder Natiq, Karthikeyan Rajagopal, Ondrej Krejcar, Hamidreza Namazi . Dynamics of a two-layer neuronal network with asymmetry in coupling. Mathematical Biosciences and Engineering, 2023, 20(2): 2908-2919. doi: 10.3934/mbe.2023137 |
[3] | Diego Fasoli, Stefano Panzeri . Mathematical studies of the dynamics of finite-size binary neural networks: A review of recent progress. Mathematical Biosciences and Engineering, 2019, 16(6): 8025-8059. doi: 10.3934/mbe.2019404 |
[4] | Anna Cattani . FitzHugh-Nagumo equations with generalized diffusive coupling. Mathematical Biosciences and Engineering, 2014, 11(2): 203-215. doi: 10.3934/mbe.2014.11.203 |
[5] | Zhenglong Tang, Chao Chen . Spatio-temporal information enhance graph convolutional networks: A deep learning framework for ride-hailing demand prediction. Mathematical Biosciences and Engineering, 2024, 21(2): 2542-2567. doi: 10.3934/mbe.2024112 |
[6] | Xiaowen Jia, Jingxia Chen, Kexin Liu, Qian Wang, Jialing He . Multimodal depression detection based on an attention graph convolution and transformer. Mathematical Biosciences and Engineering, 2025, 22(3): 652-676. doi: 10.3934/mbe.2025024 |
[7] | Mahtab Mehrabbeik, Fatemeh Parastesh, Janarthanan Ramadoss, Karthikeyan Rajagopal, Hamidreza Namazi, Sajad Jafari . Synchronization and chimera states in the network of electrochemically coupled memristive Rulkov neuron maps. Mathematical Biosciences and Engineering, 2021, 18(6): 9394-9409. doi: 10.3934/mbe.2021462 |
[8] | Felicia Maria G. Magpantay, Xingfu Zou . Wave fronts in neuronal fields with nonlocal post-synaptic axonal connections and delayed nonlocal feedback connections. Mathematical Biosciences and Engineering, 2010, 7(2): 421-442. doi: 10.3934/mbe.2010.7.421 |
[9] | P. E. Greenwood, L. M. Ward . Rapidly forming, slowly evolving, spatial patterns from quasi-cycle Mexican Hat coupling. Mathematical Biosciences and Engineering, 2019, 16(6): 6769-6793. doi: 10.3934/mbe.2019338 |
[10] | Xin-You Meng, Tao Zhang . The impact of media on the spatiotemporal pattern dynamics of a reaction-diffusion epidemic model. Mathematical Biosciences and Engineering, 2020, 17(4): 4034-4047. doi: 10.3934/mbe.2020223 |
Because of the recent development in advanced sensors, data acquisition platforms, and data analysis methods, unmanned aerial vehicle (UAV) or drone-based remote sensing has gained significant attention from precision agriculture (PA) researchers. The massive amount of raw data collected from such sensing platforms demands large-scale data processing algorithms such as machine learning and deep learning methods. Therefore, it is timely to provide a detailed survey that assimilates, categorises, and compares the performance of various machine learning and deep learning methods for PA. This paper summarises and synthesises the recent works using a general pipeline of UAV-based remote sensing for precision agriculture research. We classify the different features extracted from UAV imagery for various agriculture applications, showing the importance of each feature for the performance of the crop model and demonstrating how the multiple feature fusion can improve the models' performance. In addition, we compare and contrast the performances of various machine learning and deep learning models for three important crop trait estimations: yield estimation, disease detection and crop classification. Furthermore, the recent trends in applications of UAVs for PA are briefly discussed in terms of their importance, and opportunities. Finally, we recite the potential challenges and suggest future avenues of research in this field.
As an important branch in the field of nonlinear dynamics, the research scope of pattern dynamics involves biology, physics, mathematics, chemistry, medicine, astronomy and other disciplines, and the research in various interdiscipline has become a hot spot for researchers [1]. Pattern formation is a kind of non-uniform macro structure with certain regularity in time or space, which extensively exists in nature, such as pattern formed by self-organization in organic polymers, crystal structure in inorganic chemistry, stripes on the surface of animal fur, streak clouds in the sky, etc [2]. From the thermodynamic point of view, the first two types of patterns are those existing in the thermodynamic equilibrium state, and the last two examples are those generated when leaving the thermodynamic equilibrium state [3]. Pattern dynamics is an important branch in the field of nonlinear science [4]. Its research purpose is to explore the basic laws of pattern formation and evolution, which are common among various systems in the objective world and have universal guiding significance [5]. In recent years, as a group of interconnected basic units, complex dynamic networks have attracted more and more attention in many fields such as social science, biology, mathematics and engineering science [6]. In complex dynamic networks, one of the interesting and remarkable phenomena is to show collective behavior [7]. Collective behavior is an important attribute of complex systems, which means that the whole is greater than the sum of parts [8]. Because complex networks are natural spatiotemporal systems, patterns can be observed in these systems [9]. Spatial patterns, such as target patterns, spiral waves and vortex waves, are examples of patterns that occur in complex systems, such as reaction-diffusion systems and neural networks [10].
Spiral wave is a kind of important non-equilibrium pattern, which reflects the macroscopic structure of nonlinear system with some special laws in time or space, and widely exists in excitable, oscillatory and bistable systems [11]. It's mentioned in many literatures that spiral wave is a traveling wave rotating outward from the spiral center in a two-dimensional excitable medium [12]. Generally speaking, the formation of spiral wave is either the free end generated in the propagation wave or the wave collision generated in the inhomogeneous medium [13]. Many interesting works have been carried out, most of which have been proved to be effective in removing spiral waves and preventing spiral waves from breaking [14]. It's believed that the spiral wave in the heart tissue is harmful, so many schemes have been proposed to inhibit the spiral wave and prevent the spiral wave from breaking [15]. Neural network is a complex dynamic network, which can show many active states of spatial structures, such as spatiotemporal chaos, stochastic resonance, synchronization, chimeric state and spiral wave [16]. Network structure, neural model, even considering disturbance and noise, which are inherent properties of neural network, will affect the formation of patterns [17]. Studying these phenomena can bring new insights into the function of neurons [18,19]. Among the tools for studying neural networks, wave propagation mechanism is one of the most effective mechanisms [20]. Spiral wave is a kind of collective behavior, which exists widely in nature [21]. In fact, spiral wave is a special propagation of nonlinear wave [22]. It rotates around a center (called seed), which determines the wave dynamics [23]. It is very important to study the dynamics of spiral waves, because they have been observed in neocortex and arrhythmia of mammals [24]. It has been proved that both atrial fibrillation and ventricular fibrillation are caused by spiral waves [25]. The spiral seed in the heart fiber rotates more frequently than the natural frequency of the heart, making the heartbeat irregular [26]. Therefore, it can cause fibrous fibrillation [27]. Therefore, modeling and identifying the structures formed by spiral waves can help design methods to eliminate spiral waves and control fibrillation [28].
The combination of chemical, physical, electrical and structural characteristics of neurons makes them highly complex dynamic units [10]. Zhao et al. introduced some recent developments in the dynamic behavior of complex networks (CNs) and complex networks with multi-weights (CNMWs) under various control methods, and several output synchronization criteria for multiple output coupled complex networks (MOCCNs) were formulated by using the Lyapunov functional method and inequality techniques. [29,30]. One of the best tools to deal with this advanced complexity is to study neural spatiotemporal patterns [31]. The electrophysiological and structural characteristics of neurons in neural networks lead to complex characterization [32,33]. These performances are the driving force behind our biological behavior as human beings [34]. Technically, the firing rhythm of neurons determines the function of the brain [35]. These rhythms mainly include spikes and bursts, or a combination of these two modes [36]. In other words, they help us better understand their functions, although there is little detailed information about their chemical, electrical, morphological and structural properties [37]. In this field, one of the most important models is the spiral model. This pattern can be observed in many chemical, biological, physical and ecological systems [38]. Spirals are unique because they are self-organizing and self-sustaining [10,39]. They can play a regulatory role in a system and change its dynamics [40]. For example, there is experimental evidence that spiral waves are crucial to some ongoing cortical activities, because they act as rhythmic regulators in neural populations [41]. Sleep disorders, seizures and attention deficit hyperactivity disorder are just a few examples [42]. Spiral waves can also cause arrhythmias. They are considered to be the main cause of reentry wave front [43]. Reentry is one of the most prominent types of arrhythmia, which can lead to sudden death [44]. From the perspective of experimental research, spiral waves widely exist in nature, such as the oxidation of carbon monoxide on platinum [45]. Biological experiments found that spiral waves also exist in the cerebral cortex [46]. The appearance of spiral wave is also observed in human cardiac muscle [47]. Using a simple active medium model, Kuklik et al. studied the influence of spatial spreading inhomogeneity of transverse element coupling on spiral wave trajectory [48]. They also used the FitzHugh-Nagumo model of an excitable medium to investigate the effect of the random perturbation of cell coupling on the stability of a spiral wave in 2010 [49]. They believed that electrical cardioversion could lead to one of three outcomes, such as immediate termination of arrhythmic activity, delayed termination or unsuccessful termination, and they propose a model of atrial fibrillation as a coexistence of several spiral waves fixed in an active medium with inhomogeneity [50]. In Kumar and Amita Das's work, molecular dynamics simulation was used to prove the excitation of two-dimensional dusty plasma at the particle level [51]. Kwon et al. clearly demonstrated that the new mechanism can create a period-2 helix by computer simulation of a simple mathematical model describing the dynamics of the spiral wavefront [52]. Lacitignola et al. tested some findings by numerical approximations of the complete model and found interesting scenarios that led to spiral fracture to obtain appropriate changes in system parameters [53]. Studies have shown that researchers studying the firing activity of neurons in the cerebral cortex have also observed patches with spiral waves that are closely related to the information transmission between neurons in the brain's neural network [54]. In experimental studies, researchers have found that cardiac patients can also discharge myocardial tissue cells in spiral waves, and more seriously, spiral wave rupture may cause heart fibrillation, resulting in sudden cardiac death and causing serious consequences [55]. In general, the study of spiral waves has very important practical significance [8].
Little problem has been reported about the spatiotemporal dynamics in neural networks constructed by Izhikevich neurons derived from the modeling of cortical neurons at present [56]. At the same time, considering that Izhikevich neurons have many different types of discharge patterns, people basically only consider them as excitatory or inhibitory neurons to build neural networks when they are used for analysis, but rarely consider the formation and rupture of spiral waves in the neural networks [57]. According to the existing research, the neural network of the cerebral cortex has a 5-layer structure, so the bi-layer network is the most basic component. Neurons will inevitably connect with other neurons around them, which is actually the channel for information exchange and connection. Under the influence of these complex factors, neural networks will show different synchronization properties and spatiotemporal patterns. Moreover, it is unclear whether the neural networks constructed by different types of Izhikevich neurons can induce spiral waves [58]. In this paper, the matrix neural networks constructed by several different types of Izhikevich neurons under the random boundary conditions are discussed, and the influence of the coupling strength between neurons on spatiotemporal dynamics are investigated. A square neural network driven by random boundary is constructed firstly, and then multiple connection regions are set on the network to connect with the second layer neural network. The first layer generates spiral waves, which are transmitted from multiple regions to the second layer, and then the formation and rupture of spiral waves are observed from the second layer. In order to have a clearer understanding of cortical neural networks, the synchronization of the second layer network is also studied by changing the coupling strength between neurons as well as the inter-layer connection strength, and the change of synchronization factor with the coupling strength between neurons in the second layer also shows very interesting results. The arrangement of the paper is as follows: In Section 2, the Izhikevich neuronal model is introduced and the matrix network is constructed; In Section 3, the numerical simulation results are analyzed; Section 4 summarizes the important research conclusions of the research.
To understand how the brain works, people needs to combine experimental studies of the animal and human nervous systems with numerical simulations of large-scale brain models [59]. Eugene M. Izhikevich proposed a neuronal model in 2003 which is computationally simple, but capable of producing the rich firing patterns exhibited by real biological neurons [60]. The Izhikevich neuronal model is biologically as plausible as the Hodgkin-Huxley neural model, and is as computationally efficient as the integrate-and-fire neural model [61]. The reason why we choose Izhikevich neuronal model from various neuronal models is that the main modeling object of this model is cortical and thalamic neurons, which can reproduce all their known neuronal firing behaviors, and has simple structure, rich physiological significance, and high computing efficiency. And more importantly, it is applicable to network simulation. The Izhikevich neuronal model driven by external stimulation currents can be represented as follows
{dvdt=0.04v2+5v+140−u+I,dudt=a(bv−u). | (1) |
where v represents the membrane potential of the cerebral cortical neuron, and u represents the recovery variable, both of them are dimensionless variables [62]. Constants a and b are used to control different types of neurons, I is considered as external stimulus current [63].
When the value of neuronal membrane potential is greater than the peak value, that is, if v > 30 mV, the membrane potential is reset in the following way
{v→c,u→u+d. | (2) |
where c and d are constants.
Different setting of the constants a, b, c and d yields several typical models of the Izhikevich neuron with different firing types, for example, regular spiking (RS), fast spiking (FS), Chattering (CH) and intrinsically bursting (IB) [64]. The specific values and corresponding discharge types are given in the table as below.
In order to study the collective properties and spatiotemporal patterns of neural network, a matrix neural network which contains 200 × 200 nodes is constructed in the first step, and all neurons are evenly placed in each node [65]. A certain neuron is connected to other neurons at four locations, upper, lower, left and right, with connection strength D, and the schematic diagram of the connections between neurons is plotted in Figure 1. For the boundary of the matrix neural network, the no-flow boundary condition is considered [66]. The no-flow boundary condition considers that the current value inside the boundary is equal to that outside the boundary, that is, the current value outside the boundary is the same as the value setting inside the boundary, thus the current flowing into the network is zero, so it is called "no-flow boundary". Next, the second layer of the matrix neural network is constructed again in this manner and it is considered connecting each other with channels from multiple regions [67]. Izhikevich neural model can be used to represent the dynamical equations of two-layer network, each of which is connected to the nearest neighbor type in a two-dimensional matrix [68]. The collective behaviors of two-layer network can be represented by
{dv1ijdt=0.04v21ij+5v1ij+140−u1ij+I1ext+D1(v1i−1,j+v1i+1,j+v1i,j−1+v1i,j+1−4v1i,j),du1ijdt=a(bv1ij−u1ij),dv2ijdt=0.04v22ij+5v2ij+140−u2ij+I2ext+D2(v2i−1,j+v2i+1,j+v2i,j−1+v2i,j+1−4v2i,j) +k(v1ij−v2ij)δiαδjβ,du2ijdt=a(bv2ij−u2ij). | (3) |
where subscripts 1 and 2 represent the first and second layer network, and the subscript (ij) denotes the location of the node in the same layer. It can be seen from the structure of the network that each neuron is connected to four neurons, which means that the degree of the network is 4, and the total number of neurons in the network is 40,000, which indicates that the scale of the neural network is defined as 40,000.
In addition, D1 and D2 in Eq (3) are used to represent the coupling intensity of nearest adjacent nodes in the bi-layer network, and each layer is placed in a two-dimensional matrix, as shown in Figure 2. The intensity of the channel between the two layers is expressed by k. δiα = 1 for α = i and δjβ = 1 for β = j [69]. Otherwise, δiα = 0 and δjβ = 0. i, j, α and β are integers. For bi-layer neural networks, multiple connection regions are opened at specified locations for generating information exchange between the bi-layer networks, respectively [70]. For example, in Figure 2(a) the connection region is opened between nodes 99 to 102. Figure 2(b) expresses the case that the two layers connect at two local areas (99 ≤ α ≤ 102, 65 ≤ β ≤ 68), (99 ≤ α ≤ 102,131 ≤ β ≤ 134). For the three local coupling areas (65 ≤ α, β ≤ 68), (65 ≤ α ≤ 68,131 ≤ β ≤ 134), (131 ≤ α ≤ 134, 99 ≤ β ≤ 102) are investigated in Figure 2(c) and the four local coupling areas (65 ≤ α, β ≤ 68), (65 ≤ α ≤ 68,131 ≤ β ≤ 134), (131 ≤ α ≤ 134, 65 ≤ β ≤ 68), (131 ≤ α, β ≤ 134) are displayed in Figure 2(d).
To investigate the statistical features of the collective dynamics in the neuronal network more systematically, the synchronization factor R of the neural network is calculated by using the mean-field theory [71]. Synchronization factor R can be calculated as follows
{F=1N2N∑j=1N∑i=1vij;R=⟨F2⟩−⟨F⟩21N2N∑j=1N∑i=1(⟨v2ij⟩−⟨vij⟩2). | (4) |
where vij denotes the membrane potential of each layer nodes (i, j) neuron and it could be calculated from Eq (1). N represents the location of the neuronal node, N2 represents the number of nodes in the network, and symbols < > means that the variables are averaged over time [72]. In particular, if the value of R approaches to 1, it means that the firing behavior of all neurons exhibits a fully synchronized state; when the value of R approaches to 0, it indicates that the neuronal system is in a incompletely synchronized state [73] Previous studies have shown that smaller values of synchronization will support ordered spatial patterns, while larger values of synchronization can develop homogeneous states [74]. Appropriate values of the synchronization factor can generate graceful spatial waves in the network [75].
The collective dynamics are calculated by using Euler algorithm with a time step of 0.02 when exploring the spatiotemporal properties of neural networks [76]. In each layer, the 200 × 200 (Ni, Nj) neuron nodes are uniformly embedded into a two-dimensional square array, with a near-neighbor coupling action between the neurons and considering the no-flow boundary conditions [77]. If there is no special instruction, each neuron in the first layer will be applied an direct current as external force, and the direct current signal applied to each Izhikevich neuron is Iext = 10 [78]. On the boundaries of the first layer of the matrix neural network, all random initial values will be generated by random functions as v0 = 0.8ξln(i) – 0.2ξln(j) – 3, u0 = –0.8ξln(i) + 0.2ξln(j) – 5, where ξ represents a random number between 0 and 1 [79,80,81,82]. And initial values of all other neuron will be selected with the same values as (v0, u0) = (0, 0).
The time-series of the neuronal membrane potential are calculated using Euler algorithm method for a certain time period according to the parameters given in Table 1, and the results are plotted in Figure 3. The results in Figure 3(a) illustrate that the membrane potential of Izhikevich neuron in the CH firing mode exhibits a series of cluster firing forms with periodic properties, and each cluster spike exhibits a modal morphology. For the FS firing pattern, as shown in Figure 3(b), the modelling data is derived from inhibitory cortical neurons, where the membrane potential exhibits a periodic firing sequence with an extremely high frequency. The IB discharge pattern is first showing patterned discharge clusters followed by a series of repeated discharge spikes, as shown in Figure 3(c). The neurons in RS states are the most typical neurons in the cortex, and their firing frequency is not too fast because the limitation of their parameters determines that the neurons in RS states have the characteristics of spike frequency adaptation, as shown in Figure 3(d). Unlike neurons in FS states, neurons in RS, CH, and IB states are all used to imitate excitatory neurons, and previous studies have shown that the ratio of excitatory and inhibitory neurons is generally 4:1 in cortical neural network.
Types | a | b | c | d |
RS | 0.02 | 0.2 | –65 | 8 |
FS | 0.1 | 0.2 | –65 | 2 |
CH | 0.02 | 0.2 | –50 | 2 |
IB | 0.02 | 0.2 | –55 | 4 |
In order to understand the specific properties of the four different discharge modes of Izhikevich neuron more accurately, the bifurcation of the membrane potential is drawn in Figure 4. It is known that bifurcation often appears in the mathematical research of dynamic systems, which refers to small and continuous changes in system parameters, resulting in sudden changes in the nature of the system. Here, how the change of external stimulation current affect the spike interval of membrane potential is analyzed, and the analysis results obtained are represented by bifurcation of ISI (inter-spike interval). Figure 4(a), (b) make it clear that the peak interval of RS and FS neuronal membrane potential gradually decreases with increasing stimulus current intensity I, and the ISI curve shows a tendency to decrease monotonically with the increasing of stimulation current intensity from an overall perspective. These results mean that the RS and FS neurons are in spiking state, and the time interval between discharge spikes keeps getting smaller. Above conclusions can also be drawn from the time series of membrane potential in Figure 3. However, a obvious difference between the two case is that the peak interval of neuron in FS discharge state is smaller overall than the ISI of neuron in RS discharge state. Figure 4(c), (d) demonstrate that neurons in CH and IB discharge states exhibit more abundant firing properties when the external stimulation current intensity is relatively small. For example, for neuron in CH discharge state, the discharge mode of Izhikevich neuron undergoes the transition from period-2 bursting state to bursting state when the external stimulus current intensity is greater than 3.5. If the external stimulation current intensity is less than 3.5, the discharge mode includes chaotic state, period-3 bursting state and period-2 bursting state. For neuron in IB discharge state, when the external stimulation current intensity increases from 0 to 30, the discharge mode of neurons experiences period-2 bursting state and chaotic discharge, then evolves to period-2 bursting state and finally to spiking state.
In the first layer of the network, the influence of the random boundary value of the matrix network on the spatiotemporal pattern at different times can be observed due to the influence of the external current and the random boundary, as shown in Figure 5. It can be seen that the first layer generates spiral wave induced by random values of boundary under appropriate coupling intensity and external force, and the second layer is in the different states. When the time is relatively small, the advance of traveling wave induced by random boundary can be observed, and some spiral seeds and broken spiral waves gradually appear. As time increases, some single armed and double armed spiral seeds can also be observed.
Coupling channels between the two layers are set in multiple areas and spiral wave of first layer affect second layer via the coupling channels. When only one connection channel is set in the middle of the first layer network, the spatiotemporal pattern of the second layer network at different time units are shown in Figure 6. An obvious characteristic of Figure 6 is that all the images at different times in the figure are similar to target waves, except for breakage at 500 time units. Target-like waves tend to move from the boundary to the center of the matrix network, thus different spatiotemporal patterns can be seen at different times. Since the change of the spatiotemporal pattern is not of analytical significance when changing the observation time, we chose to fix the time at 5000 time units and study the impact of changing the neuron coupling strength on the spatiotemporal pattern in the later research.
As shown in Figure 7, the developed pattern of second layer is calculated under different coupling intensity D2 at t = 5000 time units when the two layers connect at one local areas. It can be observed that the random boundary value propagates from the boundary of the matrix network to the middle area and after the collision traveling wave it starts to form a spiral seed and a target-like wave in the case of D2 = 0.9. Continue to increase the coupling strength between neurons in the second layer, it can be found that the target-like wave breaks and gradually disappears, and then a target-like wave with high potential is observed in the central area of the network. By further increasing the coupling strength between neurons, it can be observed that the spatiotemporal patterns of the neural network experience the process of convergence → diffusion → re-convergence → re-diffusion. Moreover, when the coupling strength D2 = 3.0, only the neurons in the central area of the neural network are in the discharge state, and the neurons in other areas are inhibited, which may be caused by too large coupling strength.
According to Eq (4), the synchronization factor is calculated when the coupling strength between neurons in the second layer neural network is changed, and the results are shown in Figure 8. Once you see the image of synchronization factor, you can easily analyze that it is an inverted bell-like shaped curve. In other words, there is an optimal D2 value, which can minimize the synchronization factor R. Further analysis shows that when the value of D2 is between 0.4 and 1.3, the synchronization factors are close to the minimum, which means that the synchronization of the neural network is low, but an unexpected phenomenon is that spiral seeds or target-like waves appear in the network. In essence, it can be understood that the synchronization factor is small in this case, and the spatiotemporal patterns of the neural system are orderly arranged, so we can see the above situation. When the coupling strength is small or large, the neural system tends to be stay in homogeneous state, so the synchronization factor is also large. These results are consistent with the previous statement.
In the following research, the influence of the inter-layer connection strength on the spatiotemporal patterns in the second layer neural network is studied. As shown in Figure 9, the developed pattern of second layer is calculated under different inter-layer connection strength k at t = 5000 time units when the two layers connect at one local areas. It can be observed that when the inter-layer connection strength is small (for k = 0.1 and k = 0.1), all neurons are basically in a resting state, and only a few neurons oscillate below the threshold. With the increase of inter-layer coupling strength, several relatively complete target-like wave patterns can be observed, and then these ring patterns begin to break, showing a random change rule. This means that the random boundary plays a greater role than the inter-layer coupling strength in the process of generating spiral wave patterns.
The change trend of synchronization factor when changing the inter-layer coupling strength is shown in Figure 10. It is easy to find that the curve is a monotone decreasing function of the inter-layer coupling strength, which indicates that the synchronization of neurons in the neural network will rapidly decrease to a value close to 0 with the increase of the inter-layer coupling strength, and then basically keep fluctuating in a small range. By comparing Figures 9 and 10, it can be found that the spatiotemporal pattern shows that there is neither spiral wave nor spiral seed when the synchronization factor is relatively large. With the further increase of the inter-layer coupling strength, the synchronization factor decreases rapidly, and the target-like wave appears in the spatiotemporal pattern. When the inter-layer coupling strength is further increased and the synchronization factor is reduced to close to 0, the occurred target-like waves begin to break. As mentioned earlier, this is also related to the randomness of random boundary values.
In order to understand the impact of the number of connecting channels between layers on the spatiotemporal pattern of the bi-layer neural network, the spatiotemporal pattern in the case of two area connections are studied as well, as shown in Figure 11. From Figure 11(a)–(c), it can be clearly seen that the specific positions of the two connecting regions are obviously different from their surrounding regions, which indicates that the spiral waves generated by the first layer of network have a great impact on the spatiotemporal pattern of the second layer. When the connection strength D2 between neurons in the second layer increases to 0.4, obvious spiral seeds can be observed, including single arm and double arm. If we continue to increase the connection strength between neurons, it is found that the spatiotemporal pattern starts to change into the form of target-like waves, but the appearance of complete target-like waves can hardly be observed. Another obvious feature is that the area where the two connecting channels are located has a great impact on the neural network of the second layer, and no matter what the coupling strength is, this impact basically exists.
When the two layers are connected at two local areas, the synchronization factor changing with the coupling intensity of the second layer is shown in Figure 12, and the curve in the form of anti-resonance is observed again in this image. It can be found that when the variation range of D2 is from 0.5 to 1.2, the value of synchronization factor R is the relative minimum, however the synchronization factor is relatively large when the coupling strength is small. This anti-resonance curve shows that there is a certain range of coupling strength that can inhibit the synchronization of neurons in the neural network.
Considering that the inter-layer connection strength also affects the spatiotemporal dynamics of the second layer neural network, the developed pattern of second layer under different inter-layer connection strength is shown in Figure 13. At the same time, if you compare Figures 13 and 14, one can find the same rule as described above, that is, the synchronization factor is relatively large when the inter-layer coupling strength is small, and no spiral seeds or spiral waves are displayed in the network. If the inter-layer coupling strength k is selected as 0.2, the synchronization factor shows a maximum value that is not very obvious, showing the form of "resonance". If the inter-layer connection strength is continued to increase, it will again get the result that the synchronization factor decreases rapidly and remains near zero. However, no matter whether the value of synchronization factor is 0 or not, broken spiral seeds and incomplete target like waves can be observed in the spatiotemporal model. Among these broken spiral seeds, the traces of the connection channels in the two areas can also be clearly seen, indicating that the location of the connection channels really plays a crucial role in the spatiotemporal pattern in the second layer network.
In the following research, the bisection and trisection points of the spatial location in the second layer matrix networks are found out and they are determines as three connection areas to connect the two layers respectively. The developed pattern of second layer is calculated under different coupling intensity D2 at t = 5000 time units when the two layers connect at three local areas, which is plotted in Figure 15. It can be observed that the three connection regions have a great impact on the spatiotemporal pattern of the second layer neural network. The diversity of the spatiotemporal pattern is derived from the changes in the signals of the three connection channels. When the signals come from the three channels collide and interact, rich and varied spatiotemporal pattern can be generated in the entire network. With the generation of the collision and the further propagation of the signal, the broken spiral wave and double armed spiral seed appear. If the coupling strength between neurons is further increased, the appearance of target-like waves can also be observed, and a large number of neurons may appear in some certain regular area and discharge intensively at the same time.
Synchronization factor varies with coupling intensity of second layer D2 is plotted in Figure 16 when the two layers connect at three local areas. As in the previous R-D2 curves, the relationship between synchronization factor and coupling strength between neurons in the second layer presents the form of "anti-resonance", indicating that there is a certain range of coupling strength which can make the synchronization of neural network the lowest. When the connection channel becomes four regions, the same conclusion can be obtained, as shown in Figure 20. In general, the synchronization factor changes with D2 in the form of anti-resonance when the coupling strength between neurons in the second layer is changed. This conclusion is valid regardless of how many connection channels exist between the two layers of networks.
If the inter-layer connection strength is changed, whether the connection channels between the two layers of neural networks are selected as 3 or 4, the overall change trend of the spatiotemporal pattern is basically consistent, as shown in Figures 17 and 21. No matter how many connection areas are between the two layers, one can clearly observe the effect of the connection channel on the second layer network in the spatiotemporal pattern. In the above two cases, the synchronization factor changing with inter-layer connection strength are plotted in Figures 18 and 22. Although these two figures are basically consistent with the previous R-k curve on the whole, they show resonance peaks similar to resonance when the inter-layer coupling strength is small. Under other inter-layer coupling strengths, there is still a random transition from the broken spiral seed to the target-like wave.
Figure 19 shows the spatiotemporal pattern when there are four connection regions between the two layers of networks. It can be seen that when the coupling strength of neurons in the second layer is 0.1, the spatiotemporal pattern in the network presents a basically symmetrical pattern. Then, this symmetry is destroyed with the increase of coupling strength between neurons. The reason may be that the increase of coupling strength also leads to the increase of randomness, which leads to the loss of symmetry. In the subsequent images, spiral seeds and broken target-like waves can still be observed. The analysis of these images tells us that when the coupling strength between neurons in the second layer is changed, the spatiotemporal patterns in the second layer of neural network may appear in a variety of forms, and the corresponding synchronization factor may appear in the form of anti- resonance with the change of coupling strength. On the other hand, a curve that is a monotone decreasing function as a whole can be observed when the coupling strength between layers is changed. Figures 20–22 have been discussed previously and will not be repeated here.
Based on four kinds of firing patterns of Izhikevich neuronal model, a bi-layer neural network with multi-channel connection is constructed using numerical simulation. Each layer of network is composed of 200 × 200 Izhikevich neurons to model a two-dimensional matrix network, and random functions are applied at the boundary of first layer of network. The spiral wave in the first layer is transmitted to the second layer through multiple channels, and the spatiotemporal mode and network synchronization in the second layer are studied. By means of simulation, four different firing modes of neurons are discussed respectively, and the bifurcation of the membrane potential are studied. Then the formation and breaking mechanism of spiral wave in the two-layer network are studied as the coupling strength between neurons in the second layer and the inter-layer connection strength increase. With the development of the research, the synchronization properties of neural networks are explored by changing the coupling strength between neurons in the second layer as well as the inter-layer connection strength.
Obtained results indicate that only when the firing mode of the Izhikevich neuron constituting the matrix neural network is in RS state, the emergence and disappearance of spiral waves can be observed in the network, while the formation of spiral wave seeds cannot be observed in the network if it is composed of other firing modes such as FS, IB and CH. Further research shows that the variation of synchronization factor with coupling strength between adjacent neurons in the second layer shows an inverse bell-like curve in the form of "inverse resonance", but the variation of synchronization factor with inter-layer connection strength is a curve which is approximately monotonically decreasing. No matter how many connection areas are between the two layers of neural networks, spiral waves can be observed in the second layer of matrix neural networks, even the appearance of target-like waves can be observed under some certain conditions. Furthermore, we find that the more important phenomenon is that lower synchronicity is helpful to develop spatiotemporal patterns.
The study of the coupling strength and the influence of random boundaries in the neural networks constructed by Izhikevich neurons on the spatiotemporal behavior can be instructive to further explore the signal propagation in the cerebral cortical neural networks.
This work is supported by Science and Technology Project of Jiangxi Provincial Department of Education under Grants No. GJJ203111, Science and Technology Project of Yuzhang Normal University under Grants No. YZYB-21-17, and Talent Introduction Project (No. NGRCZX-22-07), Special Fund for Doctor of Science and Technology Program of Nanchang Institute of Science and Technology under Grants No. NGKJ-21-03.
The authors declare there is no conflict of interest.
Guowei Wang: Conceptualization, Methodology, Software, Visualization, Investigation, Writing- Reviewing and Editing; Yan Fu: Data curation, Writing-Original draft preparation, Supervision, Software, Validation.
The authors confirm that the data supporting the findings of this study are available within the article.
[1] |
N. Zhang, M. Wang, N. Wang, Precision agriculture—a worldwide overview, Comput. Electron. Agric., 36 (2002), 113–132. https://doi.org/10.1016/S0168-1699(02)00096-0 doi: 10.1016/S0168-1699(02)00096-0
![]() |
[2] |
K. H. Coble, A. K. Mishra, S. Ferrell, T. Griffin, Big data in agriculture: A challenge for the future, Appl. Econ. Perspect. Policy, 40 (2018), 79–96. https://doi.org/10.1093/aepp/ppx056 doi: 10.1093/aepp/ppx056
![]() |
[3] |
A.-K. Mahlein, Plant disease detection by imaging sensors–parallels and specific demands for precision agriculture and plant phenotyping, Plant Dis., 100 (2016), 241–251. https://doi.org/10.1094/PDIS-03-15-0340-FE doi: 10.1094/PDIS-03-15-0340-FE
![]() |
[4] |
J. C. Koh, M. Hayden, H. Daetwyler, S. Kant, Estimation of crop plant density at early mixed growth stages using UAV imagery, Plant Methods, 15 (2019), 1–9. https://doi.org/10.1186/s13007-019-0449-1 doi: 10.1186/s13007-019-0449-1
![]() |
[5] | D. Sinwar, V. S. Dhaka, M. K. Sharma, G. Rani, AI-based yield prediction and smart irrigation. Internet of Things and Analytics for Agriculture, Volume 2: Springer. (2020), 155–180. https://doi.org/10.1007/978-981-15-0663-5_8 |
[6] |
A. Al-Naji, A. B. Fakhri, S. K. Gharghan, J. Chahl, Soil color analysis based on a RGB camera and an artificial neural network towards smart irrigation: A pilot study, Heliyon, 7 (2021), e06078. https://doi.org/10.1016/j.heliyon.2021.e06078 doi: 10.1016/j.heliyon.2021.e06078
![]() |
[7] |
M. Kerkech, A. Hafiane, R. Canals, Deep leaning approach with colorimetric spaces and vegetation indices for vine diseases detection in UAV images, Comput. Electron. Agric., 155 (2018), 237–243. https://doi.org/10.1016/j.compag.2018.10.006 doi: 10.1016/j.compag.2018.10.006
![]() |
[8] |
C.-J. Chen, Y.-Y. Huang, Y.-S. Li, Y.-C. Chen, C.-Y. Chang, Y.-M. Huang, Identification of fruit tree pests with deep learning on embedded drone to achieve accurate pesticide spraying, IEEE Access, 9 (2021), 21986–21997. https://doi.org/10.1109/ACCESS.2021.3056082 doi: 10.1109/ACCESS.2021.3056082
![]() |
[9] |
R. P. Sishodia, R. L. Ray, S. K. Singh, Applications of remote sensing in precision agriculture: A review, Remote Sens., 12 (2020), 3136. https://doi.org/10.3390/rs12193136 doi: 10.3390/rs12193136
![]() |
[10] |
C. Vallentin, K. Harfenmeister, S. Itzerott, B. Kleinschmit, C. Conrad, D. Spengler, Suitability of satellite remote sensing data for yield estimation in northeast Germany, Precis. Agric., 23 (2022), 52–82. https://doi.org/10.1007/s11119-021-09827-6 doi: 10.1007/s11119-021-09827-6
![]() |
[11] |
A. Khaliq, L. Comba, A. Biglia, D. R. Aimonino, M. Chiaberge, P. Gay, Comparison of satellite and UAV-based multispectral imagery for vineyard variability assessment, Remote Sens., 11 (2019), 436. https://doi.org/10.3390/rs11040436 doi: 10.3390/rs11040436
![]() |
[12] |
P. Radoglou-Grammatikis, P. Sarigiannidis, T. Lagkas, I. Moscholios, A compilation of UAV applications for precision agriculture, Comput. Netw., 172 (2020), 107148. https://doi.org/10.1016/j.comnet.2020.107148 doi: 10.1016/j.comnet.2020.107148
![]() |
[13] |
I. Luna, A. Lobo, Mapping crop planting quality in sugarcane from UAV imagery: A pilot study in Nicaragua, Remote Sens., 8 (2016), 500. https://doi.org/10.3390/rs8060500 doi: 10.3390/rs8060500
![]() |
[14] |
M. D. Bah, A. Hafiane, R. Canals, Deep learning with unsupervised data labeling for weed detection in line crops in UAV images, Remote Sens., 10 (2018), 1690. https://doi.org/10.3390/rs10111690 doi: 10.3390/rs10111690
![]() |
[15] |
B. Mishra, A. Dahal, N. Luintel, T. B. Shahi, S. Panthi, S. Pariyar, et al., Methods in the spatial deep learning: current status and future direction, Spat. Inform. Res., 30 (2022), 215–232. https://doi.org/10.1007/s41324-021-00425-2 doi: 10.1007/s41324-021-00425-2
![]() |
[16] |
J. Geipel, J. Link, W. Claupein, Combined spectral and spatial modeling of corn yield based on aerial images and crop surface models acquired with an unmanned aircraft system, Remote Sens., 6 (2014), 10335–10355. https://doi.org/10.3390/rs61110335 doi: 10.3390/rs61110335
![]() |
[17] |
X. Zhou, H. Zheng, X. Xu, J. He, X. Ge, X. Yao, et al., Predicting grain yield in rice using multi-temporal vegetation indices from UAV-based multispectral and digital imagery, ISPRS J. Photogramm., 130 (2017), 246–255. https://doi.org/10.1016/j.isprsjprs.2017.05.003 doi: 10.1016/j.isprsjprs.2017.05.003
![]() |
[18] |
N. Yu, L. Li, N. Schmitz, L. F. Tian, J. A. Greenberg, B. W. Diers, Development of methods to improve soybean yield estimation and predict plant maturity with an unmanned aerial vehicle based platform, Remote Sens. Environ., 187 (2016), 91–101. https://doi.org/10.1016/j.rse.2016.10.005 doi: 10.1016/j.rse.2016.10.005
![]() |
[19] |
F. Gnädinger, U. Schmidhalter, Digital counts of maize plants by unmanned aerial vehicles (UAVs), Remote sens., 9 (2017), 544. https://doi.org/10.3390/rs9060544 doi: 10.3390/rs9060544
![]() |
[20] |
S. Nebiker, N. Lack, M. Abächerli, S. Läderach, Light-weight multispectral UAV sensors and their capabilities for predicting grain yield and detecting plant diseases, Int. Archives Photogrammetry, Remote Sens. Spatial Inf. Sci., 41 (2016). https://doi.org/10.5194/isprsarchives-XLI-B1-963-2016 doi: 10.5194/isprsarchives-XLI-B1-963-2016
![]() |
[21] |
M. Maimaitijiang, V. Sagan, P. Sidike, S. Hartling, F. Esposito, F. B. Fritschi, Soybean yield prediction from UAV using multimodal data fusion and deep learning, Remote Sens. Environ., 237 (2020), 111599. https://doi.org/10.1016/j.rse.2019.111599 doi: 10.1016/j.rse.2019.111599
![]() |
[22] |
P. Nevavuori, N. Narra, P. Linna, T. Lipping, Crop yield prediction using multitemporal UAV data and spatio-temporal deep learning models, Remote Sens., 12 (2020), 4000. https://doi.org/10.3390/rs12234000 doi: 10.3390/rs12234000
![]() |
[23] |
J. Abdulridha, Y. Ampatzidis, J. Qureshi, P. Roberts, Laboratory and UAV-based identification and classification of tomato yellow leaf curl, bacterial spot, and target spot diseases in tomato utilizing hyperspectral imaging and machine learning, Remote Sens., 12 (2020), 2732. https://doi.org/10.3390/rs12172732 doi: 10.3390/rs12172732
![]() |
[24] |
M. Bhandari, A. M. Ibrahim, Q. Xue, J. Jung, A. Chang, J. C. Rudd, et al., Assessing winter wheat foliage disease severity using aerial imagery acquired from small Unmanned Aerial Vehicle (UAV), Comput. Electron. Agric., 176 (2020), 105665. https://doi.org/10.1016/j.compag.2020.105665 doi: 10.1016/j.compag.2020.105665
![]() |
[25] |
W. H. Maes, K. Steppe, Perspectives for remote sensing with unmanned aerial vehicles in precision agriculture, Trends Plant Sci., 24 (2019), 152–164. https://doi.org/10.1016/j.tplants.2018.11.007 doi: 10.1016/j.tplants.2018.11.007
![]() |
[26] |
A. Chlingaryan, S. Sukkarieh, B. Whelan, Machine learning approaches for crop yield prediction and nitrogen status estimation in precision agriculture: A review, Comput. Electron. Agric., 151 (2018), 61–69. https://doi.org/10.1016/j.compag.2018.05.012 doi: 10.1016/j.compag.2018.05.012
![]() |
[27] |
D. C. Tsouros, S. Bibi, P. G. Sarigiannidis, A review on UAV-based applications for precision agriculture, Information, 10 (2019), 349. https://doi.org/10.3390/info10110349 doi: 10.3390/info10110349
![]() |
[28] |
P. Velusamy, S. Rajendran, R. K. Mahendran, S. Naseer, M. Shafiq, J.-G. Choi, Unmanned Aerial Vehicles (UAV) in Precision Agriculture: Applications and Challenges, Energies, 15 (2021), 217. https://doi.org/10.3390/en15010217 doi: 10.3390/en15010217
![]() |
[29] |
A. Kamilaris, F. X. Prenafeta-Boldú, Deep learning in agriculture: A survey, Comput. Electron. Agric., 147 (2018), 70–90. https://doi.org/10.1016/j.compag.2018.02.016 doi: 10.1016/j.compag.2018.02.016
![]() |
[30] |
D. J. Mulla, Twenty five years of remote sensing in precision agriculture: Key advances and remaining knowledge gaps, Biosyst. Eng., 114 (2013), 358–371. https://doi.org/10.1016/j.biosystemseng.2012.08.009 doi: 10.1016/j.biosystemseng.2012.08.009
![]() |
[31] | A. Mancini, E. Frontoni, P. Zingaretti. Satellite and uav data for precision agriculture applications; 2019. IEEE. pp. 491–497. https://doi.org/10.1109/ICUAS.2019.8797930 |
[32] |
U. S. Panday, N. Shrestha, S. Maharjan, A. K. Pratihast, K. L. Shrestha, J. Aryal, Correlating the Plant Height of Wheat with Above-Ground Biomass and Crop Yield Using Drone Imagery and Crop Surface Model, A Case Study from Nepal, Drones, 4 (2020), 28. https://doi.org/10.3390/drones4030028 doi: 10.3390/drones4030028
![]() |
[33] |
I. H. Beloev, A review on current and emerging application possibilities for unmanned aerial vehicles, Acta Technol. Agric., 19 (2016), 70–76. https://doi.org/10.1515/ata-2016-0015 doi: 10.1515/ata-2016-0015
![]() |
[34] |
R. Gebbers, V. I. Adamchuk, Precision agriculture and food security, Science, 327 (2010), 828–831. https://doi.org/10.1126/science.1183899 doi: 10.1126/science.1183899
![]() |
[35] | J. L. Awange, J. B. Kyalo Kiema, Fundamentals of remote sensing. Environmental Geoinformatis. Springer, Berlin, Heidelberg, 2013,111–118. https://doi.org/10.1007/978-3-642-34085-7_7 |
[36] |
T. Chen, W. Yang, H. Zhang, B. Zhu, R. Zeng, X. Wang, et al., Early detection of bacterial wilt in peanut plants through leaf-level hyperspectral and unmanned aerial vehicle data, Comput. Electron. Agric., 177 (2020), 105708. https://doi.org/10.1016/j.compag.2020.105708 doi: 10.1016/j.compag.2020.105708
![]() |
[37] | C. Albornoz, L. F. Giraldo, Trajectory design for efficient crop irrigation with a UAV, 2017 IEEE 3rd Colombian conference on Automatic Control (CCAC), IEEE, 2017. pp. 1–6. https://doi.org/10.1109/CCAC.2017.8276401 |
[38] |
V. Gonzalez-Dugo, P. Zarco-Tejada, E. Nicolás, P. A. Nortes, J. Alarcón, D. S. Intrigliolo, et al., Using high resolution UAV thermal imagery to assess the variability in the water status of five fruit tree species within a commercial orchard, Precis. Agric., 14 (2013), 660–678. https://doi.org/10.1007/s11119-013-9322-9 doi: 10.1007/s11119-013-9322-9
![]() |
[39] |
Y. Huang, K. N. Reddy, R. S. Fletcher, D. Pennington, UAV low-altitude remote sensing for precision weed management, Weed Technol., 32 (2018), 2–6. https://doi.org/10.1017/wet.2017.89 doi: 10.1017/wet.2017.89
![]() |
[40] |
C. Ballester, J. Brinkhoff, W. C. Quayle, J. Hornbuckle, Monitoring the effects of water stress in cotton using the green red vegetation index and red edge ratio, Remote Sens., 11 (2019), 873. https://doi.org/10.3390/rs11070873 doi: 10.3390/rs11070873
![]() |
[41] |
L. Zhang, H. Zhang, Y. Niu, W. Han, Mapping maize water stress based on UAV multispectral remote sensing, Remote Sens., 11 (2019), 605. https://doi.org/10.3390/rs11060605 doi: 10.3390/rs11060605
![]() |
[42] |
E. R. Hunt, D. A. Horneck, C. B. Spinelli, R. W. Turner, A. E. Bruce, D. J. Gadler, et al., Monitoring nitrogen status of potatoes using small unmanned aerial vehicles, Precis. Agric., 19 (2018), 314–333. https://doi.org/10.1007/s11119-017-9518-5 doi: 10.1007/s11119-017-9518-5
![]() |
[43] |
J. Kim, S. Kim, C. Ju, H. I. Son, Unmanned aerial vehicles in agriculture: A review of perspective of platform, control, and applications, IEEE Access, 7 (2019), 105100–105115. https://doi.org/10.1109/ACCESS.2019.2932119 doi: 10.1109/ACCESS.2019.2932119
![]() |
[44] |
R. Akhter, S. A. Sofi, Precision agriculture using IoT data analytics and machine learning, J. King Saud University-Comput. Inform. Sci., (2021). https://doi.org/10.1016/j.jksuci.2021.05.013 doi: 10.1016/j.jksuci.2021.05.013
![]() |
[45] |
L. Pádua, J. Vanko, J. Hruška, T. Adão, J. J. Sousa, E. Peres, et al., UAS, sensors, and data processing in agroforestry: A review towards practical applications, Int. J. Remote Sens., 38 (2017), 2349–2391. https://doi.org/10.1080/01431161.2017.1297548 doi: 10.1080/01431161.2017.1297548
![]() |
[46] | C. Paucar, L. Morales, K. Pinto, M. Sánchez, R. Rodríguez, M. Gutierrez, et al., Use of drones for surveillance and reconnaissance of military areas; 2018. Springer. pp. 119–132. https://doi.org/10.1007/978-3-319-78605-6_10 |
[47] |
I. Wahab, O. Hall, M. Jirström, Remote sensing of yields: Application of uav imagery-derived ndvi for estimating maize vigor and yields in complex farming systems in sub-saharan africa, Drones, 2 (2018), 28. https://doi.org/10.3390/drones2030028 doi: 10.3390/drones2030028
![]() |
[48] |
M. Zaman-Allah, O. Vergara, J. Araus, A. Tarekegne, C. Magorokosho, P. Zarco-Tejada, et al., Unmanned aerial platform-based multi-spectral imaging for field phenotyping of maize, Plant Methods, 11 (2015), 1–10. https://doi.org/10.1186/s13007-015-0078-2 doi: 10.1186/s13007-015-0078-2
![]() |
[49] |
A. C. Watts, V. G. Ambrosia, E. A. Hinkley, Unmanned aircraft systems in remote sensing and scientific research: Classification and considerations of use, Remote Sens., 4 (2012), 1671–1692. https://doi.org/10.3390/rs4061671 doi: 10.3390/rs4061671
![]() |
[50] |
S. Guan, K. Fukami, H. Matsunaka, M. Okami, R. Tanaka, H. Nakano, et al., Assessing correlation of high-resolution NDVI with fertilizer application level and yield of rice and wheat crops using small UAVs, Remote Sens., 11 (2019), 112. https://doi.org/10.3390/rs11020112 doi: 10.3390/rs11020112
![]() |
[51] |
X. Zhang, L. Han, Y. Dong, Y. Shi, W. Huang, L. Han, et al., A deep learning-based approach for automated yellow rust disease detection from high-resolution hyperspectral UAV images, Remote Sens., 11 (2019), 1554. https://doi.org/10.3390/rs11131554 doi: 10.3390/rs11131554
![]() |
[52] |
G. Oré, M. S. Alcântara, J. A. Góes, L. P. Oliveira, J. Yepes, B. Teruel, et al., Crop growth monitoring with drone-borne DInSAR, Remote Sens., 12 (2020), 615. https://doi.org/10.3390/rs12040615 doi: 10.3390/rs12040615
![]() |
[53] |
A. Matese, R. Baraldi, A. Berton, C. Cesaraccio, S. F. Di Gennaro, P. Duce, et al., Estimation of water stress in grapevines using proximal and remote sensing methods, Remote Sens., 10 (2018), 114. https://doi.org/10.3390/rs10010114 doi: 10.3390/rs10010114
![]() |
[54] |
A. P. M. Ramos, L. P. Osco, D. E. G. Furuya, W. N. Gonçalves, D. C. Santana, L. P. R. Teodoro, et al., A random forest ranking approach to predict yield in maize with uav-based vegetation spectral indices, Comput. Electron. Agric., 178 (2020), 105791. https://doi.org/10.1016/j.compag.2020.105791 doi: 10.1016/j.compag.2020.105791
![]() |
[55] |
L. Wan, H. Cen, J. Zhu, J. Zhang, Y. Zhu, D. Sun, et al., Grain yield prediction of rice using multi-temporal UAV-based RGB and multispectral images and model transfer–a case study of small farmlands in the South of China, Agric. For. Meteorol., 291 (2020), 108096. https://doi.org/10.1016/j.agrformet.2020.108096 doi: 10.1016/j.agrformet.2020.108096
![]() |
[56] |
A. Matese, S. F. Di Gennaro, Beyond the traditional NDVI index as a key factor to mainstream the use of UAV in precision viticulture, Sci. Rep., 11 (2021), 1–13. https://doi.org/10.1038/s41598-021-81652-3 doi: 10.1038/s41598-021-81652-3
![]() |
[57] |
K. Sumesh, S. Ninsawat, J. Som-ard, Integration of RGB-based vegetation index, crop surface model and object-based image analysis approach for sugarcane yield estimation using unmanned aerial vehicle, Comput. Electron. Agric., 180 (2021), 105903. https://doi.org/10.1016/j.compag.2020.105903 doi: 10.1016/j.compag.2020.105903
![]() |
[58] |
C. Stanton, M. J. Starek, N. Elliott, M. Brewer, M. M. Maeda, T. Chu, Unmanned aircraft system-derived crop height and normalized difference vegetation index metrics for sorghum yield and aphid stress assessment, J. Appl. Remote Sens., 11 (2017), 026035. https://doi.org/10.1117/1.JRS.11.026035 doi: 10.1117/1.JRS.11.026035
![]() |
[59] |
P. L. Raeva, J. Šedina, A. Dlesk, Monitoring of crop fields using multispectral and thermal imagery from UAV, Eur. J. Remote Sens., 52 (2019), 192–201. https://doi.org/10.1080/22797254.2018.1527661 doi: 10.1080/22797254.2018.1527661
![]() |
[60] |
L. G. T. Crusiol, M. R. Nanni, R. H. Furlanetto, R. N. R. Sibaldelli, E. Cezar, L. M. Mertz-Henning, et al., UAV-based thermal imaging in the assessment of water status of soybean plants, Int. J. Remote Sens., 41 (2020), 3243–3265. https://doi.org/10.1080/01431161.2019.1673914 doi: 10.1080/01431161.2019.1673914
![]() |
[61] | I. Pölönen, H. Saari, J. Kaivosoja, E. Honkavaara, L. Pesonen, Hyperspectral imaging based biomass and nitrogen content estimations from light-weight UAV, Remote Sensing for Agriculture, Ecosystems, and Hydrology XV. SPIE, 8887 (2013), 141–149. https://doi.org/10.1117/12.2028624 |
[62] |
C. N. Vong, L. S. Conway, J. Zhou, N. R. Kitchen, K. A. Sudduth, Early corn stand count of different cropping systems using UAV-imagery and deep learning, Comput. Electron. Agric., 186 (2021), 106214. https://doi.org/10.1016/j.compag.2021.106214 doi: 10.1016/j.compag.2021.106214
![]() |
[63] |
U. Lussem, A. Bolten, M. Gnyp, J. Jasper, G. Bareth, Evaluation of RGB-based vegetation indices from UAV imagery to estimate forage yield in grassland, Int. Arch. Photogramm Remote Sens. Spatial Inf. Sci., 42 (2018), 1215–1219. https://doi.org/10.5194/isprs-archives-XLII-3-1215-2018 doi: 10.5194/isprs-archives-XLII-3-1215-2018
![]() |
[64] |
R. V. Rossel, R. McGlynn, A. McBratney, Determining the composition of mineral-organic mixes using UV–vis–NIR diffuse reflectance spectroscopy, Geoderma, 137 (2006), 70–82. https://doi.org/10.1016/j.geoderma.2006.07.004 doi: 10.1016/j.geoderma.2006.07.004
![]() |
[65] |
Y. Guo, H. Wang, Z. Wu, S. Wang, H. Sun, J. Senthilnath, et al., Modified Red Blue Vegetation Index for Chlorophyll Estimation and Yield Prediction of Maize from Visible Images Captured by UAV, Sensors, 20 (2020), 5055. https://doi.org/10.3390/s20185055 doi: 10.3390/s20185055
![]() |
[66] |
H. García-Martínez, H. Flores-Magdaleno, R. Ascencio-Hernández, A. Khalil-Gardezi, L. Tijerina-Chávez, O. R. Mancilla-Villa, et al., Corn grain yield estimation from vegetation indices, canopy cover, plant density, and a neural network using multispectral and RGB images acquired with unmanned aerial vehicles, Agriculture, 10 (2020), 277. https://doi.org/10.3390/agriculture10070277 doi: 10.3390/agriculture10070277
![]() |
[67] |
T. Adão, J. Hruška, L. Pádua, J. Bessa, E. Peres, R. Morais, et al., Hyperspectral imaging: A review on UAV-based sensors, data processing and applications for agriculture and forestry, Remote Sens., 9 (2017), 1110. https://doi.org/10.3390/rs9111110 doi: 10.3390/rs9111110
![]() |
[68] |
R. Calderón, J. A. Navas-Cortés, C. Lucena, P. J. Zarco-Tejada, High-resolution airborne hyperspectral and thermal imagery for early detection of Verticillium wilt of olive using fluorescence, temperature and narrow-band spectral indices, Remote Sens. Environ., 139 (2013), 231–245. https://doi.org/10.1016/j.rse.2013.07.031 doi: 10.1016/j.rse.2013.07.031
![]() |
[69] |
J. Su, C. Liu, M. Coombes, X. Hu, C. Wang, X. Xu, et al., Wheat yellow rust monitoring by learning from multispectral UAV aerial imagery, Comput. Electron. Agric., 155 (2018), 157–166. https://doi.org/10.1016/j.compag.2018.10.017 doi: 10.1016/j.compag.2018.10.017
![]() |
[70] | J. Kurihara, T. Ishida, Y. Takahashi, Unmanned Aerial Vehicle (UAV)-based hyperspectral imaging system for precision agriculture and forest management, In Unmanned Aerial Vehicle: Applications in Agriculture and Environment, Springer. (2020), 25–38. https://doi.org/10.1007/978-3-030-27157-2_3 |
[71] |
J. Bian, Z. Zhang, J. Chen, H. Chen, C. Cui, X. Li, et al., Simplified evaluation of cotton water stress using high resolution unmanned aerial vehicle thermal imagery, Remonte Sens., 11 (2019), 267. https://doi.org/10.3390/rs11030267 doi: 10.3390/rs11030267
![]() |
[72] |
J. Martínez, G. Egea, J. Agüera, M. Pérez-Ruiz, A cost-effective canopy temperature measurement system for precision agriculture: A case study on sugar beet, Precis. Agric., 18 (2017), 95–110. https://doi.org/10.1007/s11119-016-9470-9 doi: 10.1007/s11119-016-9470-9
![]() |
[73] |
L. Zhang, Y. Niu, H. Zhang, W. Han, G. Li, J. Tang, et al., Maize canopy temperature extracted from UAV thermal and RGB imagery and its application in water stress monitoring, Front. Plant Sci., (2019), 1270. https://doi.org/10.3389/fpls.2019.01270 doi: 10.3389/fpls.2019.01270
![]() |
[74] |
S. Idso, R. Jackson, P. Pinter Jr, R. Reginato, J. Hatfield, Normalizing the stress-degree-day parameter for environmental variability, Agric. meteorol., 24 (1981), 45–55. https://doi.org/10.1016/0002-1571(81)90032-7 doi: 10.1016/0002-1571(81)90032-7
![]() |
[75] |
L. Zhou, X. Gu, S. Cheng, G. Yang, M. Shu, Q. Sun, Analysis of plant height changes of lodged maize using UAV-LiDAR data, Agric., 10 (2020), 146. https://doi.org/10.3390/agriculture10050146 doi: 10.3390/agriculture10050146
![]() |
[76] |
Y. Jia, Z. Su, Q. Zhang, Y. Zhang, Y. Gu, Z. Chen, Research on UAV remote sensing image mosaic method based on SIFT, Int. J. Signal Process., Image Process. Pattern Recognition, 8 (2015), 365–374. https://doi.org/10.14257/ijsip.2015.8.11.33 doi: 10.14257/ijsip.2015.8.11.33
![]() |
[77] |
Y. Jeong, J. Yu, L. Wang, H. Shin, S.-M. Koh, G. Park, Cost-effective reflectance calibration method for small UAV images, Int. J. Remote Sens., 39 (2018), 7225–7250. https://doi.org/10.1080/01431161.2018.1516307 doi: 10.1080/01431161.2018.1516307
![]() |
[78] |
Y. Ji, Z. Chen, Q. Cheng, R. Liu, M. Li, X. Yan, et al., Estimation of plant height and yield based on UAV imagery in faba bean (Vicia faba L.), Plant Methods, 18 (2022), 1–13. https://doi.org/10.1186/s13007-022-00861-7 doi: 10.1186/s13007-022-00861-7
![]() |
[79] |
M. Awais, W. Li, M. Cheema, S. Hussain, A. Shaheen, B. Aslam, et al., Assessment of optimal flying height and timing using high-resolution unmanned aerial vehicle images in precision agriculture, Int. J. Environ. Sci. Technol., 19 (2022), 2703–2720. https://doi.org/10.1007/s13762-021-03195-4 doi: 10.1007/s13762-021-03195-4
![]() |
[80] |
J. Gilliot, J. Michelin, D. Hadjard, S. Houot, An accurate method for predicting spatial variability of maize yield from UAV-based plant height estimation: A tool for monitoring agronomic field experiments, Precis. Agric., 22 (2021), 897–921. https://doi.org/10.1007/s11119-020-09764-w doi: 10.1007/s11119-020-09764-w
![]() |
[81] |
U. R. Mogili, B. Deepak, Review on application of drone systems in precision agriculture, Procedia Comput. Sci., 133 (2018), 502–509. https://doi.org/10.1016/j.procs.2018.07.063 doi: 10.1016/j.procs.2018.07.063
![]() |
[82] |
C. Zerbato, D. L. Rosalen, C. E. A. Furlani, J. Deghaid, M. A. Voltarelli, Agronomic characteristics associated with the normalized difference vegetation index (NDVI) in the peanut crop, Aust. J. Crop. Sci., 10 (2016), 758–764. https://doi.org/10.21475/ajcs.2016.10.05.p7167 doi: 10.21475/ajcs.2016.10.05.p7167
![]() |
[83] | A. Ashapure, S. Oh, T. G. Marconi, A. Chang, J. Jung, J. Landivar, et al., Unmanned aerial system based tomato yield estimation using machine learning; 2019. International Society for Optics and Photonics. pp. 110080O. https://doi.org/10.1117/12.2519129 |
[84] |
A. Michez, P. Lejeune, S. Bauwens, A. A. L. Herinaina, Y. Blaise, E. Castro Muñoz, et al., Mapping and monitoring of biomass and grazing in pasture with an unmanned aerial system, Remote Sens., 11 (2019), 473. https://doi.org/10.3390/rs11050473 doi: 10.3390/rs11050473
![]() |
[85] |
R. M. Haralick, K. Shanmugam, I. H. Dinstein, Textural features for image classification, IEEE Trans. Syst., Man, Cybern, (1973), 610–621. https://doi.org/10.1109/TSMC.1973.4309314 doi: 10.1109/TSMC.1973.4309314
![]() |
[86] |
Y. Guo, Y. H. Fu, S. Chen, C. R. Bryant, X. Li, J. Senthilnath, et al., Integrating spectral and textural information for identifying the tasseling date of summer maize using UAV based RGB images, Int. J. Appl. Earth Obs. Geoinf., 102 (2021), 102435. https://doi.org/10.1016/j.jag.2021.102435 doi: 10.1016/j.jag.2021.102435
![]() |
[87] |
T. B. Shahi, A. Shrestha, A. Neupane, W. Guo, Stock price forecasting with deep learning: A comparative study, Mathematics, 8 (2020), 1441. https://doi.org/10.3390/math8091441 doi: 10.3390/math8091441
![]() |
[88] |
C. Sitaula, A. Basnet, A. Mainali, T. B. Shahi, Deep learning-based methods for sentiment analysis on Nepali covid-19-related tweets, Comput. Intell. Neurosci., 2021 (2021). https://doi.org/10.1155/2021/2158184 doi: 10.1155/2021/2158184
![]() |
[89] |
T. B. Shahi, C. Sitaula, A. Neupane, W. Guo, Fruit classification using attention-based MobileNetV2 for industrial applications, Plos one, 17 (2022), e0264586. https://doi.org/10.1371/journal.pone.0264586 doi: 10.1371/journal.pone.0264586
![]() |
[90] |
S. Subba, N. Paudel, T. B. Shahi, Nepali text document classification using deep neural network, Tribhuvan University J., 33 (2019), 11–22. https://doi.org/10.3126/tuj.v33i1.28677 doi: 10.3126/tuj.v33i1.28677
![]() |
[91] | B. Whelan, J. Taylor, Precision agriculture for grain production systems, CSIRO publishing, 2013. https://doi.org/10.1071/9780643107489 |
[92] |
B. Neupane, T. Horanont, N. D. Hung, Deep learning based banana plant detection and counting using high-resolution red-green-blue (RGB) images collected from unmanned aerial vehicle (UAV), PloS one, 14 (2019), e0223906. https://doi.org/10.1371/journal.pone.0223906 doi: 10.1371/journal.pone.0223906
![]() |
[93] |
K. Osorio, A. Puerto, C. Pedraza, D. Jamaica, L. Rodríguez, A deep learning approach for weed detection in lettuce crops using multispectral images, AgriEngineering, 2 (2020), 471–488. https://doi.org/10.3390/agriengineering2030032 doi: 10.3390/agriengineering2030032
![]() |
[94] |
T. Kattenborn, J. Eichel, F. E. Fassnacht, Convolutional Neural Networks enable efficient, accurate and fine-grained segmentation of plant species and communities from high-resolution UAV imagery, Sci. Rep., 9 (2019), 1–9. https://doi.org/10.1038/s41598-018-37186-2 doi: 10.1038/s41598-018-37186-2
![]() |
[95] |
S. Shafiee, L. M. Lied, I. Burud, J. A. Dieseth, M. Alsheikh, M. Lillemo, Sequential forward selection and support vector regression in comparison to LASSO regression for spring wheat yield prediction based on UAV imagery, Comput. Electron. Agric., 183 (2021), 106036. https://doi.org/10.1016/j.compag.2021.106036 doi: 10.1016/j.compag.2021.106036
![]() |
[96] |
W. Xu, P. Chen, Y. Zhan, S. Chen, L. Zhang, Y. Lan, Cotton yield estimation model based on machine learning using time series UAV remote sensing data, Int. J. Appl. Earth Obs. Geoinf., 104 (2021), 102511. https://doi.org/10.1016/j.jag.2021.102511 doi: 10.1016/j.jag.2021.102511
![]() |
[97] |
J. Zhou, J. Zhou, H. Ye, M. L. Ali, P. Chen, H. T. Nguyen, Yield estimation of soybean breeding lines under drought stress using unmanned aerial vehicle-based imagery and convolutional neural network, Biosyst. Eng., 204 (2021), 90–103. https://doi.org/10.1016/j.biosystemseng.2021.01.017 doi: 10.1016/j.biosystemseng.2021.01.017
![]() |
[98] |
Q. Yang, L. Shi, J. Han, Y. Zha, P. Zhu, Deep convolutional neural networks for rice grain yield estimation at the ripening stage using UAV-based remotely sensed images, Field Crops Res., 235 (2019), 142–153. https://doi.org/10.1016/j.fcr.2019.02.022 doi: 10.1016/j.fcr.2019.02.022
![]() |
[99] |
H. Escalante, S. Rodríguez-Sánchez, M. Jiménez-Lizárraga, A. Morales-Reyes, J. De La Calleja, R. Vazquez, Barley yield and fertilization analysis from UAV imagery: a deep learning approach, Int. J. Remote Sens., 40 (2019), 2493–2516. https://doi.org/10.1080/01431161.2019.1577571 doi: 10.1080/01431161.2019.1577571
![]() |
[100] |
P. Nevavuori, N. Narra, T. Lipping, Crop yield prediction with deep convolutional neural networks, Comput. Electron. Agric., 163 (2019), 104859. https://doi.org/10.1016/j.compag.2019.104859 doi: 10.1016/j.compag.2019.104859
![]() |
[101] |
N. Suzuki, R. M. Rivero, V. Shulaev, E. Blumwald, R. Mittler, Abiotic and biotic stress combinations, New Phytol., 203 (2014), 32–43. https://doi.org/10.1111/nph.12797 doi: 10.1111/nph.12797
![]() |
[102] |
K. James, C. J. Nichol, T. Wade, D. Cowley, S. Gibson Poole, A. Gray, et al., Thermal and multispectral remote sensing for the detection and analysis of archaeologically induced crop stress at a UK site, Drones, 4 (2020), 61. https://doi.org/10.3390/drones4040061 doi: 10.3390/drones4040061
![]() |
[103] | S. Delalieux, P. J. Zarco-Tejada, L. Tits, M. Á. J. Bello, D. S. Intrigliolo, B. Somers, Unmixing-based fusion of hyperspatial and hyperspectral airborne imagery for early detection of vegetation stress, IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens., 7 (2014), 2571–2582. |
[104] |
J. Bellvert, P. J. Zarco-Tejada, J. Girona, E. Fereres, Mapping crop water stress index in a 'Pinot-noir'vineyard: comparing ground measurements with thermal remote sensing imagery from an unmanned aerial vehicle, Precis. Agric., 15 (2014), 361–376. https://doi.org/10.1007/s11119-013-9334-5 doi: 10.1007/s11119-013-9334-5
![]() |
[105] |
R. Sugiura, S. Tsuda, S. Tamiya, A. Itoh, K. Nishiwaki, N. Murakami, et al., Field phenotyping system for the assessment of potato late blight resistance using RGB imagery from an unmanned aerial vehicle, Biosyst. Eng., 148 (2016), 1–10. https://doi.org/10.1016/j.biosystemseng.2016.04.010 doi: 10.1016/j.biosystemseng.2016.04.010
![]() |
[106] |
A. Patrick, S. Pelham, A. Culbreath, C. C. Holbrook, I. J. De Godoy, C. Li, High throughput phenotyping of tomato spot wilt disease in peanuts using unmanned aerial systems and multispectral imaging, IEEE Instrum. Meas. Mage., 20 (2017), 4–12. https://doi.org/10.1109/MIM.2017.7951684 doi: 10.1109/MIM.2017.7951684
![]() |
[107] | M. Balota, J. Oakes. UAV remote sensing for phenotyping drought tolerance in peanuts; 2017. SPIE. pp. 81–87. https://doi.org/10.1117/12.2262496 |
[108] |
D. Gómez-Candón, J. Torres-Sanchez, S. Labbé, A. Jolivot, S. Martinez, J. L. Regnard, Water stress assessment at tree scale: high-resolution thermal UAV imagery acquisition and processing, ActaHortic, (2017), 159–166. https://doi.org/10.17660/ActaHortic.2017.1150.23 doi: 10.17660/ActaHortic.2017.1150.23
![]() |
[109] |
L. N. Lacerda, J. L. Snider, Y. Cohen, V. Liakos, S. Gobbo, G. Vellidis, Using UAV-based thermal imagery to detect crop water status variability in cotton, Smart Agric. Technol., 2 (2022), 100029. https://doi.org/10.1016/j.atech.2021.100029 doi: 10.1016/j.atech.2021.100029
![]() |
[110] |
H. Ma, W. Huang, Y. Dong, L. Liu, A. Guo, Using UAV-Based Hyperspectral Imagery to Detect Winter Wheat Fusarium Head Blight, Remote Sens., 13 (2021), 3024. https://doi.org/10.3390/rs13153024 doi: 10.3390/rs13153024
![]() |
[111] |
D. Bohnenkamp, J. Behmann, A.-K. Mahlein, In-field detection of yellow rust in wheat on the ground canopy and UAV scale, Remote Sens., 11 (2019), 2495. https://doi.org/10.3390/rs11212495 doi: 10.3390/rs11212495
![]() |
[112] |
H. Wu, T. Wiesner‐Hanks, E. L. Stewart, C. DeChant, N. Kaczmar, M. A. Gore, et al., Autonomous detection of plant disease symptoms directly from aerial imagery, Plant Phenome J., 2 (2019), 1–9. https://doi.org/10.2135/tppj2019.03.0006 doi: 10.2135/tppj2019.03.0006
![]() |
[113] |
D. Freeman, S. Gupta, D. H. Smith, J. M. Maja, J. Robbins, J. S. Owen, et al., Watson on the farm: Using cloud-based artificial intelligence to identify early indicators of water stress, Remote Sens., 11 (2019), 2645. https://doi.org/10.3390/rs11222645 doi: 10.3390/rs11222645
![]() |
[114] |
M.-D. Yang, H.-H. Tseng, Y.-C. Hsu, H. P. Tsai, Semantic segmentation using deep learning with vegetation indices for rice lodging identification in multi-date UAV visible images, Remote Sens., 12 (2020), 633. https://doi.org/10.3390/rs12040633 doi: 10.3390/rs12040633
![]() |
[115] |
Z. Song, Z. Zhang, S. Yang, D. Ding, J. Ning, Identifying sunflower lodging based on image fusion and deep semantic segmentation with UAV remote sensing imaging, Comput. Electron. Agric., 179 (2020), 105812. https://doi.org/10.1016/j.compag.2020.105812 doi: 10.1016/j.compag.2020.105812
![]() |
[116] | L. E. C. La Rosa, M. Zortea, B. Gemignani, D. A. B. Oliveira, R. Q. Feitosa. Fcrn-based multi-task learning for automatic citrus tree detection from uav images, 2020 IEEE Latin AMerican GRSS & ISPRS Remonte Sensing Conference (LAGIRS). pp. 403–408. https://doi.org/10.1109/LAGIRS48042.2020.9165654 |
[117] | M. Fawakherji, C. Potena, D. D. Bloisi, M. Imperoli, A. Pretto, D. Nardi, Uav image based crop and weed distribution estimation on embedded gpu boards, Int. Confer. Comput. Aanl. Image. Pattern. Springer, Cham. 2019,100–108. https://doi.org/10.1007/978-3-030-29930-9_10 |
[118] |
G.-H. Kwak, N.-W. Park, Impact of texture information on crop classification with machine learning and UAV images, Appl. Sci., 9 (2019), 643. https://doi.org/10.3390/app9040643 doi: 10.3390/app9040643
![]() |
[119] | F. Trujillano, A. Flores, C. Saito, M. Balcazar, D. Racoceanu, Corn classification using Deep Learning with UAV imagery. An operational proof of concept, 2018 IEEE 1st Colombian conference on applications in computational intelligence (ColCACI), IEEE, 2018, pp. 1–4. https://doi.org/10.1109/ColCACI.2018.8484845 |
[120] |
B. T. Kitano, C. C. Mendes, A. R. Geus, H. C. Oliveira, J. R. Souza, Corn plant counting using deep learning and UAV images, IEEE Geosci. Remote Sens. Lett., (2019). https://doi.org/10.1109/LGRS.2019.2930549 doi: 10.1109/LGRS.2019.2930549
![]() |
[121] |
R. Chew, J. Rineer, R. Beach, M. O'Neil, N. Ujeneza, D. Lapidus, et al., Deep neural networks and transfer learning for food crop identification in UAV Images, Drones, 4 (2020), 7. https://doi.org/10.3390/drones4010007 doi: 10.3390/drones4010007
![]() |
[122] |
M. Aria, C. Cuccurullo, bibliometrix: An R-tool for comprehensive science mapping analysis, J. Informetr., 11 (2017), 959–975. https://doi.org/10.1016/j.joi.2017.08.007 doi: 10.1016/j.joi.2017.08.007
![]() |
1. | G.-W. Wang, Y. Fu, Modes transition and network synchronization in extended Hindmarsh–Rose model driven by mutation of adaptation current under effects of electric field, 2023, 0973-1458, 10.1007/s12648-023-02613-2 | |
2. | Yan Fu, Guowei Wang, Firing patterns transition and network dynamics of an extended Hindmarsh-Rose neuronal system, 2024, 0973-1458, 10.1007/s12648-024-03228-x | |
3. | Ranjit Kumar Upadhyay, Debasish Pradhan, Sanjeev Kumar Sharma, Arnab Mondal, Emergence of spiral and antispiral patterns and its CGLE analysis in leech-heart interneuron model with electromagnetic induction, 2024, 128, 0307904X, 154, 10.1016/j.apm.2024.01.013 | |
4. | Yan Fu, Tian Lu, Meng Zhou, Dongwei Liu, Qihang Gan, Guowei Wang, Effect of color cross-correlated noise on the growth characteristics of tumor cells under immune surveillance, 2023, 20, 1551-0018, 21626, 10.3934/mbe.2023957 | |
5. | Yan Fu, Guowei Wang, Firing patterns transitions and resonance effects of the extended Hindmarsh-Rose neural model with Gaussian noise and transcranial magneto-acousto-electrical stimulation, 2024, 99, 0031-8949, 115266, 10.1088/1402-4896/ad85a0 |
Types | a | b | c | d |
RS | 0.02 | 0.2 | –65 | 8 |
FS | 0.1 | 0.2 | –65 | 2 |
CH | 0.02 | 0.2 | –50 | 2 |
IB | 0.02 | 0.2 | –55 | 4 |