Agronomy 14 00363 v2
Agronomy 14 00363 v2
Review
Deep Learning-Based Weed–Crop Recognition for Smart
Agricultural Equipment: A Review
Hao-Ran Qu and Wen-Hao Su *
Abstract: Weeds and crops engage in a relentless battle for the same resources, leading to potential re-
ductions in crop yields and increased agricultural costs. Traditional methods of weed control, such as
heavy herbicide use, come with the drawback of promoting weed resistance and environmental pollu-
tion. As the demand for pollution-free and organic agricultural products rises, there is a pressing need
for innovative solutions. The emergence of smart agricultural equipment, including intelligent robots,
unmanned aerial vehicles and satellite technology, proves to be pivotal in addressing weed-related
challenges. The effectiveness of smart agricultural equipment, however, hinges on accurate detection,
a task influenced by various factors, like growth stages, environmental conditions and shading. To
achieve precise crop identification, it is essential to employ suitable sensors and optimized algorithms.
Deep learning plays a crucial role in enhancing weed recognition accuracy. This advancement enables
targeted actions such as minimal pesticide spraying or precise laser excision of weeds, effectively
reducing the overall cost of agricultural production. This paper provides a thorough overview of
the application of deep learning for crop and weed recognition in smart agricultural equipment.
Starting with an overview of intelligent agricultural tools, sensors and identification algorithms, the
discussion delves into instructive examples, showcasing the technology’s prowess in distinguishing
between weeds and crops. The narrative highlights recent breakthroughs in automated technologies
for precision plant identification while acknowledging existing challenges and proposing prospects.
By marrying cutting-edge technology with sustainable agricultural practices, the adoption of intelli-
gent equipment presents a promising path toward efficient and eco-friendly weed management in
modern agriculture.
Citation: Qu, H.-R.; Su, W.-H. Deep
Learning-Based Weed–Crop
Keywords: deep learning; smart agricultural equipment; weeds and crops; recognition
Recognition for Smart Agricultural
Equipment: A Review. Agronomy 2024,
14, 363. https://doi.org/10.3390/
agronomy14020363 1. Introduction
Academic Editor: Sung-Cheol Koh
Weeds are a big threat in agriculture as they occur in all parts of the field and compete
with crop plants for resources. The result of competition for resources is reduced crop yields.
Received: 27 December 2023 Yield losses depend on factors, such as weed species, population density and relative time
Revised: 24 January 2024 of emergence and distribution, as well as on the soil type, soil moisture levels, pH and
Accepted: 7 February 2024
fertility [1,2]. For decades, researchers and farmers have struggled to control weeds to
Published: 11 February 2024
overcome the thorny challenges they pose. Weeds in the field compete with crops for water,
nutrients and sunlight. If not controlled properly, weeds can adversely affect crop yield
and quality. In addition, research has shown that there is a significant link between reduced
Copyright: © 2024 by the authors.
crop yields and weed competition [2]. For example, the annual cost of weeds in Australia
Licensee MDPI, Basel, Switzerland.
within grain production systems is USD 3.3 billion, comprising USD 2.6 billion in costs for
This article is an open access article weed control and USD 0.7 billion in lost yield [3].
distributed under the terms and In today’s agricultural sector, accurately identifying crops and weeds is crucial for
conditions of the Creative Commons improving agricultural productivity, reducing production costs and achieving sustainable
Attribution (CC BY) license (https:// agricultural development. The fast development of deep learning techniques for wide
creativecommons.org/licenses/by/ application in computer vision provides new opportunities for crop and weed recognition.
4.0/). The high automation and learning capabilities of deep learning models enable them to
learn from large datasets and gradually improve their performance, bringing unprece-
dented breakthroughs to precision agriculture. Recently, the main methods of weed control
in agricultural fields have included hand weeding, mechanical weeding, laser weeding
and chemical weeding. Chemical weeding provides the advantage of low cost, and it is
unaffected by terrain factors. It is widely used all over the world [4]. The heavy use of
herbicides increases weed resistance and increases the cost of agricultural inputs. Reducing
the use of herbicides is also a critical step towards sustainable agriculture. Site-specific
weed control can save up to 90% of herbicide expenditures. In addition, annual sales of
pesticides worldwide amount to about USD 100 billion. If this idea can become reality,
it will significantly reduce agricultural expenditure [5]. Spraying pesticides over large
areas can also pollute the environment. For example, indiscriminate broadcast spraying
throughout tobacco fields, especially during the early growth phase, can lead to unneces-
sarily spraying bare soil off target between any two contiguous tobacco plants, causing
environmental pollution and pesticide seepage into the ground [6,7]. Pesticide use also
has an impact on human health. The WHO has estimated that 1 million adverse reactions
have been reported when hand-sprayed insecticides are used in crop fields [8]. In order
to better control the use of herbicides, due to the massive increase in over-reliance on
herbicides and herbicide-resistant weeds, the EU’s agricultural system has become more
fragile and unsustainable. The EU Green Deal has a goal of cutting the use and risks of
chemical fertilizers by 50 percent by 2030 [9]. The European Food Safety Authority (EFSA)
has announced that 98.9% of food products contain agrochemical residues (of which 1.5%
exceed legal limits). In addition, plants resistance to agrochemicals (e.g., herbicides) is
becoming a huge threat to crop yields in many countries [10].
Manual weeding is not only a heavy workload but also cannot easily detect weeds
in a timely manner. The only solution to the problem is to increase manpower, but this
will inevitably increase agricultural costs. Mechanical weed control is especially suitable
for weed control in organic farmland and can also be useful in traditional farmland. On
the other hand, the utilization of machines may also have a downside effect by damaging
and eroding crops and the environment [11]. Currently, weed removal in crop rows still
relies on manual removal in many cases, but manual weeding is less efficient. With the
development of deep learning algorithms, weed management has achieved successful
results. Agricultural robotics research has increased over the past few years due to the
potential applications of robots and industry efforts in robot development. The role of robots
in many agricultural tasks has been studied, focusing mainly on improving the automation
of traditional agricultural machinery and weeding processes [12,13]. It can accurately
recognize weeds and accurately deal with them, which greatly saves the use of herbicides,
avoids environmental pollution and reduces agricultural costs. In smart agriculture, using
sensors installed on satellites, unmanned aerial vehicles or ground tractors to separate
them between weeds and crops is becoming an effective method of weed management.
Remote sensing technology allows for quickly charting the distribution of weeds and crops
over large areas [14]. An SVM-based system for a crop/weed detection system for tractor
boom sprayers to spot spray tobacco crops in the field was constructed. Its classification
accuracy is 96% [6]. In the last decade or so, Earth observation satellites have provided
higher-resolution free remote sensing data, making the detection of agriculture by high-
resolution satellites possible. Google Street View images were tested using a convolutional
neural network (CNN), with an overall accuracy of 83.3 % [15]. Laser weed control also
offers a new possibility for weed removal. A YOLOX convolutional neural network-based
weeding robot utilizes a blue laser to weed with a weed recognition rate of 88.94% [16].
Drones are considered to be more efficient than robotic or satellite acquisition because
they can rapidly collect field data at very high spatial resolution and at low cost [17–19].
The most widely used application is the use of drones, which are utilized to capture RGB
images and tested on a test set using SVM, KNN, AdaBoost and CNN, whose accuracies
for recognizing rice weeds are 89.75%, 85.58%, 90.25% and 92.41%, respectively [20].
Agronomy 2024, 14, 363 3 of 29
Agronomy 2024, 14, 363 SVM, KNN, AdaBoost and CNN, whose accuracies for recognizing rice weeds 3 ofare
28
89.75%, 85.58%, 90.25% and 92.41%, respectively [20].
This paper reviews the current state of research on applying deep learning to crop
and This
weedpaper reviewsfor
recognition thesmart
current state of research
agricultural on applying
equipment. There are deep learning
many to crop
previous re-
and
viewweed recognition
articles related to forthis
smart agricultural
topic. For example,equipment. There are many
Imran Zualkernan et al.previous review
[21] focused on
articles related to this topic. For example, Imran Zualkernan
new deep learning models and architectures for research using drone image data sinceet al. [21] focused on new
deep
2018.learning
Jiayou Shi models and presented
et al. [22] architectures for research
a thorough reviewusing drone
of the image and
methods dataapplications
since 2018.
Jiayou
related to crop row inspection in agricultural machinery navigation. They paid related
Shi et al. [22] presented a thorough review of the methods and applications special
to crop row
attention toinspection
sensors and in agricultural
systems used machinery
for crop rownavigation.
detection They paid special
in order attention
to validate their
to sensors
sensing andand systems capabilities
detection used for crop row
and, detection
thus, improve in order to validate
their sensing andtheir sensingcapa-
inspection and
detection capabilities and, thus, improve their sensing and inspection
bilities. Ana I. de Castro et al. [23] reviewed the sensor types, configurations and image capabilities. Ana
I.processing
de Castroalgorithms
et al. [23] reviewed
of UAVs for theagriculture
sensor types, andconfigurations and image
forestry applications. WenHaoprocessing
Su [14]
algorithms of UAVs for agriculture and forestry applications.
discussed RGB, hyperspectral and spot spectroscopy in sensors for crop and weed WenHao Su [14] discussed
iden-
RGB, hyperspectral
tification. However, and theyspot spectroscopy
did not in sensors for crop
provide a comprehensive and weed
introduction to identification.
intelligent ag-
However, they did notWe
ricultural equipment. provide
brieflya describe
comprehensive
the needintroduction
for intelligent to weed
intelligent agricultural
management and
equipment. We briefly describe the need for intelligent weed management and then present
then present aspects of weed control. Section 2 focuses on the image recognition steps
aspects of weed control. Section 2 focuses on the image recognition steps for smart devices,
for smart devices, including image collection, image preprocessing and feature extrac-
including image collection, image preprocessing and feature extraction. Section 3 describes
tion. Section 3 describes the application of using deep learning algorithmic models for
the application of using deep learning algorithmic models for recognizing weeds in smart
recognizing weeds in smart agricultural equipment. They mainly utilize convolutional
agricultural equipment. They mainly utilize convolutional neural networks (CNNs) and
neural networks (CNNs) and their variants, such as Faster RCNN [24], MTS-CNN [25],
their variants, such as Faster RCNN [24], MTS-CNN [25], FHGSO-based Deep CNN [26]
FHGSO-based Deep CNN [26] and DRCNN [27]. In addition to this, support vector ma-
and DRCNN [27]. In addition to this, support vector machine (SVM) is also heavily used,
chine (SVM) is also heavily used, mostly in agricultural equipment, such as tractors,
mostly in agricultural equipment, such as tractors, drones, etc. [4,6,24]. Most notably, a
drones, etc. [4,6,24]. Most notably, a Transformer Neural Network and its variants, for
Transformer Neural Network and its variants, for example, vit [28], Swin-DeepLab+ [29]
example, vit [28], Swin-DeepLab+ [29] and Deformable DETR, are used [30]. Moreover,
and Deformable DETR, are used [30]. Moreover, the vit model is a relatively new proposed
the vit model is a relatively new proposed model, which outperforms some advanced
model, which outperforms some advanced models such as EfficientNet and ResNet, so this
modelshas
model such as EfficientNet
great potential [28]. and ResNet,
With so this
the rapid model has great
transformation potential [28].
of agricultural With the
landscapes,
rapid transformation of agricultural landscapes, driven by technological
driven by technological innovations, this review aims to synthesize the current state of the innovations,
thisinreview
art aims to of
the application synthesize the current smart
deep learning-based state of the art inequipment
agricultural the application
for weedof deep
and
learning-based smart agricultural equipment for weed and crop
crop differentiation. By elucidating state-of-the-art technologies, identifying research differentiation. Bygaps
elu-
cidating state-of-the-art technologies, identifying research gaps and
and suggesting potential directions for future research, this study aims to contribute to the suggesting potential
directions foroffuture
development research,
intelligent this study aims
and autonomous to contribute
systems to the farmers
that empower development
with theof tools
intel-
ligent and autonomous systems that empower farmers with the
to address weed management challenges, leading to sustainable and efficient agricultural tools to address weed
management challenges, leading to sustainable and efficient agricultural management.
management.
2. Weed
2. Weed Detection
Detection Using
UsingRemote
RemoteSensing
SensingTechnique
Technique
Theworkflow
The workflowof ofimage
imagerecognition
recognitionofofcrops
cropsand
andweeds
weedscancangenerally
generallybebedivided
dividedinto
in-
to four
four steps:
steps: image
image data data acquisition,
acquisition, preprocessing,
preprocessing, feature
feature extraction
extraction andand classification
classification of
of weeds
weeds andand crops
crops [31].[31].
TheThe specific
specific details
details are are shown
shown in Figure
in Figure 1. 1.
Figure1.1.General
Figure Generalworkflow
workflowof
ofimage
imageprocessing-based
processing-basedweed
weeddetection.
detection.
Agronomy 2024, 14, 363 4 of 28
method is necessary [42]. Multispectral imaging, with the advantage of light hardware and
faster calculation speed, is emerging as the successor to hyperspectral technology.
Thermal infrared sensors help to capture the temperature of the objects, generate
images and display the same based on the information collected. Infrared sensors and
optical lenses are used in thermal cameras to capture thermal energy [43]. The development
of higher-resolution thermal imaging systems compatible with unmanned aerial vehicles
(UAVs) has facilitated the practical application of thermal imaging in agriculture. The
use of thermal measurement, in conjunction with other sensor measurements, such as
hyperspectral, visible and optical distance, has proven to be more effective in field-scale
crop phenotyping [44]. When combined with deep learning, remote heat sensing technology
is able to recognize crops and weeds and crop stress assessment [45].
LiDAR, which stands for Light Detection and Ranging, is a highly advanced and
dependable sensor that has been widely used in the fields of crop row detection and robotic
navigation. This sensor is famous for its high precision, wide range and strong immunity
to interference [46]. LIDAR works on the principle that the transmitting system emits
visible or near-infrared light waves. These light waves are then reflected off the target
and detected by the receiving system. The data obtained are subsequently processed to
generate parametric information, including distance. LiDAR sensors have been utilized in
crop row detection to provide highly accurate and detailed 3D maps of crop canopies [47].
Additionally, LiDAR sensors have the capability to penetrate vegetation and capture ground
surface data, facilitating the detection of crop rows, even in densely vegetated fields [22].
LiDAR can be used in intensive agricultural scenarios.
2.2. Preprocessing
After acquiring data from various sources, it is essential to prepare the data for the
training, testing and validation of models. Raw data may not always be suitable for
deep learning (DL) models. Approaches for dataset preparation include the application
of various image processing techniques, data labeling, utilization of image enhancement
methods to augment the input data and introduce variations, as well as the generation of
Agronomy 2024, 14, 363 6 of 28
synthetic data for training. The commonly used image processing techniques are removal
of background, resizing of captured images, green component segmentation, removal
of motion blur, denoising, image enhancement, extraction of color vegetation indices
and alteration in color models [58]. Table 2 demonstrates the effect of different image
enhancement techniques on segmentation.
Input
Crop Methods Enhancement MIoU Reference
Representation
HE 92.75%
An encoder-decoder deep
Sugar beet PS-AC RGB 94.29% AICHEN WANG et al. [59]
learning network
DPE 93.50%
HE 94.80%
An encoder-decoder deep
Oilseed PS-AC RGB 95.80% AICHEN WANG et al. [59]
learning network
DPE 96.12%
Random rotation,
DeepLabv3+ Random flipping, 88.59%
Soybean Swin+DeepLabv3+ Random cropping, RGB 91.10% Yu, H., et al. [29]
Swin-DeepLab Adding Gaussian noise, 91.53%
and Increasing contrast
An algorithm proposed by 89%
Perspective Deformity
Sunflower Lopez, L.O., et al [41]. U-Net Multispectral 90% Lopez, L.O., et al. [41]
Correction Program
FPN 89%
HE—Histogram Equalization; PS-AC—PS Auto Contrast; DPE—Deep Photo Enhancer; RGB—red, green, blue;
Swin-DeepLab—Hierarchical Vision Transformer for Semantic Segmentation.
GLCM is a way to define the texture of images using the information of intensity
GLCM is aco-occur
values that way tospatially.
defineThe thetechnique
texture usedof images using derived
texture features the information
from a gray-of inte
valueslevel
thatco-occurrence matrix (GLCM).
co-occur spatially. TheThe next step was
technique used the texture
extractionfeatures
of four texture features
derived from a g
from GLCM. These features include contrast, correlation, energy and homogeneity, with
level co-occurrence matrix (GLCM). The next step was the extraction of four texture
73% accuracy using the Radial Basis Function (RBF) kernel in the support vector machine
tures from
(SVM)GLCM.
[66]. TheThese features
Gabor wavelet includeenables
transform contrast, correlation,
the analysis of image energy
scenes andboth homogen
in
with 73%
spatialaccuracy usingdomains.
and frequency the Radial Basis Function
It is important to note that(RBF) kernel transform
the wavelet in the support
of v
an image is a well-established multi-resolution filtering technique
machine (SVM) [66]. The Gabor wavelet transform enables the analysis of image sc for extracting texture
features. Each derived (preprocessed) image was filtered with a bank of Gabor wavelet
both infilters
spatial and frequency domains. It is important to note that the wavelet trans
computed with designated lower (Ul) and higher (Uh) frequencies selected to be 0.1
of an image is a well-established
and 0.5, respectively. Four levels of multi-resolution
orientation and ten levelsfiltering
of scaletechnique
were chosenfor [67].extracting
ture features.
Yajun Each
Chen etderived (preprocessed)
al. [3] identified image comprising
six texture features, was filtered with aofbank
a histogram ori- of G
ented gradient features, rotation-invariant local binary pattern
wavelet filters computed with designated lower (Ul) and higher (Uh) frequencies se (LBP) feature, Hu invariant
moment feature, Gabor feature, gray-level co-occurrence matrix, and gray-level-gradient
ed to be 0.1 and 0.5,
co-occurrence respectively.
matrix. Four
These six feature levels of
descriptors wereorientation
combined to and createten
a setlevels of scale
of 18 fea-
chosenture[67].
combinations. For the problem of image size normalization, they proposed a strategy
that kept
Yajun Chenthe shape
et al.of [3]
the leaves unchanged
identified and supplemented
six texture features,0 pixels in the blank
comprising area of
a histogram o
the normalized size. Lei Zhang et al. [68] proposed a weed recognition method for support
ented gradient features, rotation-invariant local binary pattern (LBP) feature, Hu in
vector machines using any combination of three sets of texture features, including oriented
ant moment feature, features,
gradient histogram Gabor rotation-invariant
feature, gray-level co-occurrence
local binary pattern (LBP) matrix,
features,andand gray-l
grayscale co-occurrence matrix (GLCM). The application of six different texture features for
weed identification is enumerated in Table 2. For hybrid feature extraction, the accuracy
obtained using machine learning is greater than single feature extraction, and the accuracy
of deep learning is greater than machine learning. For the study of deep learning in crop
and weed recognition, hybrid texture features can be utilized.
Agronomy 2024, 14, 363 9 of 28
Figure
Figure 3. From left
3. From left to
to right:
right: line
line detection
detection in
in bean
bean (a)
(a) and
and spinach
spinach (b)
(b) fields.
fields. Detected
Detected lines
lines are
are in
in blue.
blue. In the spinach field, inter-row distance and the crop row orientation are not regular.
In the spinach field, inter-row distance and the crop row orientation are not regular. The detected The de-
tected lines
lines are are mainly
mainly locatedlocated
in theincenter
the center
of theof crop
the crop
rows rows
[17].[17].
2.3.3.TheSpectral
HoughFeature
transform is a widely employed method for identifying linear features
in an Spectroscopy
image. It worksisby representing
used to acquirea straight
spectralline as a spike over
information in parameter space, where
a wide spectral range, in
Agronomy 2024, 14, 363 11 of 29
the parameters correspond to the characteristics of the
which specific frequencies of vibrations can be perceived that match the jumpline. In addition, the energy
linear of a
Hough transform
key or group. can be utilized
Spectroscopy forcategorized
is also detecting orinanalyzing
many ways; arbitrary (non-parametric)
the common ones are Point
curves by examining the shape of peaks or their locations
Spectroscopy, RGB and hyperspectral imaging, fluorescence spectroscopy and in the parameter space [74].
multispectral
multispectral
Imaging. The theoretical basis for using spectral detection is that weed rivalry leadsri-
Teplyakov et al. Imaging. The
[75] proposed theoretical basis
a lightweight for using
Artificial spectral
Neural detection
Network is
for line that weed
detection
to
valry
changes leads
with several to changes
convolutional
in plant in layers
physiology plant
that physiology
and a fast Houghthat alter
alter light-absorbing light-absorbing
transform
and
layer that can
crown reflectanceand
be crown
trained reflec-
in
properties [37].
an end-to-end
tance manner. They proposed to use of fast Hough fortransform (FHT)and with
Figureproperties
4 shows the [37]. Figure
regions of4interest
shows thecorn
for regions
seedlings interest
and weeds corn
onseedlings
hyperspectral weeds
images
O(N2logN)
on complexity.
hyperspectral images TheandFHT theapproximated
corresponding the lines
average with dyadic
spectral patterns
curves. Tableand 4uti-
shows
and the corresponding average spectral curves. Table 4 shows shows a sample image of
lized an
shows efficient
a sample solution
image for summation.
of hawkweed In complex
flowersrecognition.
based backgrounds,
on spatial the model of
feature recognition.
hawkweed flowers based on spatial feature
YOLOv5s was more accurate than the detection of Hoff variations and was faster. In or-
der to solve the problems of large memory overhead, long time consumption and low
recognition accuracy of offset Hough transform, slam, N. et al. [76] proposed an efficient
circle localization algorithm based on multi-resolution segmentation (two-step opti-
mized Hough transform). First, the target circle was obtained by adaptive image prepro-
cessing to determine the location of the effective search area. Then, high-quality images
were separated by shape quality inspection to be used as accurate data sources. Finally,
the location accuracy is improved to the sub-pixel level using least squares circle fitting.
The effects of burrs, misalignments, defects and contamination are also reduced. The ex-
traction of spatial features can also be used as an auxiliary recognition criterion when
the UAV is flying overhead.
2.3.3.
Figure
Figure Spectral
4.4. SampleFeature
Sample imagesof
images ofhawkweed
hawkweedflowersh
flowershbased
basedon
on spectral
spectral feature
feature recognition.
recognition. (a)
(a)Actual
Actual
multispectral
multispectral
Spectroscopyimage;
image;is(b)
(b)Prediction
usedPredictionresult;
result;
to acquire (c)
(c)Prediction
spectral Prediction results
results
information overare
a overlayed
are overlayed
wide with actual
withrange,
spectral image
actualinimage
(EPSG:4326—WGS
(EPSG:4326—WGS
which 84)
84) [40].
specific frequencies[40]. of vibrations can be perceived that match the jump energy of
a key or group. Spectroscopy is also categorized in many ways; the common ones are
Islam
PointIslam et al.
al. [77]
Spectroscopy, employed
[77]RGB
employedRGB
RGBimages
images
and hyperspectralcaptured byfluorescence
captured
imaging, RGB cameras
by RGB mounted
cameras onand
mounted
spectroscopy a drone.
on a
They extracted
drone. the reflectance
They extracted of red,of
the reflectance green
red, and blue
green andbands and subsequently
blue bands calculated
and subsequently cal-
culated vegetation indices, including normalized red band, normalized green band and
normalized blue band. The purpose of this normalization was to reduce the effects of
different lighting conditions on the color channels. Moreover, in addition to RGB data,
Fawakherji et al. [78] took into account near-infrared (NIR) information, generating four
channel multispectral synthetic images. They extracted the plant cover from the entire
Agronomy 2024, 14, 363 11 of 28
vegetation indices, including normalized red band, normalized green band and normalized
blue band. The purpose of this normalization was to reduce the effects of different lighting
conditions on the color channels. Moreover, in addition to RGB data, Fawakherji et al. [78]
took into account near-infrared (NIR) information, generating four channel multispectral
synthetic images. They extracted the plant cover from the entire image cover. The plant
cover was a binary image where the plant pixels to be learned were set to 1, and the
other pixels were set to 0. The plant cover was then mapped to a realistic multispectral
image, and the resulting image was used for data enhancement. The use of an NIR channel
helps to enhance the accuracy of the activity for which vegetation inspection is required.
Photosynthesis in healthy green plants leads to the absorption of more solar energy in
the visible spectrum, resulting in a low reflectance level in the RGB channels. Similarly,
the reflectance of the NIR spectrum is affected by the same phenomena with opposite
results, with a high reflectance level in the NIR channel, where generally 10% or less of
radiation is absorbed [78,79]. Jinya Su et al. [38] studied that the triangular greenness index
(TGI) consisting of green-NIR was the most discriminative SI. Its recognition accuracy was
93.0%. Utilizing thermal measurements in conjunction with other sensor data, such as
hyperspectral, visible and optical distance, has demonstrated increased effectiveness in
field-scale crop phenotyping [80–82].
Figure5.
Figure 5. Pixelated
Pixelated segmentation
segmentation of
of green
greenplant
plantleaves
leaves[85].
[85].
Color features are extracted from the pixels of images, with advantages of stable
features after rotation, scale and translation changes [86]. Weeds and crop seedlings
are the same green color. It is difficult to distinguish them by color alone [30]. The
extraction of color features requires the use of color moments, which provide unique
features for distinguishing objects based on their color. Color moments are founded on
the probability distribution of image intensities, characterized by statistical moments, like
mean, variance and skewness. These three are the central moments of intensity distribution
and can be easily found for all color spaces, such as RGB, HSV and L*a*b [6]. Apart from
these color features, there are other shape descriptors/features proposed by researchers.
Tannouche et al. [87] used a region-based adjacencies descriptor to discriminate between
Dicot and Monocot weeds. The proposed descriptor calculated two numbers of adjacencies
between a given original pixel and their adjacent pixels. The first was the number of
horizontal and vertical adjacencies, and the second one was the number of diagonal
adjacencies. Shape factors that were generated by transformations typically required
the use of information about the boundaries or contours of the segmented region and
required complex calculations, compared with region-based shape measurements and
indices. Therefore, they are often referred to as region-based shape descriptors. Hu’s
moment invariants (MIs) are popular shape descriptors, which are normalized functions
created based on the information of both shape boundary and interior region [88]. Weed
detection using machine vision relies on features, like plant color, leaf texture, shape
and patterns. Drought stress can impact leaf color and morphological features in plants,
potentially affecting the reliability of machine vision-based weed detection [89]. But they
still lack universal segmentation capabilities for different crop varieties with varying leaf
shapes and canopy structures. Designing a universal 3D segmentation method for different
varieties at multiple growth stages is the current research frontier of plant phenotyping [90].
Biomorphic feature extraction has the advantages of strong interpretability, high stability
and wide versatility in weed recognition, and it is especially suitable for scenarios that
require the identification of different types of plants.
Agronomy 2024, 14, 363 13 of 28
In deep learning, hybrid feature extraction refers to the simultaneous use of multiple
levels, sources or types of features for model training and recognition. Different levels and
types of features contain different levels of abstraction and semantic information. Hybrid
feature extraction captures this diverse information and enables the model to represent
the input data more richly. Single feature extraction may ignore or lose some critical
information. Using features from multiple sources can make the model more robust and
better adaptable to variations and noise in the input data.
tasks, the attention mechanism is not always necessarily superior to the traditional deep
neural network structure but, rather, the appropriate model structure should be selected
according to the specific application scenario and task requirements.
Table 5. Cont.
Taskeen Ashraf et al. [66] sought to classify images based on grass density into three
classes. The first approach utilized texture features extracted from the gray-level co-
occurrence matrix (GLCM) with a Radial Basis Function (RBF) kernel in a support vector
machine (SVM), achieving an accuracy of 73%. Another technique employed scale and
rotation-invariant moments to classify grass density. The second technique outperformed
the first, achieving an accuracy of 86% with a Random Forest classifier. This kind of
quantitative agricultural spraying for different densities of weeds can effectively reduce
the use of pesticides. To improve weed recognition, some scientists have combined ma-
chine learning with deep learning. Tao T.et al. [99] proposed a deep convolutional neural
network with a support vector machine classifier aimed at improving the classification
accuracy of winter oilseed rape seeding and field weeds. They used a VGG network model
with true-color images (224 × 224 pixels) of oilseed rape/weeds as input. The proposed
VGG-SVM model obtained a higher classification accuracy, greater robustness and real
time. Borja Espejo-Garcia et al. [54] proposed a novel crop/weed identification system.
The method involved fine-tuning pre-trained convolutional networks, such as Xception,
Inception-Resnet, VGNets, Mobilenet and Densenet. These networks were combined with
“traditional” machine learning classifiers, like support vector machines, XGBoost, and Lo-
gistic Regression. These classifiers were trained with features extracted from deep learning
models. The aim of this approach was to prevent overfitting and achieve a robust and
consistent performance. Attention mechanisms have become increasingly popular in recent
years and can greatly increase the rate of recognition. Helong Yu et al. [29] introduced a
soybean field weed recognition model named Swin-DeepLab. This model was built upon
an enhanced DeepLabv3+ model, incorporating a Swin transformer as the feature extraction
backbone. Furthermore, a convolution block attention module (CBAM) was integrated after
each feature fusion to improve the model’s utilization of focused information within the
feature maps. The proposed network can further address the problem of weed recognition
in intensive agricultural scenarios.
(a)
(b)
Figure 6. Unmanned aerial
Figurevehicles used inaerial
6. Unmanned agriculture.
vehicles(a) Unmanned
used aerial (a)
in agriculture. vehicle (UAV) aerial
Unmanned hy- vehicle (UAV) hyper-
perspectral imaging system [120].
spectral (b) Drone
imaging spraying
system [120]. of
(b)pesticides.
Drone spraying of pesticides.
Hile Narmilan
3.2.4. Application of Agricultural Amarasingam
Robotics et al. [40] studied the potential of machine learning (ML)
for Weed Recognition
algorithms for the detection of mouse-ear grass leaves and flowers from multispectral
Agricultural robots represent an important trend in modern agricultural automa-
(MS) images acquired by unmanned aerial vehicles (UAVs) at different spatial resolutions
tion. By combining machines, sensors and autonomous navigation technologies, they are
and compared different machine learning. The highest machine learning recognition was
revolutionizing agricultural production. Agricultural robots can include modified trac-
tors, small ground robots and aerial robots [13]. Modern agricultural equipment inte-
grates advanced technologies, such as artificial intelligence, navigation, sensing systems
and communication, to increase agricultural productivity and promote smart agriculture
[22,122,123]. Among the information, navigation data, image recognition data, etc., re-
Agronomy 2024, 14, 363 18 of 28
achieved with 100% accuracy. Jinya Su et al. [38] analyzed and mapped blackgrass in
wheat fields by incorporating unmanned aerial vehicles (UAVs), multispectral imagery
and machine learning techniques. Eighteen widely used techniques were produced from
five raw spectral bands. Various feature selection algorithms were then used to refine
the simplicity and experience interpretation of the model. The selection of these raw
spectral segments and the selection of vegetation indices (VIs) were important for weed
identification in multispectral images. Mohd Anul Haq et al. [103] proposed a novel
CNNLVQ model to detect weeds in soybean crop images and distinguish between grassy
weeds and broadleaf weeds. The uniqueness of their study lies in the development of this
innovative CNNLVQ model, meticulous hyperparameter optimization and the utilization
of authentic datasets. Faster R-CNN stands out as a deep learning approach incorporating
a region proposal network (RPN). This network, formed by merging convolutional features
with a classification network, facilitates training and testing through a seamless process.
It results in a fast detection rate and outperforms other conventional object detection
methods. Shahbaz Khan et al. [24] optimized the architecture of the traditional Faster-R-
CNN. Residual Network 101 (ResNet-101) was deployed as a convolutional neural network
instead of the normally used Visual Geometry Group 16 (VGG16). Anchors are classified
using a traditional SoftMax classifier. In addition, Saad Abouzahir et al. [102] used HOG
blocks as key points to generate visual words based on the Bag of Visual Words (BOVW)
method and feature vectors as histograms of these visual words. And a backpropagation
neural network was used to detect weeds and classify plants from three different crop fields
(sugar beet, carrot, soybean). The algorithm had 97.7%, 93% and 96.6% accuracy in weed
and crop differentiation.
Drones have an important role in identifying weeds in fields and spraying pesticides in
real time. Shahbaz KhanI et al. [116] developed a deep learning-based real-time recognition
system for drones. The capability of the system is achieved through a two-step process
where the target recognizer part is based on a CNN model. The developed deep learning
system achieved an average F1 score of 0.955, while the classifier recognition average
computation time was 3.68 ms. This deep learning model can effectively solve the problem
of real-time pesticide spraying by UAVs to recognize weeds. Meanwhile, Gunasekaran
Raja et al. [71] proposed a UAV-assisted weed detection method using a modified multi-
channel gray-scale covariance matrix (GLCM-M) and normalized difference index with red
threshold (NDIRT) index (DA-WDGN) to assist the weed detection process. In DA-WDGN,
the UAV incorporates information and communication techniques to capture far-field data
and accurately detect weeds. The accurate detection of weeds limits the need for pesticides
and helps to protect the environment. Reenul Reedha et al. [28] investigated a Visual
Transformer (ViT) and applied it to plant classification in unmanned aerial vehicle (UAV)
images. They utilized the strategy of migration algorithms to increase the effectiveness of
the test set while reducing the training set. The ViT algorithm is able to efficiently process
large-scale image data, thus better adapting to the large number of images produced by
UAVs in aerial photography. This efficient image processing capability helps to improve
the speed and accuracy of weed identification. Moreover, the ViT algorithm is based on the
self-attention mechanism, which is able to capture global information in the image and not
only limited to local features. This feature gives ViT and UAVs a huge advantage in the
future development of weed recognition.
(a)
(b)
(c)
(d)
Figure 7. (a) An autonomous
Figure 7.agricultural robot for agricultural
(a) An autonomous weed removal uses
robot for[130].
weed(b) Precision
removal Agricul-
uses [130]. (b) Precision Agricul-
tural Sprayer [6]. (c) YOLOX-based blue laser cornfield weeding robot [16]. (d) Components
tural Sprayer [6]. (c) YOLOX-based blue laser cornfield weeding robot [16]. of the
(d) Components of the
modular agrochemical precision sprayer mounted on a push-type frame [107].
modular agrochemical precision sprayer mounted on a push-type frame [107].
4. Discussion
In the context of the development of artificial intelligence, smart agriculture is the
development direction of a large agricultural country. The development of intelligent ag-
riculture is inseparable from the development of intelligent agricultural equipment. In
recent years, agricultural robots, agricultural drones, satellites and other booming devel-
opments for the development of intelligent agriculture have provided a new program.
The development of all three types of smart agricultural equipment is the mainstream of
Agronomy 2024, 14, 363 21 of 28
4. Discussion
In the context of the development of artificial intelligence, smart agriculture is the
development direction of a large agricultural country. The development of intelligent
agriculture is inseparable from the development of intelligent agricultural equipment.
In recent years, agricultural robots, agricultural drones, satellites and other booming
developments for the development of intelligent agriculture have provided a new program.
The development of all three types of smart agricultural equipment is the mainstream of
the future, and all have great potential for application in the development of smart farms.
Satellites, as part of smart farming equipment, play an important role in delineating farm
boundaries for effective farm management. However, it is slightly lacking in weed and
crop identification. Waldner, F. et al. proposed a method to facilitate the extraction of site
boundaries from satellite images [113]. The use of satellite technology to segment and
monitor sites in agriculture has a number of benefits that can help farmers to plan land use
more accurately. This includes identifying the most suitable locations for specific crops,
avoiding overuse of land and increasing the sustainable utilization of agricultural land.
And it allows for better allocation of resources, such as water, fertilizers and pesticides,
reducing waste of resources and environmental pollution. As an overhead drone, it has an
integral role in smart agriculture. For example, high-resolution image acquisition provides
a dataset for the training and learning of deep learning algorithms; the detection and
identification of crops using sensors allow for the precise application of medicine and
irrigation [24,38]. Combining deep learning with drones allows for weed crop identification
and targeted pesticide spraying. Based on RGB camera sensing, CNN has more than
92% accuracy in weed recognition. It has a higher accuracy rate compared to machine
learning. And the results shown by Vit provide the possibility of real-time recognition of
pesticide spraying by drones in the future. Weeds can be dealt with more efficiently and
with less wastage of resources. Deep learning-based agricultural robots are essentially 95%
accurate in weed recognition. The use of agricultural robots in agriculture is not only in
data collection and weed identification and processing, as it also allows for precise picking
and harvesting of crops. Overall, the combination of deep learning and smart agricultural
equipment has been widely used in weed/crop identification research. In smart agriculture
scenarios, deep learning has been used to solve the problem of crop and weed identification.
Deep learning has four steps in weed/crop detection: data collection, dataset preparation,
weed detection and weed/crop localization and classification. First of all, for the collection
of datasets, with the help of intelligent agricultural equipment, the collection of images
is no longer a problem. Moreover, a variety of sensors have improved the quality of
image acquisition. Multispectral cameras have some advantages over RGB cameras and
hyperspectral cameras in that they can improve more spectral bands than RGB cameras
and are cheaper than hyperspectral cameras, which can be utilized in smart agriculture to
reduce the cost and improve the quality of collected images [38,39]. Thermal measurements
from thermal infrared sensors can complement measurements from other sensors, such as
hyperspectral, visible and optical distance, and have also been shown to be more effective
in field crop phenotyping [44]. For training datasets, manual labeling by researchers is
still required, which is a very labor-intensive task. However, semi-supervised learning
algorithms and unsupervised learning algorithms are a worthwhile solution for the future,
as they can perform labeling during iterations, greatly reducing the human workload.
Feature extraction of weeds and crops is an important part of the recognition process, and
the main features are texture features, spectral features, spatial features and biomorphic
features. All four features have a great role in weed recognition by deep learning, but the
current trend in recognition is hybrid feature extraction of spectral features, texture features
and biomorphic features. The similarity between weeds and crops makes using a single
image feature to detect weeds and crops almost impossible. The commonly used image
features can achieve the purpose of weed detection, but the experimental accuracy is low,
and the stability is poor in a nonideal environment due to the complex interference factors
in the actual field. Acquired images need to be preprocessed for better recognition and
Agronomy 2024, 14, 363 22 of 28
classification. The scientists segmented the crop and background by threshold segmentation
and color segmentation and performed noise reduction on the images [34,131].
The performance of different deep learning algorithm models in weed/crop identi-
fication is influenced by a variety of factors. The main factor is the network structure. In
general, lightweight CNN models are less accurate in weed recognition compared to CNN
models. However, lightweight CNN models are usually designed to be more concise, using
fewer parameters and computational resources, and they require relatively less memory
space [104,109]. Some of the lightweighting techniques include network pruning, quan-
tization and depth-separable convolution, which aim to minimize the size of the model
while maximizing the retention of its representational power. Due to the performance
improvement in the Faster R-CNN architecture, it is possible to perform target detection,
image classification and instance segmentation simultaneously in a single neural network.
The researcher improved the Mask R-CNN by adding an attention mechanism and deep
separable convolution. This approach improves the model’s ability to represent weed-
related features and reduces the number of model parameters, increasing computational
speed [132]. In addition to this, the performance of deep learning algorithms is greatly
influenced by the training strategy used. The training strategy involves the training process
of the model, selection of hyperparameters, data augmentation, etc. For example, batch
normalization of deep learning models by some researchers accelerates training and im-
proves the generalization performance of the model [54]. In addition, the input dataset is
key to training deep learning models as it is the basic source of information. The accuracy
of deep learning is improved by data augmentation of sample images, as stated in Section 2
of this article. Algorithmic models such as Swin transformer and DeepLabv3+ also excel in
weed identification.
5. Challenges for Weed Recognition in Smart Farming Equipment and Future Trends
In terms of future development, the combination of sensor and drone technology
can effectively increase the efficiency of identification. Among the recent innovations, un-
manned aerial vehicles (UAVs) or drones have demonstrated their suitability for the timely
tracking and assessment of vegetation status due to several advantages, as follows: (1) They
can operate at low altitudes to provide aerial imagery with ultra-high spatial resolution,
allowing for the detection of fine details of vegetation. (2) The flights can be scheduled
with great flexibility according to critical moments imposed by vegetation progress over
time. (3) They can use diverse sensors and perception systems, acquiring different ranges of
the vegetation spectrum (visible, infrared, thermal). (4) This technology can also generate
digital surface models (DSMs) with three-dimensional (3D) measurements of vegetation by
using highly overlapping images and applying photoreconstruction procedures with the
structure-from-motion (SfM) technique [23,35,44].
The future of agricultural robotics promises more developments in weed removal:
(1) Increased intelligence and autonomy: Future agricultural robots will be more intelligent,
with highly autonomous decision-making capabilities. Combined with artificial intelligence
and deep learning technology, the robot can analyze farmland images and data in real time,
make intelligent weed identification and weeding decisions, without human intervention,
and improve operational efficiency. (2) The integration of multimodal sensing technology:
Agricultural robots will integrate a variety of sensors, including vision, infrared, ultrasonic
and other multimodal sensors, to obtain richer and more accurate information about the
farmland. This will help identify weeds more accurately and adapt to different farmland
environments. (3) Efficient and precise weeding technology: Future agricultural robots
will use more precise and efficient weeding technology. This will require more advanced
weeding systems and automated control technologies. Although laser mowing is currently
very advantageous, there are still issues to consider, such as whether mowing is safe and
whether it can cause fires [124,125].
Deep learning also faces several challenges in weed and crop recognition. First, due to
the small visual differences between weeds and crops, there are large similarities between
Agronomy 2024, 14, 363 23 of 28
categories, which leads to models that are prone to confusion. In addition, there are varia-
tions in weeds and crops such as growth stages and environmental differences, and the
models need to have good generalization capabilities to accommodate these variations [73].
In addition, datasets are costly to annotate, especially when collected and labeled in a large-
scale farmland environment. This poses certain difficulties in model training. To overcome
these challenges, future research can be expanded in the following aspects: First, further
improve the robustness and generalization ability of deep learning models, and design
more effective feature extraction methods and classification algorithms for the similarities
between weeds and crops. Second, develop larger-scale datasets containing samples from
different times, locations and farming conditions to enhance the generalization ability of
the model. At the same time, techniques such as augmented learning and transfer learning
are reasonably utilized to achieve better results with fewer data. In addition, combining
sensors and smart agricultural equipment technologies for the real-time identification of
weeds and crops contributes to intelligent and precise decision making in agricultural
production. Proper dosage of plant protection products is one of the key issues in agricul-
tural production. Using advanced sensor technology, crop growth can be monitored more
accurately. This technology allows for the timely dosing of weeds or diseases. Spraying the
right amount of insecticide will neither cause contamination by using too much nor reduce
crop yields by using too little.
6. Conclusions
This review concentrates on the forefront applications of intelligent agricultural equip-
ment, specifically emphasizing crop and weed identification, pivotal components in the
trajectory of smart agriculture. The integration of sensors into smart agricultural equipment
assumes a critical role in data acquisition, capturing extensive sets of high-dimensional
images that serve as foundational training data for deep learning algorithms. Various
preprocessing techniques are employed to refine the algorithmic processes, encompassing
noise reduction, background effect elimination and image resizing. Deep learning algo-
rithms emerge as powerful tools capable of analyzing complex, high-dimensional data with
distinct characteristics compared to the training set, facilitating accurate crop identification.
The adoption of hybrid feature extraction techniques underscores the inherent advantages
of leveraging multiple features in tandem, contributing significantly to the efficacy of weed
and crop identification processes. In the realm of machine learning and deep learning,
the attention mechanism stands out as a particularly valuable and promising learning
algorithm. Renowned for its high accuracy and expedited processing time, the attention
mechanism proves advantageous in the context of crop and weed identification. These
attributes position it as a formidable asset for smart agricultural equipment engaged in
real-time weeding operations within agricultural fields. The emphasis on attention mecha-
nisms reflects a forward-looking perspective, acknowledging their potential to augment
the efficiency and accuracy of smart agricultural practices, particularly in the domain of
weed management.
References
1. Murad, N.Y.; Mahmood, T.; Forkan, A.R.M.; Morshed, A.; Jayaraman, P.P.; Siddiqui, M.S. Weed Detection Using Deep Learning:
A Systematic Literature Review. Sensors 2023, 23, 3670. [CrossRef] [PubMed]
2. Hamuda, E.; Glavin, M.; Jones, E. A survey of image processing techniques for plant extraction and segmentation in the field.
Comput. Electron. Agric. 2016, 125, 184–199. [CrossRef]
3. Llewellyn, R.; Ronning, D.; Clarke, M.; Mayfield, A.; Walker, S.; Ouzman, J. Impact of Weeds in Australian Grain Production; Grains
Research and Development Corporation: Canberra, Australia, 2016.
4. Chen, Y.; Wu, Z.; Zhao, B.; Fan, C.; Shi, S. Weed and Corn Seedling Detection in Field Based on Multi Feature Fusion and Support
Vector Machine. Sensors 2021, 21, 212. [CrossRef]
5. Du, Y.; Zhang, G.; Tsang, D.; Jawed, M.K. Deep-CNN based Robotic Multi-Class Under-Canopy Weed Control in Precision
Farming. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Philadelphia, PA, USA, 23–27
May 2022; pp. 2273–2279.
6. Tufail, M.; Iqbal, J.; Tiwana, M.I.; Alam, M.S.; Khan, Z.A.; Khan, M.T. Identification of Tobacco Crop Based on Machine Learning
for a Precision Agricultural Sprayer. IEEE Access 2021, 9, 23814–23825. [CrossRef]
7. Lease, B.A.; Wong, W.K.; Gopal, L.; Chiong, W.R. Weed Pixel Level Classification Based on Evolving Feature Selection on Local
Binary Pattern with Shallow Network Classifier. In Proceedings of the 2nd International Conference on Materials Technology and
Energy (ICMTE), Curtin Univ Malaysia, Sarawak, Malaysia, 6–8 November 2020.
8. Mogili, U.M.R.; Deepak, B.B.V.L. Review on Application of Drone Systems in Precision Agriculture. In Proceedings of the 1st
International Conference on Robotics and Smart Manufacturing (RoSMa), Chennai, India, 19–21 July 2018; pp. 502–509.
9. Tataridas, A.; Kanatas, P.; Chatzigeorgiou, A.; Zannopoulos, S.; Travlos, I. Sustainable Crop and Weed Management in the Era of
the EU Green Deal: A Survival Guide. Agronomy 2022, 12, 589. [CrossRef]
10. Jeanmart, S.; Edmunds, A.J.F.; Lamberth, C.; Pouliot, M. Synthetic approaches to the 2010-2014 new agrochemicals. Bioorganic
Med. Chem. 2016, 24, 317–341. [CrossRef]
11. Eyre, M.D.; Critchley, C.N.R.; Leifert, C.; Wilcockson, S.J. Crop sequence, crop protection and fertility management effects on
weed cover in an organic/conventional farm management trial. Eur. J. Agron. 2011, 34, 153–162. [CrossRef]
12. Ampatzidis, Y.; De Bellis, L.; Luvisi, A. iPathology: Robotic Applications and Management of Plants and Plant Diseases.
Sustainability 2017, 9, 1010. [CrossRef]
13. Aravind, K.R.; Raja, P.; Perez-Ruiz, M. Task-based agricultural mobile robots in arable farming: A review. Span. J. Agric. Res. 2017,
15, e02R01-01. [CrossRef]
14. Su, W.-H. Advanced Machine Learning in Point Spectroscopy, RGB- and Hyperspectral-Imaging for Automatic Discriminations
of Crops and Weeds: A Review. Smart Cities 2020, 3, 767–792. [CrossRef]
15. Ringland, J.; Bohm, M.; Baek, S.-R. Characterization of food cultivation along roadside transects with Google Street View imagery
and deep learning. Comput. Electron. Agric. 2019, 158, 36–50. [CrossRef]
16. Zhu, H.B.; Zhang, Y.Y.; Mu, D.L.; Bai, L.Z.; Zhuang, H.; Li, H. YOLOX-based blue laser weeding robot in corn field. Front. Plant
Sci. 2022, 13, 1017803. [CrossRef]
17. Bah, M.D.; Hafiane, A.; Canals, R. Deep Learning with Unsupervised Data Labeling for Weed Detection in Line Crops in UAV
Images. Remote Sens. 2018, 10, 1690. [CrossRef]
18. Teimouri, N.; Dyrmann, M.; Nielsen, P.R.; Mathiassen, S.K.; Somerville, G.J.; Jorgensen, R.N. Weed Growth Stage Estimator Using
Deep Convolutional Neural Networks. Sensors 2018, 18, 1580. [CrossRef]
19. Oghaz, M.M.; Razaak, M.; Kerdegari, H.; Argyriou, V.; Remagnino, P. Scene and Environment Monitoring Using Aerial Imagery
and Deep Learning. In Proceedings of the 15th Annual International Conference on Distributed Computing in Sensor Systems
(DCOSS), Santorini, Greece, 29–31 May 2019; pp. 362–369.
20. Zhu, S.; Deng, J.; Zhang, Y.; Yang, C.; Yan, Z.; Xie, Y. Study on distribution map of weeds in rice field based on UAV remote
sensing. J. South China Agric. Univ. 2020, 41, 67–74. [CrossRef]
21. Zualkernan, I.; Abuhani, D.A.; Hussain, M.H.; Khan, J.; ElMohandes, M. Machine Learning for Precision Agriculture Using
Imagery from Unmanned Aerial Vehicles (UAVs): A Survey. Drones 2023, 7, 382. [CrossRef]
22. Shi, J.Y.; Bai, Y.H.; Diao, Z.H.; Zhou, J.; Yao, X.B.; Zhang, B.H. Row Detection BASED Navigation and Guidance for Agricultural
Robots and Autonomous Vehicles in Row-Crop Fields: Methods and Applications. Agronomy 2023, 13, 1780. [CrossRef]
23. de Castro, A.I.; Shi, Y.; Maja, J.M.; Pena, J.M. UAVs for Vegetation Monitoring: Overview and Recent Scientific Contributions.
Remote Sens. 2021, 13, 2139. [CrossRef]
24. Khan, S.; Tufail, M.; Khan, M.T.; Khan, Z.A.; Anwar, S. Deep learning-based identification system of weeds and crops in strawberry
and pea fields for a precision agriculture sprayer. Precis. Agric. 2021, 22, 1711–1727. [CrossRef]
25. Kim, Y.H.; Park, K.R. MTS-CNN: Multi-task semantic segmentation-convolutional neural network for detecting crops and weeds.
Comput. Electron. Agric. 2022, 199, 107146. [CrossRef]
26. Deepa, S.N.; Rasi, D. FHGSO: Flower Henry gas solubility optimization integrated deep convolutional neural network for image
classification. Appl. Intell. 2022, 53, 7278–7297. [CrossRef]
27. Babu, V.S.; Ram, N.V. Deep Residual CNN with Contrast Limited Adaptive Histogram Equalization for Weed Detection in
Soybean Crops. Trait. Du Signal 2022, 39, 717–722. [CrossRef]
Agronomy 2024, 14, 363 25 of 28
28. Reedha, R.; Dericquebourg, E.; Canals, R.; Hafiane, A. Transformer Neural Network for Weed and Crop Classification of High
Resolution UAV Images. Remote Sens. 2022, 14, 592. [CrossRef]
29. Yu, H.; Che, M.; Yu, H.; Zhang, J. Development of Weed Detection Method in Soybean Fields Utilizing Improved DeepLabv3+
Platform. Agronomy 2022, 12, 2889. [CrossRef]
30. Sun, Y.; Chen, Y.; Jin, X.; Yu, J.; Chen, Y. AI differentiation of bok choy seedlings from weeds. Fujian J. Agric. Sci. 2021, 36,
1484–1490. [CrossRef]
31. Wu, Z.N.; Chen, Y.J.; Zhao, B.; Kang, X.B.; Ding, Y.Y. Review of Weed Detection Methods Based on Computer Vision. Sensors 2021,
21, 3647. [CrossRef] [PubMed]
32. Xu, X.; Wang, L.; Shu, M.; Liang, X.; Ghafoor, A.Z.; Liu, Y.; Ma, Y.; Zhu, J. Detection and Counting of Maize Leaves Based on
Two-Stage Deep Learning with UAV-Based RGB Image. Remote Sens. 2022, 14, 5388. [CrossRef]
33. Fan, K.-J.; Su, W.-H. Applications of Fluorescence Spectroscopy, RGB- and MultiSpectral Imaging for Quality Determinations of
White Meat: A Review. Biosensors 2022, 12, 76. [CrossRef]
34. Li, Y.; Al-Sarayreh, M.; Irie, K.; Hackell, D.; Bourdot, G.; Reis, M.M.; Ghamkhar, K. Identification of Weeds Based on Hyperspectral
Imaging and Machine Learning. Front. Plant Sci. 2021, 11, 611622. [CrossRef]
35. Diao, Z.; Yan, J.; He, Z.; Zhao, S.; Guo, P. Corn seedling recognition algorithm based on hyperspectral image and lightweight-3D-
CNN. Comput. Electron. Agric. 2022, 201, 107343. [CrossRef]
36. Dashti, H.; Glenn, N.F.; Ustin, S.; Mitchell, J.J.; Qi, Y.; Ilangakoon, N.T.; Flores, A.N.; Luis Silvan-Cardenas, J.; Zhao, K.; Spaete,
L.P.; et al. Empirical Methods for Remote Sensing of Nitrogen in Drylands May Lead to Unreliable Interpretation of Ecosystem
Function. IEEE Trans. Geosci. Remote Sens. 2019, 57, 3993–4004. [CrossRef]
37. Lou, Z.; Quan, L.; Sun, D.; Li, H.; Xia, F. Hyperspectral remote sensing to assess weed competitiveness in maize farmland
ecosystems. Sci. Total Environ. 2022, 844, 157071. [CrossRef] [PubMed]
38. Su, J.; Yi, D.; Coombes, M.; Liu, C.; Zhai, X.; McDonald-Maier, K.; Chen, W.-H. Spectral analysis and mapping of blackgrass weed
by leveraging machine learning and UAV multispectral imagery. Comput. Electron. Agric. 2022, 192, 106621. [CrossRef]
39. Su, J.; Coombes, M.; Liu, C.; Zhu, Y.; Song, X.; Fang, S.; Guo, L.; Chen, W.H. Machine Learning-Based Crop Drought Mapping
System by UAV Remote Sensing RGB Imagery. Unmanned Syst. 2020, 8, 71–83. [CrossRef]
40. Amarasingam, N.; Hamilton, M.; Kelly, J.E.; Zheng, L.; Sandino, J.; Gonzalez, F.; Dehaan, R.L.; Cherry, H. Autonomous Detection
of Mouse-Ear Hawkweed Using Drones, Multispectral Imagery and Supervised Machine Learning. Remote Sens. 2023, 15, 1633.
[CrossRef]
41. Lopez, L.O.; Ortega, G.; Aguera-Vega, F.; Carvajal-Ramirez, F.; Martinez-Carricondo, P.; Garzon, E.M. Multispectral Imaging for
Weed Identification in Herbicides Testing. Informatica 2022, 33, 771–793. [CrossRef]
42. Aguera-Vega, F.; Aguera-Puntas, M.; Aguera-Vega, J.; Martinez-Carricondo, P.; Carvajal-Ramirez, F. Multi-sensor imagery
rectification and registration for herbicide testing. Measurement 2021, 175, 109049. [CrossRef]
43. Allred, B.; Martinez, L.; Fessehazion, M.K.; Rouse, G.; Williamson, T.N.; Wishart, D.; Koganti, T.; Freeland, R.; Eash, N.; Batschelet,
A.; et al. Overall results and key findings on the use of UAV visible-color, multispectral, and thermal infrared imagery to map
agricultural drainage pipes. Agric. Water Manag. 2020, 232, 106036. [CrossRef]
44. Eide, A.; Koparan, C.; Zhang, Y.; Ostlie, M.; Howatt, K.; Sun, X. UAV-Assisted Thermal Infrared and Multispectral Imaging of
Weed Canopies for Glyphosate Resistance Detection. Remote Sens. 2021, 13, 4606. [CrossRef]
45. Pineda, M.; Baron, M.; Perez-Bueno, M.L. Thermal Imaging for Plant Stress Detection and Phenotyping. Remote Sens. 2021, 13, 68.
[CrossRef]
46. Wang, X.; Pan, H.; Guo, K.; Yang, X.; Luo, S. The evolution of LiDAR and its application in high precision measurement. IOP Conf.
Ser. Earth Environ. Sci. 2020, 502, 012008. [CrossRef]
47. Moreno, H.; Valero, C.; Bengochea-Guevara, J.M.; Ribeiro, A.; Garrido-Izard, M.; Andujar, D. On-Ground Vineyard Reconstruction
Using a LiDAR-Based Automated System. Sensors 2020, 20, 1102. [CrossRef] [PubMed]
48. Sudars, K.; Jasko, J.; Namatevs, I.; Ozola, L.; Badaukis, N. Dataset of annotated food crops and weed images for robotic computer
vision control. Data Brief 2020, 31, 105833. [CrossRef]
49. Olsen, A.; Konovalov, D.A.; Philippa, B.; Ridd, P.; Wood, J.C.; Johns, J.; Banks, W.; Girgenti, B.; Kenny, O.; Whinney, J.; et al.
DeepWeeds: A Multiclass Weed Species Image Dataset for Deep Learning. Sci. Rep. 2019, 9, 2058. [CrossRef]
50. Jiang, H.H.; Zhang, C.Y.; Qiao, Y.L.; Zhang, Z.; Zhang, W.J.; Song, C.Q. CNN feature based graph convolutional network for weed
and crop recognition in smart farming. Comput. Electron. Agric. 2020, 174, 105450. [CrossRef]
51. Sa, I.; Chen, Z.T.; Popovic, M.; Khanna, R.; Liebisch, F.; Nieto, J.; Siegwart, R. weedNet: Dense Semantic Weed Classification
Using Multispectral Images and MAV for Smart Farming. IEEE Robot. Autom. Lett. 2018, 3, 588–595. [CrossRef]
52. Binch, A.; Fox, C.W. Controlled comparison of machine vision algorithms for Rumex and Urtica detection in grassland. Comput.
Electron. Agric. 2017, 140, 123–138. [CrossRef]
53. Osorio, K.; Puerto, A.; Pedraza, C.; Jamaica, D.; Rodriguez, L. A Deep Learning Approach for Weed Detection in Lettuce Crops
Using Multispectral Images. Agriengineering 2020, 2, 471–488. [CrossRef]
54. Espejo-Garcia, B.; Mylonas, N.; Athanasakos, L.; Fountas, S.; Vasilakoglou, I. Towards weeds identification assistance through
transfer learning. Comput. Electron. Agric. 2020, 171, 105306. [CrossRef]
Agronomy 2024, 14, 363 26 of 28
55. Alam, M.S.; Alam, M.; Tufail, M.; Khan, M.U.; Guenes, A.; Salah, B.; Nasir, F.E.; Saleem, W.; Khan, M.T. TobSet: A New Tobacco
Crop and Weeds Image Dataset and Its Utilization for Vision-Based Spraying by Agricultural Robots. Appl. Sci. 2022, 12, 1308.
[CrossRef]
56. Champ, J.; Mora-Fallas, A.; Goeau, H.; Mata-Montero, E.; Bonnet, P.; Joly, A. Instance segmentation for the fine detection of crop
and weed plants by precision agricultural robots. Appl. Plant Sci. 2020, 8, e11373. [CrossRef]
57. Di Cicco, M.; Potena, C.; Grisetti, G.; Pretto, A. Automatic Model Based Dataset Generation for Fast and Accurate Crop and
Weeds Detection. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)/Workshop
on Machine Learning Methods for High-Level Cognitive Capabilities in Robotics, Vancouver, BC, Canada, 24–28 September 2017;
pp. 5188–5195.
58. Hasan, A.; Sohel, F.; Diepeveen, D.; Laga, H.; Jones, M.G.K. A survey of deep learning techniques for weed detection from images.
Comput. Electron. Agric. 2021, 184, 106067. [CrossRef]
59. Wang, A.; Xu, Y.; Wei, X.; Cui, B. Semantic Segmentation of Crop and Weed using an Encoder-Decoder Network and Image
Enhancement Method under Uncontrolled Outdoor Illumination. IEEE Access 2020, 8, 81724–81734. [CrossRef]
60. Ramirez, W.; Achanccaray, P.; Mendoza, L.F.; Pacheco, M.A.C. Deep Convolutional Neural Networks For Weed Detection in
Agricultural Crops Using Optical Aerial Images. In Proceedings of the IEEE Latin American GRSS and ISPRS Remote Sensing
Conference (LAGIRS), Santiago, Chile, 21–26 March 2020; pp. 133–137.
61. Vypirailenko, D.; Kiseleva, E.; Shadrin, D.; Pukalchik, M. Deep learning techniques for enhancement of weeds growth classification.
In Proceedings of the IEEE International Instrumentation and Measurement Technology Conference (I2MTC), Glasgow, UK,
17–20 May 2021.
62. Gee, C.; Denimal, E. RGB Image-Derived Indicators for Spatial Assessment of the Impact of Broadleaf Weeds on Wheat Biomass.
Remote Sens. 2020, 12, 2982. [CrossRef]
63. Slaughter, D.C. The Biological Engineer: Sensing the Difference Between Crops and Weeds. In Automation: The Future of Weed
Control in Cropping Systems; Young, S.L., Pierce, F.J., Eds.; Springer: Dordrecht, The Netherlands, 2014; pp. 71–95. [CrossRef]
64. Al-Badri, A.H.; Ismail, N.A.; Al-Dulaimi, K.; Salman, G.A.; Khan, A.R.; Al-Sabaawi, A.; Salam, M.S.H. Classification of weed
using machine learning techniques: A review-challenges, current and future potential techniques. J. Plant Dis. Prot. 2022, 129,
745–768. [CrossRef]
65. Cimpoi, M.; Maji, S.; Kokkinos, I.; Vedaldi, A. Deep Filter Banks for Texture Recognition, Description, and Segmentation. Int. J.
Comput. Vis. 2016, 118, 65–94. [CrossRef] [PubMed]
66. Ashraf, T.; Khan, Y.N. Weed density classification in rice crop using computer vision. Comput. Electron. Agric. 2020, 175, 105590.
[CrossRef]
67. Ayalew, G.; Zaman, Q.U.; Schumann, A.W.; Percival, D.C.; Chang, Y. An investigation into the potential of Gabor wavelet features
for scene classification in wild blueberry fields. Artif. Intell. Agric. 2021, 5, 72–81. [CrossRef]
68. Zhang, L.; Zhang, Z.; Wu, C.; Sun, L. Segmentation algorithm for overlap recognition of seedling lettuce and weeds based on
SVM and image blocking. Comput. Electron. Agric. 2022, 201. [CrossRef]
69. Miao, R.; Yang, H.; Wu, J.; Liu, H. Weed identification of overlapping spinach leaves based on image sub-block and reconstruction.
Trans. Chin. Soc. Agric. Eng. 2020, 36, 178–184.
70. Vi Nguyen Thanh, L.; Ahderom, S.; Alameh, K. Performances of the LBP Based Algorithm over CNN Models for Detecting Crops
and Weeds with Similar Morphologies. Sensors 2020, 20, 2193. [CrossRef]
71. Raja, G.; Dev, K.; Philips, N.D.; Suhaib, S.A.M.; Deepakraj, M.; Ramasamy, R.K. DA-WDGN: Drone-Assisted Weed Detection
using GLCM-M features and NDIRT indices. In Proceedings of the IEEE Conference on Computer Communications Workshops
(IEEE INFOCOM), Vancouver, BC, Canada, 9–12 May 2021.
72. Zaman, M.H.M.; Mustaza, S.M.; Ibrahim, M.F.; Zulkifley, M.A.; Mustafa, M.M. Weed Classification Based on Statistical Features
from Gabor Transform Magnitude. In Proceedings of the International Conference on Decision Aid Sciences and Application
(DASA), Sakheer, Bahrain, 7–8 December 2021.
73. Wang, A.; Zhang, W.; Wei, X. A review on weed detection using ground-based machine vision and image processing techniques.
Comput. Electron. Agric. 2019, 158, 226–240. [CrossRef]
74. Bailey, D.; Chang, Y.; Le Moan, S. Analysing Arbitrary Curves from the Line Hough Transform. J. Imaging 2020, 6, 26. [CrossRef]
[PubMed]
75. Teplyakov, L.; Kaymakov, K.; Shvets, E.; Nikolaev, D. Line detection via a lightweight CNN with a Hough Layer. In Proceedings
of the 13th International Conference on Machine Vision, Rome, Italy, 2–6 November 2021.
76. Qi, M.; Wang, Y.; Chen, Y.; Xin, H.; Xu, Y.; Meng, H.; Wang, A. Center detection algorithm for printed circuit board circular marks
based on image space and parameter space. J. Electron. Imaging 2023, 32, 011002. [CrossRef]
77. Islam, N.; Rashid, M.M.; Wibowo, S.; Xu, C.-Y.; Morshed, A.; Wasimi, S.A.; Moore, S.; Rahman, S.M. Early Weed Detection Using
Image Processing and Machine Learning Techniques in an Australian Chilli Farm. Agriculture 2021, 11, 387. [CrossRef]
78. Fawakherji, M.; Potena, C.; Pretto, A.; Bloisi, D.D.; Nardi, D. Multispectral Image Synthesis for Crop/Weed Segmentation in
Precision Farming. Robot. Auton. Syst. 2021, 146, 103861. [CrossRef]
79. Ustin, S.L.; Jacquemoud, S. How the Optical Properties of Leaves Modify the Absorption and Scattering of Energy and Enhance
Leaf Functionality. Remote Sens. Plant Biodivers. 2020, 14, 349–384.
Agronomy 2024, 14, 363 27 of 28
80. Zhu, W.; Sun, Z.; Huang, Y.; Yang, T.; Li, J.; Zhu, K.; Zhang, J.; Yang, B.; Shao, C.; Peng, J.; et al. Optimization of multi-source UAV
RS agro-monitoring schemes designed for field-scale crop phenotyping. Precis. Agric. 2021, 22, 1768–1802. [CrossRef]
81. Calderon, R.; Montes-Borrego, M.; Landa, B.B.; Navas-Cortes, J.A.; Zarco-Tejada, P.J. Detection of downy mildew of opium poppy
using high-resolution multispectral and thermal imagery acquired with an unmanned aerial vehicle. Precis. Agric. 2014, 15,
639–661. [CrossRef]
82. Bellvert, J.; Zarco-Tejada, P.J.; Girona, J.; Fereres, E. Mapping crop water stress index in a ‘Pinot-noir’ vineyard: Comparing
ground measurements with thermal remote sensing imagery from an unmanned aerial vehicle. Precis. Agric. 2014, 15, 361–376.
[CrossRef]
83. Sabat-Tomala, A.; Raczko, E.; Zagajewski, B. Comparison of Support Vector Machine and Random Forest Algorithms for Invasive
and Expansive Species Classification Using Airborne Hyperspectral Data. Remote Sens. 2020, 12, 516. [CrossRef]
84. Shen, Y.; Yin, Y.; Li, B.; Zhao, C.; Li, G. Detection of impurities in wheat using terahertz spectral imaging and convolutional neural
networks. Comput. Electron. Agric. 2021, 181, 105931. [CrossRef]
85. Guo, X.; Ge, Y.; Liu, F.; Yang, J. Identification of maize and wheat seedlings and weeds based on deep learning. Front. Earth Sci.
2023, 11, 1146558. [CrossRef]
86. Wang, Y.; Zhang, X.; Ma, G.; Du, X.; Shaheen, N.; Mao, H. Recognition of weeds at asparagus fields using multi-feature fusion
and backpropagation neural network. Int. J. Agric. Biol. Eng. 2021, 14, 190–198. [CrossRef]
87. Tannouche, A.; Sbai, K.; Rahmoune, M.; Zoubir, A.; Agounoune, R.; Saadani, R.; Rahmani, A. A Fast and Efficient Shape
Descriptor for an Advanced Weed Type Classification Approach. Int. J. Electr. Comput. Eng. 2016, 6, 1168–1175.
88. Bakhshipour, A.; Jafari, A. Evaluation of support vector machine and artificial neural networks in weed detection using shape
features. Comput. Electron. Agric. 2018, 145, 153–160. [CrossRef]
89. Zhuang, J.; Jin, X.; Chen, Y.; Meng, W.; Wang, Y.; Yu, J.; Muthukumar, B. Drought stress impact on the performance of deep
convolutional neural networks for weed detection in Bahiagrass. Grass Forage Sci. 2023, 78, 214–223. [CrossRef]
90. Li, D.; Shi, G.; Li, J.; Chen, Y.; Zhang, S.; Xiang, S.; Jin, S. PlantNet: A dual-function point cloud segmentation network for multiple
plant species. Isprs J. Photogramm. Remote Sens. 2022, 184, 243–263. [CrossRef]
91. Schmidhuber, J. Deep learning in neural networks: An overview. Neural Netw 2015, 61, 85–117. [CrossRef]
92. Zhu, Y.; Wang, M.; Yin, X.; Zhang, J.; Meijering, E.; Hu, J. Deep Learning in Diverse Intelligent Sensor Based Systems. Sensors
2023, 23, 62. [CrossRef]
93. Garibaldi-Marquez, F.; Flores, G.; Mercado-Ravell, D.A.; Ramirez-Pedraza, A.; Valentin-Coronado, L.M. Weed Classification from
Natural Corn Field-Multi-Plant Images Based on Shallow and Deep Learning. Sensors 2022, 22, 3021. [CrossRef]
94. Alzubaidi, L.; Zhang, J.; Humaidi, A.J.; Al-Dujaili, A.Q.; Duan, Y.; Al-Shamma, O.; Santamaría, J.; Fadhel, M.A.; Al-Amidie, M.;
Farhan, L. Review of deep learning: Concepts, CNN architectures, challenges, applications, future directions. J. Big Data 2021, 8,
53. [CrossRef]
95. Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, Ł.; Polosukhin, I. Attention is all you need.
In Proceedings of the 31st International Conference on Neural Information Processing Systems, Long Beach, CA, USA, 4–9
December 2017; pp. 6000–6010.
96. Dosovitskiy, A.; Beyer, L.; Kolesnikov, A.; Weissenborn, D.; Houlsby, N. An Image is Worth 16x16 Words: Transformers for Image
Recognition at Scale. arXiv 2020, arXiv:2010.11929.
97. Jiang, K.; Afzaal, U.; Lee, J. Transformer-Based Weed Segmentation for Grass Management. Sensors 2023, 23, 65. [CrossRef]
98. Khan, S.; Naseer, M.; Hayat, M.; Zamir, S.W.; Khan, F.S.; Shah, M. Transformers in Vision: A Survey. ACM Comput. Surv. 2022, 54, 200.
[CrossRef]
99. Tao, T.; Wei, X. A hybrid CNN-SVM classifier for weed recognition in winter rape field. Plant Methods 2022, 18, 29. [CrossRef]
100. Zhang, H.; Wang, Z.; Guo, Y.; Ma, Y.; Cao, W.; Chen, D.; Yang, S.; Gao, R. Weed Detection in Peanut Fields Based on Machine
Vision. Agriculture 2022, 12, 1541. [CrossRef]
101. Jin, X.; Sun, Y.; Che, J.; Bagavathiannan, M.; Yu, J.; Chen, Y. A novel deep learning-based method for detection of weeds in
vegetables. Pest Manag. Sci. 2022, 78, 1861–1869. [CrossRef]
102. Abouzahir, S.; Sadik, M.; Sabir, E. Paper Bag-of-visual-words-augmented Histogram of Oriented Gradients for efficient weed
detection. Biosyst. Eng. 2021, 202, 179–194. [CrossRef]
103. Haq, M.A. CNN Based Automated Weed Detection System Using UAV Imagery. Comput. Syst. Sci. Eng. 2022, 42, 837–849.
[CrossRef]
104. Milioto, A.; Lottes, P.; Stachniss, C. Real-Time Blob-Wise Sugar Beets vs. Weeds Classification for Monitoring Fields Using
Convolutional Neural Networks. In Proceedings of the International Conference on Unmanned Aerial Vehicles in Geomatics,
Bonn, Germany, 4–7 September 2017; pp. 41–48.
105. Ong, P.; Teo, K.S.; Sia, C.K. UAV-based weed detection in Chinese cabbage using deep learning. Smart Agric. Technol. 2023, 4, 100181.
[CrossRef]
106. Quan, L.; Feng, H.; Li, Y.; Wang, Q.; Zhang, C.; Liu, J.; Yuan, Z. Maize seedling detection under different growth stages and
complex field environments based on an improved Faster R-CNN. Biosyst. Eng. 2019, 184, 1–23. [CrossRef]
107. Sanchez, P.R.; Zhang, H. Evaluation of a CNN-Based Modular Precision Sprayer in Broadcast-Seeded Field. Sensors 2022, 22, 9723.
[CrossRef]
Agronomy 2024, 14, 363 28 of 28
108. Zhang, W.H.; Hansen, M.F.; Volonakis, T.N.; Smith, M.; Smith, L.; Wilson, J.; Ralston, G.; Broadbent, L.; Wright, G. Broad-Leaf
Weed Detection in Pasture. In Proceedings of the 3rd IEEE International Conference on Image, Vision and Computing (ICIVC),
Chongqing, China, 27–29 June 2018; pp. 101–105.
109. McCool, C.; Perez, T.; Upcroft, B. Mixtures of Lightweight Deep Convolutional Neural Networks: Applied to Agricultural
Robotics. IEEE Robot. Autom. Lett. 2017, 2, 1344–1351. [CrossRef]
110. Asseng, S.; Asche, F. Future farms without farmers. Sci. Robot. 2019, 4, eaaw1875. [CrossRef]
111. Wang, D.S.; Cao, W.J.; Zhang, F.; Li, Z.L.; Xu, S.; Wu, X.Y. A Review of Deep Learning in Multiscale Agricultural Sensing. Remote
Sens. 2022, 14, 559. [CrossRef]
112. Zhang, H.D.; Wang, L.Q.; Tian, T.; Yin, J.H. A Review of Unmanned Aerial Vehicle Low-Altitude Remote Sensing (UAV-LARS)
Use in Agricultural Monitoring in China. Remote Sens. 2021, 13, 1221. [CrossRef]
113. Waldner, F.; Diakogiannis, F.I. Deep learning on edge: Extracting field boundaries from satellite images with a convolutional
neural network. Remote Sens. Environ. 2020, 245, 1221. [CrossRef]
114. Yuan, X.H.; Shi, J.F.; Gu, L.C. A review of deep learning methods for semantic segmentation of remote sensing imagery. Expert
Syst. Appl. 2021, 169, 114417. [CrossRef]
115. Liu, J.; Xiang, J.J.; Jin, Y.J.; Liu, R.H.; Yan, J.N.; Wang, L.Z. Boost Precision Agriculture with Unmanned Aerial Vehicle Remote
Sensing and Edge Intelligence: A Survey. Remote Sens. 2021, 13, 4387. [CrossRef]
116. Khan, S.; Tufail, M.; Khan, M.T.; Khan, Z.A.; Iqbal, J.; Wasim, A. Real-time recognition of spraying area for UAV sprayers using a
deep learning approach. PLoS ONE 2021, 16, e0249436. [CrossRef] [PubMed]
117. De Castro, A.I.; Ehsani, R.; Ploetz, R.; Crane, J.H.; Abdulridha, J. Optimum spectral and geometric parameters for early detection
of laurel wilt disease in avocado. Remote Sens. Environ. 2015, 171, 33–44. [CrossRef]
118. Xie, C.Q.; Yang, C. A review on plant high-throughput phenotyping traits using UAV-based sensors. Comput. Electron. Agric.
2020, 178, 105731. [CrossRef]
119. Allred, B.; Eash, N.; Freeland, R.; Martinez, L.; Wishart, D. Effective and efficient agricultural drainage pipe mapping with UAS
thermal infrared imagery: A case study. Agric. Water Manag. 2018, 197, 132–137. [CrossRef]
120. Guo, A.T.; Huang, W.J.; Dong, Y.Y.; Ye, H.C.; Ma, H.Q.; Liu, B.; Wu, W.B.; Ren, Y.; Ruan, C.; Geng, Y. Wheat Yellow Rust Detection
Using UAV-Based Hyperspectral Technology. Remote Sens. 2021, 13, 123. [CrossRef]
121. Radoglou-Grammatikis, P.; Sarigiannidis, P.; Lagkas, T.; Moscholios, I. A compilation of UAV applications for precision agriculture.
Comput. Netw. 2020, 172, 107148. [CrossRef]
122. Sivakumar, A.N.; Modi, S.; Gasparino, M.V.; Ellis, C.; Velasquez, A.E.B.; Chowdhary, G.; Gupta, S. Learned Visual Navigation
for Under-Canopy Agricultural Robots. In Proceedings of the Conference on Robotics—Science and Systems, Electr Network,
Virtual, 12–16 July 2021.
123. Subeesh, A.; Mehta, C.R. Automation and digitization of agriculture using artificial intelligence and internet of things. Artif. Intell.
Agric. 2021, 5, 278–291. [CrossRef]
124. Andreasen, C.; Scholle, K.; Saberi, M. Laser Weeding With Small Autonomous Vehicles: Friends or Foes? Front. Agron. 2022, 4,
841086. [CrossRef]
125. Tran, D.; Schouteten, J.J.; Degieter, M.; Krupanek, J.; Jarosz, W.; Areta, A.; Emmi, L.; De Steur, H.; Gellynck, X. European
stakeholders’ perspectives on implementation potential of precision weed control: The case of autonomous vehicles with laser
treatment. Precis. Agric. 2023, 24, 2200–2222. [CrossRef]
126. Hussain, A.; Fatima, H.S.; Zia, S.M.; Hasan, S.; Khurram, M.; Stricker, D.; Afzal, M.Z. Development of Cost-Effective and Easily
Replicable Robust Weeding Machine-Premiering Precision Agriculture in Pakistan. Machines 2023, 11, 287. [CrossRef]
127. Xu, S.Y.; Wu, J.J.; Zhu, L.; Li, W.H.; Wang, Y.T.; Wang, N. A novel monocular visual navigation method for cotton-picking robot
based on horizontal spline segmentation. In Proceedings of the 9th International Symposium on Multispectral Image Processing
and Pattern Recognition (MIPPR)—Automatic Target Recognition and Navigation, Enshi, China, 31 October–1 November 2015.
128. Jia, W.K.; Zhang, Y.; Lian, J.; Zheng, Y.J.; Zhao, D.; Li, C.J. Apple harvesting robot under information technology: A review. Int. J.
Adv. Robot. Syst. 2020, 17, 1729881420925310. [CrossRef]
129. Jiang, W.; Quan, L.Z.; Wei, G.Y.; Chang, C.; Geng, T.Y. A conceptual evaluation of a weed control method with post-damage
application of herbicides: A composite intelligent intra-row weeding robot. Soil Tillage Res. 2023, 234, 105837. [CrossRef]
130. Mohamed, E.S.; Belal, A.; Abd-Elmabod, S.K.; El-Shirbeny, M.A.; Gad, A.; Zahran, M.B. Smart farming for improving agricultural
management. Egypt. J. Remote Sens. Space Sci. 2021, 24, 971–981. [CrossRef]
131. Darwin, B.; Dharmaraj, P.; Prince, S.; Popescu, D.E.; Hemanth, D.J. Recognition of Bloom/Yield in Crop Images Using Deep
Learning Models for Smart Agriculture: A Review. Agronomy 2021, 11, 646. [CrossRef]
132. Jin, S.; Dai, H.; Peng, J.; He, Y.; Zhu, M.; Yu, W.; Li, Q. An Improved Mask R-CNN Method for Weed Segmentation. In Proceedings
of the 17th Conference on Industrial Electronics and Applications (ICIEA), Chengdu, China, 16–19 December 2022.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual
author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to
people or property resulting from any ideas, methods, instructions or products referred to in the content.