0% found this document useful (0 votes)
23 views17 pages

Estimating Bermudagrass Aboveground Biomass Using

This study evaluates the effectiveness of stereovision-based measurements of crop height and vegetation coverage in predicting the aboveground biomass yield of bermudagrass. Data collected from 136 experimental plots using an RGB-depth camera demonstrated a promising correlation between these measurements and wet biomass yield, with a recommended prediction function based primarily on crop height. The research aims to develop practical and easy-to-use systems for farmers to manage pasture biomass effectively.

Uploaded by

Maha Machdoud
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views17 pages

Estimating Bermudagrass Aboveground Biomass Using

This study evaluates the effectiveness of stereovision-based measurements of crop height and vegetation coverage in predicting the aboveground biomass yield of bermudagrass. Data collected from 136 experimental plots using an RGB-depth camera demonstrated a promising correlation between these measurements and wet biomass yield, with a recommended prediction function based primarily on crop height. The research aims to develop practical and easy-to-use systems for farmers to manage pasture biomass effectively.

Uploaded by

Maha Machdoud
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 17

remote sensing

Article
Estimating Bermudagrass Aboveground Biomass Using
Stereovision and Vegetation Coverage
Jasanmol Singh 1 , Ali Bulent Koc 1, * , Matias Jose Aguerre 2 , John P. Chastain 1 and Shareef Shaik 3

1 Department of Agricultural Sciences, Clemson University, Clemson, SC 29634, USA;


[email protected] (J.S.); [email protected] (J.P.C.)
2 Department of Animal and Veterinary Sciences, Clemson University, Clemson, SC 29634, USA;
[email protected]
3 School of Computing, Clemson University, Clemson, SC 29634, USA; [email protected]
* Correspondence: [email protected]

Abstract: Accurate information about the amount of standing biomass is important in pasture
management for monitoring forage growth patterns, minimizing the risk of overgrazing, and ensuring
the necessary feed requirements of livestock. The morphological features of plants, like crop height
and density, have been proven to be prominent predictors of crop yield. The objective of this
study was to evaluate the effectiveness of stereovision-based crop height and vegetation coverage
measurements in predicting the aboveground biomass yield of bermudagrass (Cynodon dactylon) in
a pasture. Data were collected from 136 experimental plots within a 0.81 ha bermudagrass pasture
using an RGB-depth camera mounted on a ground rover. The crop height was determined based on
the disparity between images captured by two stereo cameras of the depth camera. The vegetation
coverage was extracted from the RGB images using a machine learning algorithm by segmenting
vegetative and non-vegetative pixels. After camera measurements, the plots were harvested and sub-
sampled to measure the wet and dry biomass yields for each plot. The wet biomass yield prediction
function based on crop height and vegetation coverage was generated using a linear regression
analysis. The results indicated that the combination of crop height and vegetation coverage showed a
promising correlation with aboveground wet biomass yield. However, the prediction function based
only on the crop height showed less residuals at the extremes compared to the combined prediction
function (crop height and vegetation coverage) and was thus declared the recommended approach
Citation: Singh, J.; Koc, A.B.; Aguerre,
M.J.; Chastain, J.P.; Shaik, S.
(R2 = 0.91; SeY= 1824 kg-wet/ha). The crop height-based prediction function was used to estimate
Estimating Bermudagrass the dry biomass yield using the mean dry matter fraction.
Aboveground Biomass Using
Stereovision and Vegetation Coverage. Keywords: stereovision; vegetation coverage; crop height; aboveground biomass; forages; pastures
Remote Sens. 2024, 16, 2646. https://
doi.org/10.3390/rs16142646

Academic Editor: Clement Atzberger


1. Introduction
Received: 17 May 2024 Grasslands cover 70% of the agricultural lands and 26% of the world’s total land
Revised: 11 July 2024
surface as of 2010 [1]. The majority of these grasslands are spread by human interventions
Accepted: 16 July 2024
to satisfy the need for increasing human and livestock populations. Bermudagrass is a warm
Published: 19 July 2024
season grass native to southeast Africa and is widely grown in the southeastern USA [2].
Hansen et al. [2] recommend beginning the grazing of bermudagrass when it becomes
15–20 cm tall, and until it reaches 8–10 cm of grazed height. If it grows more than 20 cm, it
Copyright: © 2024 by the authors.
is suggested to be harvested as hay up to a uniform height of 8–10 cm. Depending on the
Licensee MDPI, Basel, Switzerland. growing season, plant growth is higher in the May to June period than the July to August
This article is an open access article period because of a greater amount of moisture availability [2]. These recommendations,
distributed under the terms and however, change with the location and environmental conditions. Therefore, it becomes
conditions of the Creative Commons necessary to properly manage such grasslands and forages for optimal growth.
Attribution (CC BY) license (https:// Visual estimation was traditionally practiced by growers and farmers to assess forage
creativecommons.org/licenses/by/ availability in grasslands [3]. These estimates accounted for factors like stage of maturity,
4.0/). color, leafiness, foreign material, and overall condition of the pasture. A well-experienced

Remote Sens. 2024, 16, 2646. https://doi.org/10.3390/rs16142646 https://www.mdpi.com/journal/remotesensing


Remote Sens. 2024, 16, 2646 2 of 17

observer can make an acceptable prediction of the forage availability, but predictions might
vary drastically among observers. Hence, some standardized methods for estimating forage
availability were needed to eliminate the variability in biomass estimations. One such
method is sampling and interpolation. In this method, a small area of 0.1 m2 or a few
such quadrants are harvested in the field and weighed to estimate the average biomass per
unit harvested area. This average yield is then interpolated for the whole pasture. It is an
effective way of measuring the available biomass, but is labor and time intensive, as well
as a destructive method [4].
Several approaches have been used to predict aboveground biomass through indirect
or non-destructive methods. Table 1 provides a summary of the coefficient of determina-
tions (R2 ) of the biomass prediction functions developed for large scale and farm scale
grasslands with space-borne, aerial-borne, and ground-borne systems and techniques.

Table 1. Biomass estimation techniques/systems and their applicability in forage fields and grasslands.

Coefficient of Determination
Technique/System Platform Site Scale (R2 ) of Biomass
Prediction Function
Satellite (Landsat-8, Large scale and farm scale 0.20–0.92 [5–9]
Spectral reflectance
Sentinal-2)
Unmanned aerial vehicle Farm scale 0.42–0.92 [10,11]
(Hyperspectral camera)
LiDAR Unmanned aerial vehicle and Farm scale and large scale 0.61–0.74 [12–16]
Unmanned ground vehicle
Structure from motion Unmanned aerial vehicle Farm scale 0.59–0.88 [17–19]
Ultrasound sensor Unmanned ground vehicle Farm scale 0.73–0.80 [20,21]
Meter stick, rising plate meter Manual measurements Farm scale 0.11–0.86 [22,23]

Large-scale applications focus on the regional surveying and temporal analysis of crop
features like yield, nitrogen content, and land cover change over the years on thousands
of acres. On the other hand, farm scale applications facilitate the end users with the yield
assessment for frequent management decisions. The results from Table 1 suggest that the
coefficient of determination was better for the aerial and ground-based systems at their
lower ranges of the R2 values. However, some of the best coefficients of determination
were also seen in the satellite-based systems as well (Table 1).
The long-term goal of this study was to develop forage yield prediction systems that
are easy to use and are of practical utility for satisfying the farm scale needs of cattle and
forage producers. For farm scale pasture management, the decisions related to the forage
availability and stocking rate are made daily. The producer needs an update on which
sections of the pasture can be used for cattle grazing and for how many days, before moving
to the next section. Additionally, the decisions related to the harvesting of the forage for
hay and silage are also scheduled based on the real-time conditions of the field produce.
Any delay or unfavorable weather conditions might result in a significant decrement or
loss of forage produce. Thus, it becomes critical to obtain the information about the forage
availability on a daily basis.
In farm scale pasture productions, the management of the forage is taken care of by the
producer and is sometimes supported by consultants for equipment or service requirements.
The dependency on the consultancies for these resources might sometimes involve lesser
flexibility and independence. The unavailability of the technician and the equipment in
critical times might result in delayed management. So, systems that are fast and easy to use
for the farmer can be a possible solution for utilization without requiring much technical
knowledge for operating and post-processing the costly equipment or devices.
Based on the above-mentioned constraints, satellite-based systems cannot fit in as
a preferred choice for farm scale applications as they might not provide the daily yield
estimates over a small area, and they involve complex data processing for a farmer to deal
Remote Sens. 2024, 16, 2646 3 of 17

with. Aerial based systems require a trained technician for flight management and rigorous
data analysis. Moreover, the UAV-based data analysis, as in the case of spectral cameras
and structure-from-motion methods, take hours to process, depending on the field size and
the processor’s capabilities. Thus, they may not be suitable for real-time assessments by the
farmer. Techniques like ground-based LiDARs and ultrasound sensors might be relevant
for such applications as they are easy to manage compared to UAV and satellite-based data
collection and processing techniques.
However, LiDARs might involve challenging post-processing efforts of converting the
raw lidar data to the end user product (yield). Ultrasound sensors provide point-based
distance values which might not facilitate optimal data to represent the variation in the
crop yield. Koc et al. [21] also used a combination of ultrasound sensors and a compression
ski for predicting aboveground biomass in forage crops. The distance from the center of the
compression ski was measured using the ultrasound sensor while the ski reciprocated over
the crop surface. The results were satisfactory, but the sideward tilts of the ski with the
crop canopy did not respond to the point-based distance measurement of the ultrasound
sensor. Thus, it was not able to explain the spatial variation in the crop height under the
ski. Manual techniques like grazing sticks and pasture plate meters are the simplest to
use but are inefficient [24] and the accuracy of measurements is significantly impacted by
fatigue. Additionally, a limited number of samples can be collected manually which does
not represent the crop traits in the whole field [24].
In addition to the morphological features (crop height), the crop density across the field
may be a helpful input in improving biomass predictions. The crop density or vegetation
coverage (VC) is defined as how much of the region of interest (ROI) is covered with
green vegetation [25]. Flombaum and Sala [26] explained the correlation of the VC with
biomass in their research. Their study was conducted on multiple shrubs and grasses
(shrubs: Mulinum spinosum, Senecio filaginoides, and Adesmia campestris; grasses: Poa ligularis,
Stipa speciosa, and Stipa humilis) and linear relations were used to derive the predictions
of the biomass. Their results showed a significant (p < 0.01) correlation of the VC with
the biomass and coefficient of determinations (R2 ) from 0.53 to 0.85 among various crops.
Schirrmann et al. [27] also studied the relationship between crop height and plant coverage
in predicting wet and dry biomass. Therefore, the use of crop height along with the VC
might improve biomass predictions.
The overall goal of this research was to evaluate the effectiveness of the RGB-depth
camera-based integrated system in predicting the aboveground biomass yield in the
bermudagrass for farm scale applications. The RGB-depth camera was used to mea-
sure the changing distance from the crop surface contour using stereovision with two
monochromatic cameras. The changing distances from the crop surface were averaged
to represent an index of the height of the crop canopy above the ground (called the crop
height) in a region of interest (ROI). The RGB-depth camera also captured the crop surface
images, which were used to extract the vegetation coverage in the ROI. The crop height and
vegetation coverage measurements were correlated with the aboveground wet biomass
yield measurements from these regions of interest. The specific objectives to achieve the
above-mentioned goal are as follows:
1. To investigate the effectiveness of RGB-depth cameras in measuring the height of the
crop above the ground using stereovision and to quantify vegetation coverage from
RGB images using pixel segmentation.
2. To develop aboveground biomass prediction function with crop height and vegetation
coverage as the potential independent variables.

2. Materials and Methods


2.1. Study Site and Experimental Design
The study was conducted on a 10.1 ha bermudagrass field (34◦ 39′ 26′′ N 82◦ 43′ 45′′ W),
out of which two sections of 0.4 ha were used for the data collection on 16 August 2023
(Section 1) and 1 September 2023 (Section 2), respectively (Figure 1). The data was collected
The plot dimensions were adjusted for Section 2 to accommodate the reduced capacity of
the mower’s collection bag (HRN216PKA, Honda Power Equipment Mfg., Swepsonville,
NC, USA). This adjustment was necessitated by the unavailability of the plot harvester
(Carter Manufacturing Co., Brookston, IN, USA) used in Section 1 for harvesting the
Remote Sens. 2024, 16, 2646 4 of 17
marked plots.
After marking the plots, 10 ground control points (GCPs) were established in the
field. These GCPs were used to georeference the UAV images in post-processing. The ge-
at solar noon
olocations to ensure
of GCPs were optimal
recordedlight conditions
using the realand
timeminimum
kinematicshadow effects during
global positioning data
system
capturing. Initially, the plots were marked in the section with survey
(RTK-GPS) devices by Emlid (Emlid, Hong Kong, China). The Emlid Reach RS2 was used flags. They were
arranged
as the baseinstation
six rowsand with
thethirteen
Reach RS+ plots
was (4.6 m ×as1.5
used them) for Section
rover. The base1, except
stationfor
was one
setrow
up
with just eleven plots due to an ungrown patch in the section. For Section
at a fixed location in the field and was allowed to average its static location for 0.5 h. This2, 6 rows of
10 plots (3.0 m × 1.5 m) were flagged, resulting in a total of 136 plots for
was performed to establish its fixed coordinates in the field. The geolocations of GCPs both sections. The
plot dimensions
recorded with RTK were adjusted
helped forthe
align Section
UAV 2images
to accommodate the reduced
with the real-world capacity ofThis
coordinates. the
mower’s
RTK collection
helped bag (HRN216PKA,
in attaining a sub-centimeter HondalevelPower Equipment
accuracy in the fixMfg.,
mode. Swepsonville, NC,
It was followed
USA). This adjustment was necessitated by the unavailability of the plot harvester
by recording the geolocation of the center of all the marked plots for locating the plots in (Carter
Manufacturing
the sections duringCo., post-processing.
Brookston, IN, USA) used in Section 1 for harvesting the marked plots.

Figure 1. Sections 1 (left) and 2 (right) (red) and plots (black) in the bermudagrass field.
Figure 1. Sections 1 (left) and 2 (right) (red) and plots (black) in the bermudagrass field.

2.2. OAK-D Stereovision


After marking the Depth
plots,Camera and Its
10 ground Installation
control points (GCPs) were established in the
field.The cropGCPs
These height wasused
were measured using the depth
to georeference camera
the UAV (Luxonis
images OAK-D, CO, USA)
in post-processing. The
baseboard
geolocations with three were
of GCPs cameras for stereo
recorded usingand
theRGB vision.
real time The detailed
kinematic globalspecifications of the
positioning system
(RTK-GPS)
camera [28]devices
are shownby Emlid (Emlid,
in Table 2. Hong Kong, China). The Emlid Reach RS2 was used
as the base station and the Reach RS+ was used as the rover. The base station was set up at
Table
a fixed2.location
The colorinand
thestereovision camera
field and was specifications.
allowed to average its static location for 0.5 h. This was
performed to establish its fixed coordinates in the field. The geolocations of GCPs recorded
Camera Specifications Color Camera * Stereo Pair **
with RTK helped align the UAV images with the real-world coordinates. This RTK helped
Sensor
in attaining a sub-centimeter levelIMX378 (PY011inAF)
accuracy OV9282
the fix mode. It was (PY010
followed byFF)
recording
DFOV/HFOV/VFOV 81°/69°/55° 81°/72°/49°
the geolocation of the center of all the marked plots for locating the plots in the sections
Resolution
during post-processing. 12 MP (4056 × 3040) 1 MP (1280 × 800)
Focus AF: 8 cm–∞ or FF: 50 cm–∞ FF: 19.6 cm–∞
2.2. OAK-D Stereovision Depth Camera and Its Installation
Max framerate 60 FPS 120 FPS
The crop height was measured using the depth camera (Luxonis OAK-D, Latitude,
CO, USA) baseboard with three cameras for stereo and RGB vision. The detailed specifica-
tions of the camera [28] are shown in Table 2.
The depth camera was equipped with an RGB camera in the middle along with two
monochromatic cameras on the sides for stereovision (Figure 2). The camera was installed
on the front of the UGV at a height of 1.5 m above the ground surface using an aluminum
frame, as shown in Figure 3. This installation height was selected based on the minimum
depth measurement range of 0.7 m for the depth camera. The camera’s effective area of
coverage (AOC) for stereovision as per the horizontal and vertical field of view (FOV)
of 72◦ and 49◦ are 2.1 m and 1.3 m. It was calculated using the FOV calculator by the
manufacturer [29], and thus, the camera covered an area of 2.7 m2 .
Effective focal length 4.81 mm 2.35 m
F-number 1.8 ± 5% 2.0 ± 5%
Pixel size 1.55 µm × 1.55 µm 3 µm ×
Lens size 1/2.3 inch (11 mm) 1/4 inch (
* RGB images; ** Distance measurement.
Remote Sens. 2024, 16, 2646 Effective focal length 4.81 mm 5 2.35
of 17 mm

Pixel size
The depth camera was1.55 µm × 1.55
equipped µm
with 3 µm
an RGB camera in the ×mid
3
* RGB
Table images;
2. The color**
monochromatic Distance
and camerasmeasurement.
stereovision onspecifications.
camera the sides for stereovision (Figure 2). The c
on theSpecifications
Camera front of the UGV Color atCamera
a height
* of 1.5 mStereo
above Pair the
** ground surface
The
frame,
Sensor
depth camera
as shown in Figure was equipped
IMX378 3. This
(PY011
with an
AF)installation
RGB
OV9282heightcamera
(PY010 was
in the midd
FF) selected bas
monochromatic
DFOV/HFOV/VFOV
depth measurementcamerasrange
81on
◦ /69the
◦ /55◦sides for stereovision
of 0.7 m for the81depth
◦ /72◦ /49◦(Figure 2). The cam
camera. The camer
oncoverage
the front (AOC)
Resolution of the UGV at
12 a
MP height of
(4056 × 3040) 1.5 m above the
1 MP (1280
for stereovision as per the horizontal ×ground
800) surface us
and vertical fie
frame,
Focus as shown in Figure AF: 3. This
8 cm–∞ installation
or FF: 50 cm–∞ height
FF: 19.6 cm–∞
72° and 49° are 2.1 m and 1.3 m. It was calculated using the FOV calc was selected based
Max framerate
depth measurement 60 FPS
range of camera
0.7 m for 120 FPS camera. The camera’s
the depth
facturer
F-number
[29], and thus, the
1.8 ± 5%
covered an area of 2.7 m2.
2.0 ± 5%
coverage (AOC) for stereovision as per the horizontal and vertical field
Lens size 1/2.3 inch (11 mm) 1/4 inch (6.4 mm)
72° and 49° are 2.1 m and 1.3 m. It was calculated using the FOV calcula
Effective focal length 4.81 mm 2.35 mm
facturer [29],
Pixel size
and thus, the camera
1.55 µm × 1.55 µm
covered an area of 2.7 m2.
3 µm × 3 µm
* RGB images; ** Distance measurement.

Figure
Figure 2. Depth
2. Depth camera camera lenses. Monochromatic
lenses. Monochromatic stereovision camerasstereovision
(blue encircled); cameras
RGB camera(blue e
(green
(green encircled).
encircled).

Figure 2. Depth camera lenses. Monochromatic stereovision cameras (blue enci


(green encircled).

Figure
Figure 3. Depth
3. Depth camera camera mounted
mounted on on UGV
UGV (red encircled). (red encircled).
However, this FOV covered some non-plant regions like UGV tires and frames, so the
effectiveHowever,
FOV used for this FOV
the study wascovered some
cropped during the non-plant
data-capturingregions like UGV
stage. The depth
Figure 3. Depth camera
the effective mounted
FOV used on UGV
for the (red
study encircled).
was cropped during the data-c
depth camera was powered using the co-axial power connectors, co
However,
battery this on
installed FOVthecovered
UGV. some non-plant regions like UGV tire
the effective FOV used for the study was cropped during the data-cap
Remote Sens. 2024, 16, 2646 6 of 17

camera was powered using the co-axial power connectors, connected to the 12 V battery
installed on the UGV.

2.3. Data Collection Procedures


2.3.1. Measurement of Crop Height and Vegetation Coverage
The two monochromatic cameras were used to measure crop height. These cameras
worked on the principle of stereovision. They measured the distance from the crop canopy
based on the disparity in the image views between the two stereo cameras. This camera
had the potential to measure the distance for every pixel in its field of view and hence
provided the crop height at all these pixels in the effective FOV. The central region of the
effective FOV was used for data acquisition to eliminate any possible edge-effect errors.
The UGV equipped with the depth camera was driven over the marked plots to
measure the distance of the camera from the crop canopy every second. The UGV was
operated at a speed of 0.6 m/s, which allowed it to capture 8 data points in a 4.57 m
long plot. The UGV was also mounted with Emlid Reach RS+ as the rover station which
recorded the real-time corrected geolocation. An ASUS TUF A15 Gaming Laptop (ASUSTeK
Computer Inc., Beitou District, Taipei, Taiwan) with Ryzen 7 4800H mobile processor, 16 GB
RAM and 512 GB storage was used to extract the data from the depth camera and RS+ via
USB 3.2 Gen-1 Type-A ports.
The distance measurements from the depth camera were merged with their geolo-
cations using a Python script. For measuring the vegetation coverage, the RGB camera
(Figure 2) was used to capture the images of the crop canopy. The RGB image and crop
height data were captured at the same geolocation and instant to eliminate any error caused
by temporal delay.

2.3.2. Postharvest Field View with UAV


The DJI Mavic Pro (SZ DJI Technology Co. Ltd., Nanshan, Shenzhen, China) was
used with a Survey 3W RGB camera (Mapir Inc., San Diego, CA, USA) to capture the
aerial images of the field. This was performed with a pre-planned flight mission using
the Drone Deploy website (https://www.dronedeploy.com; accessed: 16 May 2024) over
the experimental site. The flight parameters are shown in Table 3. Other settings were
either preset or were kept by default. These UAV images were used for generating the
orthomosaics of the harvested field for serving the following purposes:
1. To identify the region of interest (harvested areas) for post-processing and eliminating
the non-relevant data.
2. To geolocate the plots in the field based on the coordinates measured with RTK-GPS.
3. To measure the exact harvested area of the plots excluding the non-harvested regions
of marked plots.

Table 3. UAV flight parameters.

Parameter Details
Flight altitude (m) 18.29
Front overlap (%) 75
Side overlap (%) 75
Flight speed (m/s) 1.8 (Auto Set)
Perimeter 3D ON
Crosshatch 3D ON

2.4. Challenges in Data Collection and Solutions


While testing the depth camera in the field, it was observed that the sunlight exposure
on the AOC was causing an error in the height measurements. To solve this issue, the UGV
was mounted with two aluminum side shades on the left and right sides of the AOC. A
cloth cover was also attached on the top extending from the camera mount to the aluminum
Remote Sens. 2024, 16, 2646 7 of 17

side shades. The aluminum was selected to make the shades rigid enough to prevent them
Remote Sens. 2024, 16, x FOR PEER REVIEW
from becoming damaged in the field while colliding with the crop canopy.
The cloth cover was used to allow an optimal amount of diffused light inside the
covered region to capture the RGB images. The aluminum and cloth shade installed on
the UGV are shown in Figure 4. Additionally, some plots (n = 5) in the field missed the
datacapturing
data capturing since
since the laptop
the laptop that
that was was recording
recording theaccidentally
the data was data was switched
accidentally
off. switc
Hence,
Hence, thethe total
total number
number ofwith
of plots plots with available
available data was
data was reduced reduced to 131.
to 131.

Figure
Figure 4. 4.
UGVUGV
withwith shades
shades attached.
attached.

2.5. Harvesting and Weighing of Plots for Wet Biomass Yield Calculations
2.5. Harvesting and Weighing of Plots for Wet Biomass Yield Calculations
After capturing the height data and RGB images, the marked plots were harvested
After
using the capturing
employed the height
harvesters. data measurements
The weight and RGB images, the marked
of the harvested forageplots
werewere h
simultaneously recorded inharvesters.
using the employed the field using the weight
The onboard measurements
weighing scale of the
of plot
the harvester.
harvested fora
For lawn mower harvesting, a battery-operated weighing scale (Measuretek Enterprise
simultaneously recorded in the field using the onboard weighing scale of the p
Ltd., Richmondhill, CA, USA) was used. These weight measurements resulted in providing
vester.
the For lawn
wet biomass mower
yield (WBY)harvesting,
(kg-wet/ha) in a battery-operated
every plot. The plotsweighing scale at
were harvested (Measurete
an
prise Ltd.,
average heightRichmondhill,
of 5.08 cm. However,CA,due USA) wastopography
to field’s used. These weight ofmeasurements
and presence anthills, the res
providing the wet biomass yield (WBY) (kg-wet/ha) in every plot. The plots w
harvesting height was increased to 6.35 cm to avoid the inclusion of soil or anthills in the
harvested
vested atforage during the
an average harvesting
height andcm.
of 5.08 collection in the case
However, dueoftothe plot harvester.
field’s topography For and p
the lawn mower, these issues were not encountered; therefore, a constant cutting height of
of anthills, the harvesting height was increased to 6.35 cm to avoid the inclusion o
5.08 cm was maintained.
anthills in the harvested forage during the harvesting and collection in the case of
2.6. Plot Subsampling
harvester. For theforlawn
Dry Matter
mower,Calculations
these issues were not encountered; therefore, a
To obtain the dry matter fraction (DMF),
cutting height of 5.08 cm was maintained. the sub-samples from each plot (131 harvests)
were collected after harvesting and stored in Ziplock bags. These collected samples were
taken to the laboratory and dried at 55 ◦ C (forced air oven) for 48 h to determine the DMF
2.6. Plot Subsampling for Dry Matter Calculations
for every harvested plot sample. The DMF values were used to derive the dry biomass
To obtain
yield (DBY) the dry
predictions. The matter
3D modelfraction (DMF),
view of the the plots
harvested sub-samples
is shown infrom
Figureeach
plot ( 5.
vests) were collected after harvesting and stored in Ziplock bags. These collected
were taken to the laboratory and dried at 55 °C (forced air oven) for 48 h to determ
DMF for every harvested plot sample. The DMF values were used to derive the
mass yield (DBY) predictions. The 3D model view of the harvested plots is show
ure 5.
Remote Sens. 2024, 16, x FOR PEER REVIEW 8 of 18
Remote Sens. 2024, 16, 2646 8 of 17

Figure 5. 3D
Figure 5. 3D view
viewof ofharvested
harvestedplots
plots generated
generated with
with SfM.
SfM. TheThe
flagsflags represent
represent the locations
the locations of the of the
GCPs as visible
GCPs as visibleininthe
the
3D3D model
model from
from SfM.SfM.

2.7. Post-Processing of Recorded Data and Development of Prediction Function


2.7. Post-Processing of Recorded Data and Development of Prediction Function
The post-processing of the raw data captured in the form of aerial RGB images,
The post-processing
coordinates, of the
crop heights, and rawfor
images data captured
vegetation in the form
coverage were of aerial RGB
conducted images, coor-
in various
dinates,
softwarecrop heights,
programs. Theand images
detailed for vegetationsteps
post-processing coverage were
for each rawconducted in various
data category and soft-
ware programs.
software The detailed
are explained below. post-processing steps for each raw data category and soft-
ware are explained below.
2.7.1. Generating Orthomosaics from UAV RGB Images
2.7.1.The images captured
Generating with the
Orthomosaics UAV-mounted
from Survey 3W camera were imported
UAV RGB Images
to Agisoft Metashape Pro 1.8.3 (Agisoft LLC, St. Petersburg, Russia) photogrammetry
The for
software images captured
generating with the UAV-mounted
the orthomosaics. Surveywere
Initially, the images 3Waligned
cameraaccording
were imported
to to
Agisoft Metashape
the manually Pro 1.8.3
georeferenced (Agisoft
GCPs, LLC, St.
in Metashape, Petersburg,
followed by the Russia)
creation ofphotogrammetry
a dense cloud, soft-
ware for generating the orthomosaics. Initially, the images were
mesh, DEM, and orthomosaic. Every image imported to the software was geotagged aligned according
with to the
manually
reference togeoreferenced
these GCPs to alignGCPs,
the in Metashape,
generated followed
orthomosaic withby the
the creationcoordinates.
real-world of a dense cloud,
The preset
mesh, DEM, parameters for each of these
and orthomosaic. Everysteps
imagewere kept at default
imported to theselections
softwareand wasallgeotagged
models with
were generated at a high quality and with moderate filtering settings.
reference to these GCPs to align the generated orthomosaic with the real-world coordi-
nates.After
Thethe completion of the photogrammetry process, the orthomosaics were exported
preset parameters for each of these steps were kept at default selections and all
as a TIFF file. These orthomosaics were later imported to ArcGIS Pro 3.2.2 (ESRI, Redlands,
models were generated at a high quality and with moderate filtering settings.
CA, USA), where a polygon layer was created around the boundaries of each harvested plot
Afterensuring
manually, the completion of theonly
that it outlined photogrammetry
the harvested regions process,
of thethe orthomosaics
plot. This layer was were ex-
ported
used to calculate the actual harvested area for each plot, excluding the unharvested/missed (ESRI,
as a TIFF file. These orthomosaics were later imported to ArcGIS Pro 3.2.2
Redlands,
region in the CA, USA),
plots, where
which a polygon
do not contributelayer was created
to biomass. around area
This harvested the boundaries
was used in of each
harvested plot manually,
the aboveground wet biomass ensuring that calculations
(kg-wet/ha) it outlined by only the harvested
dividing regions
the measured massofofthe plot.
the harvested forage (kg) with the harvested area of each plot (ha). The
This layer was used to calculate the actual harvested area for each plot, excluding geolocation of the the un-
center of the plots that were recorded with RTK-GPS helped ensure
harvested/missed region in the plots, which do not contribute to biomass. This the location of every
harvested
plot in the real world. This polygon layer was also used as the spatial boundary of the ROI
area was used in the aboveground wet biomass (kg-wet/ha) calculations by dividing the
for calculating the crop height and vegetation coverage for every plot.
measured mass of the harvested forage (kg) with the harvested area of each plot (ha). The
geolocation of the
2.7.2. Extraction center
of Crop of thefrom
Height plots
Rawthat were recorded with RTK-GPS helped ensure the
Data
location
The raw data collected in the field contained polygon
of every plot in the real world. This layer was
the coordinates also used
(Latitude, as the spatial
Longitude),
boundary of the
date, time, and the ROI fordistance
canopy calculating
valuesthe crop height
measured using and vegetation
the depth camera.coverage for every
These were
plot.
imported into ArcGIS Pro 3.2.2 as a point layer, and the plot boundary layer described in
Section 2.7.1 was used as a mask for this point layer. All the data points falling into the
2.7.2. Extraction of Crop Height from Raw Data
The raw data collected in the field contained the coordinates (Latitude, Longitude),
date, time, and the canopy distance values measured using the depth camera. These were
imported into ArcGIS Pro 3.2.2 as a point layer, and the plot boundary layer described in
Section 2.7.1 was used as a mask for this point layer. All the data points falling into the
Remote Sens. 2024, 16, x FOR PEER REVIEW

Remote Sens. 2024, 16, 2646 9 of 17

boundary of the plots were considered as the relevant data for the statistical calc
boundary
The mean of the plots of
values were considered
canopy as the relevant
distances data to
were used for represent
the statistical calculations.
that plot. The mean
The mean values of canopy distances were used to represent that plot. The mean canopy
distance was subtracted from the ground distance to obtain the total crop height (
distance was subtracted from the ground distance to obtain the total crop height (TCH) as
shown
shown in in Figure
Figure 6. Next,
6. Next, the height
the height at whichatthewhich theharvested,
crop was crop wasi.e.,harvested, i.e., the cu
the cut canopy
height
height (CCH),
(CCH), waswas subtracted
subtracted from thefrom
TCHthe TCH the
to obtain to obtain
height ofthe
theheight of the actual h
actual harvested
crop in the field; this is defined as the change in crop height (∆H).
crop in the field; this is defined as the change in crop height (∆H).

Figure
Figure 6. Crop
6. Crop height
height measurement.
measurement.

2.7.3. Vegetation Coverage from RGB Images


2.7.3. Vegetation Coverage from RGB Images
The RGB images captured with the depth camera were processed through a pre-trained
NeuralThe RGB images
Network-based pixelcaptured
segmentationwith the depth
machine learningcamera werecalled
(ML) model processed
SegVeg, throug
developed and trained by Serouart et al. [30]. This program
trained Neural Network-based pixel segmentation machine learning (ML) utilized the U-Net model for mod
segmenting vegetation from the background (generally soil) and later used support vector
SegVeg, developed and trained by Serouart et al. [30]. This program utilized th
machines (SVM) for segmenting green and senescent vegetation. For this study, only the
model forversus
vegetation segmenting
backgroundvegetation fromwas
segmentation theperformed.
background The(generally
segmentation soil) and later u
results
portlater
were vector machines
converted to obtain(SVM) for ofsegmenting
the percent green vegetation green and
in each senescent
image, vegetation.
which is called
the VC. These
study, only data, along with the
the vegetation coordinates
versus of everysegmentation
background RGB image, were imported
was into
performed. The
ArcGIS Pro 3.2.2, and in a similar manner to the TCH, the VC values were
tation results were later converted to obtain the percent of green vegetation in eac also statistically
summarized to obtain a mean VC for each plot. The segmented image illustration is shown
inwhich
Figureis7.called the VC. These
It was observed that somedata, along
of the darkerwith the coordinates
vegetative of every
regions in the capturedRGB ima
imported
images were into ArcGIS
identified Pro
as the 3.2.2, andbyinthea system.
background similarThese
mannerdarkerto regions
the TCH, werethetheVC valu
also statistically
results summarized
of the non-uniform to obtain
illumination a mean
of the area VC for
of coverage andeach
thusplot.
mightTheimpactsegmente
the aboveground yield predictions. The prediction results from the
illustration is shown in Figure 7. It was observed that some of the darker veget vegetation coverage
were compared with the existing studies to evaluate the impact of these issues on the
gions in the captured images were identified as the background by the system
predictions and to investigate the ability of the VC as an independent variable to predict
darker
the regionsbiomass
aboveground were theyield.results of the non-uniform illumination of the area of c
and thus might impact the aboveground yield predictions. The prediction results
vegetation coverage were compared with the existing studies to evaluate the im
these issues on the predictions and to investigate the ability of the VC as an inde
variable to predict the aboveground biomass yield.
Remote Sens. 2024, 16, 2646
x FOR PEER REVIEW 1010of
of 18
17

Figure 7. SegVeg pixel segmentation and extraction of VC percentage.


Figure 7. SegVeg pixel segmentation and extraction of VC percentage.

2.7.4. Development
Development of the Prediction Function
exporting of the data files, described in Sections 2.7.1 and 2.7.2 for the ∆H and
The exporting
VC, respectively, were merged; these included the date, plot number, TCH, CCH, ∆H, ∆H, VC,
VC,
DMF, WBY, and and DBY
DBY from
from both
both the
the harvested
harvested sections.
sections. The WBY prediction functions
were developed from the ∆H, ∆H, VC, and the combined ∆H ∆H and
and VC
VC as
as potential
potential independent
independent
variables using
using linear
linearregression
regressionanalysis.
analysis.The
Thegeneral
general equation
equation of of
thethe developed
developed predic-
prediction
function
tion is shown
function in Equation
is shown (1): (1):
in Equation

βw β=wb=1 b×1 ×∆H


∆H++bb2×
2
× VC
VC ++ c.
c. (1)
(1)
where,
where,
βw = wet biomass yield (kg-wet/ha);
β = wet biomass yield (kg-wet/ha);
b1w= coefficient of change in crop height;
b1 = coefficient of change in crop height;
∆H = change in crop height (mm);
∆H = change in crop height (mm);
b2 = coefficient of vegetation coverage;
b2 = coefficient of vegetation coverage;
VC = vegetation coverage (%);
VC = vegetation coverage (%);
c = y-intercept (kg-wet/ha).
c = y-intercept (kg-wet/ha).
The crop height accounts for the vertical growth of the forage above the ground and,
The crop height accounts for the vertical growth of the forage above the ground and,
hence, contributes significantly to the biomass. Similarly, vegetation coverage explains the
hence, contributes significantly to the biomass. Similarly, vegetation coverage explains the
crop
crop spread
spread across the field.
across the field. The
The higher
higher crop
crop height
height and
and vegetation
vegetation coverage
coverage values
values show
show
high
high biomass availability in the field. But, if the crop height or vegetation coverage is
biomass availability in the field. But, if the crop height or vegetation coverage iszero,
zero,
it
it conveys
conveys that
that no
no crop
crop is
is available
available inin the
the field
field for
for harvesting.
harvesting. So,
So, the
the biomass
biomass amount
amount withwith
respect
respect toto that
thatshould
shouldalso
alsobebezero.
zero.Therefore,
Therefore, thethe prediction
prediction function
function generated
generated for bio-
for biomass
mass quantification
quantification must must
pass pass through
through the origin,
the origin, justifying
justifying zerozero biomass
biomass at zero
at zero vegeta-
vegetation
tion coverage and crop height. Thus, the β w(∆H), βw(VC), and βw(∆H, VC) were generated
coverage and crop height. Thus, the βw (∆H), βw (VC), and βw (∆H, VC) were generated
with both approaches,
with both approaches, i.e.,
i.e., with
with the
the y-intercept
y-intercept (c (c ̸=
≠ 0)
0) and
and the
the intercept
intercept set
set through
through thethe
origin
origin (c = 0). The results from these prediction functions were statistically evaluated to
(c = 0). The results from these prediction functions were statistically evaluated to
conclude
conclude the the best
best approach.
approach. Next,
Next, the βww was
the β was multiplied
multiplied by by the
the mean
mean DMFDMF from
from all
all the
the
plots
plots to
to obtain
obtain the dry biomass
the dry biomass yield
yield predictions,
predictions, as as shown
shown inin Equation
Equation (2).
(2).
βd = DMFm × βw. (2)
βd = DMFm × βw . (2)
where,
where,
βd = dry biomass yield (kg-DM/ha);
βd = mdry
DMF biomass
= mean dryyield (kg-DM/ha);
matter fraction;
DMF = mean dry matter fraction;
βw = wet biomass yield (kg-wet/ha).
m
βw = wet biomass yield (kg-wet/ha).
Remote Sens. 2024, 16, 2646 11 of 17

3. Results and Discussion


3.1. Correlation Analysis
The results for the βw (c ̸= 0) indicated a positive correlation of the independent
variables with the WBY. Highly significant correlation coefficients (p < 0.001) of 0.62 for
the βw (∆H), 0.43 for the βw (VC), and 0.65 for the βw (∆H,VC) were observed. However,
despite the significant correlation, the coefficient of determination was too low to be of
practical use, given none of these relationships described even half (0.50) of the variability
in βw (Table 4).

Table 4. Results from correlation and regression analyses for βw (∆H), βw (VC), and βw (∆H, VC).

βw (∆H) βw (VC) βw (∆H, VC)


Average WBY
5585 5585 5585
(kg-wet/ha)
n 131 131 131
R2 (c ̸= 0) 0.38 0.18 0.42 *
R2 (c = 0) 0.91 0.89 0.92 *
p-value of regression
p < 0.001 p < 0.001 p < 0.001
(c = 0)
SeY (kg-wet/ha) 1824 2040 1726
CV 33% 37% 31%
b1 38.38 0 25.07
p-value (b1 ) p < 0.001 NA p < 0.001
95% confidence
±2.11 NA ±6.85
interval (b1 )
b2 0 72.76 26.36
p-value (b2 ) NA p < 0.001 p < 0.001
95% confidence
NA ±4.47 ±12.96
interval (b2 )
* Adjusted R2 , NA: Not Applicable.

However, setting the intercept through the origin (c = 0) showed a drastic improvement
in the R2 value. It increased to a minimum of 0.89 for the βw (VC) and to a maximum of 0.92
for the βw (∆H,VC). All these correlations were highly significant (p < 0.001) and indicated
a stronger positive correlation, compared to the prediction functions with the y-intercept
(c ̸= 0).
This approach with the zero-intercept was also adopted by Flombaum and Sala [26] in
their research, who supported its justifications for similar applications. The zero-intercept
compensates for the unknown information about the other unmeasured crop properties that
influence the biomass quantity. Thus, the further regression analysis for all the independent
variables was based on a similar approach (c = 0).

3.2. Regression Analysis


3.2.1. ∆H as the Independent Variable (βw (∆H))
The regression results for the βw (∆H) indicated the standard error of the βw estimates
(SeY) was 1824 kg-wet/ha, with a coefficient of variation (CV) equal to 33%. The coefficient,
b1, of the independent variable (∆H) was 38.38, with a 95% confidence interval of ±2.11.
These findings suggest that the change in crop height can explain 91% of the variability
in the βw (R2 = 0.91) with a highly significant linear relation (p < 0.001). The visual
representation of the regression results for βw (∆H) is shown in Figure 8.
Remote Sens. 2024, 16, x FOR PEER REVIEW 1

Remote Sens. 2024, 16, x FOR PEER REVIEW 1


Remote Sens. 2024, 16, 2646 12 of 17

Figure 8. Observed wet biomass yield vs. change in crop height (∆H).

Figure
3.2.2. VC8. Observed wet biomass yield
as the Independent vs. change(β
Variable inwcrop
(VC))height (∆H).
Figure 8. Observed wet biomass yield vs. change in crop height (∆H).
The
3.2.2. VCprediction function
as the Independent for the
Variable (βwVC
(VC))produced an R2 value of 0.89 (Figure 9). Th
was
3.2.2.2040
VC kg-wet/ha
The prediction with a for
function
as the Independent 37% theCV.
VCAproduced
Variable highly
(βw(VC))an R2
significant
value regression coefficient,
of 0.89 (Figure 9). The b2, of
SeY was
± 4.47The 2040 kg-wet/ha
wasprediction with
observed function a 37%
for the βfor CV. A highly significant
w(VC) with a 95% confidence
regression coefficient,
interval. The b2 ,
results in
the VC produced an R
of 71.76 ± 4.47 was observed for the βw (VC) with a 95% confidence interval. The results 9). Th
2 value of 0.89 (Figure
that
was the
2040VC
indicate that
can satisfactorily
kg-wet/ha with
the VC can a 37%explain
satisfactorilyCV. A89% of the
highly
explain
variability
89% ofsignificant
in the βw.
regression
the variability in the β . coefficient, b2, o
w
± 4.47 was observed for the βw(VC) with a 95% confidence interval. The results in
that the VC can satisfactorily explain 89% of the variability in the βw.

Figure 9. Observed wet biomass yield vs. vegetation coverage (VC).


Figure 9. Observed wet biomass yield vs. vegetation coverage (VC).

3.2.3. Compare and Contrast Between the βw(∆H) and βw(VC)


Figure 9. Observed wet biomass yield vs. vegetation coverage (VC).
It can be inferred from Table 4 that the R2 value was higher (0.91) in the case
Remote Sens. 2024, 16, 2646 13 of 17

3.2.3. Compare and Contrast between the βw (∆H) and βw (VC)


It can be inferred from Table 4 that the R2 value was higher (0.91) in the case of the
βw (∆H) compared to the βw (VC), which produced an R2 value of 0.89. The lower SeY
of 1824 kg-wet/ha was observed for the βw (∆H) compared to the βw (VC) with SeY of
2040 kg-wet/ha. The CV was also lower for the βw (∆H) (33%) compared to the βw (VC)
(37%). In addition to that, as shown in Figure 8, it was observed that the data points were
evenly scattered around the line of best fit over the range of measurement for the βw (∆H).
But in the case of the βw (VC), a skewness in the data point spread was observed around
the extremes (Figure 9). For the VC values less than 75%, the function overpredicts the
βw (VC), and for the VC above that, it tends to underpredict the estimates.
Therefore, to obtain additional insight into what was observed from the graphs, the
mean residual of three observations at the upper and lower extremes for both the prediction
functions were compared. The results indicated that the mean residual for the βw (VC)
was 4585 kg-wet/ha at the upper extreme and 2955 kg-wet/ha at the lower extreme. As
expected, these residuals were higher than the βw (∆H), with a mean residual of 3418 kg-
wet/ha at the upper extreme and 2297 kg-wet/ha at the lower extreme. Notably, the higher
residuals imply a higher deviation of the βw from the observed values, indicated by the
skewness in Figure 9. This could be an impact of the darker regions in the captured images
due to a non-uniform illumination in the area of coverage. This might have resulted in the
error in the VC calculation from the RGB images, whose possible impact is visible as the
skewness in the data point scatter.
Thus, the above results indicate that the βw (∆H) was better than the βw (VC) in all
aspects for WBY predictions. This favors the βw (∆H) as a standalone method for yield
estimation in bermudagrass when using a single independent variable. This result is
also supported by several other researchers who used crop height as an independent
variable for biomass quantification in various crops and achieved satisfactory results of
its correlation [31–33]. As shown in Table 1, compared to the results from various studies
on biomass estimation, the prediction functions developed in this study using the ∆H and
VC showed higher values of the coefficient of determination (R2 ). This also conveys that
despite the darker regions being identified as the background pixels, the results from the
βw (VC) were satisfactory and better compared to the related studies (Table 1), indicating
the least impact of the observed issue. However, the discussed issue needs to be resolved
for further applications and attaining better results.

3.2.4. Incorporation of ∆H and VC into a Multiple Linear Regression Function (βw(∆H, VC))
The regression analysis between the observed vs predicted biomass values for the
βw (∆H, VC) was conducted, which produced an R2 value of 0.92. The SeY of 1726 kg-wet/ha
and a CV equal to 31% were also observed for the combined prediction function. The
coefficients of the independent variables, b1 and b2 , in the multiple linear regression
function were highly significant (p < 0.001), indicating an acceptable performance of the
prediction function for wet biomass estimation. The WBY vs. βw (∆H, VC) scatter, along
with the line of perfect agreement (y = x), is shown in Figure 10.

3.2.5. Impact of Combining VC with ∆H on βw Performance (βw (∆H) vs. βw (∆H, VC))
Since the R2 value was high for both the βw (∆H) and the βw (VC), the next step was
to determine if the βw (∆H, VC) was superior to the βw (∆H). The results showed that the
βw (∆H, VC) achieved the highest R2 value (0.92) and the lowest SeY (1726 kg-wet/ha)
compared to the βw (∆H) (Table 4). The CV (31%) was also less, indicating that the βw (∆H,
VC) performed better than the βw (∆H). However, after evaluating the magnitude of this
improvement, it was seen that the R2 value incremented by just 0.01 and the CV decreased
by 2% only. The SeY became reduced by only 98 kg-wet/ha compared to the βw (∆H).
These improvements were, however, significantly smaller in magnitude, compared to the
average WBY from all the harvests.
Remote
RemoteSens.
Sens.2024,
2024,16,
16,x2646
FOR PEER REVIEW 14 of 18
14 of 17

Figure10.
Figure Observed
10.Observed vs.vs. predicted
predicted wetwet biomass
biomass yield
yield fromfrom
the the change
change in crop
in crop height
height and vegeta-
and vegetation
coverage.
tion coverage.

Additionally,
3.2.5. Impact an artifact
of Combining VCofwith
the skewness
∆H on βwin the βw (VC)(β
Performance was observed
w(∆H) after VC))
vs. βw(∆H, combining
the VC with ∆H in the β w (∆H, VC), which can be seen in Figure 10. This skewness might
Since the R2 value was high for both the βw(∆H) and the βw(VC), the next step was to
influence the accuracy of the WBY prediction in the combined prediction function. This
determine if the βw(∆H, VC) was superior to the βw(∆H). The results showed that the
was confirmed by comparing the residuals for the βw (∆H) and βw (∆H, VC) at the upper
βw(∆H, VC) achieved the highest R2 value (0.92) and the lowest SeY (1726 kg-wet/ha) com-
and lower extremes. A higher magnitude of residuals was observed in the β (∆H, VC)
pared to the βw(∆H) (Table 4). The CV (31%) was also less, indicating that the βww(∆H, VC)
(upper = 3713 kg-wet/ha, lower = 2606 kg-wet/ha) as compared to the βw (∆H) prediction
performed better than the βw(∆H). However, after evaluating the magnitude of this im-
function. This signifies a higher deviation of the βw (∆H, VC) from the WBY values
provement, it was seen that the R2 value incremented by just 0.01 and the CV decreased
compared to the βw (∆H).
by 2% only. The SeY became reduced by only 98 kg-wet/ha compared to the βw(∆H). These
Therefore, the above results indicate the preferability of the βw (∆H) over the βw (∆H,
improvements were, however, significantly smaller in magnitude, compared to the aver-
VC) as a function for WBY estimation. Additionally, dealing with only one independent
age WBY from all the harvests.
variable (βw (∆H)) helps in the reduction of cost, labor, time, and data processing complexi-
Additionally, an artifact of the skewness in the βw(VC) was observed after combining
ties, over the βw (∆H, VC), without significantly compromising the prediction results. The
the VC with ∆H prediction
recommended in the βw(∆H, VC), which
function, can is
βw (∆H), bedefined
seen in in
Figure 10. This
Equation skewness
(3) and was usedmight
for
influence the accuracy of the WBY
deriving the DBY along with the DMF values. prediction in the combined prediction function. This
was confirmed by comparing the residuals for the βw(∆H) and βw(∆H, VC) at the upper
and lower extremes. A higher magnitude βw (∆H) of × ∆H.
residuals
= 38.38 was observed in the βw(∆H, VC) (3)
(upper = 3713 kg-wet/ha, lower = 2606 kg-wet/ha) as compared to the βw(∆H) prediction
3.3. Dry Matter
function. Fractiona higher deviation of the βw(∆H, VC) from the WBY values com-
This signifies
pared Allto the β (∆H).
the plots were sub-sampled after harvesting, and the DMF values for each plot
w

(n =Therefore,
131) were the above results
measured. It was indicate
observedthe preferability
that the CV for ofthethe
DMFβw(∆H)
across over βw(∆H,
all the harvests
VC)
wasas23%.
a function
The mean for WBY
DMFestimation.
was 0.44 with Additionally,
a standarddealing withofonly
deviation 0.10.one independent
This indicates a
variable (βw(∆H))inhelps
large variation in thebetween
the DMF reduction of cost, labor,
harvests. time,observed
Dore [34] and datathat
processing
the averagecomplex-
DMF
ities, over the βw(∆H,
for bermudagrass VC), without
is around 0.41 for 42 significantly compromising
days of growth with a minimumthe prediction
and maximum results.of
The
0.22recommended prediction
and 0.50, respectively. function,the
Therefore, βw(∆H), is defined
DMF value in study
in this Equationwas(3) and the
within wastypical
used
for deriving
range. the DBY along
The common practicewithforthe DMF values.
calculating the DBY is multiplying the WBY values with
the DMF. The mean DMF (DMFm ) from 131 harvests was multiplied with the βw for each
βw (∆H) = 38.38 × ∆H. (3)
plot to obtain the βd , as indicated in Equation (2). Zhang et al. [32] also used average dry
matter to obtain the dry biomass yield in maize.
3.3. Dry Matter Fraction
3.4. Prediction of Dry Biomass Yield (βd )
All the plots were sub-sampled after harvesting, and the DMF values for each plot (n
= 131) The
wererecommended
measured. It was prediction
observedfunction,
that the (∆H)
βwCV for(Equation (3)), was
the DMF across all used as a function
the harvests was
to derive the β (∆H) values using the mean DMF, as per Equation
23%. The mean DMF was 0.44 with a standard deviation of 0.10. This indicates a large
d (2). A linear regression
betweeninthe
variation d andbetween
theβDMF DBY values was Dore
harvests. used [34]
to generate
observed the equation
that for the
the average DMF linefor
ofber-
best
fit (y = bx). The line of best fit was compared with the line
mudagrass is around 0.41 for 42 days of growth with a minimum and maximum of 0.22of perfect agreement (y = x)
matter to obtain the dry biomass yield in maize.

3.4. Prediction of Dry Biomass Yield (βd)


The recommended prediction function, βw(∆H) (Equation (3)), was used as a function
Remote Sens. 2024, 16, 2646 to derive the βd(∆H) values using the mean DMF, as per Equation (2). A linear regression 15 of 17
between the βd and DBY values was used to generate the equation for the line of best fit
(y = bx). The line of best fit was compared with the line of perfect agreement (y = x) to
evaluate thethe
to evaluate performance
performanceof the βd(∆H).
of the The line
βd (∆H). The of perfect
line agreement
of perfect wouldwould
agreement have ahave
slopea
of 1. The prediction function equation (line of best fit) and results from the regression
slope of 1. The prediction function equation (line of best fit) and results from the regression
analysis are illustrated in Figure 11.

Figure 11.
Figure Observed vs.
11. Observed vs. predicted
predicted dry
dry biomass
biomass yield
yield from
from prediction
prediction function
function (β
(βw (∆H)) and mean
w (∆H)) and mean

DMF == 0.44.
0.44.

It was
It was observed
observed that that the
the line
line fitting
fitting through
through thethe observed vs. predicted
observed vs. predicted dry
dry biomass
biomass
yield (line of best fit) had a slope of 0.98, which was not significantly different
yield (line of best fit) had a slope of 0.98, which was not significantly different from the from the
slope of the line of perfect agreement (slope = 1). Moreover, the 95% confidence
slope of the line of perfect agreement (slope = 1). Moreover, the 95% confidence interval interval of
thethe
of variation in the
variation slope
in the slope ± 0.06.
waswas ThisThis
± 0.06. indicates the upper
indicates and lower
the upper bound
and lower of theof
bound slope
the
(b) as 1.04 and 0.91, respectively. Notably, the slope of the line of perfect agreement
slope (b) as 1.04 and 0.91, respectively. Notably, the slope of the line of perfect agreement (b = 1)
falls into this 95% CI range. Therefore, the β (∆H) illustrated a satisfactory performance,
(b = 1) falls into this 95% CI range. Therefore,d the βd(∆H) illustrated a satisfactory perfor-
with the standard error of 939.52 kg-DM/ha for DBY estimates. The regression showed
mance, with the standard error of 939.52 kg-DM/ha for DBY estimates. The regression
an R2 value of 0.87. This additional error and decrease in the coefficient of determination
showed an R2 value of 0.87. This additional error and decrease in the coefficient of deter-
(R2 ) was endorsed by using the mean DMF of all harvests, for varying DMF values across
mination (R2) was endorsed by using the mean DMF of all harvests, for varying DMF val-
the field.
ues across the field.
These results justify the applicability of the ∆H as the independent variable to estimate
These results justify the applicability of the ∆H as the independent variable to esti-
aboveground WBY (kg-wet/ha) and the suitability of the wet biomass function, βw (∆H),
mate aboveground WBY (kg-wet/ha) and the suitability of the wet biomass function,
to estimate the βd using the DMFm . In the study by Koc et al. [35], the seasonal average
βw(∆H), to estimate the βd using the DMFm. In the study by Koc et al. [35], the seasonal
of the DMF for Alfalfa was used to make the DBY predictions for each harvesting season.
average of the DMF for Alfalfa was used to make the DBY predictions for each harvesting
However, the approach practiced in this research is also recommended for making the DBY
season. However, the approach practiced in this research is also recommended for making
estimates. Even if the knowledge of the DMF is enhanced, using any suitable method, the
the DBY estimates.
maximum coefficientEven if the knowledge
of determination that of
canthebe DMF is enhanced,
achieved using
will be 0.91 any
if the suitable
developed
method, the maximum coefficient of determination
function of ∆H (βw (∆H)) for the βd estimation is used. that can be achieved will be 0.91 if the
developed function of ∆H (βw(∆H)) for the βd estimation is used.
4. Conclusions
This research focused on the implementation of stereovision for crop height mea-
surement in bermudagrass. The influence of the crop height and ML-based vegetation
coverage was studied for biomass estimation. The findings convey the strong ability of
stereovision as a crop height measurement system. The impactful representation of crop
height for explaining the variability in the aboveground biomass supports the method. It
was observed that both the ∆H and VC highly correlate with the WBY. However, the ∆H
provided the best prediction equation if one independent variable was used.
Combining the ∆H and VC provided slightly improved results, but the scatter about
the regression line was not uniform. This improvement was minimal in contrast to the
time, labor, and resources involved in extracting the vegetation coverage. Thus, for the
Remote Sens. 2024, 16, 2646 16 of 17

development of the ground rover, which can predict the aboveground biomass, stereovision
can be a useful method for measuring crop height.
Recording a greater number of data points is suggested as part of future studies
while using stereovision-based crop height and vegetation coverage as predictors. This
might help to increase the system’s adaptability among variable environments and crop
diversities. A system to measure the real-time dry matter content of the crop canopy can
be helpful in improving dry biomass predictions. For improving the performance of the
VC, it is suggested to use a higher resolution camera for capturing the canopy images. It is
also recommended to develop a personalized ML model for segmenting vegetative and
non-vegetative pixels, specifically for grasses. This will focus on a specific crop architecture
and may produce enhanced results. Additionally, it is suggested to use a built-in light
source for the illumination of the area of coverage. It should be accompanied by an opaque
shading surface to block diffused sunlight and thus eliminate its effects on data recording.
This artificial light source might help in illuminating the darker vegetative regions for
better data capturing and the post-processing of the RGB images.

Author Contributions: Conceptualization: A.B.K. and J.P.C.; methodology: A.B.K., J.P.C. and M.J.A.;
software: J.S. and S.S.; validation: J.S., A.B.K. and J.P.C.; formal analysis: J.S.; investigation: J.S.,
A.B.K., J.P.C. and M.J.A.; resources: A.B.K. and M.J.A.; data curation: J.S.; writing—original draft
preparation: J.S.; writing—review and editing: J.S., J.P.C., A.B.K. and M.J.A.; visualization: J.S. and
J.P.C.; supervision: A.B.K. and J.P.C.; project administration: A.B.K.; funding acquisition: A.B.K. All
authors have read and agreed to the published version of the manuscript.
Funding: This research was funded by the United States Department of Agriculture (USDA), grant
number 2020-670221-31960. Technical Contribution No. 7302 of the Clemson University Experi-
ment Station.
Data Availability Statement: The raw data supporting the conclusions of this article will be made
available by the authors on request.
Conflicts of Interest: The authors declare no conflicts of interest.

References
1. Squires, V.R.; Dengler, J.; Feng, H.; Hua, L. Grasslands of the World: Diversity, Management and Conservation; A Science Publishers
Book: Boca Raton, FL, USA, 2018.
2. Hansen, T.; Mammen, R.; Crawford, R.; Massie, M.; Bishop-Hurley, G.; Kallenbach, R. Agriculture MU Guide- MU Exten-
sion, University of Missouri-Columbia. Available online: https://extension.missouri.edu/publications/g4620# (accessed on
28 May 2024).
3. Drewitz, N.; Goplen, J. Measuring Forage Quality|UMN Extension. Available online: https://extension.umn.edu/forage-
harvest-and-storage/measuring-forage-quality (accessed on 2 January 2024).
4. Whitbeck, M.; Grace, J.B. Evaluation of non-destructive methods for estimating biomass in marshes of the upper Texas, USA
coast. Wetlands 2006, 26, 278–282. [CrossRef]
5. Li, C.; Wulf, H.; Schmid, B.; He, J.S.; Schaepman, M.E. Estimating plant traits of alpine grasslands on the qinghai-tibetan plateau
using remote sensing. IEEE J. Sel. Top. Appl. Earth Obs. Remote. Sens. 2018, 11, 2263–2275. [CrossRef]
6. Semela, M.; Ramoelo, A.; Adelabu, S. Testing and Comparing the Applicability of Sentinel-2 and Landsat 8 Reflectance Data
in Estimating Mountainous Herbaceous Biomass before and after Fire Using Random Forest Modelling. In Proceedings of
the International Geoscience and Remote Sensing Symposium (IGARSS), Waikoloa, HI, USA, 26 September–2 October 2020;
pp. 4493–4496. [CrossRef]
7. Meshesha, D.T.; Ahmed, M.M.; Abdi, D.Y.; Haregeweyn, N. Prediction of grass biomass from satellite imagery in Somali regional
state, eastern Ethiopia. Heliyon 2020, 6, e05272. [CrossRef] [PubMed]
8. Fernandes, M.H.M.d.R.; Fernandes Junior, J.d.S.; Adams, J.M.; Lee, M.; Reis, R.A.; Tedeschi, L.O. Using sentinel-2 satellite images
and machine learning algorithms to predict tropical pasture forage mass, crude protein, and fiber content. Sci. Rep. 2024, 14, 8704.
[CrossRef]
9. Chen, Y.; Guerschman, J.; Shendryk, Y.; Henry, D.; Tom Harrison, M. Estimating Pasture Biomass Using Sentinel-2 Imagery and
Machine Learning. Remote. Sens. 2021, 13, 603. [CrossRef]
10. Cho, M.A.; Skidmore, A.; Corsi, F.; van Wieren, S.E.; Sobhan, I. Estimation of green grass/herb biomass from airborne hyper-
spectral imagery using spectral indices and partial least squares regression. Int. J. Appl. Earth Obs. Geoinf. 2007, 9, 414–424.
[CrossRef]
Remote Sens. 2024, 16, 2646 17 of 17

11. Franceschini, M.H.D.; Becker, R.; Wichern, F.; Kooistra, L. Quantification of Grassland Biomass and Nitrogen Content through
UAV Hyperspectral Imagery—Active Sample Selection for Model Transfer. Drones 2022, 6, 73. [CrossRef]
12. Schulze-Brüninghoff, D.; Hensgen, F.; Wachendorf, M.; Astor, T. Methods for LiDAR-based estimation of extensive grassland
biomass. Comput. Electron. Agric. 2019, 156, 693–699. [CrossRef]
13. Nguyen, P.; Badenhorst, P.E.; Shi, F.; Spangenberg, G.C.; Smith, K.F.; Daetwyler, H.D. Design of an Unmanned Ground Vehicle and
LiDAR Pipeline for the High-Throughput Phenotyping of Biomass in Perennial Ryegrass. Remote Sens. 2021, 13, 20. [CrossRef]
14. Hütt, C.; Bolten, A.; Hüging, H.; Bareth, G. UAV LiDAR Metrics for Monitoring Crop Height, Biomass and Nitrogen Uptake: A
Case Study on a Winter Wheat Field Trial. PFG J. Photogramm. Remote. Sens. Geoinformation Sci. 2023, 91, 65–76. [CrossRef]
15. Schaefer, M.T.; Lamb, D.W.; Ozdogan, M.; Baghdadi, N.; Thenkabail, P.S. A Combination of Plant NDVI and LiDAR Measurements
Improve the Estimation of Pasture Biomass in Tall Fescue (Festuca arundinacea var. Fletcher). Remote Sens. 2016, 8, 109. [CrossRef]
16. Walter, J.D.C.; Edwards, J.; McDonald, G.; Kuchel, H. Estimating Biomass and Canopy Height with LiDAR for Field Crop
Breeding. Front. Plant Sci. 2019, 10, 473161. [CrossRef] [PubMed]
17. Grüner, E.; Astor, T.; Wachendorf, M. Biomass Prediction of Heterogeneous Temperate Grasslands Using an SfM Approach Based
on UAV Imaging. Agronomy 2019, 9, 54. [CrossRef]
18. Batistoti, J.; Marcato, J.; ítavo, L.; Matsubara, E.; Gomes, E.; Oliveira, B.; Souza, M.; Siqueira, H.; Filho, G.S.; Akiyama, T.; et al.
Estimating Pasture Biomass and Canopy Height in Brazilian Savanna Using UAV Photogrammetry. Remote Sens. 2019, 11, 2447.
[CrossRef]
19. Singh, J.; Koc, A.B.; Aguerre, M.J. Aboveground Biomass Estimation of Tall Fescue using Aerial and Ground-based Systems.
In Proceedings of the 2023 ASABE Annual International Meeting, Omaha, NE, USA, 9–12 July 2023. [CrossRef]
20. Legg, M.; Bradley, S. Ultrasonic Arrays for Remote Sensing of Pasture Biomass. Remote Sens. 2020, 12, 111. [CrossRef]
21. Koc, A.B.; Erwin, C.; Aguerre, M.J.; Chastain, J.P. Estimating Tall Fescue and Alfalfa Forage Biomass Using an Unmanned Ground
Vehicle. In Lecture Notes in Civil Engineering; Springer: Cham, Switzerland, 2024; Volume 458, pp. 357–372. [CrossRef]
22. Andersson, K.; Trotter, M.; Robson, A.; Schneider, D.; Frizell, L.; Saint, A.; Lamb, D.; Blore, C. Estimating pasture biomass with
active optical sensors. Adv. Anim. Biosci. 2017, 8, 754–757. [CrossRef]
23. Martin, R.C.; Astatkie, T.; Cooper, J.M.; Fredeen, A.H. A Comparison of Methods Used to Determine Biomass on Naturalized
Swards. J. Agron. Crop Sci. 2005, 191, 152–160. [CrossRef]
24. Shu, M.; Li, Q.; Ghafoor, A.; Zhu, J.; Li, B.; Ma, Y. Using the plant height and canopy coverage to estimation maize aboveground
biomass with UAV digital images. Eur. J. Agron. 2023, 151, 126957. [CrossRef]
25. Kosmas, C.; Kirkby, M.; Geeson, N. Desertification Indicator System for Mediterranean Europe. Manual on: Key Indicators of
Desertification and Mapping Environmentally Sensitive Areas to Desertification. European Commission, Energy, Environment
and Sustainable Development, EUR 18882, 87 p. Available online: https://esdac.jrc.ec.europa.eu/public_path/shared_folder/
projects/DIS4ME/indicator_descriptions/vegetation_cover.htm# (accessed on 2 January 2024).
26. Flombaum, P.; Sala, O.E. A non-destructive and rapid method to estimate biomass and aboveground net primary production in
arid environments. J. Arid. Environ. 2007, 69, 352–358. [CrossRef]
27. Schirrmann, M.; Hamdorf, A.; Garz, A.; Ustyuzhanin, A.; Dammer, K.H. Estimating wheat biomass by combining image clustering
with crop height. Comput. Electron. Agric. 2016, 121, 374–384. [CrossRef]
28. OAK-D—DepthAI Hardware Documentation 1.0.0 Documentation. Available online: https://docs.luxonis.com/projects/
hardware/en/latest/pages/BW1098OAK/ (accessed on 16 May 2024).
29. Luxonis Field of View Calculator. Available online: https://fov.luxonis.com/?horizontalFov=80&verticalFov=55&horizontalResolution=
1280&verticalResolution=800 (accessed on 16 May 2024).
30. Serouart, M.; Madec, S.; David, E.; Velumani, K.; Lozano, R.L.; Weiss, M.; Baret, F. SegVeg: Segmenting RGB Images into Green
and Senescent Vegetation by Combining Deep and Shallow Methods. Plant Phenomics 2022, 2022, 9803570. [CrossRef] [PubMed]
31. Corti, M.; Cavalli, D.; Cabassi, G.; Bechini, L.; Pricca, N.; Paolo, D.; Marinoni, L.; Vigoni, A.; Degano, L.; Gallina, P.M. Improved
estimation of herbaceous crop aboveground biomass using UAV-derived crop height combined with vegetation indices. Precis.
Agric. 2023, 24, 587–606. [CrossRef]
32. Zhang, Y.; Xia, C.; Zhang, X.; Cheng, X.; Feng, G.; Wang, Y.; Gao, Q. Estimating the maize biomass by crop height and narrowband
vegetation indices derived from UAV-based hyperspectral images. Ecol. Indic. 2021, 129, 107985. [CrossRef]
33. Hütt, C.; Isselstein, J.; Komainda, M.; Schöttker, O.; Sturm, A. UAV LiDAR-based grassland biomass estimation for precision
livestock management. J. Appl. Remote Sens. 2024, 18, 017502. [CrossRef]
34. Dore, R.T. Comparing Bermudagrass and Bahiagrass Cultivars at Different Stages of Harvest for Dry Matter Yield and Nutrient
Content. Master’s Thesis, Louisiana State University LSU Scholarly Repository, Baton Rouge, LA, USA, 2006. [CrossRef]
35. Koc, A.B.; MacInnis, B.M.; Aguerre, M.J.; Chastain, J.P.; Turner, A.P. Alfalfa Biomass Estimation Using Crop Surface Modeling
and NDVI. Appl. Eng. Agric. 2023, 39, 251–264. [CrossRef]

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual
author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to
people or property resulting from any ideas, methods, instructions or products referred to in the content.

You might also like