0% found this document useful (0 votes)
5 views20 pages

Index-Based Forest Degradation Mapping Using High

This study presents a novel methodology for mapping forest degradation in Garajonay National Park using high and medium resolution satellite imagery and vegetation indices. By analyzing data from WorldView-2, PlanetScope, and Sentinel-2, the research identifies optimal indices for distinguishing healthy from degraded areas, contributing to conservation efforts. The findings emphasize the importance of remote sensing in monitoring forest health and guiding environmental management strategies.

Uploaded by

Asghar Ali
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views20 pages

Index-Based Forest Degradation Mapping Using High

This study presents a novel methodology for mapping forest degradation in Garajonay National Park using high and medium resolution satellite imagery and vegetation indices. By analyzing data from WorldView-2, PlanetScope, and Sentinel-2, the research identifies optimal indices for distinguishing healthy from degraded areas, contributing to conservation efforts. The findings emphasize the importance of remote sensing in monitoring forest health and guiding environmental management strategies.

Uploaded by

Asghar Ali
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 20

INTERNATIONAL JOURNAL OF DIGITAL EARTH

2024, VOL. 17, NO. 1, 2365981


https://doi.org/10.1080/17538947.2024.2365981

Index-based forest degradation mapping using high and medium


resolution multispectral sensors
a
Dionisio Rodríguez-Esparragón , Javier Marcelloa, Francisco Eugenioa and
Paolo Gambab
a
Instituto de Oceanografía y Cambio Global, IOCAG, Unidad Asociada ULPGC-CSIC, Las Palmas de Gran Canaria,
Spain; bDepartment of Electrical, Biomedical and Computer Engineering, University of Pavia, Pavia, Italy

ABSTRACT ARTICLE HISTORY


Monitoring dense forest ecosystems, such as the laurel forest in Garajonay Received 26 January 2024
National Park, is vital for biodiversity conservation, carbon storage, and Accepted 4 June 2024
ecological balance. This study employs satellite remote sensing
KEYWORDS
technologies to introduce a novel methodology, based on vegetation Laurel forest; forest
indices, aiming to assess and protect the health of the forest. Utilizing the degradation; vegetation
Jeffries-Matusita distance and a histogram-based method, optimal indices indices; WorldView;
to map forest degradation, like Wide Dynamic Range Vegetation Index PlanetScope; Sentinel-2
(WDRVI) and Modified Simple Ratio (MSR), were identified among 19
generated indices. The study processed imagery from three satellite
sensors (WorldView-2, PlanetScope and Sentinel-2), producing maps
distinguishing healthy and degraded areas. The study’s practical
significance lies in offering a method to assess the suitability of sensors
and indices for effectively mapping forest degradation. This approach aids
conservation efforts and provides valuable insights for environmental
managers and policymakers, facilitating the implementation of targeted
strategies to safeguard Garajonay National Park’s unique laurel forest
ecosystem. Emphasizing the role of remote sensing in practical vegetation
protection endeavors, the study contributes to on-the-ground initiatives,
ensuring the preservation and sustainability of the park’s rich biodiversity.

1. Introduction
This study focuses on the critical issue of vegetation degradation in Garajonay Park (La Gomera
Island, Spain), a problem exacerbated by global warming and its interconnected effects on forests
and by recent fires. This park stands out for containing half of the mature laurel forests in the entire
Canary Islands Archipelago and is one of the densest laurel forests in the world. However, currently,
the environmental conditions indicated are damaging this ecosystem, making it vulnerable and
causing devitalization phenomena, with a clear loss of health and vigor, and even mortality, in
some areas of the park.
Global warming and its effect on forests are interconnected processes. Each one amplifies the
negative impacts of the other (Abbass et al. 2022; Allen et al. 2010). Phenomena associated with
current meteorology, such as increased temperatures, altered precipitation patterns, melting of gla­
ciers and loss of biodiversity, contribute to the degradation of forests (Larjavaara et al. 2021;

CONTACT Dionisio Rodríguez-Esparragón [email protected] Universidad de Las Palmas de Gran Canaria,


Campus de Tafira, Edificios de Telecomunicaciones, 35017 Las Palmas, España (Spain)
© 2024 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group
This is an Open Access article distributed under the terms of the Creative Commons Attribution-NonCommercial License (http://creativecommons.
org/licenses/by-nc/4.0/), which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work
is properly cited. The terms on which this article has been published allow the posting of the Accepted Manuscript in a repository by the author(s) or
with their consent.
2 D. RODRÍGUEZ-ESPARRAGÓN ET AL.

Lawrence et al. 2022; Masson-Delmotte et al. 2019). Recognizing and addressing these links is cru­
cial to mitigating the effects of global warming and preserving the world’s forests for future
generations.
In this sense, forest monitoring is a crucial practice that serves multiple purposes and justifies its
importance in several aspects. Forest monitoring plays a vital role in preserving biodiversity as for­
ests are home to countless species, many of which are endangered or unique to specific regions. By
monitoring forests, we can track the health and population trends of these species, ensuring their
survival and contributing to overall conservation efforts (Ferreira et al. 2021; Kerry et al. 2022). Fur­
thermore, forest monitoring is crucial for sustainable forest management. It allows to evaluate the
health, productivity, and regeneration rates of forests. This information helps policymakers, forest
managers and scientists make informed decisions about resource allocation, timber harvesting and
conservation strategies (Noss 1999; Thompson et al. 2013; Wulder, Kurz, and Gillis 2004). In sum­
mary, forest monitoring allows us to understand the state of our forests, identify threats and take
appropriate measures to ensure their long-term survival and the well-being of our planet.
In this context, remote sensing is a powerful tool that has revolutionized forest monitoring and
management (Pandey and Arellano 2022). By using satellite imagery and other remote sensing tech­
nologies, forest researchers and managers can obtain valuable information about the health, com­
position, and dynamics of forest ecosystems (Franklin 2001; Lechner, Foody, and Boyd 2020; Pitt
et al. 1997; Tang and Shao 2015). However, despite its numerous benefits, remote sensing for forest
monitoring has also limitations. For example, cloud cover and atmospheric conditions can affect
the image quality and availability using passive sensors. Also, interpretation of remote sensing
data requires specialized knowledge and experience, and field validation is often necessary to ensure
accurate results (Y. Gao,Wang, et al. 2020; Lausch et al. 2016). Despite these drawbacks, remote sen­
sing is considered an invaluable tool for monitoring forests due to its ability to provide broad cover­
age, evaluate various forest parameters, detect disturbances, and support decision-making
processes, making it an essential component of modern forest conservation and management
efforts.
Among other techniques, the classification of vegetation covers using vegetation indices is one of
the most widely used approach in remote sensing and environmental studies (Xue and Su 2017).
Vegetation indices are mathematical expressions derived from spectral reflectance measurements
obtained by satellite or aerial sensors. These indices can provide valuable information about the
health, density, and distribution of vegetation over large areas. Additionally, by analyzing the spatial
patterns of these vegetation indices and combining them with other geographic information, such
as land use/cover maps or topographic data, it is possible to accurately classify and map vegetation
cover over large areas (L. Gao,Wang, et al. 2020; Küchler and Zonneveld 2012). This information is
crucial for monitoring ecosystem health, assessing land degradation, studying vegetation dynamics,
and supporting decision-making processes in fields such as agriculture, forestry, and environmental
management (Avola et al. 2019; L. Gao, Wang, et al. 2020; Giovos et al. 2021; Guerini Filho,
Kuplich, and Quadros 2020; Hatfield et al. 2019; Huang et al. 2021; A. R. Huete 2012; Ji et al.
2021; Pôças et al. 2020; Xue and Su 2017). In short, vegetation indices play a vital role in the classifi­
cation and monitoring of vegetation cover, providing valuable information on the state and distri­
bution of vegetation in extensive regions.
On the other hand, in recent years there has been a significant increase in the availability and
diversity of remote sensing sensors, offering various spatial and spectral resolutions. These sensors
play a crucial role in capturing valuable data about Earth’s surface and atmosphere from space.
Technological advances have led to the development of sensors capable of capturing images in
unprecedented detail, allowing for better analysis and understanding of our planet. This progress
includes improvements in both spatial resolution and spectral resolution of sensors. Indeed, remote
sensing satellites now can capture images at much higher resolutions, allowing scientists and
researchers to examine smaller features and study complex patterns (Schmitt et al. 2023; Tamiminia
et al. 2020; B. Zhang et al. 2019). This improvement in spatial resolution has been particularly
INTERNATIONAL JOURNAL OF DIGITAL EARTH 3

beneficial in applications such as urban planning (Ma et al. 2019; M. Wang, Yu, et al. 2022), disaster
management (Gokaraju et al. 2017; Joyce et al. 2009; Voigt et al. 2007), and environmental moni­
toring (Vizcaya-Martínez et al. 2022; Xu et al. 2021; 2022; J. Zhang 2010), where fine-scale details
are of utmost importance. In terms of increasing spectral resolution, remote sensing sensors can
now detect a broader range of electromagnetic wavelengths and with a narrow bandwidth, provid­
ing information beyond what is visible to the human eye. This broader spectral coverage allows
scientists to investigate diverse phenomena, such as vegetation health (Hernández-Clemente
et al. 2019; Kureel et al. 2022; Lausch et al. 2018; Psomiadis et al. 2016), atmospheric composition
(Blackwell 2005; X. Zhang et al., 2020), and geological characteristics (Asadzadeh and de Souza
Filho 2016; Shirmard et al. 2022; Tripathi and Garg 2021), with greater accuracy and precision.
This work focusses on island areas, as due to their isolation they usually have a remarkable bio­
diversity and a unique population of endemic species. However, these natural ecosystems are often
more sensitive to anthropic pressures and climate change degradation. In particular, this study
addresses the classification of vegetation covers to map areas of healthy and deteriorated vegetation
in the Garajonay Park, a ‘Monteverde Canario’ o ‘laurisilva’ forest concentrating one of the densest
laurel forests in the world. To meet this goal, firstly, a database of images acquired by three sensors
with different spatial and spectral resolutions was generated, namely WorldView-2, PlanetScope
and Sentinel-2. Subsequently, these images were processed to generate 19 different vegetation indi­
ces. To determine the best classification index, the separability between classes was analyzed using
the Jeffries-Matusita distance. Finally, a classification map was generated from each sensor image
using a semi-automatic threshold value.
This research aims to classify vegetation cover and map areas of healthy and deteriorated veg­
etation in the ‘Monteverde Canario’ laurel forest (Santos 1990). Utilizing multiple satellite remote
sensing sensors with diverse resolutions and spectral capabilities, the study contributes to a com­
prehensive analysis of vegetation cover. The process of generating different vegetation indices
and evaluating their separability using the Jeffries-Matusita distance forms a strong measure for
choosing the most appropriate index. While our focus isn’t directly on accuracy, this methodology
could potentially enhance the precision of the classification. Furthermore, the use of a semi-auto­
matic threshold value in generating the classification map ensured efficiency and reproducibility in
the classification process. Overall, the research addresses key questions on the application of remote
sensing data for vegetation monitoring in dense forests, emphasizing the importance of optimal
sensor and index selection for effective mapping of vegetation health and degradation.

2. Materials and methods


The study area is the Garajonay National Park (GNP). The GNP is a UNESCO World Heritage Site
located on the island of La Gomera in the Canary Islands, Spain (Garajonay National Park –
UNESCO World Heritage Centre, n.d.). It is famous for its exceptional laurel forest, a relic of the
Tertiary period and which provides valuable information about the ecosystems and environmental
history of the Macaronesian region (Fernández-Palacios et al. 2011). This forest is characterized by
its unique biodiversity and the presence of ancient laurel trees, which contributes to its international
importance. The park covers an area of approximately 3,983 hectares and represents a vital reserve
of biodiversity, as it is home to several species of endemic and native plants and animals. The
diverse range of flora and fauna of GNP offers an intriguing subject of study for various scientific
disciplines, including biology, ecology, botany, zoology, conservation, and environmental sciences
(Figure 1).
To carry out our study, three images from three sensors with different spatial, spectral, and tem­
poral resolutions were selected (WorldView-2, PlanetScope and Sentinel 2). The main character­
istics of the satellites and sensors are reflected in Table 1. Sentinel-2 and PlanetScope are satellite
constellations. The dates recorded in Table 1 for the PlanetScope (more than 180 platforms) and
Sentinel-2 (2) sensors correspond to the launch of the former. Also, the source images provided
4 D. RODRÍGUEZ-ESPARRAGÓN ET AL.

Figure 1. Garajonay National Park location (Esri) and panoramic view of the northern part of the laurel forest.

Table 1. Summary of characteristics of WorldView-2, PlanetScope and Sentinel-2 sensors.


Characteristic WorldView-2 PlanetScope Sentinel-2
Launch Date October 8, 2009 February 11, 2017 June 23, 2015
Orbit Type Sun-synchronous, polar orbit Sun-synchronous, polar orbit Sun-synchronous, polar orbit
Spatial 0.46 m (panchromatic), 1.84 m 3m 10 , 20 , 60 m (varies with bands)
Resolution (multispectral)
Multispectral 8 bands (coastal, blue, green, 8 bands (coastal, blue, green I, 13 bands (various wavelengths
Bands yellow, red, red edge, NIR1, NIR2) green, yellow, red, red edge, NIR) in the visible, NIR and SWIR)
Revisit Time 1.1 days 1–3 days (global average) 5 days (at the equator)
Swath Width 16.4 km 20-24 km 290 km

by each sensor appear in Figure 2. The sensing dates are 22, 17, and 18 of August 2020 for World­
View-2, PlanetScope and Sentinel 2, respectively. As can be read in Table 1, they have different
spatial, spectral, and temporal resolutions. Particularly, the spatial resolution of the images
affects the level of detail and accuracy that can be achieved in the classification and mapping of veg­
etation cover and health status. Higher spatial resolution images can capture more fine-scale vari­
ations and features, but they also require more processing time and storage space. Lower spatial
resolution images can cover larger areas and reduce noise, but they also introduce more mixed pix­
els and spectral confusion.

Figure 2. RGB composite of: (a) WorldView-2 image from August 22, 2020; (b) PlanetScope image from August 17, 2020; (c)
Sentinel-2 image from August 18, 2020.
INTERNATIONAL JOURNAL OF DIGITAL EARTH 5

The WorldView level 2 ortho-ready product was considered. Radiometric calibration was applied
to correct the sensor gains and offsets for each band. Next, the Fast Line-of-sight Atmospheric
Analysis of Hypercubes (FLAASH) atmospheric correction algorithm was used to derive the Bottom
of Atmosphere (BOA) reflectance, setting the appropriate parameters (atmosphere type, aerosol
depth and model, flight date and time, sensor geometry, height, adjacency, etc.) (Marcello et al.
2016; 2021). Finally, orthorectification was applied to account the effects of terrain relief and sensor
tilt. On the other hand, the Planet Ortho Scene-Analytics (Level 3B) surface reflectance imagery was
used, which is an orthorectified, radiometrically calibrated, and atmospherically corrected product
that captures bottom of atmosphere reflectance characteristics (Frazier and Hemingway 2021).
Finally, the Sentinel-2 Level-2A product was selected, as it includes geometric corrections and pro­
vides BOA imagery. Specifically, the Level-2A is generated from the Level-1C product after applying
orthorectification and the Sen2Cor atmospheric correction model (Main-Knorn et al. 2017). Note
that the multispectral bands without spatial improvement (pansharpening or super-resolution)
have been considered in the analysis to allow a comparison under normal conditions of use by
most remote sensing users. Specifically, pansharpening techniques have not been applied to the
WorldView-2 data, nor to the Sentinel-2 and PlanetScope sensors. These last 2 satellites do not pro­
vide a panchromatic (PAN) band and, therefore, pansharpening could only be performed using a
different sensor with higher resolution. In this context, WorldView-2 could be a good candidate,
but spatial distortions may appear due to its oblique viewing angle and registration mismatches,
exacerbated by the complex topography of the Garajonay park. An additional improvement for Sen­
tinel-2 could be to increase the resolution of the pixel size of the low-resolution bands using the 10 m
bands with pansharpening or super-resolution algorithms. In any case, this enhancement would
only benefit 4 vegetation indices that use the red-edge bands (PSRI, RENDVI, REPI and TCARI)
but, as detailed in the results section, these indices do not achieve the best results for the detection
of devitalized vegetation or for the discrimination between classes.
Under expert supervision, and by integrating field knowledge with tools such as Google Earth
and orthophotos of the terrain, a comprehensive set of regions of interest (ROIs) was meticulously
generated, each corresponding to distinct land cover classes: healthy vegetation, unhealthy veg­
etation, bare soil, and built-up soil, which primarily includes roads and constructions, as shown
in Figure 3. The delineation of these areas was conducted with utmost precision and guidance
from domain experts of the GNP, ensuring the accurate representation and differentiation of
each class. This systematic labeling approach allowed for a robust dataset to be created, laying
the foundation for in-depth analysis, and understanding of the diverse land cover types present
within the study area. The shape file compiled contains 400 points, 100 for each category.

Figure 3. Representation of the areas of interest corresponding to each of the different classes: healthy vegetation (green),
unhealthy vegetation (yellow and example photograph), bare soil (brown) and built-up soil (gray). Note that some points
were obtained in the area of influence of the park. Therefore, the area diverges from that observable in other figures.
6 D. RODRÍGUEZ-ESPARRAGÓN ET AL.

Table 2. Distance Statistics (in meters) for the ROIs of Each Class.
Class Mean Std Min Max
Healthy vegetation 2581,91 1288,30 6,32 4366,70
Unhealthy vegetation 2136,90 1322,80 2,00 3532,56
Built-up Soil 5162,42 3123,81 4,47 7931,57
Bare Soil 3849,52 2727,55 2,83 7393,39

In alignment with the specific objectives of the study, a meticulous selection of the ROIs was
undertaken. This selection process was guided by several criteria, including the comprehensive rep­
resentation of the park’s plant ecosystems, the topographical variations of the terrain, and the avail­
ability of pertinent data. As it was said, a total of 100 points were chosen for each class, thereby
ensuring that the selected ROIs adequately represent both the biodiversity of the park’s vegetation
and the distinct characteristics of the terrain. To achieve an equal number of points for each class,
we selected some ROIs of built land outside the park (beyond the area indicated by the red border),
within its area of influence. Table 2 provides a summary of the distances, expressed in meters,
between the points of each class.
The methodology conducted to compute the vegetation land cover maps is described in the sche­
matic in Figure 4. After preprocessing operations, 19 vegetation indices were calculated for each
image to generate the vegetation indices dataset with the aim to locate areas of stressed vegetation.
Specifically, a thorough review of recent literature was performed to identify forest degradation and
to select appropriate vegetation indices to discriminates devitalized and healthy vegetation consid­
ering the available spectral bands (Dupuis et al. 2020; Medina Machín et al. 2019). The vegetation
indices used, as well as their formulation and references, are summarized in Table 3, including the
description of the Yellow Normalized Difference Vegetation Index (YNDVI) that the authors intro­
duce in this work to test the capacity of the yellow band provided by the WorldView-2 and Planet­
Scope sensors.
The auxiliary data from the ROIs were, then, used to mask the pixel values corresponding to four
classes: healthy vegetation, unhealthy vegetation, built-up soil, and bare soil, from the vegetation
index data set. A separability analysis to check the spectral separability of each class in the different
vegetation indices was performed using the Jeffries-Matusita distance. This analysis was carried out
in two steps: first between the main vegetation and soil classes; second, among the four classes
already described.
The Jeffries-Matusita distance is a statistical measure widely used in remote sensing and image
analysis to quantify dissimilarity or separability between different classes within multivariate data
(Swain and King 1973). Combining elements of Jeffries divergence and Matusita distance, it evalu­
ates statistical differences between probability distributions and evaluates separability as a function

Figure 4. Scheme of the methodology used to generate the vegetation cover map.
INTERNATIONAL JOURNAL OF DIGITAL EARTH 7

Table 3. Formulation and references of the different vegetation indices used in this work.
Name Equation Reference
Atmospherically Resistant r800 − [r680 − g∗(r450 − r680 )] (Kaufman and Tanre
ARVI = , where g = 1
Vegetation Index (ARVI) r800 + [r680 − g∗(r450 − r680 )] 1992)
commonly.􏼔 􏼕
Anthocyanin Reflectance Index 2 1 1 (Gitelson, Merzlyak, and
(ARI2) ARI2 = r800 − Chivkunova 2001)
􏼔 r510 􏼕 r550
Carotenoid Reflectance Index 2 1 1 (Gitelson et al. 2002)
(CRI2) CRI2 = −
r510 r700
Enhanced Vegetation Index (EVI) NIR − RED (A. Huete et al. 2002)
EVI = 2.5∗
􏼒 (NIR + 6∗RED − 7.5∗BLUE) + 􏼓 1
Global Environmental Monitoring RED − 0.125 (Pinty and Verstraete
Index (GEMI) GEMI = n∗(1 − 0.25∗n) − , 1992)
1 − RED
2 2
2∗(NIR − RED ) + 1.5∗NIR + 0.5∗RED
where n =
NIR + RED + 0.5
Green Normalized Difference NIR − [540:570] (Gitelson and Merzlyak
GNDVI =
Vegetation Index (GNDVI) NIR + [540:570] 1998)
Modified Chlorophyll Absorption 1.5∗[2.5∗(r800 − r670 ) − 1.3∗(r800 − r550 )] (Haboudane et al. 2024)
MCARI2 = 􏽱��������������������������������������������� �
Ratio Index Improved (MCARI2) √�����􏼁
(2∗r800 + 1)2 − 6∗r800 − 5∗ r670 − 0.5
Modified Simple Ratio (MSR) RDVI − 1 NIR (Chen 1996)
MSR = √����� , donde RDVI =
RDVI + 1 RED
Modified Triangular Vegetation 1.5∗[1.2∗(r800 − r550 ) − 2.5∗(r670 − r550 )] (Haboudane et al. 2024)
MTVI2 = 􏽱��������������������������������������������� �
Index – Improved (MTVI2) √�����􏼁
(2∗r800 + 1)2 − 6∗r800 − 5∗ r670 − 0.5
Normalized Difference Vegetation NIR − RED (Rouse, Haas, and Schell
NDVI =
Index (NDVI) NIR + RED 1973)
Plant Senescence Reflectance Index r − r500 (Merzlyak et al. 1999)
PSRI = 680
(PSRI) r750
Red Edge Normalized Difference r − r705 (Sims & Gamon, 202 C.E.)
RENDVI = 750
Vegetation Index (RENDVI) r750 + r705
Red Edge Position Index (REPI) [690nm:740nm] − [700nm:730nm] (Curran, Dungan, and
REPI =
[690nm:740nm] + [700nm:730nm] Gholz 1990)
􏽐699
Red Green Ratio Index (RGRI) i=600 Ri
(Gamon and Surfus 1999)
RGRI = 􏽐599
j=500 Rj
Soil Adjusted Vegetation Index (SAVI) 1.5∗(NIR − RED) (A. R. Huete 1988)
SAVI =
NIR + RED + 0.5
Simple Ratio (SR) NIR (Birth and McVey 1968)
SR =
RED 􏼔 􏼒 􏼓􏼕
Transformed Chlorophyll Absorption r (Haboudane et al. 2024)
Reflectance Index (TCARI) TCARI = 3∗ (r700 − r670 ) − 0.2∗(r700 − r550 )∗ 700
r670
Wide Dynamic Range Vegetation a∗NIR − RED (Gitelson 2004)
WDRVI = , where a goes from 0.1–0.2
Index (WDRVI) a∗NIR + RED
Yellow Normalized Difference NIR − YELLOW -
YNDVI =
Vegetation Index (YNDVI) NIR + YELLOW

of Bhattacharya distance (Choi and Lee 2003). The distance varies from 0 to a maximum value,
where higher values indicate greater dissimilarity and better separability between classes. A value
of 0 suggests complete overlap or identical distributions, while increasing values represent greater
dissimilarity, helping to optimize feature selection to improve classification accuracy in tasks such
as remote sensing image classification.
The Jeffries-Matusita distance is used to measure the spectral separability between two classes
and, thus, it is a better method to assess the discrimination capability of land classes. This distance
can be expressed using Equations (1) and (2).

JM = 2(1 − e− B ), (1)
􏼔 2 􏼕
1 2 1 s1 + s22
B = (m1 − m2 )2 2 + ln (2)
8 s1 + s22 2 2s1 s2
8 D. RODRÍGUEZ-ESPARRAGÓN ET AL.

Figure 5. Scheme of the final steps followed to generate the land cover maps.

where B is the Bhattacharya distance; m and s (i = 1 and 2) represent the mean and variance of
classes C1 and C2, respectively. As applied, the Jeffries-Matusita distance involves values between
0 and 2, and the closer the value is to 2, the better the separation between both classes.
Once the best vegetation indices were selected, based on separability, the image corresponding to
each index was processed using the computed threshold in a stepwise process. First, a threshold was
used to discriminate vegetation and soil. Then, two other thresholds values were applied: one to
separate healthy or unhealthy vegetation, and another to separate the soil bare or built-up soil.
In all cases, the thresholds were applied to the best vegetation index image according to the Jeffr­
ies-Matusita separability criteria (Figure 5).
To establish the optimal threshold values, an overlap coefficient was defined, using the histogram
information, as follows:
Let f1 (x) and f2 (x) be the probability density functions corresponding to the values of the pixels
of two images. Then, an overlap coefficient can be defined as:
􏽚
Coverlap = min{ f1 (x), f2 (x)}dx, (3)

In this case a threshold value must be set as:


x􏽚th
Coverlap
xth / min{ f1 (x), f2 (x)}dx = , (4)
2
xmin

Practically, the equations above allow to calculate the probability of confusion using the inter-
class intersection area of the histograms of each index image extracted from the ROIS points
INTERNATIONAL JOURNAL OF DIGITAL EARTH 9

described previously. Consequently, with equations (3) and (4), the optimal threshold value was set
that ensured a 50% probability of confusion between classes. Finally, two different maps were cre­
ated using three or four classes. On the one hand, the classification into three classes allows park
managers to know the state and distribution of the vegetation. On the other hand, the classification
into four classes provides additional knowledge for the management of natural spaces, including
relevant aspects such as an evaluation of the possibility of soil erosion.
The general process is summarized in Figure 5.
This study underscores the importance of the optimal selection of sensors and indices for the
effective mapping of vegetation health and degradation through the application of a robust meth­
odology to dense forests imagery. The comparison of the results enables us to infer the reliability of
our maps. If all sensors and indices yield similar classifications, it enhances our confidence in the
accuracy of the land cover maps.

3. Results
The comprehensive analytical approach to land cover classification described in the previous sec­
tion conducts to our results structured in four key components. Each of them contributes to refine
the classification accuracy and to improve our understanding of the diverse land cover patterns
within the study area.

3.1. Vegetation indices dataset


The initial stage involves the acquisition and analysis of the vegetation indices dataset. It is expected
that vegetation indices, derived from remote sensing data, can provide valuable insights into the
health and distribution of vegetation within the study area. However, as can be seen in the example
of Figure 6, the maps resulting from their computation exhibit differences and it is not possible to
determine the best result, in terms of distinguishing the state of the vegetation, through visual
inspection.
The resulting dataset consists of a total of 57 images, which includes 19 index images for World­
View-2, 20 for PlanetScope (as shown in Figure 6) due to the provision of two distinct green bands,
and 18 for Sentinel-2, the latter lacking a yellow band in its source images. They are all represented
after normalization, and with the same color palette for comparison (nipy_spectral_r).

3.2. Separability analysis


Following the compilation of the vegetation indices dataset, a thorough separability analysis was
conducted using the Jeffries-Matusita distance. The separability results are summarized in
Table 4 aiding in the selection of optimal features and enhancing the accuracy of subsequent
classification processes. For better comparison, we also provide the graphical representations in
Figures 7 and 8 showing the separability between vegetation and soil and between healthy and
unhealthy vegetation, respectively.
As expected, the separability is higher between vegetation and soil than between both types
of vegetation and soils. In order to select the indices that demonstrate the best performance for
all 3 satellites simultaneously, in Figures 7 and 8 we have included a blue line marking the 80%
and 65% thresholds, respectively. We can realize that ARVI, MSR, NDVI and WDRVI are the
most appropriate for discriminating vegetation and soil, while ARI2, CRI2, MSR, SR
and WDRVI are appropriate for separating healthy and unhealthy vegetation. Note that
YNDVI also provides satisfactory results although, as mentioned, it cannot be applied to Sen­
tinel-2.
According to these results, for each sensor there are several indices that perform well for dis­
crimination between vegetation and soil. Among them, the MSR and WDRVI indices score
10 D. RODRÍGUEZ-ESPARRAGÓN ET AL.

Figure 6. False color representation of vegetation indices obtained from the PlanetScope image of August 17, 2020.

among the best for the 3 sensors. Therefore, one of these indices can be a good choice due to its
excellent performance and robustness regardless of the satellite selected.
In a more detailed analysis, with respect to the separability between vegetation and soil, good
results can be found in both the indices synthesized from the WorldView-2 image and the Planet­
Scope image. In this sense, separability values greater than 85% appear for the MSR, NDVI
(90.56%), REPI, SAVI, WDRVI, and YNDVI indices for WorldView-2, as well as for the
GNDVI-2, PSRI, RENDVI, and YNDVI indices (87.02%) in the case of PlanetScope. As for the Sen­
tinel-2 image, separability appears to be limited at around 81% with the MSR, NDVI, SAVI, and
WDRVI indices (83.02%).
There seems to be a significant setback in the subclassification of both vegetation and soil,
being much broader in the case of the soil class. In the first case (healthy and unhealthy),
only the indices generated from the WoldView-2 image exceed 85% separability in the ARI2,
CRI2, GNDVI, MSR, SR, and WDRVI indices. For both, PlanetScope and Sentinel-2 indices,
they remain well below 80%. Around 75% in the first case (GNDVI-1, and SR indices), and
70% in the second (CRI2). When comparing indices, Sentinel-2 generally exhibits inferior per­
formance. While there are exceptions where Sentinel-2 does not perform poorly, its scores
INTERNATIONAL JOURNAL OF DIGITAL EARTH 11

Table 4. Results of Jeffries-Matusita distance for the indices computed using the WorldView-2 (WV-2), PlanetScope (PS), and Sentinel-2
(S-2) images.
Vegetation/Soil Healthy/Unhealthy Vegetation Built-up/Bare soil
Index WV-2 PS S2 WV-2 PS S2 WV-2 PS S2
ARI2 1,39 (69,71) 1,35 (67,82) 1,38 (69,28) 1,72 (86,28) 1,40 (70,06) 1,36 (68,07) 1,07 (53,45) 0,87 (43,71) 0,58 (28,90)
ARVI 1,79 (89,51) 1,69 (84,45) 1,63 (81,56) 1,61 (80,49) 1,47 (73,47) 1,27 (63,54) 1,16 (58,14) 1,30 (65,04) 0,88 (44,09)
CRI2 1,61 (80,54) 1,33 (66,50) 1,41 (70,80) 1,76 (88,30) 1,39 (69,70) 1,40 (70,37) 0,97 (48,55) 1,11 (55,42) 0,62 (31,09)
EVI 1,51 (75,72) 1,43 (71,78) 1,25 (62,84) 0,81 (40,81) 1,38 (69,13) 0,27 (13,49) 0,98 (49,10) 0,69 (34,38) 0,43 (21,56)
GEMI 0,58 (29,18) 1,08 (54,13) 1,50 (75,20) 0,86 (43,02) 1,34 (67,16) 0,95 (47,51) 1,15 (57,42) 1,05 (52,68) 0,39 (19,59)
GNDVI 1,64 (81,92) 1,63 (81,91) 1,57 (78,58) 1,70 (85,07) 1,51 (75,58) 1,21 (60,81) 1,12 (56,09) 1,18 (58,91) 0,90 (45,06)
1,71 (85,67) 1,49 (74,52) 1,17 (58,81)
MCARI2 1,02 (51,23) 1,56 (78,25) 1,53 (76,47) 0,64 (32,17) 1,29 (64,79) 1,15 (57,40) 0,52 (26,07) 0,61 (30,70) 1,00 (50,19)
MSR 1,77 (88,67) 1,65 (82,84) 1,63 (81,54) 1,73 (86,68) 1,48 (74,35) 1,35 (67,49) 0,97 (48,74) 1,12 (55,91) 0,64 (31,98)
MTVI2 1,02 (51,23) 1,56 (78,25) 1,53 (76,47) 0,64 (32,17) 1,29 (64,79) 1,15 (57,40) 0,52 (26,07) 0,61 (30,70) 1,00 (50,19)
NDVI 1,81 (90,56) 1,64 (82,26) 1,63 (81,64) 1,68 (84,30) 1,38 (69,02) 1,25 (62,60) 1,42 (71,27) 1,40 (70,33) 0,90 (45,00)
PSRI 1,50 (75,26) 1,71 (85,56) 1,51 (75,70) 1,06 (53,14) 1,32 (66,37) 1,24 (62,34) 1,31 (65,52) 1,08 (54,15) 0,49 (24,75)
RENDVI 1,50 (75,26) 1,71 (85,56) 1,51 (75,70) 1,06 (53,14) 1,32 (66,37) 1,24 (62,34) 1,31 (65,52) 1,08 (54,15) 0,49 (24,75)
REPI 1,76 (88,00) 1,53 (76,82) 1,46 (73,36) 1,62 (81,31) 1,37 (68,70) 1,22 (61,33) 1,10 (55,23) 0,94 (46,92) 0,81 (40,64)
RGRI 1,46 (73,21) 1,58 (79,00) 0,76 (38,08) 1,19 (59,66) 1,11 (55,77) 0,43 (21,46) 0,90 (44,88) 1,17 (58,48) 0,81 (40,48)
SAVI 1,80 (89,93) 1,44 (72,23) 1,63 (81,64) 1,68 (84,30) 1,38 (69,08) 1,25 (62,60) 1,43 (71,67) 0,75 (37,38) 0,90 (45,00)
SR 1,68 (83,94) 1,63 (81,86) 1,59 (79,90) 1,75 (87,84) 1,54 (76,93) 1,38 (69,30) 0,93 (46,47) 0,89 (44,67) 0,43 (21,35)
TCARI 0,73 (36,48) 0,92 (46,16) 0,97 (48,44) 1,49 (74,56) 0,55 (27,34) 0,23 (11,76) 0,85 (42,83) 0,92 (46,22) 0,55 (27,61)
WDRVI 1,73 (86,63) 1,66 (83,34) 1,66 (83,22) 1,70 (85,28) 1,47 (73,58) 1,38 (69,08) 0,97 (48,71) 1,27 (63,72) 0,64 (32,26)
YNDVI 1,75 (87,84) 1,74 (87,02) - 1,67 (83,91) 1,32 (66,37) - 1,19 (59,48) 1,35 (67,62) -
Notes: They are expressed as value and (percentage). The maximum value for distance of separability is 2, (100). Two GNDVI values can be
observed in the PS column corresponding to each different green band of PlanetScope sensor.

consistently lag behind the optimal results obtained from the other two sensors, probably attribu­
table to its lower spatial resolution. Finally, regarding the discrimination between built-up and
bare soils, the best separability results appear above 70% for the WorldView-2 (SAVI, 71.67%)
and PlanetScope (NDVI, 70.33%) images, while for the Sentinel-2 image the best values are
around 50% (MCARI2 and MTVI2, both with 50.19%)
As seen in the results expressed in Table 4 as well as in Figures 7 and 8, there is no index that
obtains the best discrimination capability for all classes and sensors. Consequently, to obtain the
more accurate land cover maps, the best indices have been chosen for each pair of classes and sensor.
The basic statistics of the separability results by source image (Table 5) follow the same trend as
the maximum value analyzed in the previous paragraphs. They show that indices based on Planet

Figure 7. Separability values between vegetation and soil (in percentage) for the different indices obtained from the images of
the three sensors.
12 D. RODRÍGUEZ-ESPARRAGÓN ET AL.

Figure 8. Separability values between healthy and unhealthy vegetation (in percentage) for the different indices obtained from
the images of the three sensors.

Table 5. Basic statistics of the separability analysis for the indices generated from the three source images.
Vegetation/Soil Healthy/Unhealthy vegetation Bare/Built-up soil
WorldView-2 Average 1,48 1,39 1,05
Standard deviation 0,37 0,41 0,25
PlanetScope Average 1,53 1,34 1,03
Standard deviation 0,22 0,21 0,24
Sentinel-2 Average 1,46 1,10 0,68
Standard deviation 0,23 0,37 0,21

appear to be more homogeneous than those based on Sentinel-2 and Woldview-2. This causes the
average separability between vegetation and soil classes to be higher for the indices based on
PlanesScope, even though the highest value corresponds to those calculated from the
WolrldView-2 image.

3.3. Thresholding process


Finally, Table 6 shows a summary of the indices applied to generate the final maps for each image, as
well as the threshold values resulting from the thresholding process. The variety of these results
highlights that the decision to use a vegetation index generically in the vegetation cover classifi­
cation process does not seem to be the best choice to obtain a good result. On the contrary, fine-
tuning these processes requires executing some vegetation index selection process.

3.4. Final classification


Figures 9 and 10 show the final classification maps for 3 classes, merging both types of soils. Visu­
ally, the PlanetScope image (compared to the WorldView-2 image) appears to overestimate regions
INTERNATIONAL JOURNAL OF DIGITAL EARTH 13

Table 6. Thresholds and indices selected for computing the final land cover map.
Vegetation/Soil Healthy/Unhealthy vegetation Bare/Built-up soil
WorldView-2 NDVI CRI2 SAVI
0,525582847 0,001642913 0,370324396
PlanetScope YNDVI SR NDVI
0,771155753 10,0621319 0,574113896
Sentinel-2 WDRVI CRI2 MCARI2
−0,41137288 0,002225514 0,390601769

Figure 9. Land cover maps using 3 classes: green color means healthy vegetation; yellow color represents unhealthy vegetation
and brown color indicates soil. In the first row appear WorldView-2 and PlaneScope-based products as in the second row the
product based on Sentinel-2.

with unhealthy vegetation (yellow), much like the Sentinel-2. Furthermore, the Sentinel-2 data
seems to incorrectly identify some areas of vegetation, which it confuses with the ground. We
can appreciate areas of soil located to the south of the Park due to a severe fire occurred in 2012.
Finally, Table 7 shows the total estimated surface area of each class for all sensors. Regarding the
surface covered by vegetation, the data derived from the classification of the Woldview-2 and Pla­
netScope images are very similar, presenting a difference of approximately 60 hectares with respect
to what was obtained with the Sentinel-2 image.
More complex is the evaluation of the calculated surface area of healthy and unhealthy veg­
etation. It seems clear that PlanetScope tends to overestimate the unhealthy surface if we consider
the WoldView image as the reference. In principle, the same trend happens with the classification
obtained from the Sentinel-2 image.
Note, in Table 7, a small difference of around 4 hectares in the total surface area measured on the
Sentinel-2 image is observed compared to the WorldView-2 and PlanetScope sensors. This is
attributable to the difference in spatial resolution between them.
14 D. RODRÍGUEZ-ESPARRAGÓN ET AL.

Figure 10. Healthy (green) and unhealthy (red) vegetation maps for the three sensors. By row: WorldView-2, PlanetScope, and
Sentinel-2.

4. Discussion
The issue of vegetation cover degradation (Figure 3) is a significant concern for the Garajonay
National Park, and it currently stands as a top priority in its conservation efforts. Identifying
these areas of vegetation decline throughout the park is crucial for addressing this challenge.
This issue is particularly complex, as the affected areas are often not extensive, and only a few indi­
vidual trees are typically impacted.

Table 7. Summary of the estimated surfaces from the images of each sensor.
Class WoldView-2 PlanetScope Sentinel-2
Vegetation 3003,03 ha. 3000,11 ha. 3060,02 ha.
Healthy vegetation 2450,22 ha. 2197,29 ha. 2293,57 ha.
Unhealthy vegetation 552,81 ha. 802,82 ha. 766,45 ha.
Bare soil 630,46 ha. 531,78 ha. 535,22 ha.
Built-up soil 71,17 ha. 172,76 ha. 113,41 ha.
Total 3704,66 ha. 3704,66 ha. 3708,65 ha.
INTERNATIONAL JOURNAL OF DIGITAL EARTH 15

In the task of detecting unhealthy vegetation in a dense forest (as laurisilva forest), the most com­
mon approach involves the use of spectral vegetation indices. While there is a wide range of veg­
etation indices available, it is essential to conduct a comprehensive study to choose the most
suitable spectral vegetation index for this purpose. In fact, the visual inspection of the intensity
images obtained after applying the set of indices clearly shows a discrepancy on results. In conse­
quence, further quantitative analysis is needed to select the suitable indices to generate the best veg­
etation land cover map.
To that respect, and additional separability analysis was conducted using the Jeffries-Matusita
distance and auxiliary annotated data. As can be expected, the spatial resolution is a key factor in
the separability performance, as degraded areas mainly correspond to individual trees or small
groups of them. The results presented on Table 4 and Figures 7–8 show that WorldView can
be used to distinguish between vegetation and soil with an accuracy about 85% (90% in some
cases) through several indices (ARVI, MSR, NDVI, REPI, SAVI, WDRVI, and YNDVI). These
high values of separability can also be observed in the Table 4 corresponding to the Planet
image scoring 85% also in several indices (GNDVI-2, REENVI, REPI, and YNDVI). However,
the Sentinel-2 image separability limit is about 80% (maximum 83% with WDRVI). Regarding
the separability of healthy against unhealthy vegetation, only the Worldview data provide
good results (about 85%) with the ARI2, CRI2, GNDVI, MSR, SR, and WDRVI. In contrast
with these results, the Planet image achieves the best results with the SR index (77%), reaching
over 70% the results corresponding to ARI2, ARVI, CRI2, GNDVI-1, GNDVI-2, MSR, SR, and
WDRVI. The Sentinel results are always under 70% of separability but CRI2 index (70%). Finally,
despite our main objective was to obtain the map of the health of the vegetation, a study of
separability of the bare and built-up soils also was executed. NDVI provides the best score for
WorldView and Planet images: 71% and 70% respectively. Sentinel-2 results are poor reaching
un maximum of 50% for MCARI2, and MTVI2 indices.
The analysis showed that TCARI provides, in general, the worse overall performance. In particu­
lar, according to Figure 8, the results of several vegetation indices, including EVI, GEMI, MCARI2,
MTVI2, RGRI and TCARI, indicate low scores on all three satellites, resulting in poor performance
in areas such as the one being studied. Specifically, EVI is often preferred for use in densely forested
regions due to its resistance to saturation, but the results in this case are not satisfactory. This dis­
crepancy may be attributed to the park’s complex topography, as previous research has highlighted
how topographic features can influence the index performance(Matsushita et al. 2007).
On the other hand, a set of indices including YNDVI, SR, WRDVI, MSR, CRI2, ARI2, GNDVI
and ARVI seem to be more suitable for distinguishing between healthy and unhealthy vegetation,
especially when a detailed study is not required or possible or when auxiliary information is lacking.
These indices can offer a more reliable assessment of vegetation health in the given context. As
noted, MSR and WDRVI, in particular, may be a good choice for any of the 3 satellites.
In general, the quantitative results of separability for Planet and Sentinel-2 are more homo­
geneous than WorldView image. This means that the variance of the indices values is higher for
WorldView image in contrast with the other two sensors. This seems to indicate that the higher
spatial resolution, the critical the index selection to fine tuning the land cover map.
The robust thresholding process, based on histogram analysis an applied to select index images,
impacts on the final vegetation land cover map. So, the classification on the classes vegetation and
soil for Worldview and Planet image achieves very close results in terms of estimated area even
using different indices and, consequently, threshold values. However, as the quantitative results
of the separability analysis suggest, distinguishing between healthy and unhealthy vegetation in
this complex scenario is challenging. Furthermore, spatial resolution is a variable that conditions
the results.
Several strategies can be explored in future works to improve these results. Among them, apply­
ing advanced pansharpening techniques or super-resolution strategies to improve the spatial resol­
utions of WorldView-2, PlanetScope and Sentinel-2 source imagery (Dadrass Javan et al. 2021;
16 D. RODRÍGUEZ-ESPARRAGÓN ET AL.

Salgueiro, Marcello, and Vilaplana 2020; 2021; P. Wang, Bayram, and Sertel 2022; Z. Wang, Ma, and
Zhang 2023); focus on the high temporal resolution of Planet imagery to develop land cover maps
based on temporal analysis; fusion of different indices images to improve the final land cover map
based on machine or deep learning strategies; or develop a multisource classification approach.

5. Conclusions
This study contributes to effective forest monitoring of dense laurel forests by addressing the problem
of detecting diseased or devitalized areas in the Garajonay National Park. In this sense, the following
main conclusions can be listed regarding the methodology, sensors, and vegetation indices:

(1) The research applies a methodology based on vegetation indices to evaluate the health of the
vegetation in the study area using three different sensors. Comparative analysis of these sensors
and indices reveals disparities in results, requiring rigorous quantitative analysis for optimal
mapping of soil vegetation cover.
(2) The results of using the Jeffries-Matusita distance and annotated data indicate that spatial res­
olution is crucial for the separability analysis. Table 4 and Figures 7–8 show that WorldView
and Planet images had higher separability (85%) between vegetation and soil, while Sentinel-2
had lower separability (80%).
(3) WorldView achieved the best separability (85%) for healthy/unhealthy vegetation, while Planet
performed well in SR index (77%). Sentinel-2 had lower separability (<70%) but reached 70%
with CRI2. For bare/built-up soils, NDVI was the best index for WorldView and Planet (71%
and 70%), while Sentinel-2 had poor results (50% with MCARI2 and MTVI2).
(4) In a broader context, the quantitative results for separability demonstrated greater homogen­
eity for Planet and Sentinel-2 compared to the WorldView image. This observation suggests a
higher variance in index values for the WorldView image, underscoring the critical role of
spatial resolution in index selection for refining the land cover map.
(5) The thresholding process based on histogram analysis improved the vegetation map. World­
View and Planet images had similar results for vegetation and soil classes, even with different
indices and thresholds.

In summary, this study serves as a foundational framework for improving methodologies in


monitoring within dense natural forest areas, with the overarching goal of preserving vegetation
and contributing to the analysis of climate change impacts and anthropogenic pressures. By build­
ing upon the findings and insights derived from this research, we aim to refine and innovate
methods for the sustainable management and conservation of these critical ecosystems.

Acknowledgements
We want to acknowledge the Garajonay National Park conservation manager (D. Ángel Fernández López) for his
excellent support and guidance.

Disclosure statement
No potential conflict of interest was reported by the author(s).

Funding
This research was funded by the SPIP2022-02897 project, funded by the Organismo Autónomo Parques Nacionales.
Dionisio Rodríguez-Esparragón work is supported through ULPGC and the financing of his stay at University of
Pavia by the Ministry of Universities granted by Order UNI/501/2021 of May 26, as well as financing by the European
Union-Next Generation EU Funds.
INTERNATIONAL JOURNAL OF DIGITAL EARTH 17

Data availability statement


Statement: Sentinel-2 data can be downloaded from https://scihub.copernicus.eu/dhus/#/home (accessed on 7
November 2023). Restrictions apply to WorldView-2 data due to single user license applies. PlanetScope imagery
by Planets Lab PBC are used under the Education and Research Licensing of University of Pavia.

Author contributions
Conceptualization, D.R. and J.M.; methodology, D.R.; software, D.R; validation, D.R. and J.M.; for­
mal analysis, D.R., J.M., F.E. and P.G.; investigation, D.R and J.M.; resources, F.E and P.G..; data
curation, D.R and J.M.; writing – original draft preparation, D.R..; writing – review and editing,
D.R. and J.M.; visualization, D.R. and J.M.; supervision, D.R, J.M, F.E. AND P.G..; project admin­
istration, D.R.; funding acquisition, D.R., J.M., F.E. and P.G..

ORCID
Dionisio Rodríguez-Esparragón http://orcid.org/0000-0002-4542-2501

References
Abbass, K., M. Z. Qasim, H. Song, M. Murshed, H. Mahmood, and I. Younis. 2022. “A Review of the Global Climate
Change Impacts, Adaptation, and Sustainable Mitigation Measures.” Environmental Science and Pollution
Research 29 (28): 42539–42559. https://doi.org/10.1007/s11356-022-19718-6.
Allen, C. D., A. K. Macalady, H. Chenchouni, D. Bachelet, N. McDowell, M. Vennetier, T. Kitzberger, et al. 2010. “A
Global Overview of Drought and Heat-Induced Tree Mortality Reveals Emerging Climate Change Risks for
Forests.” Forest Ecology and Management 259 (4): 660–684. https://doi.org/10.1016/j.foreco.2009.09.001.
Asadzadeh, S., and C. R. de Souza Filho. 2016. “A Review on Spectral Processing Methods for Geological Remote
Sensing.” International Journal of Applied Earth Observation and Geoinformation 47: 69–90. https://doi.org/10.
1016/j.jag.2015.12.004.
Avola, G., S. F. Di Gennaro, C. Cantini, E. Riggi, F. Muratore, C. Tornambè, and A. Matese. 2019. “Remotely Sensed
Vegetation Indices to Discriminate Field-Grown Olive Cultivars.” Remote Sensing 11 (10): 1242. https://doi.org/
10.3390/rs11101242.
Birth, G. S., and G. R. McVey. 1968. “Measuring the Color of Growing Turf with a Reflectance Spectrophotometer1.”
Agronomy Journal 60 (6): 640–643. https://doi.org/10.2134/agronj1968.00021962006000060016x.
Blackwell, W. J. 2005. “A Neural-Network Technique for the Retrieval of Atmospheric Temperature and Moisture
Profiles from High Spectral Resolution Sounding Data.” IEEE Transactions on Geoscience and Remote Sensing
43 (11): 2535–2546. https://doi.org/10.1109/TGRS.2005.855071.
Chen, J. M. 1996. “Evaluation of Vegetation Indices and a Modified Simple Ratio for Boreal Applications.” Canadian
Journal of Remote Sensing 22 (3): 229–242. https://doi.org/10.1080/07038992.1996.10855178.
Choi, E., and C. Lee. 2003. “Feature Extraction Based on the Bhattacharyya Distance.” Pattern Recognition 36 (8):
1703–1709. https://doi.org/10.1016/S0031-3203(03)00035-9.
Curran, P. J., J. L. Dungan, and H. L. Gholz. 1990. “Exploring the Relationship between Reflectance red Edge and
Chlorophyll Content in Slash Pine.” Tree Physiology 7 (1-2-3-4): 33–48. https://doi.org/10.1093/treephys/7.1-2-
3-4.33.
Dadrass Javan, F., F. Samadzadegan, S. Mehravar, A. Toosi, R. Khatami, and A. Stein. 2021. “A Review of Image
Fusion Techniques for pan-Sharpening of High-Resolution Satellite Imagery.” ISPRS Journal of
Photogrammetry and Remote Sensing 171: 101–117. https://doi.org/10.1016/j.isprsjprs.2020.11.001.
Dupuis, C., P. Lejeune, A. Michez, and A. Fayolle. 2020. “How Can Remote Sensing Help Monitor Tropical Moist
Forest Degradation?—A Systematic Review.” Remote Sensing 12 (7): 1087. https://doi.org/10.3390/rs12071087.
Fernández-Palacios, J. M., L. De Nascimento, R. Otto, J. D. Delgado, E. García-Del-Rey, J. R. Arévalo, and R. J.
Whittaker. 2011. “A Reconstruction of Palaeo-Macaronesia, with Particular Reference to the Long-Term
Biogeography of the Atlantic Island Laurel Forests.” Journal of Biogeography 38 (2): 226–246. https://doi.org/
10.1111/j.1365-2699.2010.02427.x.
Ferreira, C. C., P. J. Stephenson, M. Gill, and E. C. Regan. 2021. Biodiversity Monitoring and the Role of Scientists in
the Twenty-first Century. Closing the Knowledge-Implementation Gap in Conservation Science: Interdisciplinary
Evidence Transfer Across Sectors and Spatiotemporal Scales, 25–50. ISO 690
Franklin, S. 2001. Remote Sensing for Sustainable Forest Management. 1st ed. CRC Press.
18 D. RODRÍGUEZ-ESPARRAGÓN ET AL.

Frazier, A. E., and B. L. Hemingway. 2021. “A Technical Review of Planet Smallsat Data: Practical Considerations for
Processing and Using PlanetScope Imagery.” Remote Sensing 13 (19): 3930 . https://doi.org/10.3390/rs13193930.
Gamon, J., and J. Surfus. 1999. “Assessing Leaf Pigment Content and Activity with a Reflectometer.” The New
Phytologist 143 (1): 105–117. https://doi.org/10.1046/j.1469-8137.1999.00424.x.
Gao, Y., M. Skutsch, J. Paneque-Gálvez, and A. Ghilardi. 2020. “Remote Sensing of Forest Degradation: A Review.”
Environmental Research Letters 15 (10): 103001. https://doi.org/10.1088/1748-9326/abaad7.
Gao, L., X. Wang, B. A. Johnson, Q. Tian, Y. Wang, J. Verrelst, X. Mu, and X. Gu. 2020. “Remote Sensing Algorithms
for Estimation of Fractional Vegetation Cover Using Pure Vegetation Index Values: A Review.” ISPRS Journal of
Photogrammetry and Remote Sensing 159: 364–377. https://doi.org/10.1016/j.isprsjprs.2019.11.018.
Garajonay National Park - UNESCO World Heritage Centre. n.d. Accessed October 19, 2023. https://whc.unesco.
org/en/list/380/.
Giovos, R., D. Tassopoulos, D. Kalivas, N. Lougkos, and A. Priovolou. 2021. “Remote Sensing Vegetation Indices in
Viticulture: A Critical Review.” Agriculture 11 (5): 457. https://doi.org/10.3390/agriculture11050457.
Gitelson, A. A. 2004. “Wide Dynamic Range Vegetation Index for Remote Quantification of Biophysical
Characteristics of Vegetation.” Journal of Plant Physiology 161 (2): 165–173. https://doi.org/10.1078/0176-1617-
01176.
Gitelson, A. A., and M. N. Merzlyak. 1998. “Remote Sensing of Chlorophyll Concentration in Higher Plant Leaves.”
Advances in Space Research 22 (5): 689–692. https://doi.org/10.1016/S0273-1177(97)01133-2.
Gitelson, A. A., M. N. Merzlyak, and O. B. Chivkunova. 2001. “Optical Properties and Nondestructive Estimation of
Anthocyanin Content in Plant Leaves.” Photochemistry and Photobiology 74 (1): 38–45. https://doi.org/10.1562/
0031-8655(2001)074<0038:OPANEO>2.0.CO;2.
Gitelson, A. A., Y. Zur, O. B. Chivkunova, and M. N. Merzlyak. 2002. “Assessing Carotenoid Content in Plant Leaves
with Reflectance Spectroscopy.” Photochemistry and Photobiology 75 (3): 272–281. https://doi.org/10.1562/0031-
8655(2002)075<0272:ACCIPL>2.0.CO;2.
Gokaraju, B., R. A. A. Nobrega, D. A. Doss, A. C. Turlapaty, and R. C. Tesiero. 2017. “Data Fusion of Multi-Source
Satellite Data Sets for Cost-Effective Disaster Management Studies.” Conference Proceedings - IEEE
SOUTHEASTCON. https://doi.org/10.1109/SECON.2017.7925333
Guerini Filho, M., T. M. Kuplich, and F. L. F. D. Quadros. 2020. “Estimating Natural Grassland Biomass by
Vegetation Indices Using Sentinel 2 Remote Sensing Data.” International Journal of Remote Sensing 41 (8):
2861–2876. https://doi.org/10.1080/01431161.2019.1697004.
Haboudane, D., J. R. Miller, E. Pattey, P. J. Zarco-Tejada, and I. B. Strachan. 2024. “Hyperspectral Vegetation Indices
and Novel Algorithms for Predicting Green LAI of Crop Canopies: Modeling and Validation in the Context of
Precision.” Remote Sensing of Environment 90 (3): 337–352. https://doi.org/10.1016/j.rse.2003.12.013.
Hatfield, J. L., J. H. Prueger, T. J. Sauer, C. Dold, P. O’brien, and K. Wacha. 2019. “Applications of Vegetative Indices
from Remote Sensing to Agriculture: Past and Future.” Inventions 4 (4): 71. https://doi.org/10.3390/
inventions4040071.
Hernández-Clemente, R., A. Hornero, M. Mottus, J. Penuelas, V. González-Dugo, J. C. Jiménez, L. Suárez, L. Alonso,
and P. J. Zarco-Tejada. 2019. “Early Diagnosis of Vegetation Health from High-Resolution Hyperspectral and
Thermal Imagery: Lessons Learned from Empirical Relationships and Radiative Transfer Modelling.” Current
Forestry Reports 5 (3): 169–183. https://doi.org/10.1007/s40725-019-00096-1.
Huang, S., L. Tang, J. P. Hupy, Y. Wang, and G. Shao. 2021. “A Commentary Review on the use of Normalized
Difference Vegetation Index (NDVI) in the era of Popular Remote Sensing.” Journal of Forestry Research 32
(1): 1–6. https://doi.org/10.1007/s11676-020-01155-1.
Huete, A. R. 1988. “A Soil-Adjusted Vegetation Index (SAVI).” Remote Sensing of Environment 25 (3): 295–309.
https://doi.org/10.1016/0034-4257(88)90106-X.
Huete, A. R. 2012. “Vegetation Indices, Remote Sensing and Forest Monitoring.” Geography Compass 6 (9): 513–532.
https://doi.org/10.1111/j.1749-8198.2012.00507.x.
Huete, A., K. Didan, T. Miura, E. P. Rodriguez, X. Gao, and L. G. Ferreira. 2002. “Overview of the Radiometric and
Biophysical Performance of the MODIS Vegetation Indices.” Remote Sensing of Environment 83 (1-2): 1–2. https://
doi.org/10.1016/S0034-4257(02)00096-2.
Ji, Z., Y. Pan, X. Zhu, J. Wang, and Q. Li. 2021. “Prediction of Crop Yield Using Phenological Information Extracted
from Remote Sensing Vegetation Index.” Sensors 21 (4): 1406. https://doi.org/10.3390/s21041406.
Joyce, K. E., S. E. Belliss, S. V. Samsonov, S. J. McNeill, and P. J. Glassey. 2009. “A Review of the Status of Satellite
Remote Sensing and Image Processing Techniques for Mapping Natural Hazards and Disasters.” Progress in
Physical Geography 33 (2): 183–207. https://doi.org/10.1177/0309133309339563.
Kaufman, Y., and D. Tanre. 1992. “Atmospherically Resistant Vegetation Index (ARVI) for EOS-MODIS.” IEEE
Transactions on Geoscience and Remote Sensing 30 (2): 261–270. https://doi.org/10.1109/36.134076.
Kerry, R. G., F. J. P. Montalbo, R. Das, S. Patra, G. P. Mahapatra, G. K. Maurya, V. Nayak, et al. 2022. “An Overview of
Remote Monitoring Methods in Biodiversity Conservation.” Environmental Science and Pollution Research 29
(53): 80179–80221. https://doi.org/10.1007/s11356-022-23242-y.
INTERNATIONAL JOURNAL OF DIGITAL EARTH 19

Küchler, A., and I. Zonneveld. 2012. In Vegetation Mapping, edited by A. W. Küchler and I. S. Zonneveld. Dordrecht:
Springer Science & Business Media.
Kureel, N., J. Sarup, S. Matin, S. Goswami, and K. Kureel. 2022. “Modelling Vegetation Health and Stress Using
Hypersepctral Remote Sensing Data.” Modeling Earth Systems and Environment 8 (1): 733–748. https://doi.org/
10.1007/s40808-021-01113-8.
Larjavaara, M., X. Lu, X. Chen, and M. Vastaranta. 2021. “Impact of Rising Temperatures on the Biomass of Humid
old-Growth Forests of the World.” Carbon Balance and Management 16 (1): 1–9. https://doi.org/10.1186/s13021-
021-00194-3.
Lausch, A., O. Bastian, S. Klotz, P. J. Leitão, A. Jung, D. Rocchini, M. E. Schaepman, A. K. Skidmore, L. Tischendorf,
and S. Knapp. 2018. “Understanding and Assessing Vegetation Health by in Situ Species and Remote-Sensing
Approaches.” Methods in Ecology and Evolution 9 (8): 1799–1809. https://doi.org/10.1111/2041-210X.13025.
Lausch, A., S. Erasmi, D. J. King, P. Magdon, and M. Heurich. 2016. “Understanding Forest Health with Remote
Sensing -Part I—A Review of Spectral Traits, Processes and Remote-Sensing Characteristics.” Remote Sensing 8
(12): 1029. https://doi.org/10.3390/rs8121029.
Lawrence, D., M. Coe, W. Walker, L. Verchot, and K. Vandecar. 2022. “The Unseen Effects of Deforestation:
Biophysical Effects on Climate.” Frontiers in Forests and Global Change 5: 756115. https://doi.org/10.3389/ffgc.
2022.756115.
Lechner, A. M., G. M. Foody, and D. S. Boyd. 2020. “Applications in Remote Sensing to Forest Ecology and
Management.” https://doi.org/10.1016/j.oneear.2020.05.001
Ma, X., C. Li, X. Tong, and S. Liu. 2019. “A New Fusion Approach for Extracting Urban Built-up Areas from
Multisource Remotely Sensed Data.” Remote Sensing 11 (21): 2516. https://doi.org/10.3390/rs11212516.
Main-Knorn, M., B. Pflug, J. Louis, V. Debaecker, U. Müller-Wilm, and F. Gascon. 2017. “Sen2Cor for Sentinel-2.”
Proc. SPIE 10427, Image and Signal Processing for Remote Sensing XXIII 10427 (4 October 2017): 37–48. https://
doi.org/10.1117/12.2278218.
Marcello, J., F. Eugenio, C. Gonzalo-Martin, D. Rodriguez-Esparragon, and F. Marques. 2021. “Advanced Processing
of Multiplatform Remote Sensing Imagery for the Monitoring of Coastal and Mountain Ecosystems.” IEEE Access
9: 6536–6549. https://doi.org/10.1109/ACCESS.2020.3046657.
Marcello, J., F. Eugenio, U. Perdomo, and A. Medina. 2016. “Assessment of Atmospheric Algorithms to Retrieve
Vegetation in Natural Protected Areas Using Multispectral High Resolution Imagery.” Sensors 16 (10): 1624.
https://doi.org/10.3390/s16101624.
Masson-Delmotte, V., P. Zhai, H.-O. Pörtner, D. Roberts, J. Skea, E. Calvo, B. Priyadarshi, et al. 2019. “IPCC Special
Report on Climate Change, Desertification, Land Degradation, Sustainable Land Management, Food Security, and
Greenhouse Gas Fluxes in Terrestrial Ecosystems.” https://spiral.imperial.ac.uk/bitstream/10044/1/76618/2/
SRCCL-Full-Report-Compiled-191128.pdf.
Matsushita, B., W. Yang, J. Chen, Y. Onda, and G. Qiu. 2007. “Sensitivity of the Enhanced Vegetation Index (EVI)
and Normalized Difference Vegetation Index (NDVI) to Topographic Effects: A Case Study in High-Density
Cypress Forest.” Sensors 7 (11): 2636–2651. https://doi.org/10.3390/s7112636.
Medina Machín, A., J. Marcello, A. I. Hernández-Cordero, J. Martín Abasolo, and F. Eugenio. 2019. “Vegetation
Species Mapping in a Coastal-Dune Ecosystem Using High Resolution Satellite Imagery.” GIScience & Remote
Sensing 56 (2): 210–232. https://doi.org/10.1080/15481603.2018.1502910.
Merzlyak, M. N., A. A. Gitelson, O. B. Chivkunova, and V. Y. Rakitin. 1999. “Non-destructive Optical Detection of
Pigment Changes during Leaf Senescence and Fruit Ripening.” Physiologia Plantarum 106 (1): 135–141. https://
doi.org/10.1034/j.1399-3054.1999.106119.x.
Noss, R. F. 1999. “Assessing and Monitoring Forest Biodiversity: A Suggested Framework and Indicators.” Forest
Ecology and Management 115 (2-3): 135–146. https://doi.org/10.1016/S0378-1127(98)00394-6.
Pandey, P., and P. Arellano. 2022. In Advances in Remote Sensing for Forest Monitoring, edited by Prem C. Pandey
and Paul Arellano. Hoboken, NJ: John Wiley & Sons.
Pinty, B., and M. M. Verstraete. 1992. “GEMI: A non-Linear Index to Monitor Global Vegetation from Satellites.”
Vegetatio 101 (1): 15–20. https://doi.org/10.1007/BF00031911.
Pitt, D. G., R. G. Wagner, R. J. Hall, D. J. King, D. G. Leckie, and U. Runesson. 1997. “Use of Remote Sensing for
Forest Vegetation Management: A Problem Analysis.” The Forestry Chronicle 73 (4): 459–476. https://doi.org/
10.5558/TFC73459-4.
Pôças, I., A. Calera, I. Campos, and M. Cunha. 2020. “Remote Sensing for Estimating and Mapping Single and Basal
Crop Coefficientes: A Review on Spectral Vegetation Indices Approaches.” Agricultural Water Management 233:
106081. https://doi.org/10.1016/j.agwat.2020.106081.
Psomiadis, E., N. Dercas, N. R. Dalezios, and N. V. Spyropoulos. 2016. “The Role of Spatial and Spectral Resolution
on the Effectiveness of Satellite-Based Vegetation Indices.” Proc. SPIE 9998, Remote Sensing for Agriculture,
Ecosystems, and Hydrology XVIII 9998: 509–521. https://doi.org/10.1117/12.2241316.
Rouse, J., R. Haas, and J. Schell. 1973. “Monitoring Vegetation Systems in the Great Plains with ERTS.” In The
Proceedings of a Symposium Held by Goddard Space Flight Center at Washington, D.C., edited by N. A and S.
A, Scientific and Technical Information Office. 10–14.
20 D. RODRÍGUEZ-ESPARRAGÓN ET AL.

Salgueiro, L., J. Marcello, and V. Vilaplana. 2020. “Super-resolution of Sentinel-2 imagery using generative adversar­
ial networks.” Remote Sensing 12 (15): 2424. https://doi.org/10.3390/rs12152424.
Salgueiro, L., J. Marcello, and V. Vilaplana. 2021. “Single-image super-resolution of Sentinel-2 low resolution bands
with residual dense convolutional neural networks.” Remote Sensing 13 (24): 5007. https://doi.org/10.3390/
RS13245007.
Santos, A. 1990. “Evergreen Forests in the Macaronesian Region.” Nature and Environment Series 49: 78–84.
Schmitt, M., S. A. Ahmadi, Y. Xu, G. Taskin, U. Verma, F. Sica, and R. Haensch. 2023. “There are no Data like more
Data: Datasets for Deep Learning in Earth Observation.” IEEE Geoscience and Remote Sensing Magazine 11 (3):
63–97. https://doi.org/10.1109/MGRS.2023.3293459.
Shirmard, H., E. Farahbakhsh, R. D. Müller, and R. Chandra. 2022. “A Review of Machine Learning in Processing
Remote Sensing Data for Mineral Exploration.” Remote Sensing of Environment 268: 112750. https://doi.org/10.
1016/j.rse.2021.112750.
Sims, D., and J. Gamon. 202 C.E. “Relationships between Leaf Pigment Content and Spectral Reflectance Across a
Wide Range of Species, Leaf Structures and Developmental Stages.” Remote Sensing of Environment 81 (2–3):
337–354. https://doi.org/10.1016/S0034-4257(02)00010-X.
Swain, P. H., and R. C. King. 1973. “Two Effective Feature Selection Criteria for Multispectral Remote Sensing.”
http://docs.lib.purdue.edu/larstechhttp://docs.lib.purdue.edu/larstech/39.
Tamiminia, H., B. Salehi, M. Mahdianpari, L. Quackenbush, S. Adeli, and B. Brisco. 2020. “Google Earth Engine for
geo-Big Data Applications: A Meta-Analysis and Systematic Review.” ISPRS Journal of Photogrammetry and
Remote Sensing 164: 152–170. https://doi.org/10.1016/j.isprsjprs.2020.04.001.
Tang, L., and G. Shao. 2015. “Drone Remote Sensing for Forestry Research and Practices.” Journal of Forestry
Research 26 (4): 791–797. https://doi.org/10.1007/s11676-015-0088-y.
Thompson, I. D., M. R. Guariguata, K. Okabe, C. Bahamondez, R. Nasi, V. Heymell, and C. Sabogal. 2013. “An
Operational Framework for Defining and Monitoring Forest Degradation.” Ecology and Society 18 (2), https://
doi.org/10.5751/ES-05443-180220
Tripathi, P., and R. D. Garg. 2021. “Feature Extraction of DESIS and PRISMA Hyperspectral Remote Sensing
Datasets for Geological Applications.” The International Archives of the Photogrammetry, Remote Sensing and
Spatial Information Sciences XLIV-M-3–2021 (M–3): 169–173. https://doi.org/10.5194/isprs-archives-XLIV-M-
3-2021-169-2021.
Vizcaya-Martínez, D. A., F. Flores-de-Santiago, L. Valderrama-Landeros, D. Serrano, R. Rodríguez-Sobreyra, L. F.
Álvarez-Sánchez, and F. Flores-Verdugo. 2022. “Monitoring Detailed Mangrove Hurricane Damage and Early
Recovery Using Multisource Remote Sensing Data.” Journal of Environmental Management 320: 115830.
https://doi.org/10.1016/j.jenvman.2022.115830.
Voigt, S., T. Kemper, T. Riedlinger, R. Kiefl, K. Scholte, and H. Mehl. 2007. “Satellite Image Analysis for Disaster and
Crisis-Management Support.” IEEE Transactions on Geoscience and Remote Sensing 45 (6): 1520–1528. https://doi.
org/10.1109/TGRS.2007.895830.
Wang, P., B. Bayram, and E. Sertel. 2022. “A Comprehensive Review on Deep Learning Based Remote Sensing Image
Super-Resolution Methods.” Earth-Science Reviews 232: 104110. https://doi.org/10.1016/j.earscirev.2022.104110.
Wang, Z., Y. Ma, and Y. Zhang. 2023. “Review of Pixel-Level Remote Sensing Image Fusion Based on Deep
Learning.” Information Fusion 90: 36–58. https://doi.org/10.1016/j.inffus.2022.09.008.
Wang, M., H. Yu, J. Chen, Y. Zhu, R. Neyns, and F. Canters. 2022. “Mapping of Urban Vegetation with High-
Resolution Remote Sensing: A Review.” Remote Sensing 14 (4): 1031. https://doi.org/10.3390/rs14041031.
Wulder, M. A., W. A. Kurz, and M. Gillis. 2004. “National Level Forest Monitoring and Modeling in Canada.”
Progress in Planning 61 (4): 365–381. https://doi.org/10.1016/S0305-9006(03)00069-2.
Xu, D., J. Cheng, S. Xu, J. Geng, F. Yang, H. Fang, J. Xu, et al. 2022. “Understanding the Relationship between China’s
Eco-Environmental Quality and Urbanization Using Multisource Remote Sensing Data.” Remote Sensing 14 (1):
198. https://doi.org/10.3390/rs14010198.
Xu, D., F. Yang, L. Yu, Y. Zhou, H. Li, J. Ma, J. Huang, et al. 2021. “Quantization of the Coupling Mechanism Between
eco-Environmental Quality and Urbanization from Multisource Remote Sensing Data.” Journal of Cleaner
Production 321: 128948. https://doi.org/10.1016/j.jclepro.2021.128948.
Xue, J., and B. Su. 2017. “Significant Remote Sensing Vegetation Indices: A Review of Developments and
Applications.” Journal of Sensors 2017 (1). https://doi.org/10.1155/2017/1353691.
Zhang, J. 2010. “Multi-source Remote Sensing Data Fusion: Status and Trends.” International Journal of Image and
Data Fusion 1 (1): 5–24. https://doi.org/10.1080/19479830903561035.
Zhang, B., Z. Chen, D. Peng, J. A. Benediktsson, B. Liu, L. Zou, J. Li, and A. Plaza. 2019. “Remotely Sensed Big Data:
Evolution in Model Development for Information Extraction.” Proceedings of the IEEE 107 (12): 2294–2301.
https://doi.org/10.1109/JPROC.2019.2948454.
Zhang, X., F. Wang, W. Wang, F. Huang, B. Chen, L. Gao, S. Wang, et al. 2020. “The Development and Application of
Satellite Remote Sensing for Atmospheric Compositions in China.” Atmospheric Research 245: 105056. https://doi.
org/10.1016/j.atmosres.2020.105056.

You might also like