Refine
Has Fulltext
- yes (38)
Is part of the Bibliography
- yes (38)
Year of publication
Document Type
- Journal article (31)
- Doctoral Thesis (6)
- Master Thesis (1)
Keywords
- remote sensing (38) (remove)
Institute
- Institut für Geographie und Geologie (38) (remove)
Sonstige beteiligte Institutionen
EU-Project number / Contract (GA) number
- 818182 (1)
- 834709 (1)
- LIFE12 BIO/AT/000143 (1)
- LIFE20 NAT/AT/000049 (1)
Drought is a recurring natural climatic hazard event over terrestrial land; it poses devastating threats to human health, the economy, and the environment. Given the increasing climate crisis, it is likely that extreme drought phenomena will become more frequent, and their impacts will probably be more devastating. Drought observations from space, therefore, play a key role in dissimilating timely and accurate information to support early warning drought management and mitigation planning, particularly in sparse in-situ data regions. In this paper, we reviewed drought-related studies based on Earth observation (EO) products in Southeast Asia between 2000 and 2021. The results of this review indicated that drought publications in the region are on the increase, with a majority (70%) of the studies being undertaken in Vietnam, Thailand, Malaysia and Indonesia. These countries also accounted for nearly 97% of the economic losses due to drought extremes. Vegetation indices from multispectral optical remote sensing sensors remained a primary source of data for drought monitoring in the region. Many studies (~21%) did not provide accuracy assessment on drought mapping products, while precipitation was the main data source for validation. We observed a positive association between spatial extent and spatial resolution, suggesting that nearly 81% of the articles focused on the local and national scales. Although there was an increase in drought research interest in the region, challenges remain regarding large-area and long time-series drought measurements, the combined drought approach, machine learning-based drought prediction, and the integration of multi-sensor remote sensing products (e.g., Landsat and Sentinel-2). Satellite EO data could be a substantial part of the future efforts that are necessary for mitigating drought-related challenges, ensuring food security, establishing a more sustainable economy, and the preservation of the natural environment in the region.
Public safety and socio-economic development of the Jharia coalfield (JCF) in India is critically dependent on precise monitoring and comprehensive understanding of coal fires, which have been burning underneath for more than a century. This study utilizes New-Small BAseline Subset (N-SBAS) technique to compute surface deformation time series for 2017–2020 to characterize the spatiotemporal dynamics of coal fires in JCF. The line-of-sight (LOS) surface deformation estimated from ascending and descending Sentinel-1 SAR data are subsequently decomposed to derive precise vertical subsidence estimates. The most prominent subsidence (~22 cm) is observed in Kusunda colliery. The subsidence regions also correspond well with the Landsat-8 based thermal anomaly map and field evidence. Subsequently, the vertical surface deformation time-series is analyzed to characterize temporal variations within the 9.5 km\(^2\) area of coal fires. Results reveal that nearly 10% of the coal fire area is newly formed, while 73% persisted throughout the study period. Vulnerability analyses performed in terms of the susceptibility of the population to land surface collapse demonstrate that Tisra, Chhatatanr, and Sijua are the most vulnerable towns. Our results provide critical information for developing early warning systems and remediation strategies.
Fresh water is a vital natural resource. Earth observation time-series are well suited to monitor corresponding surface dynamics. The DLR-DFD Global WaterPack (GWP) provides daily information on globally distributed inland surface water based on MODIS (Moderate Resolution Imaging Spectroradiometer) images at 250 m spatial resolution. Operating on this spatiotemporal level comes with the drawback of moderate spatial resolution; only coarse pixel-based surface water quantification is possible. To enhance the quantitative capabilities of this dataset, we systematically access subpixel information on fractional water coverage. For this, a linear mixture model is employed, using classification probability and pure pixel reference information. Classification probability is derived from relative datapoint (pixel) locations in feature space. Pure water and non-water reference pixels are located by combining spatial and temporal information inherent to the time-series. Subsequently, the model is evaluated for different input sets to determine the optimal configuration for global processing and pixel coverage types. The performance of resulting water fraction estimates is evaluated on the pixel level in 32 regions of interest across the globe, by comparison to higher resolution reference data (Sentinel-2, Landsat 8). Results show that water fraction information is able to improve the product's performance regarding mixed water/non-water pixels by an average of 11.6% (RMSE). With a Nash-Sutcliffe efficiency of 0.61, the model shows good overall performance. The approach enables the systematic provision of water fraction estimates on a global and daily scale, using only the reflectance and temporal information contained in the input time-series.
Enhancing digital and precision agriculture is currently inevitable to overcome the economic and environmental challenges of the agriculture in the 21st century. The purpose of this study was to generate and compare management zones (MZ) based on the Sentinel-2 satellite data for variable rate application of mineral nitrogen in wheat production, calculated using different remote sensing (RS)-based models under varied soil, yield and crop data availability. Three models were applied, including (1) a modified “RS- and threshold-based clustering”, (2) a “hybrid-based, unsupervised clustering”, in which data from different sources were combined for MZ delineation, and (3) a “RS-based, unsupervised clustering”. Various data processing methods including machine learning were used in the model development. Statistical tests such as the Paired Sample T-test, Kruskal–Wallis H-test and Wilcoxon signed-rank test were applied to evaluate the final delineated MZ maps. Additionally, a procedure for improving models based on information about phenological phases and the occurrence of agricultural drought was implemented. The results showed that information on agronomy and climate enables improving and optimizing MZ delineation. The integration of prior knowledge on new climate conditions (drought) in image selection was tested for effective use of the models. Lack of this information led to the infeasibility of obtaining optimal results. Models that solely rely on remote sensing information are comparatively less expensive than hybrid models. Additionally, remote sensing-based models enable delineating MZ for fertilizer recommendations that are temporally closer to fertilization times.
Supraglacial meltwater accumulation on ice sheets can be a main driver for accelerated ice discharge, mass loss, and global sea-level-rise. With further increasing surface air temperatures, meltwater-induced hydrofracturing, basal sliding, or surface thinning will cumulate and most likely trigger unprecedented ice mass loss on the Greenland and Antarctic ice sheets. While the Greenland surface hydrological network as well as its impacts on ice dynamics and mass balance has been studied in much detail, Antarctic supraglacial lakes remain understudied with a circum-Antarctic record of their spatio-temporal development entirely lacking. This study provides the first automated supraglacial lake extent mapping method using Sentinel-1 synthetic aperture radar (SAR) imagery over Antarctica and complements the developed optical Sentinel-2 supraglacial lake detection algorithm presented in our companion paper. In detail, we propose the use of a modified U-Net for semantic segmentation of supraglacial lakes in single-polarized Sentinel-1 imagery. The convolutional neural network (CNN) is implemented with residual connections for optimized performance as well as an Atrous Spatial Pyramid Pooling (ASPP) module for multiscale feature extraction. The algorithm is trained on 21,200 Sentinel-1 image patches and evaluated in ten spatially or temporally independent test acquisitions. In addition, George VI Ice Shelf is analyzed for intra-annual lake dynamics throughout austral summer 2019/2020 and a decision-level fused Sentinel-1 and Sentinel-2 maximum lake extent mapping product is presented for January 2020 revealing a more complete supraglacial lake coverage (~770 km\(^2\)) than the individual single-sensor products. Classification results confirm the reliability of the proposed workflow with an average Kappa coefficient of 0.925 and a F\(_1\)-score of 93.0% for the supraglacial water class across all test regions. Furthermore, the algorithm is applied in an additional test region covering supraglacial lakes on the Greenland ice sheet which further highlights the potential for spatio-temporal transferability. Future work involves the integration of more training data as well as intra-annual analyses of supraglacial lake occurrence across the whole continent and with focus on supraglacial lake development throughout a summer melt season and into Antarctic winter.
The boreal winter 2019/2020 was very irregular in Europe. While there was very little snow in Central Europe, the opposite was the case in northern Fenno-Scandia, particularly in the Arctic. The snow cover was more persistent here and its rapid melting led to flooding in many places. Since the last severe spring floods occurred in the region in 2018, this raises the question of whether more frequent occurrences can be expected in the future. To assess the variability of snowmelt related flooding we used snow cover maps (derived from the DLR's Global SnowPack MODIS snow product) and freely available data on runoff, precipitation, and air temperature in eight unregulated river catchment areas. A trend analysis (Mann-Kendall test) was carried out to assess the development of the parameters, and the interdependencies of the parameters were examined with a correlation analysis. Finally, a simple snowmelt runoff model was tested for its applicability to this region. We noticed an extraordinary variability in the duration of snow cover. If this extends well into spring, rapid air temperature increases leads to enhanced thawing. According to the last flood years 2005, 2010, 2018, and 2020, we were able to differentiate between four synoptic flood types based on their special hydrometeorological and snow situation and simulate them with the snowmelt runoff model (SRM).
Recently, locust outbreaks around the world have destroyed agricultural and natural vegetation and caused massive damage endangering food security. Unusual heavy rainfalls in habitats of the desert locust (Schistocerca gregaria) and lack of monitoring due to political conflicts or inaccessibility of those habitats lead to massive desert locust outbreaks and swarms migrating over the Arabian Peninsula, East Africa, India and Pakistan. At the same time, swarms of the Moroccan locust (Dociostaurus maroccanus) in some Central Asian countries and swarms of the Italian locust (Calliptamus italicus) in Russia and China destroyed crops despite developed and ongoing monitoring and control measurements. These recent events underline that the risk and damage caused by locust pests is as present as ever and affects 100 million of human lives despite technical progress in locust monitoring, prediction and control approaches. Remote sensing has become one of the most important data sources in locust management. Since the 1980s, remote sensing data and applications have accompanied many locust management activities and contributed to an improved and more effective control of locust outbreaks and plagues. Recently, open-access remote sensing data archives as well as progress in cloud computing provide unprecedented opportunity for remote sensing-based locust management and research. Additionally, unmanned aerial vehicle (UAV) systems bring up new prospects for a more effective and faster locust control. Nevertheless, the full capacity of available remote sensing applications and possibilities have not been exploited yet. This review paper provides a comprehensive and quantitative overview of international research articles focusing on remote sensing application for locust management and research. We reviewed 110 articles published over the last four decades, and categorized them into different aspects and main research topics to summarize achievements and gaps for further research and application development. The results reveal a strong focus on three species — the desert locust, the migratory locust (Locusta migratoria), and the Australian plague locust (Chortoicetes terminifera) — and corresponding regions of interest. There is still a lack of international studies for other pest species such as the Italian locust, the Moroccan locust, the Central American locust (Schistocerca piceifrons), the South American locust (Schistocerca cancellata), the brown locust (Locustana pardalina) and the red locust (Nomadacris septemfasciata). In terms of applied sensors, most studies utilized Advanced Very-High-Resolution Radiometer (AVHRR), Satellite Pour l’Observation de la Terre VEGETATION (SPOT-VGT), Moderate-Resolution Imaging Spectroradiometer (MODIS) as well as Landsat data focusing mainly on vegetation monitoring or land cover mapping. Application of geomorphological metrics as well as radar-based soil moisture data is comparably rare despite previous acknowledgement of their importance for locust outbreaks. Despite great advance and usage of available remote sensing resources, we identify several gaps and potential for future research to further improve the understanding and capacities of the use of remote sensing in supporting locust outbreak- research and management.
Effects of climate change‐induced events on forest ecosystem dynamics of composition, function and structure call for increased long‐term, interdisciplinary and integrated research on biodiversity indicators, in particular within strictly protected areas with extensive non‐intervention zones. The long‐established concept of forest supersites generally relies on long‐term funds from national agencies and goes beyond the logistic and financial capabilities of state‐ or region‐wide protected area administrations, universities and research institutes.
We introduce the concept of data pools as a smaller‐scale, user‐driven and reasonable alternative to co‐develop remote sensing and forest ecosystem science to validated products, biodiversity indicators and management plans. We demonstrate this concept with the Bohemian Forest Ecosystem Data Pool, which has been established as an interdisciplinary, international data pool within the strictly protected Bavarian Forest and Šumava National Parks and currently comprises 10 active partners. We demonstrate how the structure and impact of the data pool differs from comparable cases.
We assessed the international influence and visibility of the data pool with the help of a systematic literature search and a brief analysis of the results. Results primarily suggest an increase in the impact and visibility of published material during the life span of the data pool, with highest visibilities achieved by research conducted on leaf traits, vegetation phenology and 3D‐based forest inventory.
We conclude that the data pool results in an efficient contribution to the concept of global biodiversity observatory by evolving towards a training platform, functioning as a pool of data and algorithms, directly communicating with management for implementation and providing test fields for feasibility studies on earth observation missions.
Supraglacial lakes can have considerable impact on ice sheet mass balance and global sea-level-rise through ice shelf fracturing and subsequent glacier speedup. In Antarctica, the distribution and temporal development of supraglacial lakes as well as their potential contribution to increased ice mass loss remains largely unknown, requiring a detailed mapping of the Antarctic surface hydrological network. In this study, we employ a Machine Learning algorithm trained on Sentinel-2 and auxiliary TanDEM-X topographic data for automated mapping of Antarctic supraglacial lakes. To ensure the spatio-temporal transferability of our method, a Random Forest was trained on 14 training regions and applied over eight spatially independent test regions distributed across the whole Antarctic continent. In addition, we employed our workflow for large-scale application over Amery Ice Shelf where we calculated interannual supraglacial lake dynamics between 2017 and 2020 at full ice shelf coverage. To validate our supraglacial lake detection algorithm, we randomly created point samples over our classification results and compared them to Sentinel-2 imagery. The point comparisons were evaluated using a confusion matrix for calculation of selected accuracy metrics. Our analysis revealed wide-spread supraglacial lake occurrence in all three Antarctic regions. For the first time, we identified supraglacial meltwater features on Abbott, Hull and Cosgrove Ice Shelves in West Antarctica as well as for the entire Amery Ice Shelf for years 2017–2020. Over Amery Ice Shelf, maximum lake extent varied strongly between the years with the 2019 melt season characterized by the largest areal coverage of supraglacial lakes (~763 km\(^2\)). The accuracy assessment over the test regions revealed an average Kappa coefficient of 0.86 where the largest value of Kappa reached 0.98 over George VI Ice Shelf. Future developments will involve the generation of circum-Antarctic supraglacial lake mapping products as well as their use for further methodological developments using Sentinel-1 SAR data in order to characterize intraannual supraglacial meltwater dynamics also during polar night and independent of meteorological conditions. In summary, the implementation of the Random Forest classifier enabled the development of the first automated mapping method applied to Sentinel-2 data distributed across all three Antarctic regions.
Forests in Germany cover around 11.4 million hectares and, thus, a share of 32% of Germany's surface area. Therefore, forests shape the character of the country's cultural landscape. Germany's forests fulfil a variety of functions for nature and society, and also play an important role in the context of climate levelling. Climate change, manifested via rising temperatures and current weather extremes, has a negative impact on the health and development of forests. Within the last five years, severe storms, extreme drought, and heat waves, and the subsequent mass reproduction of bark beetles have all seriously affected Germany’s forests. Facing the current dramatic extent of forest damage and the emerging long-term consequences, the effort to preserve forests in Germany, along with their diversity and productivity, is an indispensable task for the government. Several German ministries have and plan to initiate measures supporting forest health. Quantitative data is one means for sound decision-making to ensure the monitoring of the forest and to improve the monitoring of forest damage. In addition to existing forest monitoring systems, such as the federal forest inventory, the national crown condition survey, and the national forest soil inventory, systematic surveys of forest condition and vulnerability at the national scale can be expanded with the help of a satellite-based earth observation. In this review, we analysed and categorized all research studies published in the last 20 years that focus on the remote sensing of forests in Germany. For this study, 166 citation indexed research publications have been thoroughly analysed with respect to publication frequency, location of studies undertaken, spatial and temporal scale, coverage of the studies, satellite sensors employed, thematic foci of the studies, and overall outcomes, allowing us to identify major research and geoinformation product gaps.
A disease is non-communicable when it is not transferred from one person to another. Typical examples include all types of cancer, diabetes, stroke, or allergies, as well as mental diseases. Non-communicable diseases have at least two things in common — environmental impact and chronicity. These diseases are often associated with reduced quality of life, a higher rate of premature deaths, and negative impacts on a countries' economy due to healthcare costs and missing work force. Additionally, they affect the individual's immune system, which increases susceptibility toward communicable diseases, such as the flu or other viral and bacterial infections. Thus, mitigating the effects of non-communicable diseases is one of the most pressing issues of modern medicine, healthcare, and governments in general. Apart from the predisposition toward such diseases (the genome), their occurrence is associated with environmental parameters that people are exposed to (the exposome). Exposure to stressors such as bad air or water quality, noise, extreme heat, or an overall unnatural surrounding all impact the susceptibility to non-communicable diseases. In the identification of such environmental parameters, geoinformation products derived from Earth Observation data acquired by satellites play an increasingly important role. In this paper, we present a review on the joint use of Earth Observation data and public health data for research on non-communicable diseases. We analyzed 146 articles from peer-reviewed journals (Impact Factor ≥ 2) from all over the world that included Earth Observation data and public health data for their assessments. Our results show that this field of synergistic geohealth analyses is still relatively young, with most studies published within the last five years and within national boundaries. While the contribution of Earth Observation, and especially remote sensing-derived geoinformation products on land surface dynamics is on the rise, there is still a huge potential for transdisciplinary integration into studies. We see the necessity for future research and advocate for the increased incorporation of thematically profound remote sensing products with high spatial and temporal resolution into the mapping of exposomes and thus the vulnerability and resilience assessment of a population regarding non-communicable diseases.
Land cover is a key variable in monitoring applications and new processing technologies made deriving this information easier. Yet, classification algorithms remain dependent on samples collected on the field and field campaigns are limited by financial, infrastructural and political boundaries. Here, animal tracking data could be an asset. Looking at the land cover dependencies of animal behaviour, we can obtain land cover samples over places that are difficult to access. Following this premise, we evaluated the potential of animal movement data to map land cover. Specifically, we used 13 White Storks (Cicona cicona) individuals of the same population to map agriculture within three test regions distributed along their migratory track. The White Stork has adapted to foraging over agricultural lands, making it an ideal source of samples to map this land use. We applied a presence-absence modelling approach over a Normalized Difference Vegetation Index (NDVI) time series and validated our classifications, with high-resolution land cover information. Our results suggest White Stork movement is useful to map agriculture, however, we identified some limitations. We achieved high accuracies (F1-scores > 0.8) for two test regions, but observed poor results over one region. This can be explained by differences in land management practices. The animals preferred agriculture in every test region, but our data showed a biased distribution of training samples between irrigated and non-irrigated land. When both options occurred, the animals disregarded non-irrigated land leading to its misclassification as non-agriculture. Additionally, we found difference between the GPS observation dates and the harvest times for non-irrigated crops. Given the White Stork takes advantage of managed land to search for prey, the inactivity of these fields was the likely culprit of their underrepresentation. Including more species attracted to agriculture - with other land-use dependencies and observation times - can contribute to better results in similar applications.
Large-area remote sensing time-series offer unique features for the extensive investigation of our environment. Since various error sources in the acquisition chain of datasets exist, only properly validated results can be of value for research and downstream decision processes. This review presents an overview of validation approaches concerning temporally dense time-series of land surface geo-information products that cover the continental to global scale. Categorization according to utilized validation data revealed that product intercomparisons and comparison to reference data are the conventional validation methods. The reviewed studies are mainly based on optical sensors and orientated towards global coverage, with vegetation-related variables as the focus. Trends indicate an increase in remote sensing-based studies that feature long-term datasets of land surface variables. The hereby corresponding validation efforts show only minor methodological diversification in the past two decades. To sustain comprehensive and standardized validation efforts, the provision of spatiotemporally dense validation data in order to estimate actual differences between measurement and the true state has to be maintained. The promotion of novel approaches can, on the other hand, prove beneficial for various downstream applications, although typically only theoretical uncertainties are provided.
Advances in remote inventory and analysis of forest resources during the last decade have reached a level to be now considered as a crucial complement, if not a surrogate, to the long-existing field-based methods. This is mostly reflected in not only the use of multiple-band new active and passive remote sensing data for forest inventory, but also in the methodic and algorithmic developments and/or adoptions that aim at maximizing the predictive or calibration performances, thereby minimizing both random and systematic errors, in particular for multi-scale spatial domains. With this in mind, this editorial note wraps up the recently-published Remote Sensing special issue “Remote Sensing-Based Forest Inventories from Landscape to Global Scale”, which hosted a set of state-of-the-art experiments on remotely sensed inventory of forest resources conducted by a number of prominent researchers worldwide.
Regardless of political boundaries, river basins are a functional unit of the Earth’s land surface and provide an abundance of resources for the environment and humans. They supply livelihoods supported by the typical characteristics of large river basins, such as the provision of freshwater, irrigation water, and transport opportunities. At the same time, they are impacted i.e., by human-induced environmental changes, boundary conflicts, and upstream–downstream inequalities. In the framework of water resource management, monitoring of river basins is therefore of high importance, in particular for researchers, stake-holders and decision-makers. However, land surface and surface water properties of many major river basins remain largely unmonitored at basin scale. Several inventories exist, yet consistent spatial databases describing the status of major river basins at global scale are lacking. Here, Earth observation (EO) is a potential source of spatial information providing large-scale data on the status of land surface properties. This review provides a comprehensive overview of existing research articles analyzing major river basins primarily using EO. Furthermore, this review proposes to exploit EO data together with relevant open global-scale geodata to establish a database and to enable consistent spatial analyses and evaluate past and current states of major river basins.
Forest ecosystems fulfill a whole host of ecosystem functions that are essential for life on our planet. However, an unprecedented level of anthropogenic influences is reducing the resilience and stability of our forest ecosystems as well as their ecosystem functions. The relationships between drivers, stress, and ecosystem functions in forest ecosystems are complex, multi-faceted, and often non-linear, and yet forest managers, decision makers, and politicians need to be able to make rapid decisions that are data-driven and based on short and long-term monitoring information, complex modeling, and analysis approaches. A huge number of long-standing and standardized forest health inventory approaches already exist, and are increasingly integrating remote-sensing based monitoring approaches. Unfortunately, these approaches in monitoring, data storage, analysis, prognosis, and assessment still do not satisfy the future requirements of information and digital knowledge processing of the 21st century. Therefore, this paper discusses and presents in detail five sets of requirements, including their relevance, necessity, and the possible solutions that would be necessary for establishing a feasible multi-source forest health monitoring network for the 21st century. Namely, these requirements are: (1) understanding the effects of multiple stressors on forest health; (2) using remote sensing (RS) approaches to monitor forest health; (3) coupling different monitoring approaches; (4) using data science as a bridge between complex and multidimensional big forest health (FH) data; and (5) a future multi-source forest health monitoring network. It became apparent that no existing monitoring approach, technique, model, or platform is sufficient on its own to monitor, model, forecast, or assess forest health and its resilience. In order to advance the development of a multi-source forest health monitoring network, we argue that in order to gain a better understanding of forest health in our complex world, it would be conducive to implement the concepts of data science with the components: (i) digitalization; (ii) standardization with metadata management after the FAIR (Findability, Accessibility, Interoperability, and Reusability) principles; (iii) Semantic Web; (iv) proof, trust, and uncertainties; (v) tools for data science analysis; and (vi) easy tools for scientists, data managers, and stakeholders for decision-making support.
Burkina Faso ranges amongst the fastest growing countries in the world with an annual population growth rate of more than three percent. This trend has consequences for food security since agricultural productivity is still on a comparatively low level in Burkina Faso. In order to compensate for the low productivity, the agricultural areas are expanding quickly. The mapping and monitoring of this expansion is difficult, even on the basis of remote sensing imagery, since the extensive farming practices and frequent cloud coverage in the area make the delineation of cultivated land from other land cover and land use types a challenging task. However, as the rapidly increasing population could have considerable effects on the natural resources and on the regional development of the country, methods for improved mapping of LULCC (land use and land cover change) are needed. For this study, we applied the newly developed ESTARFM (Enhanced Spatial and Temporal Adaptive Reflectance Fusion Model) framework to generate high temporal (8-day) and high spatial (30 m) resolution NDVI time series for all of Burkina Faso for the years 2001, 2007, and 2014. For this purpose, more than 500 Landsat scenes and 3000 MODIS scenes were processed with this automated framework. The generated ESTARFM NDVI time series enabled extraction of per-pixel phenological features that all together served as input for the delineation of agricultural areas via random forest classification at 30 m spatial resolution for entire Burkina Faso and the three years. For training and validation, a randomly sampled reference dataset was generated from Google Earth images and based on expert knowledge. The overall accuracies of 92% (2001), 91% (2007), and 91% (2014) indicate the well-functioning of the applied methodology. The results show an expansion of agricultural area of 91% between 2001 and 2014 to a total of 116,900 km\(^2\). While rainfed agricultural areas account for the major part of this trend, irrigated areas and plantations also increased considerably, primarily promoted by specific development projects. This expansion goes in line with the rapid population growth in most provinces of Burkina Faso where land was still available for an expansion of agricultural area. The analysis of agricultural encroachment into protected areas and their surroundings highlights the increased human pressure on these areas and the challenges of environmental protection for the future.
Maize cropping systems mapping using RapidEye observations in agro-ecological landscapes in Kenya
(2017)
Cropping systems information on explicit scales is an important but rarely available variable in many crops modeling routines and of utmost importance for understanding pests and disease propagation mechanisms in agro-ecological landscapes. In this study, high spatial and temporal resolution RapidEye bio-temporal data were utilized within a novel 2-step hierarchical random forest (RF) classification approach to map areas of mono- and mixed maize cropping systems. A small-scale maize farming site in Machakos County, Kenya was used as a study site. Within the study site, field data was collected during the satellite acquisition period on general land use/land cover (LULC) and the two cropping systems. Firstly, non-cropland areas were masked out from other land use/land cover using the LULC mapping result. Subsequently an optimized RF model was applied to the cropland layer to map the two cropping systems (2nd classification step). An overall accuracy of 93% was attained for the LULC classification, while the class accuracies (PA: producer’s accuracy and UA: user’s accuracy) for the two cropping systems were consistently above 85%. We concluded that explicit mapping of different cropping systems is feasible in complex and highly fragmented agro-ecological landscapes if high resolution and multi-temporal satellite data such as 5 m RapidEye data is employed. Further research is needed on the feasibility of using freely available 10–20 m Sentinel-2 data for wide-area assessment of cropping systems as an important variable in numerous crop productivity models.
Schistosomiasis is a widespread water-based disease that puts close to 800 million people at risk of infection with more than 250 million infected, mainly in sub-Saharan Africa. Transmission is governed by the spatial distribution of specific freshwater snails that act as intermediate hosts and the frequency, duration and extent of human bodies exposed to infested water sources during human water contact. Remote sensing data have been utilized for spatially explicit risk profiling of schistosomiasis. Since schistosomiasis risk profiling based on remote sensing data inherits a conceptual drawback if school-based disease prevalence data are directly related to the remote sensing measurements extracted at the location of the school, because the disease transmission usually does not exactly occur at the school, we took the local environment around the schools into account by explicitly linking ecologically relevant environmental information of potential disease transmission sites to survey measurements of disease prevalence. Our models were validated at two sites with different landscapes in Côte d’Ivoire using high- and moderateresolution remote sensing data based on random forest and partial least squares regression. We found that the ecologically relevant modelling approach explained up to 70% of the variation in Schistosoma infection prevalence and performed better compared to a purely pixelbased modelling approach. Furthermore, our study showed that model performance increased as a function of enlarging the school catchment area, confirming the hypothesis that suitable environments for schistosomiasis transmission rarely occur at the location of survey measurements.
Background
Schistosomiasis is the most widespread water-based disease in sub-Saharan Africa. Transmission is governed by the spatial distribution of specific freshwater snails that act as intermediate hosts and human water contact patterns. Remote sensing data have been utilized for spatially explicit risk profiling of schistosomiasis. We investigated the potential of remote sensing to characterize habitat conditions of parasite and intermediate host snails and discuss the relevance for public health.
Methodology
We employed high-resolution remote sensing data, environmental field measurements, and ecological data to model environmental suitability for schistosomiasis-related parasite and snail species. The model was developed for Burkina Faso using a habitat suitability index (HSI). The plausibility of remote sensing habitat variables was validated using field measurements. The established model was transferred to different ecological settings in Côte d’Ivoire and validated against readily available survey data from school-aged children.
Principal Findings
Environmental suitability for schistosomiasis transmission was spatially delineated and quantified by seven habitat variables derived from remote sensing data. The strengths and weaknesses highlighted by the plausibility analysis showed that temporal dynamic water and vegetation measures were particularly useful to model parasite and snail habitat suitability, whereas the measurement of water surface temperature and topographic variables did not perform appropriately. The transferability of the model showed significant relations between the HSI and infection prevalence in study sites of Côte d’Ivoire.
Conclusions/Significance
A predictive map of environmental suitability for schistosomiasis transmission can support measures to gain and sustain control. This is particularly relevant as emphasis is shifting from morbidity control to interrupting transmission. Further validation of our mechanistic model needs to be complemented by field data of parasite- and snail-related fitness. Our model provides a useful tool to monitor the development of new hotspots of potential schistosomiasis transmission based on regularly updated remote sensing data.