In a rapidly evolving digital landscape, satellite and drone-based remote sensing (RS) technologies are emerging as powerful tools for understanding agricultural systems. By capturing multi-resolution, multi-temporal, and multi-spectral imagery of the Earth's surface, RS enables researchers to monitor land use changes, assess crop mapping and productivity, evaluate environmental impacts, and ultimately inform policy and innovation at scale.
The Standing Panel on Impact Assessment (SPIA), part of CGIAR, has been supporting the integration of remote sensing into the innovation evaluation across several country studies. Recent webinars brought together researchers working at the intersection of remote sensing, agricultural innovation, and impact evaluation. The session featured three case studies, one from Ethiopia presented by Binyam Tesfaw and Mariana Belgiu of Addis Ababa University and University of Twente, another one from Vietnam presented by David Wüpper of the University of Bonn, and a third one from Bangladesh presented by Jeff Michler and Beth Tellman of the University of Arizona. These have illustrated how the application of tools such as RS, have helped to better understand the adoption and effects of climate-smart agriculture and other innovations.
Advancing in agricultural and environmental mapping in Ethiopia
The Ethiopia country study, led by Binyam Tasfaw (Addis Ababa University) and Mariana Belgiu (University of Twente), is a showcase of methodological ambition and technological integration.
Their RS approach combines satellite imagery (Sentinel, Landsat, PlanetScope), unmanned aerial vehicles (UAVs), and field survey data to address three core objectives: (i) assessing food security through yield estimation, (ii) mapping forage crop expansion, estimating dry matter productivity and carrying capacity for animal units, and evaluating changes in landscape configuration and environmental conditions associated with the spread of forage grasses; and (iii) evaluating natural resource management (NRM) practicesby mapping and monitoring of intercropping and residue cover.
Several innovations stand out:
- Crop Yield prediction: Field boundary are delineated and crop classification are mapped using deep learning on high-resolution UAV, Worldview and PlanetScope. Machine learning models are trained using field samples and crop survey data, environmental covariates (e.g., temperature, rainfall), and multi- temporal imagery (Sentinel-2, PlanetScope) to predict yields for crops. like maize and wheat.
- Fodder crop monitoring and dry matter productivity: By integrating field labels, UAV-based visual interpretation, multitemporal optical and radar images ,dry matter productivity datasets, and ancillary data (e.g. slope, temperature), the team will monitor fodder crop, and estimates carrying capacity for livestock.
- Mapping intercropping systems: Algorithms are trained to differentiate mixed cropping arrangements using phenological patterns and spectral-temporal signatures, despite the complexity of overlapping plant species to estimate the percent of residue cover at large scale.
Ethiopia’s work is further enriched by a strong emphasis on data quality, reproducibility, and methodological transparency. The team is committed to sharing annotated datasets, training materials, and codes to support learning and collaboration across countries.
Using Remote Sensing to Track Climate-Smart Agriculture in Vietnam
David Wüpper’s project showed an ambitious research agenda to evaluate the impact of climate-smart agricultural practices in Vietnam, specifically alternate wetting and drying (AWD) in rice cultivation. This technique, which involves periodically draining rice paddies, is known to reduce both water use and methane emissions. Remote sensing, Wüpper argued, offers a promising tool for monitoring such practices at scale without relying solely on costly or limited household surveys.
Wüpper outlined how satellite data can be used not only to track AWD adoption but also to estimate yields, detect environmental changes such as pollution or algae blooms, and monitor forest degradation for evaluating payments for ecosystem services. The research aims to use high-frequency satellite imagery in combination with econometric methods like difference-in-differences to assess the diffusion and impact of new technologies.
Still, there are challenges. Persistent cloud cover in tropical regions like Vietnam limits the usefulness of optical satellite data. To overcome this, the team plans to rely on radar data from Sentinel-1, which can penetrate clouds, and to triangulate remote sensing results with survey data to better understand farmer behavior. Importantly, Wüpper emphasized the need to move beyond simple binary measures of adoption and to link remote sensing with on-the-ground implementation details.
Their work underscores the value of combining RS with econometric techniques such as difference-in-differences, enabling rigorous analysis of technology adoption over time. However, challenges remain: (i) persistent cloud cover in Southeast Asia limits the use of optical imagery, necessitating the use of radar data (e.g., Sentinel-1), and (ii) field-level validation is still crucial for accurate assessments.
Measuring the Impact of Submergence-Tolerant Rice in Bangladesh
The remote sensing experience in country study of Bangladesh during 2019-2023, led by Jeff Michler and Beth Tellman, examined the large-scale rollout of submergence-tolerant rice varieties in Bangladesh, specifically the Swarna-Sub1 seed introduced in 2010. These varieties can survive short periods of flooding, a critical feature in flood-prone regions. However, identifying the impact of such technologies at scale presents a unique challenge, especially in data-poor settings with little historical information on where and how adoption took place.
To address this, Michler and Tellman’s team combined Earth observation data with three waves of household panel surveys. They developed deep learning algorithms using convolutional neural networks (CNNs) to overcome the limitations of older satellite data like MODIS, which is lower resolution and often obstructed by clouds. Their approach involved training model using Sentinel-1 radar imagery to reconstruct past flooding patterns with greater accuracy.
Their findings suggest that submergence-tolerant rice varieties do improve yields, but only under specific flooding conditions—what they called a “Goldilocks zone” of not-too-short and not-too-long flood durations. By constructing hundreds of different flooding scenarios, the team identified a cluster of conditions under which the new rice varieties had a statistically significant positive effect on vegetation health, as measured by Enhanced Vegetation Index (EVI).
Tellman emphasized the importance of matching machine learning approaches to the context and data quality. While advanced methods like CNNs and emerging foundation models (e.g., GeoFoundation Models, or GFMs) can deliver powerful results, they also require skilled coders, significant computing power, and well-constructed training data. For many researchers, especially when timelines are tight or technical capacity is limited, simpler approaches methods may be more practical and cost-effective.
Key reflections from remote sensing perspective:
- Deep learning models (e.g., CNNs) often outperform traditional RS methods but require: (i) High-quality, labeled training data; (ii) Skilled programmers; (iii) Large computing resources and image downloads.
- Data quality matters more than model complexity: Better training data > fancier ML architecture
- If data is limited, consider GeoFoundation Models (GFMs) which can generalize better with less training data.
- Higher spatial resolution ≠ better results: Spectral and temporal resolution often contribute more to impact evaluation accuracy.
- Prioritize relevant variables: (i) Focus on what matters most to the research question; (ii) Collaborate with RS experts to balance feasibility and acceptable error margins.
Reflections and Implications
These case studies from Vietnam, Bangladesh, and Ethiopia highlight the growing potential of remote sensing to transform agricultural research. With access to high-frequency, spatially detailed satellite and drone data, researchers are now able to evaluate the adoption and effectiveness of innovations like alternate wetting and drying (AWD), submergence-tolerant rice, and natural resource management practices at scales previously unimaginable. Yet, as presenters emphasized throughout the webinars, these tools are only as powerful as the context in which they are applied. Remote sensing must be carefully aligned with agronomic science, local conditions, and farmers' lived experiences. Interpreting signals like RGB, NIR or vegetation index such as EVI or defining events such as field drying cannot be done in isolation from on-the-ground realities.
These webinars were part of a broader effort to build a dynamic Community of Practice—an open, interdisciplinary network where researchers can learn from each other, share tools and data, and collaboratively tackle methodological challenges. The community is not only fostering technical exchange through shared code repositories, annotated datasets, and training sessions, but also creating space for thoughtful reflection on how remote sensing can support inclusive, evidence-based agricultural policy. As satellite and machine learning technologies evolve, this community is helping ensure they are applied with rigor, transparency, and a commitment to grounding innovation in the complex realities of farming systems across the globe.