The challenge of comparing crop imagery over space and time

O. Hall and M.F. Archila Bustos

Imagery collected by drones can help agricultural experts identify the causes of low crop productivity. But the technology must be adapted to determine different crop varieties from multispectral images. And problems of image calibration must be resolved.

The University of Lund and Swedish University of Agricultural Sciences initiated research projects in Kenya and Ghana on the use of unmanned aerial vehicles (UAVs) – also known as drones – for agricultural monitoring. The researchers made use of a framework that combines UAV-based observations, biophysical investigations, and conventional infrared spectrometer technology, with existing survey data on village and farm household characteristics that were taken between 2002 and 2014.

Numerous methods exist for estimating crop yields with remote sensing technology. Commonly, researchers compare a vegetation index calculated from remotely-sensed data with ground based yield measures. This allows them to estimate crops yields throughout the study area. Vegetation indices are a measure of plant vigour and health. They build upon the fact that green vegetation strongly absorbs visible light, but strongly reflects near-infrared light (NIR). While many different vegetation indices have been developed, the most common is the Normalized Difference Vegetation Index (NDVI), which is used in the projects in Kenya and Ghana.

Different farming practices

A key step in this research project is the use of UAVs for remote sensing instead of satellites to create crop yield maps. The researchers decided that the low spatial resolution of traditional remote sensing and NDVI data was not suitable in the research areas in Kenya and Ghana. This imagery cannot adequately capture the intricate agrarian landscapes and farming systems of sub-Saharan Africa, which are among the most complex in the world.

In fact, most satellite and NDVI technologies are developed with mechanised conventional agriculture in mind – the large rectangular fields and single crops that are common in industrialised nations. Farming practices are considerably different in sub-Saharan Africa, where many farmers grow multiple crops with similar plant cycles and where intercropping is a common practice. And most sub-Saharan African farming plots are considerably smaller than those found in industrialised areas.

These different farming practices all call for the use of higher-resolution satellite platforms. But unfortunately, these rarely capture imagery of the sub-Saharan regions. Where high-resolution data is available, the satellites revisit these sub-Saharan African regions so rarely that controlled time-series measurements become impossible. Furthermore, due to the study areas’ proximity to the equator, the images are often masked by clouds, making them unusable for analysis. Therefore, it is difficult to access satellite imagery to address the specific problems which characterize the complex farming systems of sub-Saharan Africa.

High spatial resolution images

To address this lack of remote sensed data, the decision was made to use UAVs to collect high quality aerial data. For this work, autonomous quadcopters have been used and equipped with consumer-grade cameras, which can produce high-resolution NIR and RGB (red, green, blue) images of under-served agricultural areas.

The aerial images that these camera-equipped quadcopters produce have a spatial resolution of three to four centimetres, which is much higher than the spatial resolution that is available from most satellite platforms. These high resolution images are so sharp that they show crop detail even within small fields.

The drone cameras are able to produce these high spatial resolution images because they fly at 100 metres, which is a relatively low elevation for aerial photography. Since the drone cameras collect both NIR and visible light imagery, they can also be used to develop the vegetation indices mentioned above, which in turn can be used for more detailed analyses.

Reflectance values

In order to construct robust crop yield maps, at least a couple of observations over the growing season are needed. However, UAV image quality is highly dependent on environmental conditions at the time of flight and the camera settings used to address them. Unless these conditions are controlled for, images cannot be compared over time. Since it is difficult to standardize the environment and settings at the time of each flight (flights take place at different times of year, at different times of day and under varying weather conditions), a calibration method to standardize the images after production has been developed.

This method involves converting the camera’s digital number pixel values to what is called “reflectance values”. Reflectance values are related to the object itself (such as a specific kind of crop), rather than to the camera model that has been used for the flight. By using this conversion method, researchers are now able to not only make comparisons between different missions, but also to compare UAV-gathered data with other forms of remote sensed data, when it is available. These reflectance values can also be used as the basis for image classification and change detection, which means observing differences in the state of land features by observing them over time.

The current aim of this research project is two-fold. The first is related to image classification – the process of identifying what agricultural experts are looking at in a remotely-sensed image. The researchers involved in this project hope to classify different types of crops in the aerial imagery and to distinguish these crops from non-vegetation and confounding vegetation such as weeds. The second aim is to develop a calibration method that can be used to derive accurate reflectance values from these remotely sensed images.

Automatic crop yield estimation

These classification and calibration tests will serve as a jumping-off point for developing a methodology for the automatic or semi-automatic estimation of crop yields in maize. The plan is to use this methodology throughout the study, where researchers expect to conduct a minimum of three to four more flights per field. The methodology that has been developed for classifying maize could then be applied to other crops.

The primary challenge to date has been in processing frameworks for identifying maize plants in an automatic or semi-automatic way. This process involves separating maize from confounding vegetation, such as intercropped beans, weeds or small bushes (see figure 1). Preliminary results indicate that this maize plant classification process is not only possible, but can be done with relatively high accuracy. Based on these results, the research team thinks that UAV technology could potentially be very promising in regions that are currently under-served in terms of high-resolution, remote-sensed image data.

While inexpensive consumer-grade cameras have been used in this project, researchers are still trying to produce robust and relevant measurements by using well-known vegetation indices and calibration techniques, which are often expensive to use. The aim is to reduce the cost of conducting these measurements so that the amount of local data gathered could be increased and thus the number of farmers included in the analyses. Each farmer was provided with copies of the NDVI and yield maps as well as with the drone images. They can use them in understanding their fields and crops and improving their farming practices.

Ultimately, time-series crop yield maps could be created that can be used alongside survey data related to socio-economic conditions and management practices as well as biophysical field data. By combining and comparing these different kinds of data, it would be possible to understand in more detail why yield gaps in sub-Saharan Africa occur. With this new understanding, experts can develop strategies to increase agricultural productivity across this region.


Related Links

Yieldgap: website of the research project.

Sub-Saharan Africa data from the Global Yield Gap Atlas

FAO’s information on the unique and complex qualities of sub-Saharan African mixed farming systems.

IFPRI on mapping crops to improve food security.



Copyright © 2016, CTA. Technical Centre for Rural and Agricultural Cooperation

CTA is a joint international institution of the African, Caribbean and Pacific (ACP) Group of States and the European Union (EU). CTA operates under the framework of the Cotonou Agreement and is funded by the EU.