Authors of a recent Crop Science article leveraged unmanned aerial vehicles (UAVs) to record the normalized difference vegetation index (NDVI), a measure of plant health, at the seed increase stage of the International Maize and Wheat Improvement Center’s (CIMMYT) wheat breeding program.
Francelino Rodrigues prepares an UAV for radiometric calibration for multispectral flight over a maize tar spot complex screening trial at CIMMYT’s Agua Fría experimental station, Mexico. (Photo: Alexander Loladze/CIMMYT)
A new study from researchers at the International Maize and Wheat Improvement Center (CIMMYT) shows that remote sensing can speed up and improve the effectiveness of disease assessment in experimental maize plots, a process known as phenotyping.
The study constitutes the first time that unmanned aerial vehicles (UAVs, commonly known as drones) with cameras that capture non-visible electromagnetic radiation were used to assess tar spot complex on maize.
The interdisciplinary team found among other things that potential yield losses under heavy tar spot complex infections could reach 58% — more than 10% greater than reported in previous studies.
Caused by the interaction of two fungal pathogens that thrive in warm, humid conditions, tar spot complex is diagnosed by the telltale black spots that cover infected plants. (Photo: Alexander Loladze/CIMMYT)
“Plant disease resistance assessment in the field is becoming difficult because breeders’ trials are larger, are conducted at multiple locations, and there is a lack of personnel trained to evaluate diseases,” said Francelino Rodrigues, CIMMYT precision agriculture specialist and co-lead author of the study. “In addition, disease scoring based on visual assessments can vary from person to person.”
A major foliar disease that affects maize throughout Latin America, tar spot complex results from the interaction of two species of fungus that thrive in warm, humid conditions. The disease causes telltale black spots on infected plants, killing leaves, weakening the plant, and impairing ear development.
Phenotyping has traditionally involved breeders walking through crop plots and visually assessing each plant, a labor-intensive and time-consuming process. As remote sensing technologies become more accessible and affordable, scientists are applying them more often to assess experimental plants for desired agronomic or physical traits, according to Rodrigues, who said they can facilitate accurate, high-throughput phenotyping for resistance to foliar diseases in maize and help reduce the cost and time of developing improved maize germplasm.
“To phenotype maize for resistance to foliar diseases, highly trained personnel must spend hours in the field to complete visual crop evaluations, which requires substantial time and resources and may result in biased or inaccurate results between surveyors,” said Rodrigues. “The use of UAVs to gather multispectral and thermal images allows researchers to cut down the time and expenses of evaluations, and perhaps in the future it could also improve accuracy.”
Color-infrared image of maize hybrids in the experimental trials under fungicide treatment (A1) and non-fungicide treatment (A2) of tar spot complex of maize. Image data were extracted from two polygons from the two central rows in each plot (B).
Technology sheds new light on phenotyping
Receptors in the human eye detect a limited range of wavelengths in the electromagnetic spectrum — the area we call visible light — consisting of three bands that our eyes perceive as red, green and blue. The colors we see are the combination of the three bands of visible light that an object reflects.
Remote sensing takes advantage of how the surface of a leaf differentially absorbs, transmits and reflects light or other electromagnetic radiation, depending on its composition and condition. The reflectance of diseased plant tissue is different from that of healthy ones, provided the plants are not stressed by other factors, such as heat, drought or nutrient deficiencies.
In this study, researchers planted 25 tropical and subtropical maize hybrids of known agronomic performance and resistance to tar spot complex at CIMMYT’s experimental station in Agua Fría, central Mexico. They then carried out disease assessments by eye and gathered multispectral and thermal imagery of the plots.
This allowed them to compare remote sensing with traditional phenotyping methods. Calculations revealed a strong relationship between grain yield, canopy temperature, vegetation indices and the visual assessment.
Future applications
“The results of the study suggest that remote sensing could be used as an alternative method for assessment of disease resistance in large-scale maize trials,” said Rodrigues. “It could also be used to calculate potential losses due to tar spot complex.”
Accelerated breeding for agriculturally relevant crop traits is fundamental to the development of improved varieties that can face mounting global agricultural threats. It is likely that remote sensing technologies will have a critical role to play in overcoming these challenges.
“An important future area of research encompasses pre-symptomatic detection of diseases in maize,” explained Rodrigues. “If successful, such early detection would allow appropriate disease management interventions before the development of severe epidemics. Nevertheless, we still have a lot of work to do to fully integrate remote sensing into the breeding process and to transfer the technology into farmers’ fields.”
Funding for this research was provided by the CGIAR Research Program on Maize (MAIZE).
Signing ceremony (L-R) with Pierre Defourny, Urs Schulthess, Kai Sonder, Bruno Gérard and Francelino Rodrigues giving CIMMYT access to the pilot version of the Sen2-Agri processing system and receive training on its use. Photo: Liliana Díaz Ramírez
EL BATAN, Mexico (CIMMYT) – The International Maize and Wheat Improvement Center (CIMMYT) has been selected by the European Space Agency (ESA) to have access to the pilot version of the Sen2-Agri processing system and receive training on its use.
As an ESA “champion user,” CIMMYT will test the ESA prototype system in Bangladesh and Mexico. These two sites cover a wide range of farming systems, from the large wheat fields of the Yaqui Valley to a more diverse system in Bangladesh, where parcel sizes can be as small as 0.05 hectares and farmers grow two to three crops per year on a single field.
“The great unmanned aerial vehicle (UAV) expertise acquired by CIMMYT is very complementary to the full exploitation of the new satellite generation capabilities,” says Pierre Defourny, professor at the Université catholique de Louvain in Belgium who is leading the Sen2-Agri project. “CIMMYT’s two cases will generate products that will support our joint efforts for wheat blast monitoring in Bangladesh and improve data availability for GreenSat in Mexico.”
In the early days of remote sensing, limited availability of data was a major constraint for putting the data to good use. Basic processing of the coarse data was also time consuming and tedious.
Fortunately, this has greatly changed in recent years. Open and free satellite data, such as Landsat 8 and Sentinel 1 & 2, allow for almost weekly coverages at resolutions as fine as 10 meters. Thanks to this new speed and precision, users can now focus on applying the data, deriving information products even for small holder farmers in remote areas.
The Sentinel 2 satellites have a swath width of 290 km. Sentinel-2A is already operational, while Sentinel-2B will be launched in the spring of 2018. Together, they will be able to cover the Earth every 5 days.
For example, the CIMMYT-led STARS project in Bangladesh developed an irrigation scheduling app called PANI, which uses remotely sensed data to estimate crop water use. From this data the farmer receives a simple text message on their cell phone that gives recommendations as to whether a particular field needs to be irrigated or not.
Sen2-Agri is unique compared to other systems in that it simplifies and automates satellite data processing. The system allows for semi-automated generation of products, such as cropland detection, crop classification, normalized difference vegetation index (NDVI) and leaf area index (LAI) based on images taken periodically by satellites Sentinel-2 and Landsat 8.
A signing ceremony was held on 15 August, 2016 to seal the cooperation between ESA and CIMMYT. Bruno Gérard, Director of CIMMYT’s Sustainable Intensification Program, sees this agreement as a fundamental game changer for CIMMYT’s geo-spatial work.
“Sen2-Agri will give CIMMYT access to high spatial and temporal resolution quality imagery and related ‘know-how,’ which in turn will enable us to further develop partnership with top-notch institutions in the earth observation field,” says Gérard.
Interface of the Sen2-Agri system, which allows for a semi-automated generation of cropland, crop type, LAI and NDVI maps.
The benefits of the Sen2-Agri are likely to far extend beyond the Yaqui Valley and Bangladesh. After the pilot phase of this project, the high-resolution imagery gathered could be applied to other areas CIMMYT projects are implemented.
In combination with bio-physical and socio-economic data, this will allow CIMMYT and other organizations to improve monitoring and evaluation, better assess and understand changes and shocks in crop-based farming systems and improve technology targeting across farmer communities.
The Sen2-Agri test program is being coordinated by Urs Schulthess. Please feel free to contact him at u.schulthess@cgiar.org if you have questions about or suggestions for future applications of the system.
Think of all the things you do with your cell phone on any given day. You can start your car, buy a coffee and even measure your heart rate. Cell phones are our alarm clocks and our cameras, our gyms and our banks. Cell phones are not just relevant for urban living but offer an opportunity to transform the lives of smallholders beyond compare. Even the most basic handset can empower farmers by providing them with instant information on weather, crop prices, and farming techniques.
For many farmers in the developing world, cell phones are the most accessible form of technology, but are only one of many technologies changing agriculture. Innovations such as the plow, irrigation and fertilizer have shaped the history of humankind. Today, technologies continue to play an essential role in agricultural production and impact the life of farmers everywhere.
Enter the era of hyper precision
Precision farming has been around for more than 30 years, but cheaper and more robust technologies are ushering in an era of hyper precision. With increasing climate uncertainties and price fluctuations, farmers can’t afford risk, and precision agriculture enables them to increase production and profits by linking biophysical determinants and variations in crop yield. A variety of farm equipment is being equipped with GPS and sensors that can measure water needs in the crop and nutrient levels in the soil, and dispense exactly the right amount of fertilizer and water as needed.
Precision agriculture may originate from large-scale, well-resourced farms, but its concept is highly transferable and it is scale independent. The pocket-sized active-crop canopy sensors, is already a game changing technology with the potential to bring precision agriculture within the reach of smallholders. Using such sensors to read crop health provides farmers with basic information that can be used for recommended nitrogen application. This has a dual purpose, both for smallholder farmers in areas where soils typically lack nitrogen, and those that over-fertilize while simultaneously reducing profitability and causing environmental pollution.
In Bangladesh, CIMMYT researchers are developing an irrigation scheduling app that predicts a week ahead of time whether a particular field requires irrigation. Based on satellite-derived estimates of crop water use, a soil water model and weather forecasts, the underlying algorithm for the app is also being tested in the north of Mexico.
The eyes in the sky
The human eye is a remote sensor, but on a farm there are many things that cannot be seen with the unaided eye, including surface temperatures and crop changes caused by extreme weather. At CIMMYT, remote sensing devices are allowing researchers to obtain information about a large area without physical contact that would otherwise be difficult to monitor. Indeed, last month I joined researchers at CIMMYT Headquarters in El Batan, Mexico, to learn more about the use of an Unmanned Aerial Vehicle (UAV) with built-in GPS and thermal and multispectral sensors that captures aerial photography to an image resolution of 3 cm. This device is being used to capture the canopy temperature and nitrogen status of crops.
Remote sensing alone is not going to teach a farmer how to properly sow a field, take the best care of his crops or optimize returns. Remote sensing explores spatial and temporal dimensions to provide a diagnosis but the next crucial step is to turn this into recommendations on nutrient management, irrigation and crop protection. The next question is how to bring these recommendations to small farms. In a low-tech setting, this depends on knowledge transfer to provide recommendations to farmers.
Learning about the use of UAV with CIMMYT scientists including (L-R) Francelino Rodrigues, Zia Ahmed, Martin Kropff, Lorena Gonzalez, Alex Park, Kai Sonder, Bruno Gérard and Juan Arista. (Photo: CIMMYT)
In late June, while the great majority of the conservation agriculture community converged on Winnipeg, Canada, in the Northern Hemisphere, Dr. Francelino Rodrigues, a CIMMYT post-doctoral fellow in precision agriculture in the Biometric and Statistics Unit of the Genetic Resources Program, and Dr. Jack McHugh, a CIMMYT cropping systems agronomist in the Global Conservation Agriculture Program, ventured into the much colder Southern Hemisphere to take part in the Digital Rural Futures Conference at the University of Southern Queensland (USQ) in Toowoomba, Queensland, Australia.
Although the conference itself held considerable incentive to visit Australia, it was the National Centre for Engineering in Agriculture (NCEA) at USQ that was of greater interest, because of the possibilities for future collaborations in precision farming research and development (R&D). The NCEA was established in 1994 and specializes in engineering research relevant to the agribusiness sector and the natural resource base it utilizes. The center promotes research through extension, training and commercialization. Having worked at NCEA prior to CIMMYT, McHugh thought there were benefits in closer collaboration between CIMMYT and NCEA to take advantage of the precision agriculture R&D being conducted there.
Prior to the conference, Rodrigues and McHugh presented their work from Mexico and China, respectively, to NCEA staff. The discussion highlighted the complementary nature of the two organizations in the areas of precision agriculture, field monitoring, smart technologies and remote sensing. A tour of the NCEA ‘smart farms’ was the highlight of the conference for McHugh, who was able to see that much of his earlier work had been developed into significant applied instrumentation.
Rodrigues commented on the versitile multi-proximal sensor platform developed by McHugh at the NCEA: “The platform [on a motorbike] allows simultaneously on-the-go measurements of apparent soil electro-conductivity and the normalized difference vegetation index (NDVI), which gives a tremendous advantage compared with stop-and-go measurements. It’s something we started to do with a wood sled in the past year at CIMMYT’s experiment station in Obregón, but the motorbike would definitely create a new opportunity for fast and efficient measurements during crop growth.”
According to the NCEA, the farming system of the future will have robotic sensing systems and decision support tools that interface seamlessly with commercial on-farm operations to optimize resource usage. The NCEA is working on components of this, but much of what the CIMMYT researchers saw could be applied immediately to current farming systems and already includes considerable integration. Some of the systems displayed were controlled remotely by tablets and interfaced on large screen monitors that displayed real-time feedback of sensors, machinery and field activities including the following: smart weed spot sprayers that are able to differentiate crops from weeds based on reflectance and leaf shape; aerial vehicles with multispectral and thermal sensors; and irrigation monitoring for water scheduling.
Smart weed spot sprayer working with reflectance and leaf shapes to differentiate crops from weeds.
Other sensors on display included NDVI sensor platforms, automated cone penetrometers, sensor-equipped bee traps and automated adaptive control of furrow irrigation systems. Of particular note was the augmented reality (AR) for real-time interactivity with on-farm devices and information. AR automatically filters information from online sources based on the user’s current location and viewing perspective, using the camera in a tablet or smartphone. AR markers in the ‘real-world’ (e.g., weather stations, pumps, field sensors, crops and more) can be discovered and online information can be retrieved. The data is merged into the device’s real-world observation, and the user can interact with the content to control and configure machinery. The next step is to build collaboration between both institutes. McHugh and Rodrigues are looking forward to the identification and application of the NCEA technology through future research exchanges and project development.