Evaluating Maize Genotype Performance under Low Nitrogen Conditions Using RGB UAV Phenotyping Techniques
Ma. Luisa Buchaillot, Adrian Gracia-Romero, Omar Vergara-Diaz, Mainassara A. Zaman-Allah, Amsal Tarekegne, Jill E. Cairns, Boddupalli M. Prasanna, Jose Luis Araus and Shawn C. Kefauver*
Maize is the most cultivated cereal in Africa in terms of land area and production, but low soil nitrogen availability often constrains yields. Developing new maize varieties with high and reliable yields using traditional crop breeding techniques in field conditions can be slow and costly. Remote sensing has become an important tool in the modernization of field-based high-throughput plant phenotyping (HTPP), providing faster gains towards the improvement of yield potential and adaptation to abiotic and biotic limiting conditions. We evaluated the performance of a set of remote sensing indices derived from red-green-blue (RGB) images along with field-based multispectral normalized difference vegetation index (NDVI) and leaf chlorophyll content (SPAD values) as phenotypic traits for assessing maize performance under managed low-nitrogen conditions. HTPP measurements were conducted from the ground and from an unmanned aerial vehicle (UAV). For the ground-level RGB indices, the strongest correlations to yield were observed with hue, greener green area (GGA), and a newly developed RGB HTPP index, NDLab (normalized difference Commission Internationale de I´Edairage (CIE)Lab index), while GGA and crop senescence index (CSI) correlated better with grain yield from the UAV. Regarding ground sensors, SPAD exhibited the closest correlation with grain yield, notably increasing in its correlation when measured in the vegetative stage. Additionally, we evaluated how different HTPP indices contributed to the explanation of yield in combination with agronomic data, such as anthesis-silking interval (ASI), anthesis date (AD), and plant height (PH). Multivariate regression models, including RGB indices (R 2 > 0.60), outperformed other models using only agronomic parameters or field sensors (R 2 > 0.50), reinforcing RGB HTPP’s potential to improve yield assessments. Finally, we compared the low-N results to the same panel of 64 maize genotypes grown under optimal conditions, noting that only 11% of the total genotypes appeared in the highest yield producing quartile for both trials. Furthermore, we calculated the grain yield loss index (GYLI) for each genotype, which showed a large range of variability, suggesting that low-N performance is not necessarily exclusive of high productivity in optimal conditions.
Automatic Wheat Ear Counting Using Thermal Imagery
Jose A. Fernandez-Gallego, Ma. Luisa Buchaillot, Nieves Aparicio Gutiérrez, María Teresa Nieto-Taladriz, José Luis Araus*, and Shawn C. Kefauver*
Abstract: Ear density is one of the most important agronomical yield components in wheat. Ear counting is time-consuming and tedious as it is most often conducted manually in field conditions. Moreover, different sampling techniques are often used resulting in a lack of standard protocol, which may eventually affect inter-comparability of results. Thermal sensors capture crop canopy features with more contrast than RGB sensors for image segmentation and classification tasks. An automatic thermal ear counting system is proposed to count the number of ears using zenithal/nadir thermal images acquired from a moderately high resolution handheld thermal camera. Three experimental sites under different growing conditions in Spain were used on a set of 24 varieties of durum wheat for this study. The automatic pipeline system developed uses contrast enhancement and filter techniques to segment image regions detected as ears. The approach is based on the temperature differential between the ears and the rest of the canopy, given that ears usually have higher temperatures due to their lower transpiration rates. Thermal images were acquired, together with RGB images and in situ (i.e., directly in the plot) visual ear counting from the same plot segment for validation purposes. The relationship between the thermal counting values and the in situ visual counting was fairly weak (R2 = 0.40), which highlights the difficulties in estimating ear density from one single image-perspective. However, the results show that the automatic thermal ear counting system performed quite well in counting the ears that do appear in the thermal images, exhibiting high correlations with the manual image-based counts from both thermal and RGB images in the sub-plot validation ring (R2 = 0.75–0.84). Automatic ear counting also exhibited a high correlation with the manual counting from thermal images when considering the complete image (R2 = 0.80). The results also show a high correlation between the thermal and the RGB manual counting using the validation ring (R2 = 0.83). Methodological requirements and potential limitations of the technique are discussed. Keywords: thermal images; ear counting; digital image processing; wheat
Low-cost assessment of grain yield in durum wheat using RGB images
Jose A. Fernandez-Gallego, Shawn C. Kefauver*, Thomas Vatter, Nieves Aparicio Gutiérrez, María Teresa Nieto-Taladriz, José Luis Araus*
The pattern of photosynthetic area of the canopy throughout the crop cycle is an important factor for determining grain yield in wheat. This work proposes the use of zenithal RGB images of the canopy taken in natural light conditions to derive vegetation indices as a low-cost approach to predict grain yield. A set of 23 varieties of durum wheat was monitored in three growing conditions (support irrigation, rainfed and late planting) and two sites (Aranjuez and Valladolid, Spain), totaling 6 field trials. For each plot, digital RGB images were taken periodically from seedling emergence to late grain filling. RGB-based Green Area (GA), Greener Area (GGA), Normalized Green Red Difference Index (NGRDI), Triangular Greenness Index (TGI) and a novel photosynthetic area index based on the CIE L*u*v* colour space (u*v*A) were compared to handheld spectroradiometer Normalised Difference Vegetation Index (NDVI) for reference. In the case of the irrigated and late planting trials, the best phenotypic predictions of grain yield were achieved with the vegetation indices measured during the last part of the crop cycle (i.e. grain filling). For the rainfed trials, the best phenotypic predictions were achieved with indices measured earlier (around heading). Among all the evaluated indices, the novel index performed the best. Considering the heritabilities of the evaluated RGB indices and their genetic correlations with grain yield, index-based predictions of grain yield were best in the early crop stages for both rainfed and irrigated conditions, while for late planting indices measured at different crop stages performed equally well.
Cereal Crop Ear Counting in Field Conditions Using Zenithal RGB Images
José A. Fernandez-Gallego, María Luisa Buchaillot, Adrian Gracia-Romero, Thomas Vatter, Omar Vergara Diaz, Nieves Aparicio Gutiérrez, María Teresa Nieto-Taladriz, Samir Kerfal, Maria Dolors Serre, José Luis Araus, Shawn C. Kefauver*
Ear density, or the number of ears per square meter (ears/m2), is a central focus in many cereal crop breeding programs, such as wheat and barley, representing an important agronomic yield component for estimating grain yield. Therefore, a quick, efficient, and standardized technique for assessing ear density would aid in improving agricultural management, providing improvements in preharvest yield predictions, or could even be used as a tool for crop breeding when it has been defined as a trait of importance. Not only are the current techniques for manual ear density assessments laborious and time-consuming, but they are also without any official standardized protocol, whether by linear meter, area quadrant, or extrapolation based on plant ear density and plant counts postharvest. An automatic ear counting algorithm is presented in detail for estimating ear density with only sunlight illumination in field conditions based on zenithal (nadir) natural color (red, green, and blue [RGB]) digital images, allowing for high-throughput standardized measurements. Different field trials of durum wheat and barley distributed geographically across Spain during the 2014/2015 and 2015/2016 crop seasons in irrigated and rainfed trials were used to provide representative results. The three-phase protocol includes crop growth stage and field condition planning, image capture guidelines, and a computer algorithm of three steps: (i) a Laplacian frequency filter to remove low- and high-frequency artifacts, (ii) a median filter to reduce high noise, and (iii) segmentation and counting using local maxima peaks for the final count. Minor adjustments to the algorithm code must be made corresponding to the camera resolution, focal length, and distance between the camera and the crop canopy. The results demonstrate a high success rate (higher than 90%) and R2 values (of 0.62-0.75) between the algorithm counts and the manual image-based ear counts for both durum wheat and barley.
Wheat ear counting in-field conditions: High throughput and low-cost approach using RGB images.
Jose A. Fernandez‑Gallego, Shawn C. Kefauver1* , Nieves Aparicio Gutiérrez, María Teresa Nieto‑Taladriz and José Luis Araus.
Background: The number of ears per unit ground area (ear density) is one of the main agronomic yield components in determining grain yield in wheat. A fast evaluation of this attribute may contribute to monitoring the efficiency of crop management practices, to an early prediction of grain yield or as a phenotyping trait in breeding programs. Currently, the number of ears is counted manually, which is time-consuming. Moreover, there is no single standardized protocol for counting the ears. An automatic ear‑counting algorithm is proposed to estimate ear density under field conditions based on zenithal color digital images taken from above the crop in natural light conditions. Field trials were carried out at two sites in Spain during the 2014/2015 crop season on a set of 24 varieties of durum wheat with two growing conditions per site. The algorithm for counting uses three steps: (1) a Laplacian frequency filter chosen to remove low and high-frequency elements appearing in an image, (2) a Median filter to reduce high noise still present around the ears and (3) segmentation using Find Maxima to segment local peaks and determine the ear count within the image.
Results: The results demonstrate high success rate (higher than 90%) between the algorithm counts and the manual (image‑based) ear counts, and precision, with a low standard deviation (around 5%). The relationships between algorithm ear counts and grain yield were also significant and greater than the correlation with manual (field‑based) ear counts. In this approach, results demonstrate that automatic ear counting performed on data captured around anthesis correlated better with grain yield than with images captured at later stages when the low performance of ear counting at late grain filling stages was associated with the loss of contrast between canopy and ears.
Conclusions: Developing robust, low‑cost and efficient field methods to assess wheat ear density, as a major agro‑ nomic component of yield, is highly relevant for phenotyping efforts towards increases in grain yield. Although the phenological stage of measurements is important, the robust image analysis algorithm presented here appears to be amenable from aerial or other automated platforms.
Keywords: Digital image processing, Ear counting, Field phenotyping, Laplacian frequency filter, Median filter, Find maxima, Wheat