Agricultural Reviews

  • Chief EditorPradeep K. Sharma

  • Print ISSN 0253-1496

  • Online ISSN 0976-0741

  • NAAS Rating 4.84

Frequency :
Quarterly (March, June, September & December)
Indexing Services :
AGRICOLA, Google Scholar, CrossRef, CAB Abstracting Journals, Chemical Abstracts, Indian Science Abstracts, EBSCO Indexing Services, Index Copernicus
Agricultural Reviews, volume 42 issue 2 (june 2021) : 121-132

Crop Monitoring using Unmanned Aerial Vehicles: A Review

Jose Cuaran1,*, Jose Leon1
1Department of Electronics Engineering, Universidad Catolica de Colombia, Diagonal 46 A # 15 B - 10, Bogotá D.C.
Cite article:- Cuaran Jose, Leon Jose (2021). Crop Monitoring using Unmanned Aerial Vehicles: A Review . Agricultural Reviews. 42(2): 121-132. doi: 10.18805/ag.R-180.
Unmanned aerial vehicles (UAVs) or drones have been developed significantly over the past two decades, for a wide variety of applications such as surveillance, geographic studies, fire monitoring, security, military applications, search and rescue, agriculture, etc. In agriculture, for example, remote sensing by means of unmanned aerial vehicles has proven to be the most efficient way to monitor crops from images. Unlike remote sensing from satellite images or taken from manned aircraft, UAVs allow capturing images of high spatial and temporal resolution, thanks to their maneuverability and capability of flying at low altitude. This article presents an extensive review of the literature on crop monitoring by UAV, identifying specific applications, types of vehicles, sensors, image processing techniques, among others. A total of 50 articles related to crop monitoring applications of UAV in agriculture were reviewed.  Only journal articles indexed in the Scopus database with more than 50 citations were considered. It was found that cereals are the most common crops where remote sensing has been applied so far. In addition, the most common crop remote sensing applications are related to precision agriculture, which includes the management of weeds, pests, diseases, nutrients and others. Crop phenotyping is also a common application of remote sensing, which consists of the evaluation of a crop’s physical characteristics under environmental changes, to select the plants or seeds with favorable genotype and phenotype. Besides, multirotor is the most common type of UAV used for remote sensing and RGB and multispectral cameras are mostly used as sensors for this application. Finally, there is a great opportunity for research in remote sensing related to a wide variety of crops, crop monitoring applications, vegetation indexes and photogrammetry.
It is calculated that the world population in 2050 will be over 9 billion people, so food demand could double this figure. In the face of this, the need arises to increase farming production; nonetheless, conventional agriculture is carried out applying agrochemical products in a uniform manner over a whole crop without considering its specific variability as regards climate, topography, soil properties, humidity, weeds, pests, diseases. This managing leads to the inefficient use of natural resources and agricultural input, which at the same time generates low profitability and environmental degradation (Leiva, 2008).
       
This therefore demands the need to optimize agricultural production, which can be achieved by adapting new techniques and practices regarding crop management which, based in spatial and temporary variations in a piece of land, will permit to diagnose and apply the adequate agrochemical doses in its different sectors. This manner of agricultural production, known as precision agriculture, has been developed in different countries since the 80s, with remarkable advances in the last years (Gebbers et al., 2010).
       
Precision agriculture requires geographical information systems (GIS), remote sensors, digitalized maps, databases, global positioning systems (GPS), robotics, among others. Regarding the latter are ground and aerial robots capable of carrying out operations such as planting, crop management (pests, weeds, disease control), phytosanitary product spraying, harvest, remote sensing and others (Tom et al., 2018).
       
In the literature there are different developments of unmanned air vehicles for application in precision agriculture. One of these applications is crop monitoring through high-resolution multispectral images. In this way it is possible to extract different vegetation indexes that permit to locate weeds, pests, diseases, nutrients, water stress, among others (Bendig et al., 2014; López-Granados, 2011; Shi et al., 2016; Vega et al., 2015, Kumar et al., 2020). With this information it is possible to take located control actions applying, for example, agrochemicals in a specific site and in the adequate amount (Gonzalez et al., 2016; Huang et al., 2009), which leads to savings in agricultural inputs, reducing the environmental effect and improving profitability. These intelligent spray systems may be accompanied by computer vision systems and artificial intelligence techniques that guarantee the effective recognition of crop weeds or diseases (Gao et al., 2019).
       
Unlike manual on-site monitoring, remote sensing allows non-invasive, fast and efficient crop monitoring. This is possible thanks to significant advances in unmanned aerial vehicles, different types of sensors, georeferencing systems and image processing algorithms (Gogoi et al., 2018). Remote sensing using unmanned aerial vehicles, unlike satellite sensing, allows images of higher spatial and temporal resolution to be taken at a low cost, with less interference from atmospheric conditions.
       
A bibliographic review on the application of unmanned aerial vehicles in agriculture is presented below, placing emphasis on remote sensing for crop monitoring as it ends up being their main application. It starts by describing the methodology applied to compile and filter the articles. It continues by going in depth into the most common remote sensing application such as pests and disease detection and phenotyping. The main image processing techniques and the corresponding sensors are described as well. A summary of the crops with which the said monitoring has been carried out using UAVs and the type of aerial vehicles used is also included.
       
To carry out an analysis that will permit to provide a vision of the development in precision agriculture and that will allow to contribute to this food safety field, a data search is carried out in a high-impact database in the academic field, using a search equation with relevant key words. After using the search equation in the data bases, an analysis of the articles, authors and years is done to filter those with a greater impact and identify the authors who have written the most on precision agriculture. When selecting relevant articles and authors, this database is analyzed in a bibliometric analysis program thus permitting to interpret the search results and make decisions to reduce and eliminate non-relevant data in the research.
       
An initial search was carried out at the Scopus database with the following equation: (quadrotor OR multirotor OR drone OR uav OR *copter) AND (agriculture OR farming OR crop), the result of the authors’ expertise and mastery of the issue. In this way 2076 articles were found in the said database. To have a pertinent filtering, key words were analyzed, leaving out those that are not a topic of interest for the research, to have in the end 1519 articles. Some keywords considered were “remote sensing”, “monitoring” and “mapping”.
       
Because the number of articles, only journal articles were selected, rejecting those from conferences, book chapters, general reviews, among others. This way, from 1519 articles 692 were chosen, reducing the resulting documents by 55%. The next criterion to reduce articles to review was the number of paper citations of each one of them. Those with fewer than 50 citations were rejected, thus permitting to review the ones that generate a greater impact in the academic field. In this way, 51 articles were obtained out of which this review was made.
 
Crop monitoring
 
Crop monitoring by means of remote sensing has different objectives. One of them is in precision agriculture, to determine and handle weeds, pests, diseases, nutrients, hydric stress, among others. Another objective is crop phenotyping, where seeds with the best phenotypic characteristics are selected, such as height and biomass production. In the literature is found that crops that have been monitored the most through remote sensing are cereals such a wheat, corn and barley, as shown in Fig 1. The corresponding references are shown in Table 1.
 

Fig 1: Crops monitored using remote sensing.


 

Table 1: Sample references for different crops monitored using remote sensing.


 
Crop monitoring for precision agriculture
 
Precision agriculture is a set of techniques through which, by monitoring of temporal and spatial variation in a crop, it is possible to set up a prompt treatment at a specific site. Precision agriculture comprises the following stages: data collection, variability mapping, decision making and application of the management practices. Variable remote sensing is important in the first three stages and it is complemented with other technologies such as geographic information systems (GIS) and machinery for the adequate treatment (Vega et al., 2015). As applications of this type there is the management of weeds, pests, diseases, nutrients, hydration, among others.
 
Weed management
 
Weed management by means a precision agriculture approach initially requires detecting weed localized areas to then take a control action through spraying or mechanical removal. This can be done in real time, by means of capturing geo-referenced images, a weed recognition system and one for their management (López-Granados, 2011). These practices to handle weeds increase profitability and reduce environmental impact (López-Granados, 2011), since, different from conventional treatments whereby herbicides are uniformity distributed (Gómez-Candón et al., 2014), with precision agriculture, agrochemical spraying is done at a specific site thus optimizing the amount of the agrochemical required.
       
Weed management is a priority in early-phase crops. To recognize weeds at this phase, high spatial resolution is required (less than 5 cm per pixel, since crop reflectance and weeds are quite similar at this phase (Gómez-Candón et al., 2014).  To do so it is necessary to have a high altitude topographic survey (less than 100 m) that permits to discriminate each one of the plants by means of high resolution (Torres-Sánchez et al., 2013) (Gómez-Candón et al., 2014) (Pérez-Ortiz et al., 2016). This implies a challenge since the lesser the altitude, the higher number of images is required, as well as the greater UAV autonomy (Gómez-Candón et al., 2014). Computer vision techniques and machine learning may be used for this task (Pérez-Ortiz et al., 2016).
       
In some cases when differentiating soil weeds is simply required, it is sufficient to determine the green excess index (ExG), which is obtained from RGB images (Pérez-Ortiz et al., 2016). In other cases it may be necessary to determine the different plant species (weeds and crop) and identify, for example, dominant plants and their size (Shi et al., 2016). To do so, a multispectral sensor may be useful or even a hyperspectral one that permits to evidence quite small reflectance variations (López-Granados, 2011; Torres-Sánchez et al., 2013).
 
Management of pests, diseases and crop condition               
The localized and timely detection of crop pests and diseases permits to take corrective and preventive decisions and their application in an effective manner. Thus, for example, a commonly monitored parameter is the biomass contents as it is strongly related to a crop production. This way, early problem detection in the crop, such as nutrient deficiency, permits to take timely measures without having to wait until the harvest to detect low production zones (Vega et al., 2015). On the other hand, in case of detecting zones with sufficient nutrient contents (Nitrogen, for example) it permits to avoid a crop over fertilization, saving agricultural input (Bendig et al., 2014). Disease detection may be carried out through vegetation indexes since some diseases produce changes in reflectance (Garcia-Ruiz et al., 2013).
       
Pest, disease and nutrient deficit detection can be fought through agrochemical spraying, including pesticides, fungicides, fertilizers, etc. Using UAV for spraying is ideal for small lots or difficult access areas (Huang et al., 2009).  In (Faiçal et al., 2014) for example, a UAV fumigation technique is used as well as a ground-based sensor network. As a control strategy, they generate route adaptation for an agrochemical spraying vehicle depending on speed and wind intensity. To do so, they have wireless sensors located on the ground which feedback data on the amount of agrochemicals they are receiving.
       
The crop condition is evaluated not only by means of pest and disease detection but also by estimating other parameters such as plant density (Jin et al., 2017), biomass contents (Vega et al., 2015), crop height (Geipel et al., 2014), canopy area, nitrogen contents (Bendig et al., 2014), solid clay contents (Shi et al., 2016), crop hydric stress contents (Berni et al., 2009), among others.
 
Crop monitoring for phenotyping
 
Remote sensing has proved to be a great potential in crop phenotyping processes, where a specie’s response to different environmental conditions is evaluated. Crop phenotyping requires measuring and evaluating observable physical characteristics (Holman et al., 2016), for a species’ different phenotypes, different generations and at different growth stages (Tattaris et al., 2016a). Traditional phenotyping with direct field observations demands much time and costs (Haghighattalab et al., 2016). In the face of this, non-invasive phenotyping by means of unmanned aerial vehicles is more common now-a-days due to the possibility to take temporal and spatial high-resolution images that permit to collect different patterns conforming crop phenotype (Shi et al., 2016).
       
Thanks to phenotyping it is possible to collect plant or seed species that better adapt, those of higher production and, overall, those with a favorable genotype and phenotype. Thus, for example, how a crop responds to different fertilizers can be evaluated (Holman et al., 2016), or how they perform against pests and diseases (Chapman et al., 2014).
 
The most common phenotypic traits considered are:
       
-Canopy height (m), biomass amount (kg/m2) and crop’s crown volume (m3). These are indicators for good growth, crop vitality, efficient use of light, production during harvest, carbon reserves and nutrient availability (Bendig et al., 2014; Holman et al., 2016; Li et al., 2016; Shi et al., 2016; Torres-Sánchez et al., 2015).
       
-Foliar area index and canopy cover are good indicators of how much vegetation there is per surface unit (Liebisch et al., 2015; Shi et al., 2016). This is related to the plant photosynthetic capacity, respiration and evaporation-transpiration and efficiency in the use of light (Córcoles et al., 2013; Verger et al., 2014). The foliar area index reflects the crop’s growth potential and it permits to estimate biomass production (Córcoles et al., 2013; Hunt et al., 2010; Lelong et al., 2008; Liebisch et al., 2015; Shi et al., 2016; Verger et al., 2014).
       
A crop phenotype from remote sensing can be carried out in two complementary manners: from vegetation indexes and from photogrammetry. Each approach allows to determine different phenotypic traits, as it is described in the following sections.
 
Vegetation indexes
 
Vegetation indexes are quantitative indicators calculated as of crop images. These indexes allow to monitor a crop or soil condition, providing information on growth, biomass contents, crop health, crop condition, weed presence, etc.
       
Some indexes which are gotten from red or infrared bands are related to the biomass contents (Aasen et al., 2015), the canopy structure and the foliar area index (LAI) (Aasen et al., 2015; Lelong et al., 2008; Tattaris et al., 2016). Other indexes only depend on visible bands and are related to pigment concentration in the leaves and nitrogen contents (Lelong et al., 2008). On the other hand, some indexes such as canopy temperature (Berni et al., 2009), which is calculated as of infrared radiation, permit to evaluate a plant’s transpiration rate in an indirect manner. It is a performance indicator of a plant with low hydric stress (Tattaris et al., 2016b).
       
(Henrich et al., 2009) have compiled ample data on different sensors, vegetation indexes and remote sensing applications, available presently only in a free-access web site. Fig 2 shows the most common vegetation indexes used in remote sensing. Table 2 summarizes some vegetation indexes with some applications found in the literature and the corresponding formula to estimate them based on different spectral bands.
 

Table 2: Vegetation indexes with some applications.


       
Taking images to extract vegetation indexes is a great challenge, as it demands stable lighting conditions during capture. Since these indexes depend of reflectance, the time of the day when images are captured may have an influence on the same (Vega et al., 2015), as well as the angle used to take the images (Rasmussen et al., 2016). On the other hand, it must be taken into account that plant reflectance varies depending on their growth stage (López-Granados, 2011). Finally, to guarantee a good spatial resolution, images must be taken at a height of 30 to 100 m (Rasmussen et al., 2016).
 
Photogrammetry
 
Photogrammetry is a technique for the tridimensional reconstruction of an object from multiple images taken on it. Photogrammetry permits to generate digital models of a 3D surface as of which it is possible to measure distances, areas, volumes, with high accuracy. UAV have a great potential for this since they permit to take high-resolution aerial images at a low altitude.
       
The photogrammetry process is carried out under the following phases: flight planning, location and measuring of control points, flight execution to capture images and processing of the georeferenced images (Córcoles et al., 2013). For geo-referenciation different ground control points (GCPs) are commonly used in order to ensure accuracy in the mosaicking process (Gómez-Candón et al., 2014).
       
By means of the tridimensional reconstruction of lands and crops (CSM crop surface models) it is possible to determine the crop height, the canopy volume, the crop area (Bendig et al., 2014; az-Varelaet_al2015; Li et al., 2016; Torres-Sánchez et al., 2015), parameters that cannot be obtained from vegetation indexes (Geipel et al., 2014). It is also possible to determine a crop growth rate if the process takes place under different growth phases (Bendig et al., 2014; Holman et al., 2016).
       
For the tridimensional reconstruction a computer vision technique is normally used known as Structure from motion, which offers high topographical resolution from the different sequences of images (Bendig et al., 2014; Geipel et al., 2014; Holman et al., 2016; Lucieer et al., 2014; Zahawi et al., 2015). Generating 3D hyperspectral maps permits to get hyperspectral data by pixel in an image from a camera that has been previously characterized and calibrated using radiometric methods (Aasen et al., 2015). Some software tools used in the literature to carry out this process are Agisoft Photo Scan Proffessional (Bendig et al., 2014; Geipel et al., 2014; Haghighattalab et al., 2016), Smart3DCapture (Li et al., 2016) and Leica Photogrammetry Suit (Gómez-Candón et al., 2014).
       
For image georeferencing, UAVs use GPS. Nonetheless, since the GPS receptor provides the UAV position with an ample error margin (of several meters) normally land control points (GPC) are used in order to  obtain a minor error less than 10 cm (Li et al., 2016; Lucieer et al., 2014; Vega et al., 2015).
 
Sensors for remote sensing
 
According to the reviewed articles, crop monitoring can be done through remote sensing, based mainly on RGB, multispectral and hyperspectral sensors, as shown in Fig 3. The corresponding references are shown in Table 3.
 

Fig 3: Image sensors used for remote sensing.


 

Table 3: Sample references for different image sensors used in remote sensing.


 
RGB sensors provide 3 bands (red, green and blue) of the visible spectrum, as shown in Fig 4, with a wavelength ranging from 380 nm up to 750 nm. These bands allow the calculation of some of the vegetation indexes shown in Table 2, which are mainly intended to discriminate vegetation from the soil and estimate biomass amount.
 

Fig 4: Spectrum representation showing examples of RGB, multispectral and hyperspectral sensors. Adapted from (Adão et al., 2017).


       
Multispectral sensors are the most utilized in remote sensing with UAVs as they are more accessible and offer a good number of bands aside from RGB (Verger et al., 2014). These sensors capture between 3 and 12 bands from around 100nm width each (Adão et al., 2017; López-Granados, 2011). These bands allow the calculation of different vegetation indexes as shown in Table 2. One of the most common vegetation indexes is NDVI, which requires the near-infrared band and a visible band, allowing the detection of vegetation in a more effective way than the ExG index, which is based on the RGB bands.
       
On the other hand, hyperspectral sensors detect up to hundreds of narrow bands for a width less than 10 nm (López-Granados, 2011; Lucieer et al., 2014). Thanks to this they allow to analyze variables manifesting in a very small band such as, for example, chlorophyll contents in a 670-780nm band and obtain high resolution thermic images (Berni, Zarco-Tejada, Sepulcre-Cantó, et al., 2009). Overall, hyperspectral sensors allow us to discriminate components that may be grouped by multispectral bands. A wide review of hyperspectral sensors is done in (Adão et al., 2017).
       
Both multispectral and hyperspectral sensor may be useful for the detection of weeds, diseases and pests, as they lead to small spectral variations that require high spectral resolution.
 
Aerial vehicles for remote sensing
 
For remote sensing traditional satellite images or those taken from manned aerial vehicles have been used.  They present limitations as to temporal and spatial resolution and depend on climate conditions, such as cloudiness (Torres-Sánchez et al., 2013). In the last years UAV have shown a great potential to carry out this task at a lower cost and with better performance. Unmanned aerial vehicles permit to take high spatial resolution images due to their maneuverability to fly at a low altitude. Fig 5 summarizes the types of UAV used in different studies on remote sensing. The corresponding references are shown in Table 4.
 

Fig 5: Types of UAV used in remote sensing.


 

Table 4: Sample references for types of UAV used in remote sensing.


       
Multirotor is the most common type of UAV used for remote sensing, since they allow us to take multispectral images at low altitude and high spatial resolution, they have low operating costs and high maneuverability. However, multirotors face challenges including lower flight autonomy, slower velocity and lower payload capacity compared to fixed wings UAVs. These last ones are intended for large areas and flat terrains, whereas multirotors are more suitable for small areas regardless of the terrain topography. Table 5 summarizes the overall characteristics of different types of UAVs, that may be considered for remote sensing applications.
 

Table 5: UAVs for remote sensing and their characteristics. Adapted from (García, 2017).

This review has evidenced the great variety of crops monitored by remote sensing. However, cereals like corn, wheat and barley accounts for about 60% of the total articles reviewed, which indicates the need for further research in other crops. Likewise, RGB and multispectral sensors are mostly used, accounting for about 70% of the studied cases. While RGB cameras permit to calculate some simple vegetation indexes that are good detectors of the vegetal area in a crop, multispectral cameras have the capacity to capture images within specific wavelength ranges that provide details on plant diseases and plant types.
       
Among the most common crop remote sensing applications by means of UAVs are found: precision agriculture, which includes management of weeds, pests, diseases, nutrients and others; crop phenotyping, which consists of the evaluation of a crop’s physical characteristics in the face of environmental changes. In any case, high spatial and temporal resolution images end up being essential since, as of them, it is possible to determine vegetation indexes and/or tridimensional surface models that provide data on different variables both of the crop and the soil. Some of the most common vegetation indexes in the literature are: NDVI, EXG, GNDVI, NGRDI and SAVI, accounting for more than 60% of the total studied cases. Among the most common study variables or parameters are found the foliar area index, foliar biomass, hydric stress, contents of vegetation in the soil and nitrogen contents.
               
​Finally, almost 50% of the unmanned aerial vehicles used in remote sensing are multirotors thanks to the easiness for their construction and control. Their maneuverability permits to take images at a low altitude, guaranteeing a good spatial resolution. Different from fixed wing vehicles, they allow flying over places of difficult access or mountainous terrains. However, almost 20% of the reviewed articles were based on fixed wing UAVs, since this type of aerial vehicle allows the monitoring of large areas at a higher speed.

  1. Aasen, H., Burkart, A., Bolten, A. and Bareth, G. (2015). Generating 3D hyperspectral information with lightweight UAV snapshot cameras for vegetation monitoring: From camera calibration to quality assurance. ISPRS Journal of Photogrammetry and Remote Sensing. 108: 245-259. https://doi.org/10.1016/j.isprsjprs.2015.08.002.

  2. Adão, T., Hruška, J., Pádua, L., Bessa, J., Peres, E., Morais, R., and Sousa, J. (2017). Hyperspectral Imaging: A Review on UAV-Based Sensors, Data Processing and Applications for Agriculture and Forestry. Remote Sensing. 9(11): 1110.https://doi.org/10.3390/rs9111110.

  3. Barrientos, A., Colorado, J., Cerro, J., Martínez-Álvarez, A., Rossi, C., Sanz, D., and Valente, J. (2011). Aerial Remote Sensing in Agriculture: A Practical Approach to Area Coverage and Path Planning for Fleets of Mini Aerial Robots. Journal of Field Robotics. 28: 667-689.https://doi.org/10.1002/rob.20403.

  4. Bendig, J., Bolten, A., Bennertz, S., Broscheit, J., Eichfuss, S. and Bareth, G. (2014). Estimating Biomass of Barley Using Crop Surface Models (CSMs) Derived from UAV-Based RGB Imaging. Remote Sensing. 6(11): 10395-10412.https://doi.org/10.3390/rs61110395.

  5. Bendig, J., Yu, K., Aasen, H., Bolten, A., Bennertz, S., Broscheit, J., et al. (2015). Combining UAV-based plant height from crop surface models, visible and near infrared vegetation indices for biomass monitoring in barley. International Journal of Applied Earth Observation and Geoinformation. 39: 79-87. https://doi.org/10.1016/j.jag.2015.02.012.

  6. Berni, J.A.J., Zarco-Tejada, P.J., Sepulcre-Cantó, G., Fereres, E. and Villalobos, F. (2009). Mapping canopy conductance and CWSI in olive orchards using high resolution thermal remote sensing imagery. Remote Sensing of Environment. 113(11): 2380-2388. https://doi.org/10.1016/j.rse.2009.06.018.

  7. Berni, J.A.J., Zarco-Tejada, P.J., Suarez, L. and Fereres, E. (2009). Thermal and Narrowband Multispectral Remote Sensing for Vegetation Monitoring From an Unmanned Aerial Vehicle. IEEE Transactions on Geoscience and Remote Sensing. 47(3): 722-738. https://doi.org/10.1109/TGRS.2008.2010457.

  8. Chapman, S.C., Merz, T., Chan, A., Jackway, P., Hrabar, S., Dreccer, M.F., Jimenez-Berni, J. (2014). Pheno-Copter: A Low-Altitude, Autonomous Remote-Sensing Robotic Helicopter for High-Throughput Field-Based Phenotyping. Agronomy. 4(2): 279-301.https://doi.org/10.3390/agronomy4020279.

  9. Córcoles, J.I., Ortega, J.F., Hernández, D., and Moreno, M.A. (2013). Estimation of leaf area index in onion (Allium cepa L.) using an unmanned aerial vehicle. Biosystems Engineering, 115(1): 31-42.https://doi.org/10.1016/j.biosystemseng. 2013.02.002.

  10. Díaz-Varela, R., de la Rosa, R., León, L. and Zarco-Tejada, P. (2015). High-Resolution Airborne UAV Imagery to Assess Olive Tree Crown Parameters Using 3D Photo Reconstruction: Application in Breeding Trials. Remote Sensing. 7(4): 4213-4232. https://doi.org/10.3390/rs70404213.

  11. Faiçal, B.S., Costa, F.G., Pessin, G., Ueyama, J., Freitas, H., Colombo, A., Braun, T. (2014). The use of unmanned aerial vehicles and wireless sensor networks for spraying pesticides. Journal of Systems Architecture, 60(4): 393-404.https://doi.org/10.1016/j.sysarc.2014.01.004.

  12. Gao, P., Zhang, Y., Zhang, L., Noguchi, R. and Ahamed, T. (2019). Development of a Recognition System for Spraying Areas from Unmanned Aerial Vehicles Using a Machine Learning Approach. Sensors. 19(2): 313. https://doi.org/10.3390/s19020313.

  13. Garcia, I. (2017). Estudio sobre vehículos aéreos no tripulados y sus aplicaciones. Universidad de Valladolid. Escuela de Ingenierías industriales. P.196.

  14. Garcia-Ruiz, F., Sankaran, S., Maja, J.M., Lee, W.S., Rasmussen, J. and Ehsani, R. (2013). Comparison of two aerial imaging platforms for identification of Huanglongbing-infected citrus trees. Computers and Electronics in Agriculture. 91: 106-115.https://doi.org/10.1016/j.compag.2012.12.002.

  15. Gebbers, R. and Adamchuk, V.I. (2010). Precision Agriculture and Food Security. Science. 327(5967): 828-831.https://doi.org/10.1126/science.1183899.

  16. Geipel, J., Link, J. and Claupein, W. (2014). Combined Spectral and Spatial Modeling of Corn Yield Based on Aerial Images and Crop Surface Models Acquired with an Unmanned Aircraft System. Remote Sensing. 6(11). 10335-10355. https://doi.org/10.3390/rs61110335.

  17. Gogoi, N.K., Deka, B. and Bora, L.C. (2018). Remote sensing and its use in detection and monitoring plant diseases: A review. Agricultural Reviews. 39(4): 307-313. https://doi.org/10.18805/ag.R-1835.

  18. Gómez-Candón, D., De Castro, A.I. and López-Granados, F. (2014). Assessing the accuracy of mosaics from unmanned aerial vehicle (UAV) imagery for precision agriculture purposes in wheat. Precision Agriculture. 15(1): 44-56. https://doi.org/10.1007/s11119-013-9335-4.

  19. Gonzalez-de-Soto, M., Emmi, L., Perez-Ruiz, M., Aguera, J. and Gonzalez-de-Santos, P. (2016). Autonomous systems for precise spraying – Evaluation of a robotised patch sprayer. Biosystems Engineering. 146: 165-182. https://doi.org/10.1016/j.biosystemseng.2015.12.018.

  20. Haghighattalab, A., González Pérez, L., Mondal, S., Singh, D., Schinstock, D., Rutkoski, J., Poland, J. (2016). Application of unmanned aerial systems for high throughput phenotyping of large wheat breeding nurseries. Plant Methods. 12(1): 35. https://doi.org/10.1186/s13007-016-0134-6.

  21. Henrich, V., Jung, A., Götze, C., Sandow, C., Thürkow, D. and Gläßer, C. (2009, marzo 16). Development of an online indices database: Motivation, concept and implementation. Presentado en 6th EARSeL Imaging Spectroscopy SIG Workshop Innovative Tool for Scientific and Commercial Environment Applications Tel Aviv, Israel. Recuperado de https://www.indexdatabase.de/.

  22. Holman, F., Riche, A., Michalski, A., Castle, M., Wooster, M. and Hawkesford, M. (2016). High Throughput Field Phenotyping of Wheat Plant Height and Growth Rate in Field Plot Trials Using UAV Based Remote Sensing. Remote Sensing. 8(12): 1031. https://doi.org/10.3390/rs8121031.

  23. Hunt, E.R., Hively, W.D., Fujikawa, S., Linden, D., Daughtry, C.S. and McCarty, G. (2010). Acquisition of NIR-Green-Blue Digital Photographs from Unmanned Aircraft for Crop Monitoring. Remote Sensing. 2(1): 290-305.https://doi.org/10.3390/rs2010290.

  24. Jin, X., Liu, S., Baret, F., Hemerlé, M. and Comar, A. (2017). Estimates of plant density of wheat crops at emergence from very low altitude UAV imagery. Remote Sensing of Environment. 198: 105-114. https://doi.org/10.1016/j.rse.2017.06.007.

  25. Kumar, S.K. Singh, G.P. Obi Reddy, V.N. Mishra, R.K. Bajpai. (2020). Remote Sensing and Geographic Information System in Water Erosion Assessment. Agricultural Reviews. 41: 116-123. DOI: 10.18805/ag.R-1968.

  26. Leiva, F. (2008). Agricultura de precisión en cultivos transitorios. Bogotá, D.C.: Universidad Nacional de Colombia.

  27. Lelong, C., Burger, P., Jubelin, G., Roux, B., Labbé, S., and Baret, F. (2008). Assessment of Unmanned Aerial Vehicles Imagery for Quantitative Monitoring of Wheat Crop in Small Plots. Sensors. 8(5): 3557-3585. https://doi.org/10.3390/s8053557.

  28. Li, W., Niu, Z., Chen, H., Li, D., Wu, M. and Zhao, W. (2016). Remote estimation of canopy height and aboveground biomass of maize using high-resolution stereo images from a low-cost unmanned aerial vehicle system. Ecological Indicators. 67: 637-648. https://doi.org/10.1016/j.ecolind.2016.03.036.

  29. Liebisch, F., Kirchgessner, N., Schneider, D., Walter, A. and Hund, A. (2015). Remote, aerial phenotyping of maize traits with a mobile multi-sensor approach. Plant Methods. 11(1): 9. https://doi.org/10.1186/s13007-015-0048-8.

  30. López-Granados, F. (2011). Weed detection for site-specific weed management: Mapping and real-time approaches: Weed detection for site-specific weed management. Weed Research. 51(1): 1-11. https://doi.org/10.1111/j.1365-3180.2010.00829.x.

  31. Lucieer, A., Malenovský, Z., Veness, T. and Wallace, L. (2014). HyperUAS-Imaging Spectroscopy from a Multirotor Unmanned Aircraft System: HyperUAS-Imaging Spectroscopy from a Multirotor Unmanned. Journal of Field Robotics. 31(4): 571-590. https://doi.org/10.1002/rob.21508.

  32. Matese, A., Toscano, P., Di Gennaro, S.F., Genesio, L., Vaccari, F.P., Primicerio, J., Gioli, B. (2015). Intercomparison of UAV, Aircraft and Satellite Remote Sensing Platforms for Precision Viticulture. Remote Sensing. 7(3): 2971-2990. https://doi.org/10.3390/rs70302971.

  33. Munoz-Marí, J., Bovolo, F., Gómez-Chova, L., Bruzzone, L. and Camp-Valls, G. (2010). Semisupervised One-Class Support Vector Machines for Classification of Remote Sensing Data. IEEE Transactions on Geoscience and Remote Sensing. 48(8): 3188-3197. https://doi.org/10.1109/TGRS.2010.2045764.

  34. Peña-Barragán, J.M., Torres-Sánchez, J., De Castro, A., Kelly, M. and López-Granados, F. (2013). Weed Mapping in Early-Season Maize Fields Using Object-Based Analysis of Unmanned Aerial Vehicle (UAV) Images. PloS one. 8: e77151. https://doi.org/10.1371/journal.pone.0077151.

  35. Pérez-Ortiz, M., Peña, J.M., Gutiérrez, P.A., Torres-Sánchez, J., Hervás-Martínez, C. and López-Granados, F. (2016). Selecting patterns and features for between- and within- crop-row weed mapping using UAV-imagery. Expert Systems with Applications. 47: 85-94. https://doi.org/10.1016/j.eswa.2015.10.043.

  36. Rasmussen, J., Ntakos, G., Nielsen, J., Svensgaard, J., Poulsen, R.N. and Christensen, S. (2016). Are vegetation indices derived from consumer-grade cameras mounted on UAVs sufficiently reliable for assessing experimental plots? European Journal of Agronomy. 74: 75-92.https://doi.org/10.1016/j.eja.2015.11.026.

  37. Schmale III, D.G., Dingus, B.R. and Reinholtz, C. (2008). Development and application of an autonomous unmanned aerial vehicle for precise aerobiological sampling above agricultural fields. Journal of Field Robotics. 25(3): 133-147. https://doi.org/10.1002/rob.20232.

  38. Shi, Y., Thomasson, J.A., Murray, S.C., Pugh, N.A., Rooney, W.L., Shafian, S., et al. (2016). Unmanned Aerial Vehicles for High-Throughput Phenotyping and Agronomic Research. PLOS ONE. 11(7): e0159781.https://doi.org/10.1371/journal.pone.0159781.

  39. Stagakis, S., González-Dugo, V., Cid, P., Guillén-Climent, M.L. and Zarco-Tejada, P.J. (2012). Monitoring water stress and fruit quality in an orange orchard under regulated deficit irrigation using narrow-band structural and physiological remote sensing indices. ISPRS Journal of Photogrammetry and Remote Sensing. 71: 47-61. https://doi.org/10.1016/j.isprsjprs.2012.05.003.

  40. Sugiura, R., Noguchi, N. and Ishii, K. (2005). Remote-sensing Technology for Vegetation Monitoring using an Unmanned Helicopter. Biosystems Engineering. 90(4): 369-379. https://doi.org/10.1016/j.biosystemseng.2004.12.011.

  41. Suomalainen, J. anders, N., Iqbal, S., Roerink, G., Franke, J., Wenting, P., Kooistra, L. (2014). A Lightweight Hyperspectral Mapping System and Photogrammetric Processing Chain for Unmanned Aerial Vehicles. Remote Sensing. 6(11): 11013-11030.https://doi.org/10.3390/rs61111013.

  42. Swain, K.C., Thomson, S.J., and Jayasuriya, H.P.W. (2010). Adoption of an Unmanned Helicopter for Low-Altitude Remote Sensing to Estimate Yield and Total Biomass of a Rice Crop. Transactions of the ASABE. Recuperado de http://agris.fao.org/agris-search/search.do?recordID=US 201301826860.

  43. Tattaris, M., Reynolds, M.P. and Chapman, S.C. (2016a). A Direct Comparison of Remote Sensing Approaches for High-Throughput Phenotyping in Plant Breeding. Frontiers in Plant Science. 7. https://doi.org/10.3389/fpls.2016.01131.

  44. Tattaris, M., Reynolds, M.P. and Chapman, S.C. (2016b). A Direct Comparison of Remote Sensing Approaches for High-Throughput Phenotyping in Plant Breeding. Frontiers in Plant Science. 7. https://doi.org/10.3389/fpls.2016.01131.

  45. Tokekar, P., Vander Hook, J., Mulla, D. and Isler, V. (2013). Sensor Planning for a Symbiotic UAV and UGV System for Precision Agriculture. 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems. 5321-5326. https://doi.org/10.1109/IROS.2013.6697126.

  46. Tom, D., Simon, P., Simon, B. and Bruce, G. (2018). Agricultural robotics: The Future of Robotic Agriculture. UKRAS Network Robotics and Autonomous systems.

  47. Torres-Sánchez, J., López-Granados, F. and Peña, J.M. (2015). An automatic object-based method for optimal thresholding in UAV images: Application for vegetation detection in herbaceous crops. Computers and Electronics in Agriculture. 114: 43-52. https://doi.org/10.1016/j.compag.2015.03. 019.

  48. Torres-Sánchez, J., Peña, J.M., de Castro, A.I. and López-Granados, F. (2014). Multi-temporal mapping of the vegetation fraction in early-season wheat fields using images from UAV. Computers and Electronics in Agriculture. 103: 104-113.https://doi.org/10.1016/j.compag.2014.02.009.

  49. Torres-Sánchez, Jorge, López-Granados, F., De Castro, A.I. and Peña-Barragán, J.M. (2013). Configuration and Specifica- tions of an Unmanned Aerial Vehicle (UAV) for Early Site Specific Weed Management. PLoS ONE. 8(3): e58210.https://doi.org/10.1371/journal.pone.0058210.

  50. Torres-Sánchez, Jorge, López-Granados, F., Serrano, N., Arquero, O. and Peña, J. M. (2015). High-Throughput 3-D Monitoring of Agricultural-Tree Plantations with Unmanned Aerial Vehicle (UAV) Technology. PLOS ONE. 10(6): e0130479.https://doi.org/10.1371/journal.pone.0130479.

  51. Vega, F.A., Ramírez, F.C., Saiz, M.P., and Rosúa, F.O. (2015). Multi-temporal imaging using an unmanned aerial vehicle for monitoring a sunflower crop. Biosystems Engineering. 132: 19-27. https://doi.org/10.1016/j.biosystemseng.2015.01.008.

  52. Verger, A., Vigneau, N., Chéron, C., Gilliot, J.-M., Comar, A. and Baret, F. (2014). Green area index from an unmanned aerial system over wheat and rapeseed crops. Remote Sensing of Environment.152. 654-664.https://doi.org/10.1016/j.rse.2014.06.006.

  53. von Bueren, S.K., Burkart, A., Hueni, A., Rascher, U., Tuohy, M.P., and Yule, I.J. (2015). Deploying four optical UAV-based sensors over grassland: Challenges and limitations. Biogeosciences. 12(1): 163-175. https://doi.org/10.5194/bg-12-163-2015.

  54. Xiang, H., and Tian, L.F. (2011). Method for automatic georeferencing aerial remote sensing (RS) images from an unmanned aerial vehicle (UAV) platform. Biosystems Engineering - BIOSYST ENG. 108: 104-113. https://doi.org/10.1016/j.biosystemseng.2010.11.003.

  55. Y. Huang, W.C. Hoffmann, Y. Lan, W. Wu and B.K. Fritz. (2009). Development of a Spray System for an Unmanned Aerial Vehicle Platform. Applied Engineering in Agriculture. 25(6): 803-809. https://doi.org/10.13031/2013.29229.

  56. Zahawi, R.A., Dandois, J.P., Holl, K.D., Nadwodny, D., Reid, J.L., and Ellis, E.C. (2015). Using lightweight unmanned aerial vehicles to monitor tropical forest recovery. Biological Conservation. 186: 287-295. https://doi.org/10.1016/j.biocon.2015.03.031.

  57. Zarco-Tejada, P.J., González-Dugo, V., Williams, L.E., Suárez, L., Berni, J.A.J., Goldhamer, D. and Fereres, E. (2013). A PRI-based water stress index combining structural and chlorophyll effects: Assessment using diurnal narrow-band airborne imagery and the CWSI thermal index. Remote Sensing of Environment. 138: 38-50. https://doi.org/10.1016/j.rse.2013.07.024.

     

Editorial Board

View all (0)