2019-08-13T18:25:19Z (GMT) by Aaron Etienne

Current methods of broadcast herbicide application cause a negative environmental and economic impact. Computer vision methods, specifically those related to object detection, have been reported to aid in site-specific weed management procedures to target apply herbicide on per-weed basis within a field. However, a major challenge to developing a weed detection system is the requirement for properly annotated training data to differentiate between weeds and crops under field conditions. This research involved creating an annotated database of weeds by using UAS-acquired imagery from corn and soybean research plots located in North-central Indiana. A total of 27,828 RGB; 108,398 multispectral; and 23,628 thermal images, were acquired using FLIR Duo Pro R sensor that was attached to a DJI Matrice 600 Pro UAS. An annotated database of 306 RGB images, organized into monocot and dicot weed classes, was used for network training. Two Deep Learning networks namely, DetectNet and You Only Look Once version 3 (YOLO ver3) were subjected to five training stages using four annotated image sets. The precision for weed detection ranged between 3.63-65.37% for monocot and 4.22-45.13% for dicot weed detection. This research has demonstrated a need for creating a large annotated weed database for improving precision of deep learning algorithms through better training of the network.