Detecting weeds from crops under complex field environments based on faster RCNN
2020 IEEE Eighth International Conference on Communications and Electronics (ICCE)
School of Science / Electron Science Research Institute / Graduate Research School
Grains Research and Development Coorporation Photonic Detection Systems Pty Ltd, Australia
The power of deep learning in object detection has widely been investigated, demonstrating promising results in recent years. In precision agricultural applications, weed detection plays an indispensable part in site-specific weed management. The published resources of crop and weed datasets under complex field environments including lighting conditions, weather conditions, different growth stages, heavy occlusion and overlap, and weeds with similar properties are limited. In this paper, a FT_BRC image dataset (published online with 3380 images) was collected by a camera installed on a portable trolley under practical field environments from a commercial farm in Cunderdin, Western Australia. Based on their harmful effects on the crop yield, Wild radish (Raphanus raphanistrum) and Capeweed (Arctotheca calendula) weed detection in Barley crops (Hordeum vulgare) is investigated. In the context of locating targeted weeds and estimating weed density, a part of the FT_BRC dataset was fully annotated and applied Faster RCNN models with different feature extractors for weed detection in the field. Experimental results show that the mean average precision (mAP) of the Faster RCNN model with Inception-ResNet-V2 network with 0.555 (at IoU = 0.5) is higher compared to other networks and the inference time of this model was approximately 0.38 seconds per image. These results can support to further the development of solutions for weed detection in real-time precision agriculture.