Developing an Algorithm for Robotic Precision Application of Crop Protection Products
https://doi.org/10.22314/2073-7599-2022-16-3-74-80
Abstract
The existing range of plant identification methods and tools is considered limited in real agrotechnical tasks. The image parameters tend to differ significantly in applied solutions. (Research purpose) To develop an algorithm for crop plant recognition by a robotic device using a state-of-the-art convolutional neural network (R-CNN) and deep learning technology. (Materials and methods) A robotic device has been developed for variable rate application of plant protection products able to recognize both useful crops and weeds, determine the area of processing, namely the coordinates of the processing center and the processing radius. Mask R-CNN and Deeplabv3 plus segmenting neural networks were chosen for crop (white head cabbage) detection. The network-based algorithm detects, segments, and positions plants based on a dataset collected in the image-mask and COCO dataset formats. The data set was formed by aerial photography using an unmanned aircraft. The original images are taken by Xiaovv HD Web USB 150 degree Full HD 1080P webcam and Logitech C270 HD 720p webcam. The trained neural network for the robotic device was installed on the Nvidia Jetson AGX Xavier platform. (Results and discussion) As a result of assessing the accuracy of the model on the test data, the following values were obtained: the number of plants detected is 98 percent, the accuracy of contour detection is 94 percent. (Conclusions) It is proved that the trained neural network can be applied to any cultivated crops, taking into account the heterogeneity of their location in the field, soil types, and the percentage of weeds. As a result, the model is trained to extract the bounding box coordinates and the object (cabbage) location by pixels with the required accuracy for both synthetic and real data.
About the Author
M. A. MirzaevRussian Federation
Maksim A. Mirzaev, Ph.D. student, junior researcher
Moscow
References
1. Mirzaev M.A. Proektirovanie avtonomnogo polevogo robota dlya differentsirovannogo vneseniya agrokhimicheskikh sredstv [Design of an autonomous field robot for differentiated application of agrochemical agents]. Elektrotekhnologii i elektrooborudovanie v APK. 2021. Vol. 68. N4(45). 131-136 (In Russian).
2. Zhao K., et al. Building extraction from satellite images using mask R-CNN with building boundary regularization. Proceedings of the IEEE conference on computer vision and pattern recognition workshops. 2018. 247-251 (In English).
3. Weber M., et al. Deeplab2: A tensorflow library for deep labeling. arXiv preprint arXiv:2106.09748v1. 2021 (In English).
4. Wang H., et al. Max-deeplab: End-to-end panoptic segmentation with mask transformers. Proceedings of the IEEE/CVF conference on computer vision and pattern recognition. 2021. 5463-5474 (In English).
5. Caesar H., Uijlings J., Ferrari V. Coco-stuff: Thing and stuff classes in context. Proceedings of the IEEE conference on computer vision and pattern recognition. 2018. 1209-1218 (In English).
6. Jeon J., et al. Run your visual-inertial odometry on NVIDIA Jetson: Benchmark tests on a micro aerial vehicle. IEEE Robotics and Automation Letters. 2021. Vol. 6. N3. 5332-5339 (In English).
7. Hossain S., Lee D. Deep learning-based real-time multiple-object detection and tracking from aerial imagery via a flying robot with GPU-based embedded devices. Sensors. 2019. Vol. 19. N15. 3371 (In English).
8. Carneiro T. et al. Performance analysis of google colaboratory as a tool for accelerating deep learning applications. IEEE Access. 2018. Vol. 6. 61677-61685 (In English).
9. Meyer G.E., Neto J.C. Verification of color vegetation indices for automated crop imaging applications. Computers and electronics in agriculture. 2008. Vol. 63. N2. 282-293 (In English).
10. Demin E.E., et al. Analiticheskie issledovaniya tekhnicheskikh parametrov samodvizhushchikhsya opryskivateley [Analytical studies of technical parameters of self-propelled sprayers]. Agrarnyy nauchnyy zhurnal. 2021. N12. 112-114 (In Russian).
11. Balabanov V.I., et al. Tekhnologii, mashiny i oborudovanie dlya koordinatnogo (tochnogo) zemledeliya [Technologies, machines and equipment for coordinate (precision) farming]. Moscow: Rosinformagrotekh. 2016. 240 (In Russian).
12. Lysov A.K., Vorob'ev N.I. Vychislenie s pomoshchyu matematicheskoy modeli traektoriy dvizheniya raspylyaemykh aerozol'nykh kapel' [Calculation of droplets trajectories of a sprayed aerosol with the use of mathematical model]. Agro-EkoInzheneriya. 2021. N2(107). 96-106 (In Russian).
13. Bredikhin A.I. Algoritmy obucheniya svertochnykh neyronnykh setey [Training algorithms for convolutional neural networks]. Vestnik Yugorskogo gosudarstvennogo universiteta. 2019. N1(52). 41-54 (In Russian).
14. Henaff O. Data-efficient image recognition with contrastive predictive coding. International Conference on Machine Learning. PMLR. 2020. 4182-4192 (In English).
15. Bardou D., Zhang K., Ahmad S. M. Classification of breast cancer based on histology images using convolutional neural networks. IEEE Access. 2018. Vol. 6. 24680-24693 (In English).
16. Polyzotis N., Zinkevich M., Roy S., Breck E., Whang S. Data Validation for Machine Learning. Proceedings of Machine Learning and Systems. 2019 (In English).
Review
For citations:
Mirzaev M.A. Developing an Algorithm for Robotic Precision Application of Crop Protection Products. Agricultural Machinery and Technologies. 2022;16(3):74-80. (In Russ.) https://doi.org/10.22314/2073-7599-2022-16-3-74-80