INTELLIGENT CONTROL OF ROBOTICS AT RODENT BURROW SEGMENTATION USING DEEP CONVOLUTIONAL ARCHITECTURES
Abstract
In this paper, we investigate the application of neural network architectures for semantic segmentation of rodent burrows for monitoring their population in agricultural fields. In particular, three models for semantic segmentation are considered: convolutional autoencoder (CAE), SegNet, and U-Net. These models are applied to analyze images obtained from unmanned aerial vehicles (UAVs) and ground robotic means, which allows for automatic burrow detection, minimizing the need for labor costs in processing large amounts of data. A sample of 247 RGB images containing 1098 labeled burrows was prepared for training and testing the models. The quality indicators of semantic segmentation were assessed using the Jaccard metric (IoU), which resulted in the following values: 0.511 for CAE, 0.548 for SegNet, and 0.529 for U-Net. An assessment of the computational resources required to implement these models in on-board computing units (OCUs) of mobile robotic means was conducted. Two criteria were considered: the number of floating-point operations (GFLOPS) and the number of model parameters. The results showed that SegNet requires 2.23 GFLOPS and has 0.76 million parameters, which is 2.58 and 2.33 times less than SAE and U-Net, respectively. The number of floating-point operations for SegNet was also 2.43 and 1.88 times lower than that of SAE and U-Net, respectively. As a result, SegNet outperformed SAE and U-Net in both segmentation efficiency and required computational resources. This work was carried out as part of the implementation of a computer vision system for an agricultural robotic means.
References
1. Witmer G. Rodents in Agriculture: A Broad Perspective // Agronomy. – 2022. – Vol. 12, No. 6. – P. 1458.
2. Old J. M., Lin S.H., Franklin M.J.M. Mapping out bare-nosed wombat (Vombatus ursinus) burrows
with the use of a drone //BMC Ecology. – 2019. – Vol. 19, No. 1. – P. 1-10.
3. Фаворская М.Н., Нишчхал Н. Верификация разливов нефти на водных поверхностях по аэрофо-
тоснимкам на основе методов глубокого обучения //Информатика и автоматизация. – 2022.
– Т. 21, № 5. – С. 937-962.
4. Hao S., Zhou Y., Guo Y. A brief survey on semantic segmentation with deep learning //
Neurocomputing. – 2020. – Vol. 406. – P. 302-321.
5. Hafiz A. M., Bhat G. M. A survey on instance segmentation: state of the art // International Journal of
Multimedia Information Retrieval. – 2020. – Vol. 9, No. 3. – P. 171-189.
6. Kirillov A., He, K., Girshick R., Rother C., Dollár P. Panoptic segmentation // Proceedings of the
IEEE/CVF Conference on Computer Vision and Pattern Recognition. – 2019. – P. 9404-9413.
7. Geddes A.M.W. The relative importance of pre-harvest crop pests in Indonesia // Bulletin-Natural Resources
Institute. – 1992. – No. 47.
8. Leirs H. Management of rodents in crops: the Pied Piper and his orchestra //ACIAR Monograph Series.
– 2003. – Vol. 96. – P. 183-190.
9. Singleton G., Belmain S., Brown P. R., Aplin K., Htwe N. M. Impacts of rodent outbreaks on food security
in Asia // Wildlife Research. – 2010. – Vol. 37, No. 5. – P. 355-359.
10. John A. Rodent outbreaks and rice pre-harvest losses in Southeast Asia // Food Security. – 2014.
– Vol. 6, No. 2. – P. 249-260.
11. Chow J., Su Z., Wu J., Tan P. S., Mao X., Wang Y. H. Anomaly detection of defects on concrete structures
with the convolutional autoencoder // Advanced Engineering Informatics. – 2020. – Vol. 45. – P. 101105.
12. Krestovnikov K.D., Erashov A.A. Development of architecture and generalized structure of modules
for a distributed control system for general-purpose robotic complexes // Robotics and Cybernetics.
– 2022. – Vol. 10, No. 3. – P. 201-212. (In Russ.).
13. Sun C., Bao Y., Bao Y., Vandansambuu B., Bayarsaikhan S., Gantumur B., Wu K. Detection and classification
of Brandt’s vole burrow clusters utilizing GF-2 satellite imagery and faster R-CNN model //
Frontiers in Ecology and Evolution. – 2024. – Vol. 12. – P. 1310046.
14. Heydari M., Mohamadzamani D., Parashkouhi M. G., Ebrahimi E., Soheili A. An Algorithm for Detecting
the Location of Rodent-Made Holes through Aerial Filming by Drones // Archives of Pharmacy
Practice. – 2020. – Vol. 1. – P. 55.
15. Ezzy H., Charter M., Bonfante A., Brook A. How the small object detection via machine learning and
UAS-based remote-sensing imagery can support the achievement of SDG2: A case study of vole burrows
// Remote Sensing. – 2021. – Vol. 13. – No. 16. – P. 3191.
16. Li C., Luo X., Pan X. CGT-YOLOv5n: A Precision Model for Detecting Mouse Holes Amid Complex
Grassland Terrains // Applied Sciences. – 2023. – Vol. 14, No. 1. – P. 291.
17. Du M., Wang D., Liu S., Lv C., Zhu Y. Rodent hole detection in a typical steppe ecosystem using UAS
and deep learning // Frontiers in Plant Science. – 2022. – Vol. 13. – P. 992789.
18. Xie T., Luo X., Pan X. BSM-YOLO: A Dynamic Sparse Attention-Based Approach for Mousehole
Detection // IEEE Access. – 2024.
19. Wan J., Jian D., Yu D. Research on the method of grass mouse hole target detection based on deep learning
// Journal of Physics: Conference Series. – IOP Publishing, 2021. – Vol. 1952. – No. 2. – P. 022061.
20. Google Pictures. – URL: https://www.google.ru/imghp?hl=ru&ogbl (дата обращения: 11.01.2025).
21. Labelme. Image Polygonal Annotation with Python. – URL: https://github.com/wkentaro/labelme
(дата обращения: 11.01.2025).
22. Ioffe S., Szegedy C. Batch normalization: Accelerating deep network training by reducing internal
covariate shift //International Conference on Machine Learning. – PMLR, 2015. – P. 448-456.
23. Glorot X., Bordes A., Bengio Y. Deep sparse rectifier neural networks // Proceedings of the Fourteenth
International Conference on Artificial Intelligence and Statistics. – JMLR Workshop and Conference
Proceedings, 2011. – P. 315-323.
24. PyTorch. – URL: pytorch.org (дата обращения: 11.01.2025).
25. Bergmann P., Löwe S., Fauser M., Sattlegger D., Steger C. Improving unsupervised defect segmentation
by applying structural similarity to autoencoders //arXiv preprint arXiv:1807.02011. – 2018.
26. Астапова М.А., Уздяев М.Ю., Кондратьев В.М. Прогнозирование урожайности зеленых культур на
основе мониторинга морфометрических параметров посредством машинного зрения и нейронных
сетей // Известия Кабардино-Балкарского научного центра РАН. – 2024. – Т. 26, № 3. – С. 11-20.