METHODOLOGY AND PRACTICE OF INCREASING THE AUTONOMY OF GROUND-BASED ROBOTIC COMPLEXES
Abstract
Trends in the development of modern robotics and the needs of practice require an increase in the degree of autonomy of robotic complexes. Increasing the degree of autonomy, in turn, requires an increase in situational awareness and, as a result, an increase in the volume of data and the efficiency of their processing in real time on on-board resources. At the same time, the requirement of economic feasibility of the proposed solutions remains. Taking into account the fact that in domestic practice, in most cases, remote-controlled robots are used, it becomes necessary to increase their autonomy using existing technical solutions. This direction of development of robotic complexes is called the transition from remote control to supervisory control. Along this path, an increasing number of information management functions are transferred from the operator to the on-board information management system. Based on the analysis of the world and our own experience in the development of robotic complexes, the author's methodology for creating robots with an increased degree of autonomy, we have identified a key element in ensuring the intellectual autonomy of mobile devices. A unified software and hardware module for information support of mobile robots is proposed. The module is based on a vision system with an open software and hardware architecture. This module allows increasing the degree of autonomy of ground-based robotic complexes in terms of intellectual autonomy gradually, while remaining within the framework of economic feasibility. The open software architecture of the module takes into account the de facto variety of hardware solutions in existing remote-controlled mobile vehicles and allows for an increase in the degree of autonomy - to switch from remote control mode to supervisory control step by step, according to the tasks being solved and the available means. A technique for creating new or reengineering existing RTK samples is proposed. The methodology includes an analysis of the overall layout of the RTK with an emphasis on the software and algorithmic part of the on-board information and control system. This takes into account the conditions for matching the requirements for the sensor and computing parts. The paper considers examples of the application of this technique to the improvement of existing samples of groundbased RTCs.. Practical tasks and examples of their solution using the proposed module are presented
References
1. Christensen H.I., Sloman A., Kruijff G-J., Wyatt J. Cognitive Systems. Reports on the European Union project
on Cognitive Systems. – Свободный режим доступа: https://www.researchgate.net/publication/
50369027_Cognitive_Systems (дата обращения: 01.02.2025).
2. Богуславский А.А., Боровин Г.К., Карташев В.А., Павловский В.Е., Соколов С.М. Модели и ал-
горитмы для интеллектуальных систем управления. – ИПМ им. М.В. Келдыша РАН, 2019.
3. Huang H. et al. Autonomy Measures for Robots // Proc. of the 2004 ASME International Mechanical
Engineering Congress & Exposition, Anaheim, California, November 2004
4. Huang H., Messina E., Albus J. Autonomy Level Specification for Intelligent Autonomous Vehicles:
Interim Progress Report // PerMIS Workshop, Gaithersburg, MD. – 2003.
5. Huang H. et al. Autonomy Levels for Unmanned Systems (ALFUS) Framework: An Update // SPIE
Defense and Security Symposium, Orlando, Florida, March 2005. – 2005.
6. Barbera A. et al. Software Engineering for Intelligent Control Systems // KI Journal Special Issue on
AI and Software Engineering, Germany. – 2004. – No. 3.
7. Huang H. et al. Autonomy Levels for Unmanned Systems Framework // NIST Special Publication 1011,
National Institute of Standards and Technology, Gaithersburg, MD. – 2004. – Vol. I: Terminology.
8. Huang H., Pavek K., Novak B., Albus J., Messina E. A Framework for Autonomy Levels for Unmanned
Systems (ALFUS) // Proceedings of the AUVSI's Unmanned Systems North America June
2005, Baltimore, MD. – 2005.
9. Богуславский А.А., Соколов С.М. Технология изготовления программного обеспечения СТЗ
автономных РТК // Тр. 34-й Международной научно-технической конференции «Экстремаль-
ная робототехника», 23-24 ноября 2023. – СПб.: ЦНИИ РТК, 2023. – С. 27-35.
10. Соколов С.М. Сравнительный анализ степени автономности робототехнических комплексов //
Известия ЮФУ. Технические науки. – 2023. – № 1 (231). – С. 65-76.
11. Соколов С.М. Технологии технического зрения и военно-технические задачи боевой робототех-
ники // Вооружение и экономика. – 2023. – № 3 (65). – С. 51-59.
12. Intel Corp. Accelerate Automotive with Intel. – Свободный режим доступа: https://www.intel.com/
content/www/us/en/automotive/overview.html (дата обращения: 01.02.2025).
13. Лю Ш. и др. Разработка беспилотных транспортных средств. – М.: ДМК Пресс, 2022. – 246 с.
14. Panayiotou K. et al. A Framework for Rapid Robotic Application Development for Citizen Developers
// Software Magazine. – 2022. – P. 53-79.
15. Shnkland S. Tesla self-driving car computer. – Свободный режим доступа: https://www.cnet.com/news/
meet-tesla-self-driving-car-computer-and-its-two-ai-brains (дата обращения: 01.02.2025).
16. Apollo Open Platform. – Свободный режим доступа: https://developer.apollo.auto/ (дата обраще-
ния: 01.02.2025).
17. NVIDIA DRIVE Hyperion Autonomous Vehicle Development Platfor. – Свободный режим досту-
па: https://developer.nvidia.com/drive/hyperion-7.1 (дата обращения: 01.02.2025).
18. Serrano D. Middleware and Software Frameworks in Robotics. Applicability to Small Unmanned
Vehicles. – STO-EN-SCI-271, STO/NATO, 2015. – 8 p.
19. Tsardoulias E., Mitkas P. Robotic frameworks, architectures and middleware comparison //
arXiv:1711.06842v1. – 2017.
20. Wigness M., Eum S., Rogers J., Han D., Kwon H. A RUGD Dataset for Autonomous Navigation and
Visual Perception in Unstructured Outdoor Environments // Proc. Int. Conf. on Intelligent Robots and
Systems (IROS) 03-08 November 2019. – 2019.
21. Albl C. et al. Multi-view drone tracking datasets. – Свободный режим доступа:
https://github.com/CenekAlbl/drone-tracking-datasets/tree/master (дата обращения: 01.02.2025).
22. Бочаров Н.А. и др. Математические и программные модели задач технического зрения робото-
технических комплексов на основе микропроцессоров «Эльбрус» // Тр. ИСП РАН. – 2022.
– Т. 34. – Вып. 6. – С. 82-99.
23. Newman W. A Systematic Approach to Learning Robot Programming with ROS. – CRC Press, 2018.
24. Fregin A. et al. Building a Computer Vision Research Vehicle with ROS. – Свободный режим доступа:
https://www.researchgate.net/publication/348769946_Building_a_Computer_Vision_Research_Vehicl
e_with_ROS (дата обращения: 01.02.2025).
25. Hu T. et al. ROS-based ground stereo vision detection: implementation and experiments // Robotics
and Biomimetics. – 2016. – No. 3.
26. Nister D., Naroditsky O., Bergen J. Visual odometry // Proc. Int. Conf. Computer Vision and Pattern
Recognition. – 2004. – P. 652-659.
27. Scaramuzza D., Fraundorfer F. Visual Odometry // IEEE Robotics and Automation Magazine.
– 2011. – No. 12. – P. 80-92.
28. Konolige K., Agrawal M., Sol J. Large scale visual odometry for rough terrain // Proc. Int. Symp. Robotics
Research. – 2007.
29. Tardif J., Pavlidis Y., Daniilidis K. Monocular visual odometry in urban environments using an omnidirectional
camera // Proc. IEEE/ RSJ Int. Conf. Intelligent Robots and Systems. – 2008. – P. 2531-2538.
30. Agostinho L. et al. A Practical Survey on Visual Odometry for Autonomous Driving in Challenging
Scenarios and Conditions // IEEE Access. – 2022. – Vol. 10.