ORGANIZATION OF GOAL-DIRECTED MOVEMENTS OF VEHICLES USING VISUAL LANDMARKS

Abstract

The report considers the solution of the navigation problem with the help of a technical vision system that determines the position of the mobile vehicle relative to the landmarks indicated in the surrounding space. Navigation by landmarks is the most objective criterion for the location of a mobile vehicle in the surrounding space. The method of measuring the parameters of the ratios that characterize the location of the mobile vehicle relative to the landmarks is almost independent of other navigation measurements. Data input for correcting coordinates and other motion parameters can be performed not continuously, but at some discrete, and, in general, quite rare moments of time. The general scheme of the solution is considered: from setting up, to receiving navigation information. The integration of the obtained data with data from other navigation tools is briefly described, and the key problems and parameters of the VS that affect the accuracy of the obtained results are analyzed. The key point in this method is the solution of a system of equations describing the position of robotic complexes relative to the specified landmarks. This system is solved by a modified Gauss-Newton method for a nonlinear redefined system of equations. By replacing the left side of each equation with its differential at the point of initial approximation, linearization is performed. The values of the unknowns in the redefined system of linear equations for which the sum of the squared residuals in the equations is minimal can be obtained either by the SVD (singular value decomposition) method or by using the system's symmetrization. At the same time, SVD is more resistant to the accumulation of computational error, but it is somewhat more demanding on computer resources and more difficult to implement. We used the symmetrization solution as a simpler one. The resulting system is solved by the square root (Cholesky) method. To detect landmarks in the VS, two types of VS modules are used – panoramic, based on a camera with a fish-eye lens, and stereo. The proposed method allows us to solve the problem of clarifying the parameters of motion by separate, sparse measurements of the proper position and speed relative to landmarks in the surrounding space. Independently and in combination with other navigation tools, the described approach provides high-precision determination of navigation parameters in various driving conditions. The results of field experiments with the model of the proposed system in motion under various conditions are described. The ways of improvement and development of the considered approach are discussed.

Authors

References

1. RSS novosti. Novosti iskusstvennogo intellekta [RSS news. Artificial Intelligence News].
Available at: https://ai-news.ru/2021/01/bespilotnye_karernye_samosvaly.html.
2. OOO «MosTransArenda». Trend na bespilotnye samosvaly nabiraet oboroty [LLC
«MosTransArenda». The trend for unmanned dump trucks is gaining momentum]. Available
at: ttps://mtarenda.ru/articles/trend-na-bespilotnye-samosvaly-nabiraet-oboroty/#.
3. KOMEK MASHINERI. Doydut li bespilotnye samosvaly Komatsu do Rossii? Ischeznet li
professiya voditelya? [KOMEK MACHINERY. Will Komatsu unmanned dump trucks reach
Russia? Will the driver profession disappear?]. Available at: https://www.komek.ru/staty/
doydut-li-bespilotnye-samosvaly-komatsu-do-rossii-ischeznet-li-professiya-voditelya/.
4. Tadviser. Produkt: BelAZ-7513R_(bespilotnyy_samosval) [Tadviser. Product: BelAZ-7513R
(unmanned dump track). Available at: https://www.tadviser.ru/index.php|/Produkt:BelAZ-
7513R_(bespilotnyy_samosval).
5. Kozhemyakin A. Samosval na million [Dump truck for a million]. Available at:
https://dev.by/news/samosval-na-million.
6. Petrichkovich YA., Solokhina T. i dr. RoboDeus – 50-yadernaya geterogennaya SnK dlya
vstraivaemykh sistem i robototekhniki [RoboDeus – a 50-core heterogeneous SoC for embedded
systems and robotics], Elektronika [Electronics], 2020, No. 7, pp. 52-63.
7. NVidia Corp. NVidia Jetson AGX Xavier. Available at: https://www.nvidia.com/ruru/
autonomous-machines/embedded-systems/jetson-agx-xavier/.
8. Intel Corp. Accelerate Automotive with Intel. Available at: https://www.intel.ru/content/
www/ru/ru/automotive/products/programmable/overview.html.
9. Shnkland S. Tesla self-driving car computer. Available at: https://www.cnet.com/news/meettesla-
self-driving-car-computer-and-its-two-ai-brains.
10. Mur-Artal R., Tardós J. ORB-SLAM: Tracking and mapping recognizable features, MVIGRO
Workshop at Robotics Science and Systems (RSS), Berkeley, USA, 2014.
11. Kiril'chenko A.A., Platonov A.K., Sokolov S.M. Teoreticheskie aspekty organizatsii
interpretiruyushchey navigatsii mobil'nogo robota [Theoretical aspects of the organization of
interpretive navigation of a mobile robot], Preprint IPM im. M.V. Keldysha RAN [Preprint of
the Institute of Applied Mathematics M.V. Keldysh RAS], 2008, No. 19.
12. Boguslavskiy A.A. i dr. Modeli i algoritmy dlya intellektual'nykh sistem upravleniya [Models and
algorithms for intelligent control systems]. Moscow: IPM im. M.V. Keldysha, 2019, 228 p.
13. Larman K. Primenenie UML 2.0 i shablonov proektirovaniya [Applying UML 2.0 and design
patterns]. Vil'yams, 2019, 736 p.
14. Gill F., Myurrey U., Rayt M. Prakticheskaya optimizatsiya [Practical optimization]. Moscow:
Mir, 1985, 510 p.
15. Golub Dzh., Van Loun Ch. Matrichnye vychisleniya [Matrix calculations]. Moscow: Mir,
1999, 548 p.
16. Bakhvalov N.S., Zhidkov N.P., Kobel'kov G.M. chislennye metody [Numerical methods]. Moscow:
Nauka, 1987, 600 p.
17. Infineon Technologies AG. Sensor fusion for autonomous driving. Available at:
https://www.infineon.com/cms/en/applications/automotive/chassis-safety-and-adas/sensor-fusion/.
18. Kocic J., Jovičić N., Drndarevic V. Sensors and Sensor Fusion in Autonomous Vehicles, Proc.
26th Telecommunications Forum (TELFOR), Belgrade, Serbia, 20-21 November 2018,
pp. 420-425.
19. Sokolov S.M., Beklemishev N.D., Boguslavsky A.A. Coordinated use of visual odometry and
landmarks for navigation of mobile ground vehicles, ISPRS International Workshop Photogrammetric
and computer vision techniques for video Surveillance, Biometrics and Biomedicine
Moscow, April 26-28, 2021. (в печати).
20. Sprunk C. Planning Motion Trajectories for Mobile Robots Using Splines, Faculty of Applied
Sciences Department of Computer Science Autonomous Intelligent Systems, 2008. Available
at: http://www2.informatik.uni-freiburg.de/~lau/students/Sprunk2008.pdf.
21. Sokolov S.M., Boguslavskiy A.A., Trifonov O.V. Intellektual'nye moduli sistemy
tekhnicheskogo zreniya dlya operativnogo opredeleniya sostoyaniya i kontrolya ob"ektov
infrastruktury zheleznoy dorogi [Intelligent modules of the technical vision system for rapid
determination of the state and control of railway infrastructure facilities], Tr. 2-y
mezhdunarodnoy nauchno-prakticheskoy konferentsii «Intellektual'nye sistemy na transporte»
(IntellectTrans-2012) 28-31 marta 2012 g. [Proceedings of the 2nd International scientific and
practical conference "Intelligent Systems in Transport" (IntellectTrans-2012) 28-31 March
2012-St. Petersburg, 2012]. Saint Petersburg, 2012, pp. 346-355.
22. Sokolov S.M., Boguslavsky A.A., Vasilyev A.I., Trifonov O.V. Development of software and
hardware of entry-level vision systems for navigation tasks and measuring, Advances in Intelligent
Systems and Computing (Springer), 2013, Vol. 208 AISC, pp. 463-476.
23. SBG Systems. Ellipse Series, 2021. Available at: https://www.sbgsystems.
com/products/ellipse-series/.

Скачивания

Published:

2021-04-04

Issue:

Section:

SECTION IV. COMMUNICATION, NAVIGATION, AND HOVER

Keywords:

Vision systems, unmanned mobile means, navigation, landmarks, software framework, Gauss-Newton method