The accuracy dependency investigation of simultaneous localization and mapping on the errors from mobile device sensors
Myasnikov V.V., Dmitriev E.A.

 

Samara National Research University, 443086, Samara, Russia, Moskovskoye Shosse 34;

IPSI RAS – Branch of the FSRC “Crystallography and Photonics” RAS, 443001, Samara, Russia, Molodogvardeyskaya 151

 PDF

Abstract:
Monocular Simultaneous Localization and Mapping (SLAM) is one of the most complex and well-known problems, affecting several scientific fields: robotics, computer vision, virtual reality. This paper aims to study the SLAM problem for the mobile device with a monocular camera and sensors: accelerometer, gyroscope and digital compass. The latter allow to obtain an additional estimation of a mobile device position and orientation. The aim is to assess the potential suitability and efficiency of using extra information from inertial sensors to improve the solution quality and to reduce the time to obtain the solution. The experimental part of the study, including both model and field experiments, allowed to determine the requirements for permissible errors introduced by the sensors of the mobile device. For a specific model of a mobile device, it is shown that the electronic compass meets these requirements, while the errors of the inertial sensors used to determine the movements are unacceptably large.

Keywords:
SLAM, visual odometry, scene reconstruction, mapping, mobile device, inertial sensors, compass

Citation:
Myasnikov VV, Dmitriev EA. The accuracy dependency investigation of simultaneous localization and mapping on the errors from mobile device sensors. Computer Optics 2019; 43(3): 492-503. DOI: 10.18287/2412-6179-2019-43-3-492-503.

References:

  1. Horn BKP. Robot vision. London, Cambridge: The MIT Press; 1986.
  2. Forsyth D, Ponce J. Computer vision: A modern approach. Upper Saddle River, NJ: Prentice-Hall; 2003.
  3. Shapiro L. Computer vision and image processing. Academic Press; 1992.
  4. Durrant-Whyte H, Bailey T. Simultaneous localization and mapping (SLAM): Part I. The essential algorithms. IEEE Robot Automat Mag 2006; 13(2): 99-110.
  5. Durrant-Whyte H, Bailey T. Simultaneous localization and mapping (SLAM): Part II. State of the art. IEEE Robot Automat Mag 2006; 13(3): 108-117.
  6. Cadena C, Carlone L, Carrillo H, Latif Y, Scaramuzza D, Neira J, Reid I, Leonard JJ. Past, present, and future of simultaneous localization and mapping: Toward the robust-perception age. IEEE Transactions on Robotics 2016; 32(6): 1309-1332.
  7. Younes G, Asmar D, Shammas E, Zelek J. Keyframe-based monocular SLAM: design, survey, and future directions. Robot Auton Syst 2017; 98: 67-88.
  8. Goshin YeV, Fursov AV. Solving a camera autocalibration problem with a conformed indentification method [In Russian]. Computer Optics 2012; 36(4): 605-611.
  9. Kotov AP, Fursov VA, Goshin YeV. Technology for fast 3D-scene reconstruction from stereo images [In Russian]. Computer Optics 2015; 39(4): 600-605. DOI: 10.18287/0134-2452-2015-39-4-600-605.
  10. Myasnikov, VV. Model-based gradient field descriptor as a convenient tool for image recognition and analysis [In Russian]. Computer Optics 2012; 36(4): 596-604.
  11. Fischler MA, Bolles RC. Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Communications of the ACM 1981; 24(6): 381-395.
  12. Levenberg KA. Method for the solution of certain non-linear problems in least squares. Quarterly of Applied Mathematics 2012; 2(2): 164-168.
  13. Marquardt D. An algorithm for least-squares estimation of nonlinear parameters. SIAM Journal on Applied Mathematics 1963; 11(2): 431-441.
  14. Montemerlo M, Thrun S, Koller D, Wegbreit D. FastSLAM: A factored solution to the simultaneous localization and mapping problem. Proc AAAI Nat Conf Artif Intell 2002: 593-598.
  15. Klein G, Murray D. Parallel tracking and mapping for small AR workspaces. Proc IEEE and ACM Int Symp Mixed Augmented Reality (ISMAR) 2007: 225-234.
  16. Engel J, Sturm J, Cremers D. Semi-dense visual odometry for a monocular camera. Int Conf Computer Vision (ICCV) 2013: 1449-1456.
  17. Engel J, Sturm J, Cremers D. LSD-SLAM: Large-scale direct monocular SLAM. European Conference on Computer Vision (ECCV) 2017: 834-849.
  18. Mur-Artal R, Montiel JMM, Tardos JD. ORB-SLAM: A versatile and accurate monocular SLAM. IEEE Transactions on Robotics 2017; 31(5): 1147-1163.
  19. Newcombe RA, Lovegrove SJ, Davison AJ. DTAM: dense tracking and mapping in real-time. IEEE Int Conf Computer Vision 2011: 2320-2327.
  20. Stühmer J, Gumhold S, Cremers D. Real-time dense geometry from a handheld camera. Pattern Recognition (DAGM) 2010: 11-20.
  21. Tanskanen P, Kolev K, Meier L, Paulsen FC, Saurer O, Pollefeys M. Live metric 3D reconstruction on mobile phones. IEEE Int Conf Computer Vision 2013: 65-72.
  22. Roxas M, Oishi T. Real-time simultaneous 3D reconstruction and optical flow estimation. IEEE Winter Conference on Applications of Computer Vision (WACV) 2018: 885-893.
  23. Schuster R, Wasenmüller O, Didier S. Dense scene flow from stereo disparity and optical flow. Computer Science in Cars Symposium 2018.
  24. Kummerle R, Steder B, Dornhege C, Ruhnke M, Grisetti G, Stachniss C, Kleiner A. On measuring the accuracy of SLAM algorithms. Autonomous Robots 2009; 27(4): 387-407.
  25. Ma Z, Qiao Y, Lee B, Fallon B. Experimental evaluation of mobile phone sensors. 24th IET Irish Signals and Systems Conference 2013: 49.
  26. Kok M, Hol JD, Schon TB. Using inertial sensors for position and orientation estimation. Foundations and Trends in Signal Processing 2017; 11(1-2): 1-153.
  27. Titterton DH, Weston JL. Strapdown inertial navigation technology. London, UK, Reston, Virginia: Institution of Engineering and Technology; 1996. ISBN: 978-0-86341-358-2.
  28. Android. Source. Develop. Sensor Types. Source: áhttps://source.android.com/devices/sensors/sensor-types#rotation_vectorñ.

© 2009, IPSI RAS
151, Molodogvardeiskaya str., Samara, 443001, Russia; E-mail: journal@computeroptics.ru ; Tel: +7 (846) 242-41-24 (Executive secretary), +7 (846)332-56-22 (Issuing editor), Fax: +7 (846) 332-56-20