Systems Engineering and Electronics ›› 2022, Vol. 44 ›› Issue (3): 977-985.doi: 10.12305/j.issn.1001-506X.2022.03.30

• Guidance, Navigation and Control • Previous Articles     Next Articles

Visual-inertial SLAM method based on multi-scale optical flow fusion feature point

Tongdian WANG, Jieyu LIU*, Zongshou WU, Qiang SHEN, Erliang YAO   

  1. College of Missile Engineering, Rocket Force University of Engineering, Xi'an 710025, China
  • Received:2021-07-02 Online:2022-03-01 Published:2022-03-10
  • Contact: Jieyu LIU

Abstract:

In order to improve the robustness and accuracy of vision inertial navigation system in weak texture environment, combined with the characteristics of high accuracy of feature point method and fast speed of optical flow method and inertial information, a vision inertial simultaneous localization and mapping (SLAM) method of multi-scale homogenization optical flow fusion feature point method is proposed. Firstly, the feature extraction process of oriented fast and rotated brief (ORB) algorithm is improved, the orb feature points are extracted by multi-scale grid method, and the feature points are evenly distributed by quadtree to improve the discreteness of feature distribution. Secondly, the LK (Lucas and Kanade) optical flow method is used to track the feature points between frames for data association, and the descriptor of the feature points is calculated and matched in the key frame, so as to realize the data association between key frames, ensure the speed of the algorithm and improve the positioning accuracy and robustness. Finally, the initial pose obtained from the data association established based on the optical flow method provides the initial value for the back-end optimization, integrates the re projection error of orb feature points, the pre integration error of inertial measurement unit (IMU) and the a priori error of sliding window, constructs the minimization target function, and uses the nonlinear optimization of sliding window to solve it. Experiments show that the proposed method has higher positioning accuracy and robustness than monocular visual inertial system (VINS-Mono), and the positioning accuracy is improved by 16.7% on average.

Key words: visual inertial simultaneous localization and mapping (SLAM), optical flow method, state estimation

CLC Number: 

[an error occurred while processing this directive]