Systems Engineering and Electronics ›› 2023, Vol. 45 ›› Issue (4): 1127-1133.doi: 10.12305/j.issn.1001-506X.2023.04.21

• Guidance, Navigation and Control • Previous Articles    

UAV localization method with multi-view fusion

Yang PANG1, Ming WANG2,*, Ziyi YAN3, Tongyao YUE4, Zhe ZHOU1   

  1. 1. School of Instrument Science and Optoelectronic Engineering, Beihang University, Beijing 100191, China
    2. Institute of Unmanned Systems, Beihang University, Beijing 100191, China
    3. Xi'an Flight Automatic Control Research Institute, Xi'an 710018, China
    4. Beijing Institute of Control and Electronics Technology, Beijing 100082, China
  • Received:2022-04-24 Online:2023-03-29 Published:2023-03-28
  • Contact: Ming WANG

Abstract:

Unmanned aerial vehicle (UAV) technology has been widely used in military and civilian fields. In the process of early warning, flight control and trajectory tracking, navigation equipment is required to provide accurate position information. Traditional monocular vision measurement cannot accurately obtain the depth information of the target, and binocular vision measurement method is affected by distance and viewpoint with low measurement accuracy. To address this problem, a multi-camera all-round multi-view fusion positioning method is proposed, in which the camera with the largest imaging area is the main one and the cameras on both sides are supplemented, and the object point positioning method is used to calculate the 3D information of the UAV respectively, and the average value is used as the positioning information of the UAV. Finally, through experimental verification, the multi-view fusion object point measurement method can achieve a positioning accuracy of 3 cm.

Key words: unmanned aerial vehicle (UAV), multi-view fusion, positioning, point-of-object positioning

CLC Number: 

[an error occurred while processing this directive]