系统工程与电子技术

• 电子技术 • 上一篇    下一篇

基于线性变换的阵列幅相误差自校正算法

曲志昱1, 吴迪2, 王炎1   

  1. 1.哈尔滨工程大学信息与通信工程学院, 黑龙江 哈尔滨 150001;
    2.中航工业雷达所北京创新中心, 北京 100012
  • 出版日期:2016-05-25 发布日期:2010-01-03

Self-calibration method of gain/phase error based on linear transformation

QU Zhi-Yu1, WU Di2, WANG Yan1   

  1. 1. College of Information and Communication Engineering, Harbin Engineering University, Harbin 150001, China;2. Radar Research (Beijing) Leihua Electronic Technology Institute, Beijing 100012, China
  • Online:2016-05-25 Published:2010-01-03

摘要:

针对天线阵列的幅相误差严重影响阵列测向算法估计性能的问题,提出了一种基于线性变换的阵列幅相误差自校正算法。该方法通过利用幅度相位特性一致的辅助阵元,进行矩阵的正交线性变换,并结合最小二乘法算法,有效地估计阵列的幅相误差系数和入射信号的波达方向。其不需要谱峰搜索,无特征分解运算,计算量小,复杂度低,可实现对阵列幅相误差的快速校正。计算机仿真实验结果验证了该算法估计性能的有效性。

Abstract:

Processing the signals received on an array of sensors for localization of multiple emitters is of great interest. However, the common array gain/phase errors will severely affect the estimation performance of direction finding algorithms. In this paper, a self-calibration method of gain/phase errors based on linear transformation is proposed. Both the gain/phase coefficients and direction of arrival (DOA) can be estimated effectively by a sequence of processes including applying a group of instrumental sensors with consistent gain/phase performance, converting and deriving the matrix by using the orthogonal linear transformation as well as combining the least squares method. The proposed algorithm is able to self-calibrate the array gain/phase errors without spectral searching and matrix decomposition, there it is low in computational load and complexity. Furthermore, the computer simulation results verify the effectiveness and superiority of the algorithm.