系统工程与电子技术 ›› 2021, Vol. 43 ›› Issue (10): 2775-2781.doi: 10.12305/j.issn.1001-506X.2021.10.09

• 电磁散射与逆散射研究新进展专栏 • 上一篇    下一篇

基于注意力机制的堆叠LSTM网络雷达HRRP序列目标识别方法

张一凡1,*, 张双辉2, 刘永祥2, 荆锋1   

  1. 1. 国防科技大学信息通信学院, 陕西 西安 710106
    2. 国防科技大学电子科学学院, 湖南 长沙 410073
  • 收稿日期:2021-02-15 出版日期:2021-10-01 发布日期:2021-11-04
  • 通讯作者: 张一凡
  • 作者简介:张一凡(1994—), 男, 助教, 硕士, 主要研究方向为雷达目标识别、深度学习|张双辉(1989—), 男, 副研究员, 博士, 主要研究方向为雷达成像、压缩感知、贝叶斯推断|刘永祥(1976—), 男, 教授, 博士, 主要研究方向为目标微动特性分析与识别|荆锋(1979—), 男, 副教授, 博士, 主要研究方向为智能信息处理

Radar HRRP sequence target recognition method of attention mechanism based stacked LSTM network

Yifan ZHANG1,*, Shuanghui ZHANG2, Yongxiang LIU2, Feng JING1   

  1. 1. School of Information and Communication, National University of Defense Technology, Xi'an 710106, China
    2. School of Electronic Science and Technology, National University of Defense Technology, Changsha 410073, China
  • Received:2021-02-15 Online:2021-10-01 Published:2021-11-04
  • Contact: Yifan ZHANG

摘要:

传统的雷达高分辨距离像(high resolution range profile, HRRP)序列识别方法依赖于人工特征提取, 并且现有的深度学习方法存在梯度消失问题, 导致收敛速度慢, 识别精度低。针对上述问题, 提出一种基于注意力机制的堆叠长短时记忆(attention-based stacked long short-term memory, Attention-SLSTM)网络模型, 该模型通过堆叠多个长短时记忆(long short-term memory, LSTM)网络层, 实现了HRRP序列更深层次抽象特征的提取; 通过替换模型的激活函数, 减缓了堆叠LSTM(stacked LSTM, SLSTM)模型梯度消失问题; 引入注意力机制计算特征序列的分配权重并用于分类识别步骤, 增强了隐藏层特征的非线性表达能力。模型在雷达目标识别标准数据集MSTAR上多种不同目的的实验结果表明, 所提方法具有更快的收敛速度和更好的识别性能, 与多种现有方法对比具有更高的识别率, 证明了所提方法的正确性和有效性。

关键词: 高分辨距离像序列, 注意力机制, 长短时记忆网络, 雷达目标识别

Abstract:

The traditional radar high resolution range profile (HRRP) sequence recognition method relies on artificial feature extraction, and the existing deep learning method has the problem of gradient vanishing, which leads to the slow convergence speed and low recognition accuracy of the existing recognition methods. To solve these problems, an attention-based stacked long short-term memory (Attention-SLSTM) network model is proposed, which realizes the extraction of deeper abstract features of HRRP sequence by stacking multiple long short-term memory (LSTM) network layers.By replacing the activation function of the model, it slows down the gradient vanishing problem of stacked LSTM.The attention mechanism is introduced to calculate the distribution weight of feature sequence and use it in the classification and recognition step, which enhances the nonlinear expression ability of hidden layer features. Experimental results on the radar target recongnition standard data set MSTAR for different purposes show that the proposed method has faster convergence speed and better recognition performance, and has higher recognition rate compared with other existing methods, which proves the correctness and effectiveness of the proposed method.

Key words: high-resolution range profile sequence (HRRPs), attention mechanism, long short-term memory (LSTM) network, radar automatic target recognition (RATR)

中图分类号: