系统工程与电子技术 ›› 2019, Vol. 41 ›› Issue (3): 509-514.doi: 10.3969/j.issn.1001-506X.2019.03.07

• 电子技术 • 上一篇    下一篇

基于DPM和R-CNN的高分二号遥感影像船只检测方法

楼立志, 张涛, 张绍明   

  1. 同济大学测绘与地理信息学院, 上海 200092
  • 出版日期:2019-02-25 发布日期:2019-02-27

Ship detection in GaoFen-2 remote sensing imagery based on DPM and R-CNN

LOU Lizhi, ZHANG Tao, ZHANG Shaoming   

  1. College of Surveying, Mapping and Geo-Informatics, Tongji University, Shanghai 200092, China
  • Online:2019-02-25 Published:2019-02-27

摘要:

提出了基于可变形部件模型(deformable part model, DPM)的高分二号(GaoFen-2,GF2)遥感影像船只检测方法,并与区域卷积网络(regional convolutional neural network, R-CNN)进行比较。先将遥感影像分段以获得船只的粗略感兴趣区域(regions of interest, ROI),然后在ROI内计算方向梯度直方图(histogram of oriented gradients, HOG)和卷积特征,再分别由DPM和R-CNN采用HOG和卷积特征。为测试R-CNN的最佳性能,将具有5个卷积层(ZF网)和具有13个卷积层(VGG网)的网络应用于船只检测。使用8张GF2遥感影像的3523艘船只的实验结果表明,DPM和R-CNN都能以高召回率和正确率检测水中的船只,但对于聚集船只而言,DPM的效果优于R-CNN。基于HOG+DPM,ZF网和VGG网的方法平均精度分别为95.031%,93.282%和93.683%。

Abstract:

A method of ship detection for GaoFen-2 (GF2) imagery is proposed based on deformable part model (DPM) and the comparison with the regional convolutional neural network (R-CNN) is carried out. The GF2 images are firstly segmented to obtain the rough regions of interest (ROI) of ships. Then the histogram of oriented gradients (HOG) features and multi-layer convolutional features are computed within the ROIs. The HOG and convolutional features are then adopted by the DPM and the R-CNN respectively. To test the best performance of the R-CNN, a shallower network (ZF-net) with five convolutional layers and a deeper one (VGG-net) with 13 convolutional layers are applied to the ship detection. The experiments results using eight GF2 images with 3523 ships show that the DPM and the R-CNN can detect the ships surrounded by water with a high recall rate and precision. However, for the ships staying together and surrounded tightly by other ships, the DPM performs better than the R-CNN. The average precision of the methods based on HOG+DPM, ZF-net and VGG-net are 95.031%, 93.282% and 93.683% respectively.