系统工程与电子技术 ›› 2025, Vol. 47 ›› Issue (5): 1718-1727.doi: 10.12305/j.issn.1001-506X.2025.05.34

• 通信与网络 • 上一篇    

面向海上无人系统的边缘模型协同与数据压缩算法

姜俊1, 张家瑞1,*, 潘吉龙2, 孙国林2   

  1. 1. 海军工程大学作战运筹与规划系, 湖北 武汉 430033
    2. 电子科技大学智能计算研究院, 四川 成都 611731
  • 收稿日期:2024-06-27 出版日期:2025-06-11 发布日期:2025-06-18
  • 通讯作者: 张家瑞
  • 作者简介:姜俊(1977—), 男, 副教授, 博士, 主要研究方向为作战软件与仿真、作战指挥决策与运筹分析
    张家瑞(1994—), 男, 硕士研究生, 主要研究方向为作战软件与仿真、作战指挥决策与运筹分析
    潘吉龙(2000—), 男, 硕士研究生, 主要研究方向为联邦学习
    孙国林(1978—), 男, 教授, 博士, 主要研究方向为人工智能、边缘计算、资源管理

Edge model collaboration and data compression algorithm for unmanned systems in sea

Jun JIANG1, Jiarui ZHANG1,*, Jilong PAN2, Guolin SUN2   

  1. 1. Department of Combat Operation and Planning, Navy University of Engineering, Wuhan 430033, China
    2. Intelligent Computing Research Institute, University of Electronic Science and Technology of China, Chengdu 611731, China
  • Received:2024-06-27 Online:2025-06-11 Published:2025-06-18
  • Contact: Jiarui ZHANG

摘要:

在海上环境中, 无人系统可以通过边缘人工智能(artificial intelligence, AI)模型协同实现数据采集与边缘处理任务。面向通信环境差、通信带宽有限、通信链路易被干扰等问题, 首先, 从无人机AI模型协同的角度, 提出一种联邦互蒸馏的模型协同训练方法, 以减少模型训练数据传输带宽要求。其次,从数据去冗余压缩的角度出发, 提出一种数据差分动态压缩方法, 以降低数据传输频次。仿真实验表明, 所提出的联邦智能模型互蒸馏训练方法训练的模型性能明显优于分布式训练的模型的性能, 且较之于集中式的模型训练方法节省大量通信带宽, 提出的差分动态数据压缩方法能够大幅减少通信报文的长度与发送频率, 适应于带宽受限的通信弱连接环境。

关键词: 通信弱连接, 边缘模型协同, 联邦互蒸馏, 数据压缩

Abstract:

In sea environment, unmanned system relies on edge artificial intelligence (AI) model collaboration to implement data collection and edge processing tasks. However, facing with problems such as poor communication links, limited communication bandwidth, and sensitive to interferences, this paper firstly proposes a model collaborative training method, federated mutual distillation, to reduce the bandwidth requirements for model training data, from the perspective of AI model collaboration among unmanned aerial vehicle. Secondly, from the perspective of data de-redundancy transmission, a data differential dynamic compression method is proposed to reduce the frequency of data transmission. Simulation results show that the performance of the model, trained with the federated mutual distillation training method, is better than that of the benchmarks, and a cost of communication bandwidth is reduced compared to the centralized training models. The proposed data differential dynamic compression method can greatly reduce the sending length and frequency of communication messages, and adapt to the bandwidth bottleneck in the weak communication connection environment.

Key words: weak communication link, edge model collaboration, federated mutual distillation, data compression

中图分类号: