Systems Engineering and Electronics ›› 2025, Vol. 47 ›› Issue (4): 1374-1383.doi: 10.12305/j.issn.1001-506X.2025.04.34

• Communications and Networks • Previous Articles    

Communication-efficient decentralized federated learning with resource heterogeneity

Shujia PAN, Siguang CHEN   

  1. School of Internet of Things, Nanjing University of Post and Telecommunications, Nanjing 210003, China
  • Received:2023-11-03 Online:2025-04-25 Published:2025-05-28
  • Contact: Siguang CHEN

Abstract:

In order to alleviate the negative impact caused by data heterogeneity of different terminal nodes in decentralized federation learning, and to enhance asynchronous compatibility while reducing the overall communication overhead, a decentralized federated learning algorithm based on mask location graph is proposed. Specifically, an asymmetric mask updating scheme is designed, which binds the mask norm to the training degree by gradually increasing the sparsity. It can effectively utilize the sparse gradient and guarantee system security while using trusted sparse federated aggregation. Secondly, a dynamic mask community segmentation algorithm is designed to combine gradient mask with community segmentation, which can effectively utilize the similarity between gradients across the entire network, while actively selecting similar aggregation targets and improving model performance. Furthermore, separating the model layer from the mask layer in the network structure can reduce the impact of arithmetic heterogeneity on the system scalability. Finally, a single-threaded based experimental scheme is developed to simulate data heterogeneity, computility heterogeneity and terminal node asynchrony simultaneously. Experimental results show that compared with existing relevant methods, the proposed algorithm maintains high accuracy in both two datasets and strict asynchronous condition settings, and reduces communication overhead by 14%~21%.

Key words: edge computing, federated learning, decentralized system, sparse training

CLC Number: 

[an error occurred while processing this directive]