Communication efficiency optimization in federated learning based on multi-objective evolutionary algorithm

Federated learning is an emerging technology that can effectively safeguard personal information. As opposed to traditional centralized learning, federated learning can avoid data sharing while maintaining global model training. However, in the process of updating the global model, it will consume h...

Full description

Saved in:
Bibliographic Details
Published inEvolutionary intelligence Vol. 16; no. 3; pp. 1033 - 1044
Main Authors Chai, Zheng-yi, Yang, Chuan-dong, Li, Ya-lun
Format Journal Article
LanguageEnglish
Published Berlin/Heidelberg Springer Berlin Heidelberg 01.06.2023
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN1864-5909
1864-5917
DOI10.1007/s12065-022-00718-x

Cover

More Information
Summary:Federated learning is an emerging technology that can effectively safeguard personal information. As opposed to traditional centralized learning, federated learning can avoid data sharing while maintaining global model training. However, in the process of updating the global model, it will consume huge client communication resources, which hinder the wide application of this technology. To reduce the communication overhead without seriously reducing the accuracy of the global model, under the federated learning framework, we use decomposition based multi-objective optimization algorithm (MOEA/D) to optimize the structure of the global model. For the structure of the global model, a highly scalable coding method is used for coding, which improves the efficiency of the evolutionary neural network. As a comparison, we use the non-dominated sorting genetic algorithm II(NSGA II) to optimize the problem under the same conditions and verify the effectiveness of both algorithms according to the obtained Pareto solution. We verify that MOEA/D has better convergence when using multilayer and convolutional neural networks as the global model. Overall, MOEA/D can further strengthen the structure optimization of the federated learning model and reduce communication costs.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1864-5909
1864-5917
DOI:10.1007/s12065-022-00718-x