Linear Coded Federated Learning under Multiple Stragglers over Heterogeneous Clients

Recently, federated learning (FL) becomes a emerging research area, and the combination of edge computing and FL is one of the important research contents. However, there are many kinds of edge devices in heterogeneous federated learning, such as personal computers, embedded devices, and the resourc...

Full description

Saved in:
Bibliographic Details
Published in2022 IEEE 25th International Conference on Computer Supported Cooperative Work in Design (CSCWD) pp. 1221 - 1226
Main Authors Yang, Yingyao, Wang, Jin, Gu, Fei
Format Conference Proceeding
LanguageEnglish
Published IEEE 04.05.2022
Subjects
Online AccessGet full text
DOI10.1109/CSCWD54268.2022.9776309

Cover

More Information
Summary:Recently, federated learning (FL) becomes a emerging research area, and the combination of edge computing and FL is one of the important research contents. However, there are many kinds of edge devices in heterogeneous federated learning, such as personal computers, embedded devices, and the resource-limited devices will reduce the efficiency of FL. In this paper, we propose an efficient linear coded federated learning under multiple stragglers (LCFLMS) to (1) accelerate the training speed and improve the efficiency of heterogeneous FL under multiple stragglers and (2) provide the certain level of privacy protection. We design a client-based multiple stragglers task outsourcing (C-MSTO) algorithm and a server-based multiple stragglers task outsourcing (S-MSTO) algorithm to meet the model calculation acceleration in general environment under multiple stragglers. In the process of outsourcing, the raw data are protected by using linear coding computing (LCC) scheme. Finally, the experimental results demonstrate that LCFLMS reduces the training time by 90.22% when the performance difference between clients in FL system is large.
DOI:10.1109/CSCWD54268.2022.9776309