Optimal Time and Energy-Aware Client Selection Algorithms for Federated Learning on Heterogeneous Resources

Federated Learning systems allow training machine learning models distributed across multiple clients, each one using private local data. Iteratively, the clients send their training contributions to a server, which performs a merge to produce an enhanced global model. Due to resource and data heter...

Full description

Saved in:
Bibliographic Details
Published in2024 IEEE 36th International Symposium on Computer Architecture and High Performance Computing (SBAC-PAD) pp. 148 - 158
Main Authors Nunes, Alan L., Boeres, Cristina, Drummond, Lucia M. A., Pilla, Laercio L.
Format Conference Proceeding
LanguageEnglish
Published IEEE 13.11.2024
Subjects
Online AccessGet full text
ISSN2643-3001
DOI10.1109/SBAC-PAD63648.2024.00021

Cover

More Information
Summary:Federated Learning systems allow training machine learning models distributed across multiple clients, each one using private local data. Iteratively, the clients send their training contributions to a server, which performs a merge to produce an enhanced global model. Due to resource and data heterogeneity, client selection is crucial to optimize the system efficiency and improve the global model generalization. Selecting more clients is likely to increase the overall energy consumption, while a small number of clients may decline the performance of the trained model or require longer training time. We propose two time- and energy-aware client selection algorithms, MEC and ECMTC, which are proven regarding their optimality and evaluated against state-of-the-art algorithms on an extensive series of experiments in both simulation and HPC platform scenarios. The results indicate the benefits of jointly optimizing the time and energy consumption metrics using our proposals.
ISSN:2643-3001
DOI:10.1109/SBAC-PAD63648.2024.00021