Open Access
Article
Distillation-Based User Selection for Heterogeneous Federated Learning
Bowen Li
Wenling Li*
Author Information
Submitted: 7 Sept 2023 | Accepted: 9 Nov 2023 | Published: 26 Jun 2024

Abstract

Federated learning is a newly developing distributed machine learning technology, which makes it possible for users to train machine learning models with decentralized privacy data. Owing to huge communication overhead, the traditional federated learning algorithm samples user data randomly, which may reduce the performance of the model due to the statistical heterogeneity of users. In this paper, we propose a distillation-based user selection algorithm for federated learning in heterogeneous situations. Based on knowledge distillation, the soft targets of users are uploaded to the server as a basis for user selection. Our algorithm reduces the statistical heterogeneity of selected users, resulting in low additional communication and computation overhead. Experiments implemented on MNIST and fashion-MNIST show that the proposed algorithm obtains better model performance as compared to the federated averaging algorithm and several other user selection algorithms.

Graphical Abstract

References

Share this article:
Graphical Abstract
How to Cite
Li, B., & Li, W. (2024). Distillation-Based User Selection for Heterogeneous Federated Learning. International Journal of Network Dynamics and Intelligence, 3(2), 100007. https://doi.org/10.53941/ijndi.2024.100007
RIS
BibTex
Copyright & License
article copyright Image
Copyright (c) 2024 by the authors.

This work is licensed under a This work is licensed under a Creative Commons Attribution 4.0 International License.

scilight logo

About Scilight

Contact Us

Suite 4002 Level 4, 447 Collins Street, Melbourne, Victoria 3000, Australia
General Inquiries: info@sciltp.com
© 2025 Scilight Press Pty Ltd All rights reserved.