ArXiv Preprint
With the booming deployment of Internet of Things, health monitoring
applications have gradually prospered. Within the recent COVID-19 pandemic
situation, interest in permanent remote health monitoring solutions has raised,
targeting to reduce contact and preserve the limited medical resources. Among
the technological methods to realize efficient remote health monitoring,
federated learning (FL) has drawn particular attention due to its robustness in
preserving data privacy. However, FL can yield to high communication costs, due
to frequent transmissions between the FL server and clients. To tackle this
problem, we propose in this paper a communication-efficient federated learning
(CEFL) framework that involves clients clustering and transfer learning. First,
we propose to group clients through the calculation of similarity factors,
based on the neural networks characteristics. Then, a representative client in
each cluster is selected to be the leader of the cluster. Differently from the
conventional FL, our method performs FL training only among the cluster
leaders. Subsequently, transfer learning is adopted by the leader to update its
cluster members with the trained FL model. Finally, each member fine-tunes the
received model with its own data. To further reduce the communication costs, we
opt for a partial-layer FL aggregation approach. This method suggests partially
updating the neural network model rather than fully. Through experiments, we
show that CEFL can save up to to 98.45% in communication costs while conceding
less than 3% in accuracy loss, when compared to the conventional FL. Finally,
CEFL demonstrates a high accuracy for clients with small or unbalanced
datasets.
Dong Chu, Wael Jaafar, Halim Yanikomeroglu
2022-11-30