Robust Federated Learning With Noisy Labeled Data Through Loss Function Correction

Li Chen, Fan Ang, Yunfei Chen, Weidong Wang

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

Federated learning (FL) is a communication-efficient machine learning paradigm to leverage distributed data at the network edge. Nevertheless, FL usually fails to train a high-quality model from the networks, where the edge nodes collect noisy labeled data. To tackle this challenge, this paper focuses on developing an innovative robust FL. We consider two kinds of networks with different data distribution. Firstly, we design a reweighted FL under a full-data network, where all edge nodes are equipped with both numerous noisy labeled dataset and small clean dataset. The key idea is that edge devices learn to assign the local weights of loss functions in noisy labeled dataset, and cooperate with central server to update global weights. Secondly, we consider a part-data network where some edge nodes exclude clean dataset, and can not compute the weights locally. The broadcasting of the global weights is added to help those edge nodes without clean dataset to reweight their noisy loss functions. Both designs have a convergence rate of O(1/T2). Simulation results illustrate that the both proposed training processes improve the prediction accuracy due to the proper weights assignments of noisy loss function.
Original languageEnglish (US)
Pages (from-to)1501-1511
Number of pages11
JournalIEEE Transactions on Network Science and Engineering
Volume10
Issue number3
DOIs
StatePublished - Dec 7 2022
Externally publishedYes

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Computer Science Applications
  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'Robust Federated Learning With Noisy Labeled Data Through Loss Function Correction'. Together they form a unique fingerprint.

Cite this