Asymmetric Loss Functions for Learning with Noisy Labels

Xiong Zhou, Xianming Liu, Junjun Jiang, Xin Gao, Xiangyang Ji

Research output: Chapter in Book/Report/Conference proceedingConference contribution

26 Scopus citations


Robust loss functions are essential for training deep neural networks with better generalization power in the presence of noisy labels. Symmetric loss functions are confirmed to be robust to label noise. However, the symmetric condition is overly restrictive. In this work, we propose a new class of loss functions, namely asymmetric loss functions, which are robust to learning with noisy labels for various types of noise. We investigate general theoretical properties of asymmetric loss functions, including classification calibration, excess risk bound, and noise tolerance. Meanwhile, we introduce the asymmetry ratio to measure the asymmetry of a loss function. The empirical results show that a higher ratio would provide better noise tolerance. Moreover, we modify several commonly-used loss functions and establish the necessary and sufficient conditions for them to be asymmetric. Experimental results on benchmark datasets demonstrate that asymmetric loss functions can outperform state-of-the-art methods.
Original languageEnglish (US)
Title of host publication38th International Conference on Machine Learning, ICML 2021
PublisherML Research Press
Number of pages11
ISBN (Print)9781713845065
StatePublished - Jan 1 2021


Dive into the research topics of 'Asymmetric Loss Functions for Learning with Noisy Labels'. Together they form a unique fingerprint.

Cite this