Benign Overfitting in Multiclass Classification: All Roads Lead to Interpolation

Ke Wang, Vidya Muthukumar, Christos Thrampoulidis

Research output: Chapter in Book/Report/Conference proceedingConference contribution

11 Scopus citations

Abstract

The growing literature on “benign overfitting” in overparameterized models has been mostly restricted to regression or binary classification settings; however, most success stories of modern machine learning have been recorded in multiclass settings. Motivated by this discrepancy, we study benign overfitting in multiclass linear classification. Specifically, we consider the following popular training algorithms on separable data: (i) empirical risk minimization (ERM) with cross-entropy loss, which converges to the multiclass support vector machine (SVM) solution; (ii) ERM with least-squares loss, which converges to the min-norm interpolating (MNI) solution; and, (iii) the one-vs-all SVM classifier. Our first key finding is that under a simple sufficient condition, all three algorithms lead to classifiers that interpolate the training data and have equal accuracy. When the data is generated from Gaussian mixtures or a multinomial logistic model, this condition holds under high enough effective overparameterization. Second, we derive novel error bounds on the accuracy of the MNI classifier, thereby showing that all three training algorithms lead to benign overfitting under sufficient overparameterization. Ultimately, our analysis shows that good generalization is possible for SVM solutions beyond the realm in which typical margin-based bounds apply.
Original languageEnglish (US)
Title of host publication35th Conference on Neural Information Processing Systems, NeurIPS 2021
PublisherNeural information processing systems foundation
Pages24164-24179
Number of pages16
ISBN (Print)9781713845393
StatePublished - Jan 1 2021
Externally publishedYes

Fingerprint

Dive into the research topics of 'Benign Overfitting in Multiclass Classification: All Roads Lead to Interpolation'. Together they form a unique fingerprint.

Cite this