Constrained submodular minimization for missing labels and class imbalance in multi-label learning

Baoyuan Wu, Siwei Lyu, Bernard Ghanem

Research output: Chapter in Book/Report/Conference proceedingConference contribution

58 Scopus citations

Abstract

In multi-label learning, there are two main challenges: missing labels and class imbalance (CIB). The former assumes that only a partial set of labels are provided for each training instance while other labels are missing. CIB is observed from two perspectives: first, the number of negative labels of each instance is much larger than its positive labels; second, the rate of positive instances (i.e. the number of positive instances divided by the total number of instances) of different classes are significantly different. Both missing labels and CIB lead to significant performance degradation. In this work, we propose a new method to handle these two challenges simultaneously. We formulate the problem as a constrained submodular minimization that is composed of a submodular objective function that encourages label consistency and smoothness, as well as, class cardinality bound constraints to handle class imbalance. We further present a convex approximation based on the Lovasz extension of submodular functions, leading to a linear program, which can be efficiently solved by the alternative direction method of multipliers (ADMM). Experimental results on several benchmark datasets demonstrate the improved performance of our method over several state-of-the-art methods.
Original languageEnglish (US)
Title of host publication30th AAAI Conference on Artificial Intelligence, AAAI 2016
PublisherAAAI press
Pages2229-2236
Number of pages8
ISBN (Print)9781577357605
StatePublished - Jan 1 2016

Fingerprint

Dive into the research topics of 'Constrained submodular minimization for missing labels and class imbalance in multi-label learning'. Together they form a unique fingerprint.

Cite this