Classification algorithms using adaptive partitioning

Peter Binev, Albert Cohen, Wolfgang Dahmen, Ronald DeVore

Research output: Contribution to journalArticlepeer-review

8 Scopus citations

Abstract

© 2014 Institute of Mathematical Statistics. Algorithms for binary classification based on adaptive tree partitioning are formulated and analyzed for both their risk performance and their friendliness to numerical implementation. The algorithms can be viewed as generating a set approximation to the Bayes set and thus fall into the general category of set estimators. In contrast with the most studied tree-based algorithms, which utilize piecewise constant approximation on the generated partition [IEEE Trans. Inform. Theory 52 (2006) 1335.1353; Mach. Learn. 66 (2007) 209.242], we consider decorated trees, which allow us to derive higher order methods. Convergence rates for these methods are derived in terms the parameter - of margin conditions and a rate s of best approximation of the Bayes set by decorated adaptive partitions. They can also be expressed in terms of the Besov smoothness β of the regression function that governs its approximability by piecewise polynomials on adaptive partition. The execution of the algorithms does not require knowledge of the smoothness or margin conditions. Besov smoothness conditions are weaker than the commonly used Holder conditions, which govern approximation by nonadaptive partitions, and therefore for a given regression function can result in a higher rate of convergence. This in turn mitigates the compatibility conflict between smoothness and margin parameters.
Original languageEnglish (US)
Pages (from-to)2141-2163
Number of pages23
JournalThe Annals of Statistics
Volume42
Issue number6
DOIs
StatePublished - Dec 2014
Externally publishedYes

Fingerprint

Dive into the research topics of 'Classification algorithms using adaptive partitioning'. Together they form a unique fingerprint.

Cite this